Lower Bounds on the Complexity of Approximating Continuous Functions by Sigmoidal Neural Networks
–Neural Information Processing Systems
This is one of the theoretical results most frequently cited to justify the use of sigmoidal neural networks in applications. By this statement one refers to the fact that sigmoidal neural networks have been shown to be able to approximate any continuous function arbitrarily well. Numerous results in the literature have established variants of this universal approximation property by considering distinct function classes to be approximated by network architectures using different types of neural activation functions with respect to various approximation criteria, see for instance [1, 2, 3, 5, 6, 11, 12, 14, 15].
Neural Information Processing Systems
Dec-31-2000