A General Constructive Upper Bound on Shallow Neural Nets Complexity

Hakl, Frantisek, Fojtik, Vit

arXiv.org Machine Learning 

We provide an upper bound on the number of neurons required in a shallow neural network to approximate a continuous function on a compact set with a given accuracy. This method, inspired by a specific proof of the Stone-Weierstrass theorem, is constructive and more general than previous bounds of this character, as it applies to any continuous function on any compact set.