An elementary proof of a universal approximation theorem
–arXiv.org Artificial Intelligence
There are several versions of universal approximation theorems known, including the very well-known ones from [1, 2, 3]. Each of them states that some collection of neural networks is dense in some space of continuous functions with respect to the uniform norm. In this short note, we present what we believe to be a new and atypically elementary proof of one such theorem. If σ is a 0-1 squashing function (a.k.a. a sigmoidal function), we show that the collection of neural networks with three hidden layers and activation function σ (except at the output) is dense in the space C(K) of real-valued continuous functions on a compact set K R
arXiv.org Artificial Intelligence
Jun-14-2024