Asymptotic properties of one-layer artificial neural networks with sparse connectivity

Hirsch, Christian, Neumann, Matthias, Schmidt, Volker

arXiv.org Machine Learning 

A law of large numbers for the empirical distribution of parameters of a one-layer artificial neural networks with sparse connectivity is derived for a simultaneously increasing number of both, neurons and training iterations of the stochastic gradient descent.