Stochastic Neural Networks with Infinite Width are Deterministic

Ziyin, Liu, Zhang, Hanlin, Meng, Xiangming, Lu, Yuting, Xing, Eric, Ueda, Masahito

arXiv.org Machine Learning 

Applications of neural networks have achieved great success in various fields. A major extension of the standard neural networks is to make them stochastic, namely, to make the output a random function of the input. In a broad sense, stochastic neural networks include neural networks trained with dropout (Srivastava et al., 2014; Gal & Ghahramani, 2016), Bayesian networks (Mackay, 1992), variational autoencoders (VAE) (Kingma & Welling, 2013), and generative adversarial networks (Goodfellow et al., 2014). There are many reasons why one wants to make a neural network stochastic. Two main reasons are (1) regularization and (2) distribution modeling.