Critical initialisation for deep signal propagation in noisy rectifier neural networks

Arnu Pretorius, Elan van Biljon, Steve Kroon, Herman Kamper

Neural Information Processing Systems 

Stochastic regularisation is an important weapon in the arsenal of a deep learning practitioner. However, despite recent theoretical advances, our understanding of how noise influences signal propagation in deep neural networks remains limited. By extending recent work based on mean field theory, we develop a new framework for signal propagation in stochastic regularised neural networks. Our noisy signal propagation theory can incorporate several common noise distributions, including additive and multiplicative Gaussian noise as well as dropout. We use this framework to investigate initialisation strategies for noisy ReLU networks. We show that no critical initialisation strategy exists using additive noise, with signal propagation exploding regardless of the selected noise distribution.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found