Reviews: Critical initialisation for deep signal propagation in noisy rectifier neural networks
–Neural Information Processing Systems
This paper analyzes signal propagation in vanilla fully-connected neural networks in the presence of noise. For ReLU networks, it concludes that the initial weight variance should be adjusted from the "He" initialization to account for the noise scale. Various empirical simulations corroborate this claim. Generally speaking, I believe studying signal propagation in random neural networks is a powerful way to build better initialization schemes and examining signal propagation in the presence of noise is an interesting direction. The paper is well-written and easy to read.
Neural Information Processing Systems
Oct-7-2024, 04:00:15 GMT
- Technology: