Synaptic Weight Noise During MLP Learning Enhances Fault-Tolerance, Generalization and Learning Trajectory
Murray, Alan F., Edwards, Peter J.
–Neural Information Processing Systems
Predictions are made in the light of these calculations which suggest that fault tolerance, generalisation ability and learning trajectory should be improved by such noise-injection. Extensive simulation experiments on two distinct classification problems substantiate the claims. The results appearto be perfectly general for all training schemes where weights are adjusted incrementally, and have wide-ranging implications forall applications, particularly those involving "inaccurate" analog neural VLSI. 1 Introduction This paper demonstrates both by consjderatioll of the cost function and the learning equations,and by simulation experiments, that injection of random noise on to MLP weights during learning enhances fault-tolerance without additional supervision. Wealso show that the nature of the hidden node states and the learning trajectory is altered fundamentally, in a manner that improves training times and learning quality. The enhancement uses the mediating influence of noise to distribute informationoptimally across the existing weights.
Neural Information Processing Systems
Dec-31-1993
- Technology: