Whiteout: Gaussian Adaptive Noise Regularization in FeedForward Neural Networks

Li, Yinan, Liu, Fang

arXiv.org Machine Learning 

Noise injection (NI) is an approach to mitigate over-fitting in feedforward neural networks (NNs). The Bernoulli NI procedure as implemented in dropout and shakeout has connections with $l_1$ and $l_2$ regularization on the NN model parameters and demonstrates the efficiency and feasibility of NI in regularizing NNs. We propose whiteout, a new NI regularization technique with adaptive Gaussian noise in NNs. Whiteout is more versatile than dropout and shakeout. We show that the optimization objective function associated with whiteout in generalized linear models has a closed-form penalty term that has connections with a wide range of regularization and includes the bridge, lasso, ridge, and elastic net penalization as special cases; it can be also extended to offer regularization similar to the adaptive lasso and group lasso. We prove that whiteout can also be viewed as robust learning of NNs in the presence of small perturbations in input and hidden nodes. We establish that the noise-perturbed empirical loss function with whiteout converges almost surely to the ideal loss function, and the estimates of NN parameters obtained from minimizing the former loss function are consistent with those obtained from minimizing the ideal loss function. Computationally, whiteout can be easily incorporated in the back-propagation algorithm. The superiority of whiteout over dropout and shakeout in learning NNs with relatively small sized training data is demonstrated using the the LSVT voice rehabilitation data and the LIBRAS hand movement data.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found