Overfitting in Neural Nets: Backpropagation, Conjugate Gradient, and Early Stopping

Caruana, Rich, Lawrence, Steve, Giles, C. Lee

Neural Information Processing Systems 

The conventional wisdom is that backprop nets with excess hidden units generalize poorly. We show that nets with excess capacity generalize well when trained with backprop and early stopping. Experiments suggest tworeasons for this: 1) Overfitting can vary significantly in different regions of the model. Excess capacity allows better fit to regions of high non-linearity, and backprop often avoids overfitting the regions of low non-linearity.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found