SIMPLIFYING NEURAL NETS BY DISCOVERING FLAT MINIMA

Hochreiter, Sepp, Schmidhuber, Jürgen

Neural Information Processing Systems 

We present a new algorithm for finding low complexity networks with high generalization capability. The algorithm searches for large connected regions of so-called ''fiat'' minima of the error function. In the weight-space environment of a "flat" minimum, the error remains approximately constant. Using an MDL-based argument, flat minima can be shown to correspond to low expected overfitting. Although our algorithm requires the computation of second order derivatives, it has backprop's order of complexity. Experiments with feedforward and recurrent nets are described. In an application to stock market prediction, the method outperforms conventional backprop, weight decay, and "optimal brain surgeon".

Similar Docs  Excel Report  more

TitleSimilaritySource
None found