Chaitin-Kolmogorov Complexity and Generalization in Neural Networks

Neural Information Processing Systems 

We present a unified framework for a number of different ways of failing to generalize properly. The complexity of the function computed is therefore increased, and generalization is degraded. We analyze replicated networks, in which a number of identical networks are independently trained on the same data and their results averaged. We conclude that replication almost always results in a decrease in the expected complexity of the network, and that replication therefore increases expected generalization. Simulations confirming the effect are also presented.