Reviews: Generalization Bounds for Neural Networks via Approximate Description Length
–Neural Information Processing Systems
In this paper the authors establish upper bounds on the generalization error of classes of norm-bounded neural networks. There is a long line of literature on this exact question, and this paper claims to resolve an interesting open question in this area (at least when the depth of the network is viewed as a constant). In particular, the paper considers generalization bounds for a class of fully-connected networks of constant depth and whose matrices are of bounded norm. Work by Bartlett et al. ("Spectrally normalized margin bounds on neural networks", ref [4] in the paper) proved an upper bound on generalization error that contains a factor growing as the (1,2)-matrix norm of any layer. If one further assumes that the depth as well as all the spectral norms are constants, then this is the dominant term (up to logarithmic factors) in their generalization bound.
Neural Information Processing Systems
Jan-25-2025, 08:00:23 GMT
- Technology: