Can neural networks do better than the Vapnik-Chervonenkis bounds?
–Neural Information Processing Systems
These experiments are designed to test whether average generalization performance can surpass the worst-case bounds obtained from formal learning theory using the Vapnik-Chervonenkis dimension (Blumer et al., 1989). We indeed find that, in some cases, the average generalization is significantly better than the VC bound: the approach to perfect performance is exponential in the number of examples m, rather than the 11m result of the bound. In other cases, we do find the 11m behavior of the VC bound, and in these cases, the numerical prefactor is closely related to prefactor contained in the bound.
Neural Information Processing Systems
Dec-31-1991
- Country:
- North America > United States
- California (0.14)
- Washington > King County
- Seattle (0.14)
- North America > United States
- Technology: