Almost Linear VC Dimension Bounds for Piecewise Polynomial Networks
Bartlett, Peter L., Maiorov, Vitaly, Meir, Ron
–Neural Information Processing Systems
We compute upper and lower bounds on the VC dimension of feedforward networks of units with piecewise polynomial activation functions. We show that if the number of layers is fixed, then the VC dimension grows as W log W, where W is the number of parameters in the network. The VC dimension is an important measure of the complexity of a class of binaryvalued functions, since it characterizes the amount of data required for learning in the PAC setting (see [BEHW89, Vap82]). In this paper, we establish upper and lower bounds on the VC dimension of a specific class of multi-layered feedforward neural networks. Let F be the class of binary-valued functions computed by a feed forward neural network with W weights and k computational (non-input) units, each with a piecewise polynomial activation function.
Neural Information Processing Systems
Dec-31-1999