From Margin to Sparsity

Graepel, Thore, Herbrich, Ralf, Williamson, Robert C.

Neural Information Processing Systems 

We present an improvement of Novikoff's perceptron convergence theorem. Reinterpreting this mistake bound as a margin dependent sparsity guarantee allows us to give a PACstyle generalisation error bound for the classifier learned by the perceptron learning algorithm. The bound value crucially depends on the margin a support vector machine would achieve on the same data set using the same kernel. Ironically, the bound yields better guarantees than are currently available for the support vector solution itself.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found