Incremental and Decremental Support Vector Machine Learning

Cauwenberghs, Gert, Poggio, Tomaso

Neural Information Processing Systems 

An online recursive algorithm for training support vector machines, one vector at a time, is presented. Adiabatic increments retain the Kuhn Tucker conditions on all previously seen training data, in a number of steps each computed analytically. The incremental procedure is reversible, anddecremental "unlearning" offers an efficient method to exactly evaluate leave-one-out generalization performance.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found