Plotting

 Vert, Régis


Consistency of one-class SVM and related algorithms

Neural Information Processing Systems

We determine the asymptotic limit of the function computed by support vector machines (SVM) and related algorithms that minimize a regularized empiricalconvex loss function in the reproducing kernel Hilbert space of the Gaussian RBF kernel, in the situation where the number of examples tends to infinity, the bandwidth of the Gaussian kernel tends to 0, and the regularization parameter is held fixed.


Kernel Projection Machine: a New Tool for Pattern Recognition

Neural Information Processing Systems

This paper investigates the effect of Kernel Principal Component Analysis (KPCA) within the classification framework, essentially the regularization properties of this dimensionality reduction method. KPCA has been previously used as a pre-processing step before applying an SVM but we point out that this method is somewhat redundant from a regularization point of view and we propose a new algorithm called Kernel Projection Machine to avoid this redundancy, based on an analogy with the statistical framework of regression for a Gaussian white noise model. Preliminary experimental results show that this algorithm reaches the same performances as an SVM.


Kernel Projection Machine: a New Tool for Pattern Recognition

Neural Information Processing Systems

This paper investigates the effect of Kernel Principal Component Analysis (KPCA)within the classification framework, essentially the regularization propertiesof this dimensionality reduction method. KPCA has been previously used as a pre-processing step before applying an SVM but we point out that this method is somewhat redundant from a regularization pointof view and we propose a new algorithm called Kernel Projection Machine to avoid this redundancy, based on an analogy with the statistical framework of regression for a Gaussian white noise model. Preliminary experimental results show that this algorithm reaches the same performances as an SVM.