Goto

Collaborating Authors

 Zwald, Laurent


On the Convergence of Eigenspaces in Kernel Principal Component Analysis

Neural Information Processing Systems

This paper presents a non-asymptotic statistical analysis of Kernel-PCA with a focus different from the one proposed in previous work on this topic. Here instead of considering the reconstruction error of KPCA we are interested in approximation error bounds for the eigenspaces themselves. We prove an upper bound depending on the spacing between eigenvalues but not on the dimensionality of the eigenspace. As a consequence this allows to infer stability results for these estimated spaces.


On the Convergence of Eigenspaces in Kernel Principal Component Analysis

Neural Information Processing Systems

This paper presents a non-asymptotic statistical analysis of Kernel-PCA with a focus different from the one proposed in previous work on this topic. Here instead of considering the reconstruction error of KPCA we are interested in approximation error bounds for the eigenspaces themselves. Weprove an upper bound depending on the spacing between eigenvalues but not on the dimensionality of the eigenspace. As a consequence thisallows to infer stability results for these estimated spaces.


Kernel Projection Machine: a New Tool for Pattern Recognition

Neural Information Processing Systems

This paper investigates the effect of Kernel Principal Component Analysis (KPCA) within the classification framework, essentially the regularization properties of this dimensionality reduction method. KPCA has been previously used as a pre-processing step before applying an SVM but we point out that this method is somewhat redundant from a regularization point of view and we propose a new algorithm called Kernel Projection Machine to avoid this redundancy, based on an analogy with the statistical framework of regression for a Gaussian white noise model. Preliminary experimental results show that this algorithm reaches the same performances as an SVM.


Kernel Projection Machine: a New Tool for Pattern Recognition

Neural Information Processing Systems

This paper investigates the effect of Kernel Principal Component Analysis (KPCA)within the classification framework, essentially the regularization propertiesof this dimensionality reduction method. KPCA has been previously used as a pre-processing step before applying an SVM but we point out that this method is somewhat redundant from a regularization pointof view and we propose a new algorithm called Kernel Projection Machine to avoid this redundancy, based on an analogy with the statistical framework of regression for a Gaussian white noise model. Preliminary experimental results show that this algorithm reaches the same performances as an SVM.