New Probabilistic Bounds on Eigenvalues and Eigenvectors of Random Kernel Matrices
Reyhani, Nima, Hino, Hideitsu, Vigario, Ricardo
Kernel methods are successful approaches for different machine learning problems. This success is mainly rooted in using feature maps and kernel matrices. Some methods rely on the eigenvalues/eigenvectors of the kernel matrix, while for other methods the spectral information can be used to estimate the excess risk. An important question remains on how close the sample eigenvalues/eigenvectors are to the population values. In this paper, we improve earlier results on concentration bounds for eigenvalues of general kernel matrices. Meanwhile, the obstacles for sharper bounds are accounted for and partially addressed. As a case study, we derive a concentration inequality for sample kernel target-alignment. 1 INTRODUCTION Kernel methods such as Spectral Clustering, Kernel Principal Component Analysis(KPCA), and Support Vector Machines, are successful approaches in many practical machine learning and data analysis problems (Steinwart & Christmann, 2008). The main ingredient of these methods is the kernel matrix, which is built using the kernel function, evaluated at given sample points.
Feb-14-2012
- Country:
- Asia
- Japan > Honshū
- Kantō > Tokyo Metropolis Prefecture > Tokyo (0.14)
- Middle East > Jordan (0.04)
- Japan > Honshū
- Europe
- Finland > Uusimaa
- Helsinki (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Finland > Uusimaa
- North America
- Canada > Quebec
- Montreal (0.04)
- United States > New York (0.04)
- Canada > Quebec
- Asia
- Genre:
- Research Report (0.82)
- Technology: