Positive semi-definite embedding for dimensionality reduction and out-of-sample extensions
Fanuel, Michaël, Aspeel, Antoine, Delvenne, Jean-Charles, Suykens, Johan A. K.
In machine learning or statistics, it is often desirable to reduce the dimensionality of a sample of data points in a high dimensional space $\mathbb{R}^d$. This paper introduces a dimensionality reduction method where the embedding coordinates are the eigenvectors of a positive semi-definite kernel obtained as the solution of an infinite dimensional analogue of a semi-definite program. This embedding is adaptive and non-linear. A main feature of our approach is the existence of a non-linear out-of-sample extension formula of the embedding coordinates, called a projected Nystr\"om approximation. This extrapolation formula yields an extension of the kernel matrix to a data-dependent Mercer kernel function. Our empirical results indicate that this embedding method is more robust with respect to the influence of outliers, compared with a spectral embedding method.
Oct-6-2020
- Country:
- Europe > Belgium
- Flanders > Flemish Brabant
- Leuven (0.04)
- Wallonia > Walloon Brabant
- Louvain-la-Neuve (0.04)
- Flanders > Flemish Brabant
- North America > United States (0.14)
- Europe > Belgium
- Genre:
- Research Report (0.40)