A Bi-level Nonlinear Eigenvector Algorithm for Wasserstein Discriminant Analysis
Roh, Dong Min, Bai, Zhaojun, Li, Ren-Cang
–arXiv.org Artificial Intelligence
As widely used feature extraction approaches in machine learning, dimensionality reduction (DR) methods [53, 7, 20, 12] learn projections such that the projected lower dimensional subspaces maintain the coherent structure of datasets and reduce computational costs of classification or clustering. The linear projection obtained from linear DR methods takes the form of a matrix such that the embedding to the lower dimensional subspace only involves matrix multiplications. Due to such ease in interpretation and implementation, linear DR methods are often the favored choice among numerous DR methods. For example, principal component analysis (PCA) [24] seeks to find a linear projection that preserves the dataset's variation and is one of the most common and well-known DR methods. Other well-known DR methods include Fisher linear discriminant analysis (LDA) [24] to take into account the information of classes and compute a linear projection that best separates different classes, and Mahalanobis metric learning [35] to seek a distance metric that better models the relationship among dataset from a linear projection. Wasserstein discriminant analysis (WDA) [19] is a supervised linear DR that is based on the use of regularized Wasserstein distances [13] as a distance metric. Similar to Fisher linear discriminant analysis (LDA), WDA seeks a projection matrix to maximize the dispersion of projected points between different classes and minimize the dispersion of projected points within same classes.
arXiv.org Artificial Intelligence
Jul-27-2023
- Country:
- North America > United States > California > Yolo County > Davis (0.14)
- Genre:
- Overview (0.67)
- Research Report (0.63)
- Industry:
- Health & Medicine (0.46)
- Technology: