Invariant subspaces and PCA in nearly matrix multiplication time

Neural Information Processing Systems 

Approximating invariant subspaces of generalized eigenvalue problems (GEPs) is a fundamental computational problem at the core of machine learning and scientific computing. It is, for example, the root of Principal Component Analysis (PCA) for dimensionality reduction, data visualization, and noise filtering, and of Density Functional Theory (DFT), arguably the most popular method to calculate the electronic structure of materials.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found