Robustness of Minimum-Volume Nonnegative Matrix Factorization under an Expanded Sufficiently Scattered Condition

Barbarino, Giovanni, Gillis, Nicolas, Saha, Subhayan

arXiv.org Machine Learning 

In fact, low-rank approximations are a central tool in data analysis, being equivalent to linear dimensionality reductions techniques, with PCA and the truncated SVD as the workhorse approaches [60, 59, 45]. However, due to the sheer number of possible such decompositions, the information provided is hardly interpretable. This motivated researchers to introduce more constrained low-rank approximations. Among them, nonnegative matrix factorization (NMF) focuses on nonnegative input matrices X and imposes the factors, W and H, to be nonnegative entry-wise. Nonnegativity is motivated by physical constraints, such as nonnegative sources and activations in hyperspectral imaging [9], chemometrics [15] and audio source separation [52], and by probabilistic modeling, such as topic modeling [39, 3] and unmixing of independent distributions [38]. Moreover, NMF leads to an easily-interpretable and part-based representation of the data [39]. See also [13, 19, 25] and the references therein.