Upper Bound of Bayesian Generalization Error in Non-negative Matrix Factorization

Hayashi, Naoki, Watanabe, Sumio

arXiv.org Machine Learning 

Recently, nonnegative matrix factorization (NMF) [1, 2] has been applied to text mining [3], signal processing [4, 5, 6], bioinformatics [7], and consumer analysis [8]. Experiments has shown that a new knowledge discovery method is derived by NMF, however, its mathematical property as a learning machine is not yet clarified, since it is not a regular statistical model. A statistical model is called regular if a function from a parameter to a probability density function is one-to-one and if the likelihood function can be approximated by a Gaussian function. It is proved that, if a statistical model is regular and if a true distribution is realizable by a statistical model, then the generalization error is asymptotically equal to d/(2n), where d, n, and the generalization error are the dimension of the parameter, the sample size, and the expected Kullback-Leibler divergence of the true distribution and the estimated learning machine, respectively. However, the statistical model used in NMF is not regular because the map from a parameter to a probability density function is not injective.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found