Global Coordination of Local Linear Models
Roweis, Sam T., Saul, Lawrence K., Hinton, Geoffrey E.
–Neural Information Processing Systems
High dimensional data that lies on or near a low dimensional manifold can be described bya collection of local linear models. Such a description, however, does not provide a global parameterization of the manifold--arguably an important goal of unsupervised learning. In this paper, we show how to learn a collection of local linear models that solves this more difficult problem. Our local linear models are represented by a mixture of factor analyzers, and the "global coordination" ofthese models is achieved by adding a regularizing term to the standard maximum likelihood objective function. The regularizer breaks a degeneracy in the mixture model's parameter space, favoring models whose internal coordinate systemsare aligned in a consistent way. As a result, the internal coordinates changesmoothly and continuously as one traverses a connected path on the manifold--even when the path crosses the domains of many different local models. The regularizer takes the form of a Kullback-Leibler divergence and illustrates an unexpected application of variational methods: not to perform approximate inferencein intractable probabilistic models, but to learn more useful internal representations in tractable ones.
Neural Information Processing Systems
Dec-31-2002