A fast, universal algorithm to learn parametric nonlinear embeddings

Miguel A. Carreira-Perpinan, Max Vladymyrov

Neural Information Processing Systems 

Nonlinear embedding algorithms such as stochastic neighbo r embedding do dimensionality reduction by optimizing an objective functio n involving similarities between pairs of input patterns. The result is a low-dimensi onal projection of each input pattern. A common way to define an out-of-sample mappin g is to optimize the objective directly over a parametric mapping of the inpu ts, such as a neural net. This can be done using the chain rule and a nonlinear opti mizer, but is very slow, because the objective involves a quadratic number of t erms each dependent on the entire mapping's parameters. Using the method of auxi liary coordinates, we derive a training algorithm that works by alternating ste ps that train an auxiliary embedding with steps that train the mapping. This has two advantages: 1) The algorithm is universal in that a specific learning algori thm for any choice of embedding and mapping can be constructed by simply reusing e xisting algorithms for the embedding and for the mapping. A user can then try poss ible mappings and embeddings with less effort.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found