Nearly Isometric Embedding by Relaxation

James McQueen, Marina Meila, Dominique Joncas

Neural Information Processing Systems 

Many manifold learning algorithms aim to create embeddings with low or no distortion (isometric). If the data has intrinsic dimension d, it is often impossible to obtain an isometric embedding in d dimensions, but possible in s > d dimensions. Y et, most geometry preserving algorithms cannot do the latt er. This paper proposes an embedding algorithm to overcome this. The algorith m accepts as input, besides the dimension d, an embedding dimension s d . For any data embedding Y, we compute a Loss( Y), based on the push-forward Riemannian metric associated with Y, which measures deviation of Y from from isometry. Riemannian Relaxation iteratively updates Y in order to decrease Loss( Y) . The experiments confirm the superiority of our algorithm in obtaining low dis tortion embeddings.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found