Aligning Hyperbolic Representations: an Optimal Transport-based approach

Hoyos-Idrobo, Andrés

arXiv.org Machine Learning 

Hyperbolic embeddings are state-of-the-art models to learn representations of data with an underlying hierarchical structure [18]. The hyperbolic space serves as a geometric prior to hierarchical structures, tree graphs, heavy-tailed distributions, e.g., scale-free, powerlaw [45]. A relevant tool to implement hyperbolic space algorithms is the Möbius gyrovector spaces or Gyrovector spaces [66]. Gyrovector spaces are an algebraic formalism, which leads to vector-like operations, i.e., gyrovector, in the Poincaré model of the hyperbolic space. Thanks to this formalism, we can quickly build estimators that are well-suited to perform end-to-end optimization [6]. Gyrovector spaces are essential to design the hyperbolic version of several machine learning algorithms, like Hyperbolic Neural Networks (HNN) [24], Hyperbolic Graph NN [36], Hyperbolic Graph Convolutional NN [12], learning latent feature representations [41, 46], word embeddings [62, 25], and image embeddings [29]. Modern machine learning algorithms rely on the availability to accumulate large volumes of data, often coming from various sources, e.g., acquisition devices or languages. However, these massive amounts of heterogeneous data can entangle downstream learning tasks since the data may follow different distributions. Alignment aims at building connections between two or more disparate data sets by aligning their underlying manifolds.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found