Convex Relaxation for Solving Large-Margin Classifiers in Hyperbolic Space
Yang, Sheng, Liu, Peihan, Pehlevan, Cengiz
–arXiv.org Artificial Intelligence
Representations embedded in the hyperbolic space have demonstrated significant improvements over their Euclidean counterparts across a variety of datasets, including images [1], natural languages [2], and complex tabular data such as single-cell sequencing [3]. On the other hand, learning and optimization on hyperbolic spaces are typically more involved than that on Euclidean spaces. Problems that are convex in Euclidean spaces become constrained non-convex problems in hyperbolic spaces. The hyperbolic Support Vector Machine (HSVM), as explored in recent studies [4, 5], exemplifies such challenges by presenting as a non-convex constrained programming problem that has been solved predominantly based on projected gradient descent. Attempts have been made to alleviate its non-convex nature through reparametrization [6] or developing a hyperbolic perceptron algorithm that converges to a separator with finetuning using adversarial samples to approximate the large-margin solution [7].
arXiv.org Artificial Intelligence
May-27-2024
- Country:
- North America > United States (0.14)
- Genre:
- Research Report > New Finding (0.48)
- Industry:
- Health & Medicine (0.93)
- Technology: