A Riemannian ADMM

Li, Jiaxiang, Ma, Shiqian, Srivastava, Tejes

arXiv.org Artificial Intelligence 

Optimization over Riemannian manifolds has drawn a lot of attention due to its applications in machine learning and related disciplines, including low-rank matrix completion [6, 49], phase retrieval [3, 45], blind deconvolution [21] and dictionary learning [11, 43]. Riemannian optimization aims at minimizing an objective function over a Riemannian manifold. When the objective function is smooth, people have proposed to solve them using Riemannian gradient method, Riemannian quasi-Newton method, Riemannian trust-region method, etc. Work along this line has been summarized in the monographs [1, 5] as well as some other references. Recently, due to increasing demand from application areas such as machine learning, statistics, signal processing and so on, there is a line of work designing efficient and scalable algorithms for solving Riemannian optimization problems with nonsmooth objectives. For example, people have studied Riemannian subgradient method [33], Riemannian proximal gradient method [10, 23], Riemannian proximal point algorithm [9], Riemannian proximal-linear algorithm [51], zeroth-order Riemannian algorithms [32], and so on. One thing that has not been widely considered is how to design alternating direction method of multipliers (ADMM) on manifolds.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found