Goto

Collaborating Authors

 Xu, Conglong


Adaptive Stochastic Gradient Descents on Manifolds with an Application on Weighted Low-Rank Approximation

arXiv.org Artificial Intelligence

We prove a convergence theorem for stochastic gradient descents on manifolds with adaptive learning rate and apply it to the weighted low-rank approximation problem.


Weighted Low-rank Approximation via Stochastic Gradient Descent on Manifolds

arXiv.org Machine Learning

We solve a regularized weighted low-rank approximation problem by a stochastic gradient descent on a manifold. To guarantee the convergence of our stochastic gradient descent, we establish a convergence theorem on manifolds for retraction-based stochastic gradient descents admitting confinements. On sample data from the Netflix Prize training dataset, our algorithm outperforms the existing stochastic gradient descent on Euclidean spaces. We also compare the accelerated line search on this manifold to the existing accelerated line search on Euclidean spaces.