Escaping from saddle points on Riemannian manifolds

Yue Sun, Nicolas Flammarion, Maryam Fazel

Neural Information Processing Systems 

We consider minimizing a nonconvex, smooth function f on a Riemannian manifold M. We show that a perturbed version of Riemannian gradient descent algorithm converges to a second-order stationary point (and hence is able to escape saddle points on the manifold).