SSRGD: Simple Stochastic Recursive Gradient Descent for Escaping Saddle Points
–Neural Information Processing Systems
We analyze stochastic gradient algorithms for optimizing nonconvex problems. In particular, our goal is to find local minima (second-order stationary points) instead of just finding first-order stationary points which may be some bad unstable saddle points.
Neural Information Processing Systems
Dec-25-2025, 21:01:49 GMT
- Technology: