Efficiently avoiding saddle points with zero order methods: No gradients required
Emmanouil-Vasileios Vlatakis-Gkaragkounis, Lampros Flokas, Georgios Piliouras
–Neural Information Processing Systems
We consider the case of derivative-free algorithms for non-convex optimization, also known as zero order algorithms, that use only function evaluations rather than gradients. For a wide variety of gradient approximators based on finite differences, we establish asymptotic convergence to second order stationary points using a carefully tailored application of the Stable Manifold Theorem. Regarding efficiency, we introduce a noisy zero-order method that converges to second order stationary points, i.e avoids saddle points.
Neural Information Processing Systems
Jan-21-2025, 19:16:39 GMT
- Country:
- Asia > Middle East (0.68)
- Europe (1.00)
- North America
- Canada > British Columbia
- United States > California (0.28)
- Technology: