Efficiently avoiding saddle points with zero order methods: No gradients required
Flokas, Lampros, Vlatakis-Gkaragkounis, Emmanouil-Vasileios, Piliouras, Georgios
We consider the case of derivative-free algorithms for non-convex optimization, also known as zero order algorithms, that use only function evaluations rather than gradients. For a wide variety of gradient approximators based on finite differences, we establish asymptotic convergence to second order stationary points using a carefully tailored application of the Stable Manifold Theorem. Regarding efficiency, we introduce a noisy zero-order method that converges to second order stationary points, i.e avoids saddle points. Our algorithm uses only $\tilde{\mathcal{O}}(1 / \epsilon^2)$ approximate gradient calculations and, thus, it matches the converge rate guarantees of their exact gradient counterparts up to constants. In contrast to previous work, our convergence rate analysis avoids imposing additional dimension dependent slowdowns in the number of iterations required for non-convex zero order optimization.
Oct-28-2019
- Country:
- Africa > Middle East
- Tunisia > Ben Arous Governorate > Ben Arous (0.04)
- Asia
- Middle East
- Israel > Haifa District
- Haifa (0.04)
- Jordan (0.04)
- UAE > Abu Dhabi Emirate
- Abu Dhabi (0.04)
- Israel > Haifa District
- Singapore (0.04)
- Middle East
- Europe
- France > Île-de-France
- Spain
- Canary Islands (0.04)
- Catalonia > Barcelona Province
- Barcelona (0.04)
- Sweden > Stockholm
- Stockholm (0.04)
- North America
- Canada
- British Columbia > Vancouver (0.04)
- Quebec > Montreal (0.04)
- United States
- California
- Los Angeles County > Long Beach (0.04)
- San Diego County > San Diego (0.04)
- Florida > Miami-Dade County
- Miami (0.04)
- Nevada (0.04)
- Texas > Dallas County
- Dallas (0.04)
- California
- Canada
- Oceania > Australia
- New South Wales > Sydney (0.04)
- Africa > Middle East
- Genre:
- Research Report (0.64)
- Technology: