A short proof of near-linear convergence of adaptive gradient descent under fourth-order growth and convexity
Davis, Damek, Drusvyatskiy, Dmitriy
Davis, Drusvyatskiy, and Jiang showed that gradient descent with an adaptive stepsize converges locally at a nearly-linear rate for smooth functions that grow at least quartically away from their minimizers. The argument is intricate, relying on monitoring the performance of the algorithm relative to a certain manifold of slow growth -- called the ravine. In this work, we provide a direct Lyapunov-based argument that bypasses these difficulties when the objective is in addition convex and a has a unique minimizer. As a byproduct of the argument, we obtain a more adaptive variant than the original algorithm with encouraging numerical performance.
Apr-16-2026
- Country:
- Asia > Russia (0.04)
- Europe > Russia (0.04)
- North America > United States
- California > San Diego County
- New York (0.04)
- Pennsylvania > Philadelphia County
- Philadelphia (0.14)
- Genre:
- Research Report (0.40)
- Technology: