A Concise Lyapunov Analysis of Nesterov's Accelerated Gradient Method

Liu, Jun

arXiv.org Artificial Intelligence 

Among them, Nesterov's accelerated gradient method [7,8] has gained significant attention due to its provable acceleration on general convex functions beyond quadratics. A special focus has been on using dynamical system tools [12,10,3,14] and control-theoretical methods [5,9] for the analysis and design of such algorithms. In the standard textbook [8] by Nesterov, the convergence analysis of accelerated gradient methods is conducted using a technique known as estimating sequences. These are essentially auxiliary comparison functions used to prove the convergence rates of optimization algorithms. As pointed out in [14], estimating sequences are usually constructed inductively and can be difficult to understand and apply. This motivated the Lyapunov analysis in [14], which aims to unify the analysis of a broad class of accelerated algorithms. Despite this comprehensive work, to the best knowledge of the author, a simple and direct Lyapunov analysis of the original scheme of Nesterov's accelerated gradient method is still lacking.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found