On Dissipative Symplectic Integration with Applications to Gradient-Based Optimization

França, Guilherme, Jordan, Michael I., Vidal, René

arXiv.org Machine Learning 

Recently, continuous dynamical systems have proved useful in providing conceptual and quantitative insights into gradient-based optimization, widely used in modern machine learning and statistics. An important question that arises in this line of work is how to discretize the system in such a way that its stability and rates of convergence are preserved. In this paper we propose a geometric framework in which such discretizations can be realized systematically, enabling the derivation of "rate-matching" optimization algorithms without the need for a discrete convergence analysis. More specifically, we show that a generalization of symplectic integrators to dissipative Hamiltonian systems is able to preserve continuous rates of convergence up to a controlled error. Moreover, such methods preserve a perturbed Hamiltonian despite the absence of a conservation law, extending key results of symplectic integrators to dissipative cases. Our arguments rely on a combination of backward error analysis with fundamental results from symplectic geometry.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found