On Dissipative Symplectic Integration with Applications to Gradient-Based Optimization
França, Guilherme, Jordan, Michael I., Vidal, René
Recently, continuous dynamical systems have proved useful in providing conceptual and quantitative insights into gradient-based optimization, widely used in modern machine learning and statistics. An important question that arises in this line of work is how to discretize the system in such a way that its stability and rates of convergence are preserved. In this paper we propose a geometric framework in which such discretizations can be realized systematically, enabling the derivation of "rate-matching" optimization algorithms without the need for a discrete convergence analysis. More specifically, we show that a generalization of symplectic integrators to dissipative Hamiltonian systems is able to preserve continuous rates of convergence up to a controlled error. Moreover, such methods preserve a perturbed Hamiltonian despite the absence of a conservation law, extending key results of symplectic integrators to dissipative cases. Our arguments rely on a combination of backward error analysis with fundamental results from symplectic geometry.
Jun-25-2020
- Country:
- Asia > Middle East
- Jordan (0.05)
- Europe
- Switzerland > Basel-City
- Basel (0.04)
- United Kingdom
- England > Cambridgeshire
- Cambridge (0.04)
- North Sea > Southern North Sea (0.04)
- England > Cambridgeshire
- Switzerland > Basel-City
- North America > United States
- California > Alameda County
- Berkeley (0.14)
- Maryland > Baltimore (0.04)
- California > Alameda County
- Asia > Middle East
- Genre:
- Research Report (0.50)
- Technology: