Dimension-free convergence rates for gradient Langevin dynamics in RKHS
Muzellec, Boris, Sato, Kanji, Massias, Mathurin, Suzuki, Taiji
Gradient Langevin dynamics (GLD) and stochastic GLD (SGLD) have attracted considerable attention lately, as a way to provide convergence guarantees in a non-convex setting. However, the known rates grow exponentially with the dimension of the space. In this work, we provide a convergence analysis of GLD and SGLD when the optimization space is an infinite dimensional Hilbert space. More precisely, we derive non-asymptotic, dimension-free convergence rates for GLD/SGLD when performing regularized non-convex optimization in a reproducing kernel Hilbert space. Amongst others, the convergence analysis relies on the properties of a stochastic differential equation, its discrete time Galerkin approximation and the geometric ergodicity of the associated Markov chains.
Feb-29-2020
- Country:
- Asia
- Afghanistan > Parwan Province
- Charikar (0.04)
- Japan > Honshū
- Kantō
- Kanagawa Prefecture (0.04)
- Tokyo Metropolis Prefecture > Tokyo (0.04)
- Kantō
- Afghanistan > Parwan Province
- Europe
- France
- Occitanie > Haute-Garonne
- Toulouse (0.04)
- Île-de-France > Paris
- Paris (0.04)
- Occitanie > Haute-Garonne
- Poland > Masovia Province
- Warsaw (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- France
- North America > United States
- Massachusetts > Suffolk County
- Boston (0.04)
- New York (0.04)
- Massachusetts > Suffolk County
- Asia
- Genre:
- Research Report > New Finding (0.45)
- Industry:
- Education (0.45)