The Randomized Midpoint Method for Log-Concave Sampling

Shen, Ruoqi, Lee, Yin Tat

Neural Information Processing Systems 

Sampling from log-concave distributions is a well researched problem that has many applications in statistics and machine learning. In our paper, we propose a Markov chain Monte Carlo (MCMC) algorithm based on the underdamped Langevin diffusion (ULD). Our algorithm performs significantly faster than the previously best known algorithm for solving this problem, which requires $\tilde{O}\left(\kappa {1.5}/\epsilon\right)$ steps \cite{chen2019optimal,dalalyan2018sampling}. Moreover, our algorithm can be easily parallelized to require only $O(\kappa\log\frac{1}{\epsilon})$ parallel steps. To solve the sampling problem, we propose a new framework to discretize stochastic differential equations. We apply this framework to discretize and simulate ULD, which converges to the target distribution $p {*}$.