Beyond Log-concavity: Provable Guarantees for Sampling Multi-modal Distributions using Simulated Tempering Langevin Monte Carlo
Holden Lee, Andrej Risteski, Rong Ge
–Neural Information Processing Systems
A key task in Bayesian machine learning is sampling from distributions that are only specified up to a partition function (i.e., constant of proportionality). One prevalent example of this is sampling posteriors in parametric distributions, such as latent-variable generative models. However sampling (even very approximately) can be #P-hard. Classical results (going back to [BÉ85]) on sampling focus on log-concave distributions, and show a natural Markov chain called Langevin diffusion mixes in polynomial time. However, all log-concave distributions are uni-modal, while in practice it is very common for the distribution of interest to have multiple modes.
Neural Information Processing Systems
May-26-2025, 09:49:01 GMT
- Country:
- Europe > Netherlands (0.14)
- North America
- Canada (0.14)
- United States > Massachusetts (0.14)
- Technology: