Forward-backward Gaussian variational inference via JKO in the Bures-Wasserstein Space
Diao, Michael, Balasubramanian, Krishnakumar, Chewi, Sinho, Salim, Adil
–arXiv.org Artificial Intelligence
Variational inference (VI) seeks to approximate a target distribution $\pi$ by an element of a tractable family of distributions. Of key interest in statistics and machine learning is Gaussian VI, which approximates $\pi$ by minimizing the Kullback-Leibler (KL) divergence to $\pi$ over the space of Gaussians. In this work, we develop the (Stochastic) Forward-Backward Gaussian Variational Inference (FB-GVI) algorithm to solve Gaussian VI. Our approach exploits the composite structure of the KL divergence, which can be written as the sum of a smooth term (the potential) and a non-smooth term (the entropy) over the Bures-Wasserstein (BW) space of Gaussians endowed with the Wasserstein distance. For our proposed algorithm, we obtain state-of-the-art convergence guarantees when $\pi$ is log-smooth and log-concave, as well as the first convergence guarantees to first-order stationary solutions when $\pi$ is only log-smooth.
arXiv.org Artificial Intelligence
Apr-10-2023
- Country:
- Asia > Middle East
- Jordan (0.04)
- Europe > United Kingdom
- England > Cambridgeshire
- Cambridge (0.04)
- Scotland > City of Edinburgh
- Edinburgh (0.04)
- England > Cambridgeshire
- North America > United States
- California > Yolo County
- Davis (0.04)
- Massachusetts > Middlesex County
- Cambridge (0.04)
- Rhode Island > Providence County
- Providence (0.04)
- California > Yolo County
- Asia > Middle East
- Genre:
- Research Report (0.64)
- Technology: