Stochastic variance-reduced Gaussian variational inference on the Bures-Wasserstein manifold
Luu, Hoang Phuc Hau, Yu, Hanlin, Williams, Bernardo, Hartmann, Marcelo, Klami, Arto
Optimization in the Bures-Wasserstein space has been gaining popularity in the machine learning community since it draws connections between variational inference and Wasserstein gradient flows. The variational inference objective function of Kullback-Leibler divergence can be written as the sum of the negative entropy and the potential energy, making forward-backward Euler the method of choice. Notably, the backward step admits a closed-form solution in this case, facilitating the practicality of the scheme. However, the forward step is no longer exact since the Bures-Wasserstein gradient of the potential energy involves "intractable" expectations. Recent approaches propose using the Monte Carlo method -- in practice a single-sample estimator -- to approximate these terms, resulting in high variance and poor performance. We propose a novel variance-reduced estimator based on the principle of control variates. We theoretically show that this estimator has a smaller variance than the Monte-Carlo estimator in scenarios of interest. We also prove that variance reduction helps improve the optimization bounds of the current analysis. We demonstrate that the proposed estimator gains order-of-magnitude improvements over the previous Bures-Wasserstein methods.
Oct-3-2024
- Country:
- Asia > Middle East
- Jordan (0.05)
- Europe
- Finland > Uusimaa
- Helsinki (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Finland > Uusimaa
- North America > United States
- California > San Diego County
- San Diego (0.04)
- Massachusetts > Middlesex County
- Cambridge (0.04)
- California > San Diego County
- Asia > Middle East
- Genre:
- Research Report (0.64)
- Technology: