The Variational Gaussian Process
Tran, Dustin, Ranganath, Rajesh, Blei, David M.
Variational inference is a powerful tool for approximate inference, and it has been recently applied for representation learning with deep generative models. We develop the variational Gaussian process (VGP), a Bayesian nonparametric variational family, which adapts its shape to match complex posterior distributions. The VGP generates approximate posterior samples by generating latent inputs and warping them through random non-linear mappings; the distribution over random mappings is learned during inference, enabling the transformed outputs to adapt to varying complexity. We prove a universal approximation theorem for the VGP, demonstrating its representative power for learning any model. For inference we present a variational objective inspired by auto-encoders and perform black box inference over a wide class of models. The VGP achieves new state-of-the-art results for unsupervised learning, inferring models such as the deep latent Gaussian model and the recently proposed DRAW.
Apr-17-2016
- Country:
- Asia > Middle East
- Jordan (0.05)
- Europe
- Netherlands > South Holland
- Dordrecht (0.04)
- United Kingdom > England
- Oxfordshire > Oxford (0.04)
- Netherlands > South Holland
- North America
- Canada > Ontario
- Toronto (0.14)
- United States
- Massachusetts > Middlesex County
- Cambridge (0.04)
- New York (0.04)
- Massachusetts > Middlesex County
- Canada > Ontario
- Asia > Middle East
- Genre:
- Research Report (0.50)