MCMC for Variationally Sparse Gaussian Processes
Hensman, James, Matthews, Alexander G., Filippone, Maurizio, Ghahramani, Zoubin
–Neural Information Processing Systems
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable research effort has been made into attacking three issues with GP models: how to compute efficiently when the number of data is large; how to approximate theposterior when the likelihood is not Gaussian and how to estimate covariance function parameter posteriors. This paper simultaneously addresses these, using a variational approximation to the posterior which is sparse in support ofthe function but otherwise free-form. The result is a Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function valuesand covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs. Code to replicate each experiment in this paper isavailable at github.com/sparseMCMC.
Neural Information Processing Systems
Dec-31-2015