Export Reviews, Discussions, Author Feedback and Meta-Reviews

Neural Information Processing Systems 

"NIPS Neural Information Processing Systems 8-11th December 2014, Montreal, Canada",,, "Paper ID:","1660" "Title:","Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models" Current Reviews First provide a summary of the paper, and then address the following criteria: Quality, clarity, originality and significance. The authors have come up with a novel way to distribute the optimisation of variational Bayesian sparse GPs (for regression and LVMs). By reformulating Titsias' (2009) variational lower bound, such that the training data become independent given the inducing points, training can be parallelised across nodes on a cluster or network, via a Map-Reduce implementation. There is of course a bit of overhead, because of the optimisation of global parameters, which is negligible considering the overall speed-up and scaling, and the authors demonstrate that with a simple experiment. It is nice that the reformulation of the variational lower bound unifies both cases (regression becomes a special case of the LVM when the inputs are fixed).