Scalable Cross Validation Losses for Gaussian Process Models
Jankowiak, Martin, Pleiss, Geoff
We introduce a simple and scalable method for training Gaussian process (GP) models that exploits cross-validation and nearest neighbor truncation. To accommodate binary and multi-class classification we leverage P\`olya-Gamma auxiliary variables and variational inference. In an extensive empirical comparison with a number of alternative methods for scalable GP regression and classification, we find that our method offers fast training and excellent predictive performance. We argue that the good predictive performance can be traced to the non-parametric nature of the resulting predictive distributions as well as to the cross-validation loss, which provides robustness against model mis-specification.
May-24-2021
- Country:
- North America > United States
- Massachusetts (0.14)
- New York (0.14)
- North America > United States
- Genre:
- Research Report (0.40)
- Technology: