wasserstein-approximate gaussian process
Quantile Propagation for Wasserstein-Approximate Gaussian Processes
Approximate inference techniques are the cornerstone of probabilistic methods based on Gaussian process priors. Despite this, most work approximately optimizes standard divergence measures such as the Kullback-Leibler (KL) divergence, which lack the basic desiderata for the task at hand, while chiefly offering merely technical convenience. We develop a new approximate inference method for Gaussian process models which overcomes the technical challenges arising from abandoning these convenient divergences. Our method---dubbed Quantile Propagation (QP)---is similar to expectation propagation (EP) but minimizes the $L_2$ Wasserstein distance (WD) instead of the KL divergence. The WD exhibits all the required properties of a distance metric, while respecting the geometry of the underlying sample space. We show that QP matches quantile functions rather than moments as in EP and has the same mean update but a smaller variance update than EP, thereby alleviating EP's tendency to over-estimate posterior variances. Crucially, despite the significant complexity of dealing with the WD, QP has the same favorable locality property as EP, and thereby admits an efficient algorithm. Experiments on classification and Poisson regression show that QP outperforms both EP and variational Bayes.
Review for NeurIPS paper: Quantile Propagation for Wasserstein-Approximate Gaussian Processes
Weaknesses: After reading the rebuttals and reviewer discussion, I realise that I was wrong about EP overestimating the variance and the strength of the paper's empirical results, so I have decided to downgrade my score. I still believe this paper should be accepted, but I'm less confident of the matter. Here are the things I changed my mind about, to more critical: - Does EP really overestimate the posterior variance? EP should overestimate the *support* of distributions, because the forward-KL covers all modes with a (unimodal) Gaussian. But this does not necessarily imply that the variance is overestimated, and locally the variance is matched exactly.
Quantile Propagation for Wasserstein-Approximate Gaussian Processes
Approximate inference techniques are the cornerstone of probabilistic methods based on Gaussian process priors. Despite this, most work approximately optimizes standard divergence measures such as the Kullback-Leibler (KL) divergence, which lack the basic desiderata for the task at hand, while chiefly offering merely technical convenience. We develop a new approximate inference method for Gaussian process models which overcomes the technical challenges arising from abandoning these convenient divergences. Our method---dubbed Quantile Propagation (QP)---is similar to expectation propagation (EP) but minimizes the L_2 Wasserstein distance (WD) instead of the KL divergence. The WD exhibits all the required properties of a distance metric, while respecting the geometry of the underlying sample space.