PAC-Bayesian Theory Meets Bayesian Inference
Germain, Pascal, Bach, Francis, Lacoste, Alexandre, Lacoste-Julien, Simon
–Neural Information Processing Systems
We exhibit a strong link between frequentist PAC-Bayesian bounds and the Bayesian marginal likelihood. That is, for the negative log-likelihood loss function, we show that the minimization of PAC-Bayesian generalization bounds maximizes the Bayesian marginal likelihood. This provides an alternative explanation to the Bayesian Occam's razor criteria, under the assumption that the data is generated by an i.i.d. distribution. Moreover, as the negative log-likelihood is an unbounded loss function, we motivate and propose a PAC-Bayesian theorem tailored for the sub-gamma loss family, and we show that our approach is sound on classical Bayesian linear regression tasks.
Neural Information Processing Systems
Dec-31-2016
- Country:
- Europe
- Spain > Catalonia
- Barcelona Province > Barcelona (0.04)
- United Kingdom > England
- Oxfordshire > Oxford (0.04)
- Spain > Catalonia
- North America > United States
- New Jersey > Hudson County
- Secaucus (0.04)
- New York (0.04)
- New Jersey > Hudson County
- Europe