PAC-Bayes Bounds for the Risk of the Majority Vote and the Variance of the Gibbs Classifier

Lacasse, Alexandre, Laviolette, François, Marchand, Mario, Germain, Pascal, Usunier, Nicolas

Neural Information Processing Systems 

We propose new PAC-Bayes bounds for the risk of the weighted majority vote that depend on the mean and variance of the error of its associated Gibbs classifier. We show that these bounds can be smaller than the risk of the Gibbs classifier and can be arbitrarily close to zero even if the risk of the Gibbs classifier is close to 1/2. Moreover, we show that these bounds can be uniformly estimated on the training data for all possible posteriors Q. Moreover, they can be improved by using a large sample of unlabelled data.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found