PAC-Bayes Tree: weighted subtrees with guarantees
–Neural Information Processing Systems
We present a weighted-majority classification approach over subtrees of a fixed tree, which provably achieves excess-risk of the same order as the best tree-pruning. Furthermore, the computational efficiency of pruning is maintained at both training and testing time despite having to aggregate over an exponential number of subtrees. We believe this is the first subtree aggregation approach with such guarantees. The guarantees are obtained via a simple combination of insights from PAC-Bayes theory, which we believe should be of independent interest, as it generically implies consistency for weighted-voting classifiers w.r.t. Bayes - while, in contrast, usual PAC-bayes approaches only establish consistency of Gibbs classifiers.
Neural Information Processing Systems
Oct-7-2024, 05:49:11 GMT