Tight Margin-Based Generalization Bounds for Voting Classifiers over Finite Hypothesis Sets
Larsen, Kasper Green, Schalburg, Natascha
–arXiv.org Artificial Intelligence
Ensemble learning is a powerful machine learning tool; it enables us to transform weak learners; hypothesis classes that are barely better than guessing, into learners with state-of-the-art performance. In essence, ensemble methods take a set of base classifiers, weigh those classifiers according to performance on the training set and retrieve the final prediction by aggregating according to those weights. An important historical example is AdaBoost (Freund and Schapire [1997]), a type of voting classifier, which builds the ensemble classifier sequentially; new base classifiers are added to the ensemble to correct the mistakes of the current ensemble. AdaBoost was the first efficient and practical implementation of a boosting algorithm, and hence the relevance of ensemble learners is often attributed to AdaBoost. Much theoretical research has been done to explain the impressive practical performance of AdaBoost and other ensemble methods.
arXiv.org Artificial Intelligence
Nov-26-2025
- Country:
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Genre:
- Research Report (0.63)
- Technology: