Direct 0-1 Loss Minimization and Margin Maximization with Boosting
–Neural Information Processing Systems
We propose a boosting method, DirectBoost, a greedy coordinate descent algorithm that builds an ensemble classifier of weak classifiers through directly minimizing empirical classification error over labeled training examples; once the training classification error is reduced to a local coordinatewise minimum, Direct-Boost runs a greedy coordinate ascent algorithm that continuously adds weak classifiers to maximize any targeted arbitrarily defined margins until reaching a local coordinatewise maximum of the margins in a certain sense.
Neural Information Processing Systems
Mar-13-2024, 18:38:31 GMT