6081594975a764c8e3a691fa2b3a321d-Reviews.html

Neural Information Processing Systems 

This paper proposes a new boosting method that represents a tradeoff between online and offline learning. The main idea of the method is to maintain a reservoir of training examples (of fixed size) from which to train the weak learners. At each boosting iteration, new examples are added to the reservoir and then a selection strategy is used to reduce the reservoir to its original fixed size before the weak learner is trained. Several naive selection strategies are proposed but the main contribution of the paper is a more sophisticated selection strategy whose goal is to remove examples from the reservoir so that a weak learner trained on the reduced set will minimize the error computed on the whole set before reduction. The resulting algorithm is applied on four computer vision datasets, where it is shown to outperform several other online boosting methods. The idea of using a reservoir is original and very interesting.