AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods
Freund, Robert M., Grigas, Paul, Mazumder, Rahul
Boosting methods are highly popular and effective supervised learning methods which combine weak learners into a single accurate model with good statistical performance. In this paper, we analyze two well-known boosting methods, AdaBoost and Incremental Forward Stagewise Regression (FS$_\varepsilon$), by establishing their precise connections to the Mirror Descent algorithm, which is a first-order method in convex optimization. As a consequence of these connections we obtain novel computational guarantees for these boosting methods. In particular, we characterize convergence bounds of AdaBoost, related to both the margin and log-exponential loss function, for any step-size sequence. Furthermore, this paper presents, for the first time, precise computational complexity results for FS$_\varepsilon$.
Jul-3-2013
- Country:
- North America > United States
- Massachusetts > Middlesex County
- New York (0.04)
- South America > Chile (0.04)
- North America > United States
- Genre:
- Research Report (0.50)
- Technology: