Parameter-Free Online Learning via Model Selection
Foster, Dylan J., Kale, Satyen, Mohri, Mehryar, Sridharan, Karthik
–Neural Information Processing Systems
We introduce an efficient algorithmic framework for model selection in online learning, also known as parameter-free online learning. Departing from previous work, which has focused on highly structured function classes such as nested balls in Hilbert space, we propose a generic meta-algorithm framework that achieves online model selection oracle inequalities under minimal structural assumptions. We give the first computationally efficient parameter-free algorithms that work in arbitrary Banach spaces under mild smoothness assumptions; previous results applied only to Hilbert spaces. We further derive new oracle inequalities for matrix classes, non-nested convex sets, and $\mathbb{R}^{d}$ with generic regularizers. Finally, we generalize these results by providing oracle inequalities for arbitrary non-linear classes in the online supervised learning model. These results are all derived through a unified meta-algorithm scheme using a novel "multi-scale" algorithm for prediction with expert advice based on random playout, which may be of independent interest.
Neural Information Processing Systems
Dec-31-2017
- Country:
- Europe > United Kingdom
- England (0.14)
- North America > United States (0.28)
- Europe > United Kingdom
- Genre:
- Instructional Material (0.35)
- Industry:
- Education > Educational Setting > Online (0.83)