Hyper-parameter Tuning under a Budget Constraint

Lu, Zhiyun, Chiang, Chao-Kai, Sha, Fei

arXiv.org Machine Learning 

Hyper-parameter tuning is of crucial importance to designing and deploying machine learning systems. Broadly, hyper-parameters include the architecture of the learning models, regularization parameters, optimization methods and their parameters, and other "knobs" to be tuned. It is challenging to explore the vast space of hyper-parameters efficiently to identify the optimal configuration. Quite a few approaches have been proposed and investigated: random search, Bayesian Optimization (BO) [30, 29], bandits-based Hyperband [17, 24], and meta-learning [5, 1, 10]. Many of those prior studies have focused on the aspect of reducing as much as possible the computation cost to obtain the optimal configuration. In this work, we look at a different but important perspective to hyper-parameter optimization - under a fixed time/computation cost, how we can improve the performance as much as possible.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found