Hyperparameter Optimization: Foundations, Algorithms, Best Practices and Open Challenges
Bischl, Bernd, Binder, Martin, Lang, Michel, Pielok, Tobias, Richter, Jakob, Coors, Stefan, Thomas, Janek, Ullmann, Theresa, Becker, Marc, Boulesteix, Anne-Laure, Deng, Difan, Lindauer, Marius
Most machine learning algorithms are configured by one or several hyperparameters that must be carefully chosen and often considerably impact performance. To avoid a time consuming and unreproducible manual trial-and-error process to find well-performing hyperparameter configurations, various automatic hyperparameter optimization (HPO) methods, e.g., based on resampling error estimation for supervised machine learning, can be employed. After introducing HPO from a general perspective, this paper reviews important HPO methods such as grid or random search, evolutionary algorithms, Bayesian optimization, Hyperband and racing. It gives practical recommendations regarding important choices to be made when conducting HPO, including the HPO algorithms themselves, performance evaluation, how to combine HPO with ML pipelines, runtime improvements, and parallelization.
Jul-14-2021
- Country:
- Europe (1.00)
- North America > United States
- Genre:
- Overview (1.00)
- Research Report > Experimental Study (0.92)
- Industry:
- Education (0.92)
- Energy > Oil & Gas (0.92)
- Health & Medicine > Pharmaceuticals & Biotechnology (1.00)
- Technology:
- Information Technology > Artificial Intelligence
- Machine Learning
- Evolutionary Systems (1.00)
- Neural Networks > Deep Learning (0.93)
- Performance Analysis > Accuracy (1.00)
- Statistical Learning > Regression (0.92)
- Representation & Reasoning
- Expert Systems (0.92)
- Optimization (1.00)
- Search (0.90)
- Machine Learning
- Information Technology > Artificial Intelligence