A Comparative Study of Hyperparameter Tuning Methods
Dasgupta, Subhasis, Sen, Jaydip
–arXiv.org Artificial Intelligence
The study emphasizes the challenge of finding the optimal trade-off between bias and variance, especially as hyperparameter optimization increases in complexity. Through empirical analysis, three hyperparameter tuning algorithms Tree-structured Parzen Estimator (TPE), Genetic Search, and Random Search are evaluated across regression and classification tasks. The results show that nonlinear models, with properly tuned hyperparameters, significantly outperform linear models. Interestingly, Random Search excelled in regression tasks, while TPE was more effective for classification tasks. This suggests that there is no one-size-fits-all solution, as different algorithms perform better depending on the task and model type. The findings underscore the importance of selecting the appropriate tuning method and highlight the computational challenges involved in optimizing machine learning models, particularly as search spaces expand.
arXiv.org Artificial Intelligence
Aug-29-2024
- Country:
- Asia
- Cambodia (0.04)
- India (0.04)
- Japan (0.04)
- Middle East > Republic of Türkiye (0.04)
- Europe
- Germany (0.04)
- Greece (0.04)
- United Kingdom > England (0.04)
- North America
- Canada (0.04)
- Puerto Rico (0.04)
- United States > California
- Orange County > Irvine (0.04)
- Oceania > Guam (0.04)
- South America > Paraguay
- Asia
- Genre:
- Research Report > New Finding (1.00)
- Industry:
- Energy (0.69)
- Technology:
- Information Technology > Artificial Intelligence
- Machine Learning
- Evolutionary Systems (0.70)
- Neural Networks > Deep Learning (0.68)
- Performance Analysis > Accuracy (0.68)
- Statistical Learning > Regression (0.48)
- Representation & Reasoning > Search (1.00)
- Machine Learning
- Information Technology > Artificial Intelligence