Supercharging Hyperparameter Tuning with Dask
Hyperparameter tuning is a crucial, and often painful, part of building machine learning models. Squeezing out each bit of performance from your model may mean the difference of millions of dollars in ad revenue, or life-and-death for patients in healthcare models. Even if your model takes one minute to train, you can end up waiting hours for a grid search to complete (think a 10x10 grid, cross-validation, etc.). Each time you wait for a search to finish breaks an iteration cycle and increases the time it takes to produce value with your model. In this post, we will see show how you can improve the speed of your hyperparameter search by over 100x by replacing a few lines of your scikit-learn pipeline with Dask code on Saturn Cloud.
Jul-27-2020, 00:40:13 GMT
- Technology: