Hyperparameter Search with Iterative Sweeps

#artificialintelligence 

I've spent several years reproducing and optimizing various deep learning models, primarily for computer vision and NLP, often with extremely short deadlines. My distilled high-level strategy for hyperparameter search is bounded exploration (try a wider range of values for fewer variables) and faster iteration (more short phases of exploration building on each other). I hope this overview of hyperparameter search helps you tune deep learning models a bit faster regardless of the framework or tools you use. Hyperparameter search -- or tuning, or optimization -- is the task of finding the best hyperparameters for a learning algorithm. Such tuning could be done entirely by hand: run a controlled experiment (keep all hyperparameters constant except one), analyze the effect of the single value change, decide based on that which hyperparameter to change next, run the next experiment, and repeat.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found