CHOPT : Automated Hyperparameter Optimization Framework for Cloud-Based Machine Learning Platforms

Kim, Jinwoong, Kim, Minkyu, Park, Heungseok, Kusdavletov, Ernar, Lee, Dongjun, Kim, Adrian, Kim, Ji-Hoon, Ha, Jung-Woo, Sung, Nako

arXiv.org Machine Learning 

Deep neural networks (DNNs) have become an essential method for solving difficult tasks in computer vision, signal processing, and natural language processing (He et al., 2016; Choi et al., 2018; Han et al., 2017; Van Den Oord et al., 2016; Seo et al., 2016; Vaswani et al., 2017). As the capabilities of deep learning have expanded with more modular architectures and advanced optimization methods, the number of hyperparameters has increased in general. This increase of hyperparameter sizes makes it more difficult for a researcher to optimize a model, wasting a lot of human resources and potentially leading unfair comparisons. This reinforces the importance of efficient automated hyperparameter tuning methods and interfaces. To address this problem, several hyperparameter optimization (HyperOpt) methods have been proposed (Jaderberg et al., 2017; Falkner et al., 2018; Li et al., 2017). These methods have many advantages such as strong final performance, parallelism, early stopping which significantly improve performance in terms of computing resource efficiency and optimization time.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found