Bag of Tricks for Neural Architecture Search
Elsken, Thomas, Staffler, Benedikt, Zela, Arber, Metzen, Jan Hendrik, Hutter, Frank
–arXiv.org Artificial Intelligence
This allows to search for architectures by using alternating stochastic gradient descent, which (in each batch) iterates While neural architecture search methods have been successful updates of the network parameters and the real-valued in previous years and led to new state-of-the-art performance weights parameterizing the architecture. However, directly on various problems, they have also been criticized using this alternating optimization has been reported to lead for being unstable, being highly sensitive with respect to premature convergence in the architectural space [26].
arXiv.org Artificial Intelligence
Jul-8-2021