Traditional and accelerated gradient descent for neural architecture search
Trillos, Nicolas Garcia, Morales, Felix, Morales, Javier
In this paper, we introduce two algorithms for neural architecture search (NASGD and NASAGD) following the theoretical work by two of the authors [4], which aimed at introducing the conceptual basis for new notions of traditional and accelerated gradient descent algorithms for the optimization of a function on a semi-discrete space using ideas from optimal transport theory. Our methods, which use the network morphism framework introduced in [3] as a baseline, can analyze forty times as many architectures as the hill climbing methods [3, 11] while using the same computational resources and time and achieving comparable levels of accuracy.
Jul-2-2020
- Country:
- Europe > Sweden
- North America > United States
- Maryland > Prince George's County
- College Park (0.14)
- Wisconsin > Dane County
- Madison (0.14)
- Maryland > Prince George's County
- South America > Venezuela
- Capital District > Caracas (0.04)
- Genre:
- Research Report (0.82)
- Technology: