carpedm20/ENAS-pytorch
ENAS reduce the computational requirement (GPU-hours) of Neural Architecture Search (NAS) by 1000x via parameter sharing between models that are subgraphs within a large computational graph. More configurations can be found here. Efficient Neural Architecture Search (ENAS) is composed of two sets of learnable parameters, controller LSTM θ and the shared parameters ω. These two parameters are alternatively trained and only trained controller is used to derive novel architectures. Controller LSTM decide 1) what activation function to use and 2) which previous node to connect.
Feb-16-2018, 10:49:31 GMT
- Technology: