Goto

Collaborating Authors

 bena


Efficient Neural Architecture Search: A Broad Version

Ding, Zixiang, Chen, Yaran, Li, Nannan, Zhao, Dongbin, Chen, C. L. Philip

arXiv.org Machine Learning

Efficient Neural Architecture Search (ENAS) achieves novel efficiency for learning architecture with high-performance via parameter sharing, but suffers from an issue of slow propagation speed of search model with deep topology. In this paper, we propose a Broad version for ENAS (BE-NAS) to solve the above issue, by learning broad architecture whose propagation speed is fast with reinforcement learning and parameter sharing used in ENAS, thereby achieving a higher search efficiency. In particular, we elaborately design Broad Convolutional Neural Network (BCNN), the search paradigm of BENAS with fast propagation speed, which can obtain a satisfactory performance with broad topology, i.e. fast forward and backward propagation speed. The proposed BCNN extracts multi-scale features and enhancement representations, and feeds them into global average pooling layer to yield more reasonable and comprehensive representations so that the achieved performance of BCNN with shallow topology can be promised. In order to verify the effectiveness of BENAS, several experiments are performed and experimental results show that 1) BENAS delivers 0.23 day which is 2x less expensive than ENAS, 2) the architecture learned by BENAS based small-size BCNNs with 0.5 and 1.1 millions parameters obtain state-of-the-art performance, 3.63% and 3.40% test error on CIFAR-10, 3) the learned architecture based BCNN achieves 25.3% top-1 error on ImageNet just using 3.9 millions parameters. 1 Introduction Recently, Neural Architecture Search (NAS) [25] which automates the process of model designing is gaining around in past several years. However, early approaches [20, 25, 26] suffer from the issue of inefficiency. To solve this issue, some one-shot approaches [1, 6, 11, 14, 19] are proposed.