NASA: Neural Architecture Search and Acceleration for Hardware Inspired Hybrid Networks
Shi, Huihong, You, Haoran, Zhao, Yang, Wang, Zhongfeng, Lin, Yingyan
–arXiv.org Artificial Intelligence
To this end, we propose a Neural Architecture DNN-powered solutions in numerous real-world applications. Search and Acceleration framework dubbed NASA, which However, the extensively used multiplications in DNNs enables automated multiplication-reduced DNN development dominate their energy consumption and have largely challenged and integrates a dedicated multiplication-reduced accelerator DNNs' achievable hardware efficiency, motivating for boosting DNNs' achievable efficiency. Specifically, multiplication-free DNNs that adopt hardware-friendly operators, NASA adopts neural architecture search (NAS) spaces that such as additions and bit-wise shifts, which require a augment the state-of-the-art one with hardware inspired smaller unit energy and area cost as compared to multiplications multiplication-free operators, such as shift and adder, armed [26]. In particular, pioneering works of multiplicationfree with a novel progressive pretrain strategy (PGP) together DNNs include (1) DeepShift [6] which proposes to adopt with customized training recipes to automatically search for merely shift layers for DNNs, (2) AdderNet [20] which advocates optimal multiplication-reduced DNNs; On top of that, NASA using adder layers to implement DNNs for trading the further develops a dedicated accelerator, which advocates a massive multiplications with lower-cost additions, and (3) chunk-based template and auto-mapper dedicated for NASA-ShiftAddNet [26] which combines both shift and adder layers NAS resulting DNNs to better leverage their algorithmic to construct DNNs for better trading-off the achievable properties for boosting hardware efficiency.
arXiv.org Artificial Intelligence
Dec-18-2022