Small-footprint slimmable networks for keyword spotting
Akhtar, Zuhaib, Khursheed, Mohammad Omar, Du, Dongsu, Liu, Yuzong
–arXiv.org Artificial Intelligence
Dynamic neural networks are In this work, we present Slimmable Neural Networks applied another paradigm in which the network dynamically adapts to the problem of small-footprint keyword spotting. We show its computation graph and parameters to different inputs and that slimmable neural networks allow us to create super-nets permits tradeoff between accuracy and inference efficiency from Convolutional Neural Networks and Transformers, from [3]. Another notable work Once-for-All (OFA) network was which sub-networks of different sizes can be extracted. We proposed in [4], which allows one to train one super-network demonstrate the usefulness of these models on in-house voice once and derive multiple sub-networks with different resource assistant data and Google Speech Commands, and focus our contraint requirements. OFA also mitigates the large computational efforts on models for the on-device use case, limiting ourselves cost in conventional neural architecture search (NAS) to less than 250k parameters. We show that slimmable by decoupling the network training and search.
arXiv.org Artificial Intelligence
Apr-21-2023