Reviews: NAT: Neural Architecture Transformer for Accurate and Compact Architectures

Neural Information Processing Systems 

This paper proposes a novel search space for neural architecture post-processing that seeks to reduce resource consumption of trained models without sacrificing performance. Following the author feedback, all reviewers scored this paper above the threshold. They also continue to highlight crucial improvements, that I hope the authors will address for the camera-ready version.