Reviews: ChannelNets: Compact and Efficient Convolutional Neural Networks via Channel-Wise Convolutions

Neural Information Processing Systems 

The paper proposes channel-wise convolutions that address the full connections between feature maps and replace them with sparse connections (based on 1-D convolutions). This reduces the #params and #FLOPS significantly; while maintaining high accuracy. The authors show results on imagenet classification and compare it to VGG/MobileNet variants to demonstrate this. Strengths: The paper is well written and easy to follow. Background and related work such as standard convolution fc layers used in neural nets; mobilenet and shufflenet variants to reduce computation are described in sufficient detail.