Goto

Collaborating Authors

 mobilenetv2


Pelee: A Real-Time Object Detection System on Mobile Devices

Neural Information Processing Systems

An increasing need of running Convolutional Neural Network (CNN) models on mobile devices with limited computing power and memory resource encourages studies on efficient model design. A number of efficient architectures have been proposed in recent years, for example, MobileNet, ShuffleNet, and MobileNetV2. However, all these models are heavily dependent on depthwise separable convolution which lacks efficient implementation in most deep learning frameworks. In this study, we propose an efficient architecture named PeleeNet, which is built with conventional convolution instead.


PrivCirNet: Efficient Private Inference via Block Circulant Transformation

Neural Information Processing Systems

Homomorphic encryption (HE)-based deep neural network (DNN) inference protects data and model privacy but suffers from significant computation overhead. We observe transforming the DNN weights into circulant matrices converts general matrix-vector multiplications into HE-friendly 1-dimensional convolutions, drastically reducing the HE computation cost.







on ResNet-50 and by 7.3% on MobileNetV2

Neural Information Processing Systems

Our gains are indeed large. EvoNorm-S0 is the state-of-the-art in the small batch size regime (Table 4), outperforming BN-ReLU by 7.8% We achieve clear gains over other influential works such as GroupNorm (GN). We'd also like to emphasize that EvoNorms beat BN-ReLU on 12 (out of 14) different classification models/training These are significant considering the predominance of BN-ReLU in ML models. R3: "the overall search algorithm lacks some novelty." "yet another AutoML paper" (with the expectation that some fancy search algorithms must be proposed), but rather under R2, R4: Can EvoNorms generalize to deeper variants (e.g., ResNet-101) and architecture families not included MnasNet, EfficientNet-B5, Mask R-CNN + FPN/SpineNet and BigGAN-none of them was used during search.