Paraphrasing Complex Network: Network Compression via Factor Transfer
Jangho Kim, Seonguk Park, Nojun Kwak
–Neural Information Processing Systems
Many researchers have sought ways of model compression to reduce the size of a deep neural network (DNN) with minimal performance degradation in order to use DNNs in embedded systems. Among the model compression methods, a method called knowledge transfer is to train a student network with a stronger teacher network. In this paper, we propose a novel knowledge transfer method which uses convolutional operations to paraphrase teacher's knowledge and to
Neural Information Processing Systems
Nov-17-2025, 22:47:26 GMT