Paraphrasing Complex Network: Network Compression via Factor Transfer

Jangho Kim, Seonguk Park, Nojun Kwak

Neural Information Processing Systems 

Many researchers have sought ways of model compression to reduce the size of a deep neural network (DNN) with minimal performance degradation in order to use DNNs in embedded systems. Among the model compression methods, a method called knowledge transfer is to train a student network with a stronger teacher network. In this paper, we propose a novel knowledge transfer method which uses convolutional operations to paraphrase teacher's knowledge and to

Similar Docs  Excel Report  more

TitleSimilaritySource
None found