unified optimization framework
Reviews: Model Compression with Adversarial Robustness: A Unified Optimization Framework
This paper formulates a new problem and proposed a reasonable algorithm. However, I am not totally convinced the current jointly optimization is significantly better a two steps approach of (1) first do adversarial learning and then (2) compress the model by pruning and compression. As shown in Figure 2, the performance of the proposed method is almost the same as the two step approach (adversarial learning pruning). Moreover, the current paper could be improved in the following aspects: - eq (3): what about the non-conversational layers? I wish to see results of other adversarial learning. Is it possible to provide experiments for large neural networks?
Model Compression with Adversarial Robustness: A Unified Optimization Framework
Deep model compression has been extensively studied, and state-of-the-art methods can now achieve high compression ratios with minimal accuracy loss. Previous literature suggested that the goals of robustness and compactness might sometimes contradict. We propose a novel Adversarially Trained Model Compression (ATMC) framework. ATMC constructs a unified constrained optimization formulation, where existing compression means (pruning, factorization, quantization) are all integrated into the constraints. An efficient algorithm is then developed.
GAN Slimming: All-in-One GAN Compression by A Unified Optimization Framework
Wang, Haotao, Gui, Shupeng, Yang, Haichuan, Liu, Ji, Wang, Zhangyang
Generative adversarial networks (GANs) have gained increasing popularity in various computer vision applications, and recently start to be deployed to resource-constrained mobile devices. Similar to other deep models, state-of-the-art GANs suffer from high parameter complexities. That has recently motivated the exploration of compressing GANs (usually generators). Compared to the vast literature and prevailing success in compressing deep classifiers, the study of GAN compression remains in its infancy, so far leveraging individual compression techniques instead of more sophisticated combinations. We observe that due to the notorious instability of training GANs, heuristically stacking different compression techniques will result in unsatisfactory results. To this end, we propose the first unified optimization framework combining multiple compression means for GAN compression, dubbed GAN Slimming (GS). GS seamlessly integrates three mainstream compression techniques: model distillation, channel pruning and quantization, together with the GAN minimax objective, into one unified optimization form, that can be efficiently optimized from end to end. Without bells and whistles, GS largely outperforms existing options in compressing image-to-image translation GANs. Specifically, we apply GS to compress CartoonGAN, a state-of-the-art style transfer network, by up to 47 times, with minimal visual quality degradation. Codes and pre-trained models can be found at https://github.com/TAMU-VITA/GAN-Slimming.
- North America > United States > Texas > Travis County > Austin (0.14)
- North America > United States > Washington > King County > Seattle (0.04)
- North America > United States > New York > Monroe County > Rochester (0.04)
Model Compression with Adversarial Robustness: A Unified Optimization Framework
Gui, Shupeng, Wang, Haotao N., Yang, Haichuan, Yu, Chen, Wang, Zhangyang, Liu, Ji
Deep model compression has been extensively studied, and state-of-the-art methods can now achieve high compression ratios with minimal accuracy loss. Previous literature suggested that the goals of robustness and compactness might sometimes contradict. We propose a novel Adversarially Trained Model Compression (ATMC) framework. ATMC constructs a unified constrained optimization formulation, where existing compression means (pruning, factorization, quantization) are all integrated into the constraints. An efficient algorithm is then developed.