Supplementary Material: Attribution Preservation in Network Compression for Reliable Network Interpretation

Neural Information Processing Systems 

ImageNet class labels - the class labels are unusable. In the fine-tuning phase, the pruned network is fine-tuned for 10 epochs with batch size 180. We conduct experiments for structured pruning methods on ImageNet. We observe same tendencies in the results (Table 4). Our method outperforms naive compression in terms of maintaining the attribution maps.