SupplementaryMaterial: ProgressiveKernelBasedKnowledgeDistillationfor AdderNeuralNetworks

Neural Information Processing Systems 

In this section, more experimental results of PKKD are conducted. Then, we show the superiority of the proposed methods on the traditional CNN distillation. Model Top-1acc Top-5acc ResNet-18 69.8% 89.1% PKKD 73.1% 91.3% VanillaKD[1] 72.5% 90.9% Finally, we show the experimental results of using different settings of PKKD on ImageNet with ResNet-50inTab.3.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found