PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient
–Neural Information Processing Systems
Knowledge distillation(KD) is a widely-used technique to train compact models in object detection. However, there is still a lack of study on how to distill between heterogeneous detectors. In this paper, we empirically find that better FPN features from a heterogeneous teacher detector can help the student although their detection heads and label assignments are different. However, directly aligning the feature maps to distill detectors suffers from two problems. First, the difference in feature magnitude between the teacher and the student could enforce overly strict constraints on the student.
Neural Information Processing Systems
Oct-11-2024, 08:21:54 GMT
- Technology: