Contrastive Distillation Is a Sample-Efficient Self-Supervised Loss Policy for Transfer Learning

Open in new window