CompRess: Self-Supervised Learning by Compressing Representations
–Neural Information Processing Systems
Self-supervised learning aims to learn good representations with unlabeled data. Recent works have shown that larger models benefit more from self-supervised learning than smaller models. As a result, the gap between supervised and selfsupervised learning has been greatly reduced for larger models. In this work, instead of designing a new pseudo task for self-supervised learning, we develop a model compression method to compress an already learned, deep self-supervised model (teacher) to a smaller one (student). We train the student model so that it mimics the relative similarity between the datapoints in the teacher's embedding space. For AlexNet, our method outperforms all previous methods including the fully supervised model on ImageNet linear evaluation (59.0%
Neural Information Processing Systems
May-30-2025, 10:03:58 GMT
- Country:
- North America > United States
- California (0.28)
- Maryland (0.28)
- North America > United States
- Genre:
- Research Report (0.94)
- Industry:
- Education (0.50)
- Government > Military (0.46)
- Technology: