Slimmed Asymmetrical Contrastive Learning and Cross Distillation for Lightweight Model Training Jian Meng, Li Y ang

Neural Information Processing Systems 

Contrastive learning (CL) has been widely investigated with various learning mechanisms and achieves strong capability in learning representations of data in a self-supervised manner using unlabeled data. A common fashion of contrastive learning on this line is employing large-sized encoders to achieve comparable performance as the supervised learning counterpart. Despite the success of the labelless training, current contrastive learning algorithms failed to achieve good performance with lightweight (compact) models, e.g., MobileNet, while the requirements of the heavy encoders impede the energy-efficient computation, especially for resource-constrained AI applications. Motivated by this, we propose a new self-supervised CL scheme, named SACL-XD, consisting of two technical components, S limmed A symmetrical C ontrastive L earning (SACL) and Cross - D istillation (XD), which collectively enable efficient CL with compact models.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found