Non-Linguistic Supervision for Contrastive Learning of Sentence Embeddings
–Neural Information Processing Systems
Semantic representation learning for sentences is an important and well-studied problem in NLP. The current trend for this task involves training a Transformer-based sentence encoder through a contrastive objective with text, i.e., clustering sentences with semantically similar meanings and scattering others. In this work, we find the performance of Transformer models as sentence encoders can be improved by training with multi-modal multi-task losses, using unpaired examples from another modality (e.g., sentences and unrelated image/audio data).
Neural Information Processing Systems
Dec-25-2025, 14:07:46 GMT
- Technology: