Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision
Wan, Zhen, Cheng, Fei, Liu, Qianying, Mao, Zhuoyuan, Song, Haiyue, Kurohashi, Sadao
–arXiv.org Artificial Intelligence
Contrastive pre-training on distant supervision has shown remarkable effectiveness in improving supervised relation extraction tasks. However, the existing methods ignore the intrinsic noise of distant supervision during the pre-training stage. In this paper, we propose a weighted contrastive learning method by leveraging the supervised data to estimate the reliability of pre-training instances and explicitly reduce the effect of noise. Experimental results on three supervised datasets demonstrate the advantages of our proposed weighted contrastive learning approach compared to two state-of-the-art non-weighted baselines.Our code and models are available at: https://github.com/YukinoWan/WCL
arXiv.org Artificial Intelligence
Feb-10-2023
- Country:
- Europe (0.94)
- North America > United States
- Minnesota (0.28)
- Genre:
- Research Report (0.64)
- Technology: