Domain-Adaptive Pretraining Methods for Dialogue Understanding
Wu, Han, Xu, Kun, Song, Linfeng, Jin, Lifeng, Zhang, Haisong, Song, Linqi
–arXiv.org Artificial Intelligence
Language models like BERT and SpanBERT pretrained on open-domain data have obtained impressive gains on various NLP tasks. In this paper, we probe the effectiveness of domain-adaptive pretraining objectives on downstream tasks. In particular, three objectives, including a novel objective focusing on modeling predicate-argument relations, are evaluated on two challenging dialogue understanding tasks. Experimental results demonstrate that domain-adaptive pretraining with proper objectives can significantly improve the performance of a strong baseline on these tasks, achieving the new state-of-the-art performances.
arXiv.org Artificial Intelligence
May-28-2021
- Country:
- North America > United States > Minnesota > Hennepin County > Minneapolis (0.14)
- Genre:
- Research Report > New Finding (0.34)
- Technology: