Non-Parametric Unsupervised Domain Adaptation for Neural Machine Translation
Zheng, Xin, Zhang, Zhirui, Huang, Shujian, Chen, Boxing, Xie, Jun, Luo, Weihua, Chen, Jiajun
–arXiv.org Artificial Intelligence
Recently, $k$NN-MT has shown the promising capability of directly incorporating the pre-trained neural machine translation (NMT) model with domain-specific token-level $k$-nearest-neighbor ($k$NN) retrieval to achieve domain adaptation without retraining. Despite being conceptually attractive, it heavily relies on high-quality in-domain parallel corpora, limiting its capability on unsupervised domain adaptation, where in-domain parallel corpora are scarce or nonexistent. In this paper, we propose a novel framework that directly uses in-domain monolingual sentences in the target language to construct an effective datastore for $k$-nearest-neighbor retrieval. To this end, we first introduce an autoencoder task based on the target language, and then insert lightweight adapters into the original NMT model to map the token-level representation of this task to the ideal representation of translation task. Experiments on multi-domain datasets demonstrate that our proposed approach significantly improves the translation accuracy with target-side monolingual data, while achieving comparable performance with back-translation.
arXiv.org Artificial Intelligence
Sep-14-2021
- Country:
- Europe (0.68)
- North America > United States
- California > Los Angeles County
- Long Beach (0.14)
- Minnesota > Hennepin County
- Minneapolis (0.14)
- California > Los Angeles County
- Genre:
- Research Report (1.00)
- Technology: