Unified Language Model Pre-training for Natural Language Understanding and Generation
Li Dong, Nan Yang, Wenhui Wang, Furu Wei, Xiaodong Liu, Yu Wang, Jianfeng Gao, Ming Zhou, Hsiao-Wuen Hon
–Neural Information Processing Systems
LM) that can be fine-tuned for both natural language understanding and generation tasks. The model is pre-trained using three types of language modeling tasks: unidirectional, bidirectional, and sequence-to-sequence prediction. The unified modeling is achieved by employing a shared Transformer network and utilizing specific self-attention masks to control what context the prediction conditions on.
Neural Information Processing Systems
Feb-11-2025, 23:15:37 GMT
- Country:
- Europe (1.00)
- North America > United States (0.68)
- Technology:
- Information Technology > Artificial Intelligence > Natural Language
- Machine Translation (0.68)
- Text Processing (0.94)
- Understanding (0.61)
- Information Technology > Artificial Intelligence > Natural Language