Sulem, Elior
Recent Advances in Natural Language Processing via Large Pre-Trained Language Models: A Survey
Min, Bonan, Ross, Hayley, Sulem, Elior, Veyseh, Amir Pouran Ben, Nguyen, Thien Huu, Sainz, Oscar, Agirre, Eneko, Heinz, Ilana, Roth, Dan
Large, pre-trained transformer-based language models such as BERT have drastically changed the Natural Language Processing (NLP) field. We present a survey of recent work that uses these large language models to solve NLP tasks via pre-training then fine-tuning, prompting, or text generation approaches. We also present approaches that use pre-trained language models to generate data for training augmentation or other purposes. We conclude with discussions on limitations and suggested directions for future research.