Multi-CLS BERT: An Efficient Alternative to Traditional Ensembling
Chang, Haw-Shiuan, Sun, Ruei-Yao, Ricci, Kathryn, McCallum, Andrew
–arXiv.org Artificial Intelligence
Ensembling BERT models often significantly improves accuracy, but at the cost of significantly more computation and memory footprint. In this work, we propose Multi-CLS BERT, a novel ensembling method for CLS-based prediction tasks that is almost as efficient as a single BERT model. Multi-CLS BERT uses multiple CLS tokens with a parameterization and objective that encourages their diversity. Thus instead of fine-tuning each BERT model in an ensemble (and running them all at test time), we need only fine-tune our single Multi-CLS BERT model (and run the one model at test time, ensembling just the multiple final CLS embeddings). To test its effectiveness, we build Multi-CLS BERT on top of a state-of-the-art pretraining method for BERT (Aroca-Ouellette and Rudzicz, 2020). In experiments on GLUE and SuperGLUE we show that our Multi-CLS BERT reliably improves both overall accuracy and confidence estimation. When only 100 training samples are available in GLUE, the Multi-CLS BERT_Base model can even outperform the corresponding BERT_Large model. We analyze the behavior of our Multi-CLS BERT, showing that it has many of the same characteristics and behavior as a typical BERT 5-way ensemble, but with nearly 4-times less computation and memory.
arXiv.org Artificial Intelligence
May-20-2023
- Country:
- Asia (0.93)
- Europe (1.00)
- North America > United States
- California (0.68)
- New York > New York County
- New York City (0.27)
- Genre:
- Research Report > New Finding (0.45)
- Industry:
- Government > Regional Government
- Health & Medicine (1.00)
- Law (1.00)
- Law Enforcement & Public Safety (1.00)
- Leisure & Entertainment > Sports (0.93)
- Media > Film (1.00)
- Technology: