Aya Model: An Instruction Finetuned Open-Access Multilingual Language Model
Üstün, Ahmet, Aryabumi, Viraat, Yong, Zheng-Xin, Ko, Wei-Yin, D'souza, Daniel, Onilude, Gbemileke, Bhandari, Neel, Singh, Shivalika, Ooi, Hui-Lee, Kayid, Amr, Vargus, Freddie, Blunsom, Phil, Longpre, Shayne, Muennighoff, Niklas, Fadaee, Marzieh, Kreutzer, Julia, Hooker, Sara
–arXiv.org Artificial Intelligence
Recent breakthroughs in large language models (LLMs) have centered around a handful of data-rich languages. What does it take to broaden access to breakthroughs beyond first-class citizen languages? Our work introduces Aya, a massively multilingual generative language model that follows instructions in 101 languages of which over 50% are considered as lower-resourced. Aya outperforms mT0 and BLOOMZ on the majority of tasks while covering double the number of languages. We introduce extensive new evaluation suites that broaden the state-of-art for multilingual eval across 99 languages -- including discriminative and generative tasks, human evaluation, and simulated win rates that cover both held-out tasks and in-distribution performance. Furthermore, we conduct detailed investigations on the optimal finetuning mixture composition, data pruning, as well as the toxicity, bias, and safety of our models. We open-source our instruction datasets and our model at https://hf.co/CohereForAI/aya-101
arXiv.org Artificial Intelligence
Feb-12-2024
- Country:
- Africa (0.67)
- Asia (1.00)
- Europe (1.00)
- North America > United States
- Minnesota > Hennepin County
- Minneapolis (0.13)
- Washington > King County
- Seattle (0.14)
- Minnesota > Hennepin County
- Genre:
- Research Report > New Finding (0.67)
- Industry:
- Education > Educational Setting
- Online (0.34)
- Health & Medicine > Therapeutic Area (0.45)
- Law (1.00)
- Education > Educational Setting
- Technology: