Less is More: Pre-Training Cross-Lingual Small-Scale Language Models with Cognitively-Plausible Curriculum Learning Strategies
Salhan, Suchir, Martinez, Richard Diehl, Goriely, Zébulon, Buttery, Paula
–arXiv.org Artificial Intelligence
Curriculum Learning has been a popular strategy to improve the cognitive plausibility of Small-Scale Language Models (SSLMs) in the BabyLM Challenge. However, it has not led to considerable improvements over non-curriculum models. We assess whether theoretical linguistic acquisition theories can be used to specify more fine-grained curriculum learning strategies, creating age-ordered corpora of Child-Directed Speech for four typologically distant language families to implement SSLMs and acquisition-inspired curricula cross-lingually. Comparing the success of three objective curricula (Growing, Inwards and MMM) that precisely replicate the predictions of acquisition theories on a standard SSLM architecture, we find fine-grained acquisition-inspired curricula can outperform non-curriculum baselines and performance benefits of curricula strategies in SSLMs can be derived by specifying fine-grained language-specific curricula that precisely replicate language acquisition theories.
arXiv.org Artificial Intelligence
Oct-30-2024
- Country:
- Asia
- Japan > Honshū
- Kansai > Osaka Prefecture > Osaka (0.04)
- Middle East > UAE (0.04)
- Singapore (0.05)
- Japan > Honshū
- Europe
- Croatia > Dubrovnik-Neretva County
- Dubrovnik (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.28)
- Croatia > Dubrovnik-Neretva County
- Asia
- Genre:
- Research Report > Experimental Study (0.46)
- Technology: