Model Connectomes: A Generational Approach to Data-Efficient Language Models
–arXiv.org Artificial Intelligence
Biological neural networks are shaped both by evolution across generations and by individual learning within an organism's lifetime, whereas standard artificial neural networks undergo a single, large training procedure without inherited constraints. In this preliminary work, we propose a framework that incorporates this crucial generational dimension--an "outer loop" of evolution that shapes the "inner loop" of learning--so that artificial networks better mirror the effects of evolution and individual learning in biological organisms. Focusing on language, we train a model that inherits a "model connectome" from the outer evolution loop before exposing it to a developmental-scale corpus of 100M tokens. Compared with two closely matched control models, we show that the connectome model performs better or on par on natural language processing tasks as well as alignment to human behavior and brain data. These findings suggest that a model connec-tome serves as an efficient prior for learning in low-data regimes - narrowing the gap between single-generation artificial models and biologically evolved neural networks. How does the brain quickly and robustly learn to perform a wide array of tasks?
arXiv.org Artificial Intelligence
May-1-2025
- Country:
- Asia > Middle East
- Jordan (0.04)
- Europe > Ireland
- Leinster > County Dublin > Dublin (0.04)
- North America > United States
- California > Santa Clara County
- Palo Alto (0.04)
- Massachusetts > Middlesex County
- Cambridge (0.04)
- California > Santa Clara County
- Asia > Middle East
- Genre:
- Research Report > New Finding (0.48)
- Industry:
- Health & Medicine > Therapeutic Area > Neurology (0.69)
- Technology: