The Rise and Down of Babel Tower: Investigating the Evolution Process of Multilingual Code Large Language Model
Chen, Jiawei, Chen, Wentao, Su, Jing, Xu, Jingjing, Lin, Hongyu, Ren, Mengjie, Lu, Yaojie, Han, Xianpei, Sun, Le
–arXiv.org Artificial Intelligence
Large language models (LLMs) have shown significant multilingual capabilities. However, the mechanisms underlying the development of these capabilities during pre-training are not well understood. In this paper, we use code LLMs as an experimental platform to explore the evolution of multilingual capabilities in LLMs during the pre-training process. Based on our observations, we propose the Babel Tower Hypothesis, which describes the entire process of LLMs acquiring new language capabilities. During the learning process, multiple languages initially share a single knowledge system dominated by the primary language and gradually develop language-specific knowledge systems. Experimental results show that the internal state changes of the LLM are consistent with our Babel Tower Hypothesis. Building on these insights, we propose a novel method to construct an optimized pre-training corpus for multilingual code LLMs, which significantly outperforms LLMs trained on the original corpus. The proposed Babel Tower Hypothesis provides new insights into designing pre-training data distributions to achieve optimal multilingual capabilities in LLMs. A united human race speaking a single language migrates to Shinar where they agree to build a great city with a tower that would reach the sky. Yahweh, observing these efforts and remarking on humanity's power in unity, confounds their speech so that they can no longer understand each other and scatters them around the world, leaving the city unfinished.
arXiv.org Artificial Intelligence
Dec-10-2024
- Country:
- Asia (1.00)
- North America > Mexico (0.29)
- Genre:
- Research Report > New Finding (0.48)
- Industry:
- Education (0.48)
- Technology: