The Structural Sources of Verb Meaning Revisited: Large Language Models Display Syntactic Bootstrapping
Zhu, Xiaomeng, McCoy, R. Thomas, Frank, Robert
–arXiv.org Artificial Intelligence
Syntactic bootstrapping (Gleitman, 1990) is the hypothesis that children use the syntactic environments in which a verb occurs to learn its meaning. In this paper, we examine whether large language models exhibit a similar behavior. We do this by training RoBERTa and GPT-2 on perturbed datasets where syntactic information is ablated. Our results show that models' verb representation degrades more when syntactic cues are removed than when co-occurrence information is removed. Furthermore, the representation of mental verbs, for which syntactic bootstrapping has been shown to be particularly crucial in human verb learning, is more negatively impacted in such training regimes than physical verbs. In contrast, models' representation of nouns is affected more when co-occurrences are distorted than when syntax is distorted. In addition to reinforcing the important role of syntactic bootstrapping in verb learning, our results demonstrated the viability of testing developmental hypotheses on a larger scale through manipulating the learning environments of large language models.
arXiv.org Artificial Intelligence
Aug-19-2025
- Country:
- Asia > Middle East
- Republic of Türkiye (0.04)
- North America
- Canada > Ontario
- Toronto (0.04)
- Dominican Republic (0.04)
- Mexico > Mexico City
- Mexico City (0.04)
- United States
- California > Santa Clara County
- Stanford (0.04)
- Florida > Miami-Dade County
- Miami (0.04)
- Massachusetts > Middlesex County
- Cambridge (0.04)
- New Jersey > Bergen County
- Mahwah (0.04)
- California > Santa Clara County
- Canada > Ontario
- Asia > Middle East
- Genre:
- Research Report > New Finding (1.00)
- Industry:
- Education (0.34)
- Technology: