What Can String Probability Tell Us About Grammaticality?
Hu, Jennifer, Wilcox, Ethan Gotlieb, Song, Siyuan, Mahowald, Kyle, Levy, Roger P.
–arXiv.org Artificial Intelligence
What have language models (LMs) learned about grammar? This question remains hotly debated, with major ramifications for linguistic theory. However, since probability and grammaticality are distinct notions in linguistics, it is not obvious what string probabilities can reveal about an LM's underlying grammatical knowledge. We present a theoretical analysis of the relationship between grammar, meaning, and string probability, based on simple assumptions about the generative process of corpus data. Our framework makes three predictions, which we validate empirically using 280K sentence pairs in English and Chinese: (1) correlation between the probability of strings within minimal pairs, i.e., string pairs with minimal semantic differences; (2) correlation between models' and humans' deltas within minimal pairs; and (3) poor separation in probability space between unpaired grammatical and ungrammatical strings. Our analyses give theoretical grounding for using probability to learn about LMs' structural knowledge, and suggest directions for future work in LM grammatical evaluation.
arXiv.org Artificial Intelligence
Nov-10-2025
- Country:
- Asia
- China > Hong Kong (0.04)
- Middle East
- Jordan (0.04)
- UAE > Abu Dhabi Emirate
- Abu Dhabi (0.04)
- Singapore (0.04)
- South Korea (0.04)
- Europe
- Belgium > Brussels-Capital Region
- Brussels (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Belgium > Brussels-Capital Region
- North America > United States
- Hawaii > Honolulu County
- Honolulu (0.04)
- Massachusetts > Middlesex County
- Cambridge (0.14)
- Minnesota > Hennepin County
- Minneapolis (0.14)
- New Mexico > Bernalillo County
- Albuquerque (0.04)
- Texas > Travis County
- Austin (0.04)
- Hawaii > Honolulu County
- Asia
- Genre:
- Research Report (1.00)
- Technology: