Modelling Child Learning and Parsing of Long-range Syntactic Dependencies
Mahon, Louis, Johnson, Mark, Steedman, Mark
–arXiv.org Artificial Intelligence
This work develops a probabilistic child language acquisition model to learn a range of linguistic phenonmena, most notably long-range syntactic dependencies of the sort found in object wh-questions, among other constructions. The model is trained on a corpus of real child-directed speech, where each utterance is paired with a logical form as a meaning representation. It then learns both word meanings and language-specific syntax simultaneously. After training, the model can deduce the correct parse tree and word meanings for a given utterance-meaning pair, and can infer the meaning if given only the utterance. The successful modelling of long-range dependencies is theoretically important because it exploits aspects of the model that are, in general, trans-context-free.
arXiv.org Artificial Intelligence
Mar-17-2025
- Country:
- Asia > Middle East
- Qatar (0.14)
- Europe > United Kingdom
- England (0.14)
- Asia > Middle East
- Genre:
- Research Report (0.50)