On Unsupervised Training of Link Grammar Based Language Models
–arXiv.org Artificial Intelligence
In this short note we explore what is needed for unsupervised training of graph language models based on link grammars. First, we introduce the termination tags formalism required to build a language model based on a link grammar formalism of Sleator and Temperley [21] and discuss the influence of context on the unsupervised learning of link grammars. Second, we propose a statistical link grammar formalism, allowing for statistical language generation. Third, based on the above formalism, we show that the classical dissertation of Yuret [25] on discovery of linguistic relations using lexical attraction ignores contextual properties of the language, and thus the approach to unsupervised language learning relying just on bigrams is flawed. This correlates well with the unimpressive results in unsupervised training of graph language models based on bigram approach of Yuret.
arXiv.org Artificial Intelligence
Aug-27-2022
- Country:
- Asia > Russia
- Siberian Federal District > Tomsk Oblast > Tomsk (0.05)
- Europe
- Russia > Central Federal District
- Moscow Oblast > Moscow (0.04)
- Switzerland > Zürich
- Zürich (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Russia > Central Federal District
- North America > United States
- Massachusetts > Middlesex County > Cambridge (0.04)
- Asia > Russia
- Genre:
- Research Report (0.40)