Compositional Operators in Distributional Semantics

arXiv.org Artificial Intelligence

This survey presents in some detail the main advances that have been recently taking place in Computational Linguistics towards the unification of the two prominent semantic paradigms: the compositional formal semantics view and the distributional models of meaning based on vector spaces. After an introduction to these two approaches, I review the most important models that aim to provide compositionality in distributional semantics. Then I proceed and present in more detail a particular framework by Coecke, Sadrzadeh and Clark (2010) based on the abstract mathematical setting of category theory, as a more complete example capable to demonstrate the diversity of techniques and scientific disciplines that this kind of research can draw from. This paper concludes with a discussion about important open issues that need to be addressed by the researchers in the future.


Don't Blame Distributional Semantics if it can't do Entailment

arXiv.org Artificial Intelligence

Distributional semantics has emerged as a promising model of certain'conceptual' aspects of linguistic meaning (e.g., Landauer and Dumais 1997; Turney and Pantel 2010; Baroni and Lenci 2010; Lenci 2018) and as an indispensable component of applications in Natural Language Processing (e.g., reference resolution, machine translation, image captioning; especially since Mikolov et al. 2013). Yet its theoretical status within a general theory of meaning and of language and cognition more generally is not clear (e.g., Lenci 2008; Erk 2010; Boleda and Herbelot 2016; Lenci 2018). In particular, it is not clear whether distributional semantics can be understood as an actual model of expression meaning - what Lenci (2008) calls the'strong' view of distributional semantics - or merely as a model of something that correlates with expression meaning in certain partial ways - the'weak' view. In this paper we aim to resolve, in favor of the'strong' view, the question of what exactly distributional semantics models, what its role should be in an overall theory of language and cognition, and how its contribution to state of the art applications can be understood. We do so in part by clarifying its frequently discussed but still obscure relation to formal semantics. Our proposal relies crucially on the distinction between what linguistic expressions mean outside of any particular context, and what speakers mean by them in a particular context of utterance.


Compositionality for Recursive Neural Networks

arXiv.org Artificial Intelligence

Modelling compositionality has been a longstanding area of research in the field of vector space semantics. The categorical approach to compositionality maps grammar onto vector spaces in a principled way, but comes under fire for requiring the formation of very high-dimensional matrices and tensors, and therefore being computationally infeasible. In this paper I show how a linear simplification of recursive neural tensor network models can be mapped directly onto the categorical approach, giving a way of computing the required matrices and tensors. This mapping suggests a number of lines of research for both categorical compositional vector space models of meaning and for recursive neural network models of compositionality.


Phrase Type Sensitive Tensor Indexing Model for Semantic Composition

AAAI Conferences

Compositional semantic aims at constructing the meaning of phrases or sentences according to the compositionality of word meanings. In this paper, we propose to synchronously learn the representations of individual words and extracted high-frequency phrases. Representations of extracted phrases are considered as gold standard for constructing more general operations to compose the representation of unseen phrases. We propose a grammatical type specific model that improves the composition flexibility by adopting vector-tensor-vector operations. Our model embodies the compositional characteristics of traditional additive and multiplicative model. Empirical result shows that our model outperforms state-of-the-art composition methods in the task of computing phrase similarities.


Distributional Semantic Features as Semantic Primitives — Or Not

AAAI Conferences

We argue that >distributional semantics can serve as the basis for a semantic representation of words and phrases that serves many of the purposes semantic primitives were designed for, without running into many of their philosophical, empirical, and practical problems.