Discriminating Among Word Meanings By Identifying Similar Contexts

AAAI Conferences

Word sense discrimination is an unsupervised clustering problem, which seeks to discover which instances of a word/s are used in the same meaning. This is done strictly based on information found in raw corpora, without using any sense tagged text or other existing knowledge sources. Our particular focus is to systematically compare the efficacy of a range of lexical features, context representations, and clustering algorithms when applied to this problem.


Banjade

AAAI Conferences

Vector based word representation models are often developed from very large corpora. However, we often encounter words in real world applications that are not available in a single vector model. In this paper, we present a novel Neural Network (NN) based approach for obtaining representations for words in a target model from another model, called the source model, where representations for the words are available, effectively pooling together their vocabularies. Our experiments show that the transformed vectors are well correlated with the native target model representations and that an extrinsic evaluation based on a word-to-word similarity task using the Simlex-999 dataset leads to results close to those obtained using native model representations.


Tran-Thanh

AAAI Conferences

We propose a new representation for coalitionalgames, called the coalitional skill vector model, where there is a set of skills in the system, and eachagent has a skill vector -- a vector consisting of valuesthat reflect the agents' level in different skills. Furthermore, there is a set of goals, each withrequirements expressed in terms of the minimumskill level necessary to achieve the goal. Agents can form coalitions to aggregate their skills, and achieve goals otherwise unachievable. We show that this representation is fully expressive, that is, it can represent any characteristic function game. We also show that, for some interesting classes of games, our representation is significantly more compact than the classical representation, and facilitates the development of efficient algorithms to solve the coalition structure generation problem, as well as the problem of computing the core and/or the least core. We also demonstrate that by using the coalitional skill vector representation, our solver can handle up to 500 agents.


Handling Missing Words by Mapping Across Word Vector Representations

AAAI Conferences

Vector based word representation models are often developed from very large corpora. However, we often encounter words in real world applications that are not available in a single vector model. In this paper, we present a novel Neural Network (NN) based approach for obtaining representations for words in a target model from another model, called the source model, where representations for the words are available, effectively pooling together their vocabularies. Our experiments show that the transformed vectors are well correlated with the native target model representations and that an extrinsic evaluation based on a word-to-word similarity task using the Simlex-999 dataset leads to results close to those obtained using native model representations.


Ontologies for Dates and Duration

AAAI Conferences

Reasoning with dates and duration has long been addressed by the community. Existing duration ontologies, however, lack complete axiomatizations of their intended models; many simply represent timedurations as real numbers and treat the duration function as a metric on the timeline. We show that such approaches are inadequate and provide a first-order ontology of duration that overcomes these limitations.