Goto

Collaborating Authors

A Logical Study of Partial Entailment

AAAI Conferences

We introduce a novel logical notion-partial entailment-to propositional logic. In contrast with classical entailment, that a formula P partially entails another formula Q with respect to a background formula set Γ intuitively means that under the circumstance of Γ, if P is true then some "part" of Q will also be true. We distinguish three different kinds of partial entailments and formalize them by using an extended notion of prime implicant. We study their semantic properties, which show that, surprisingly, partial entailments fail for many simple inference rules. Then, we study the related computational properties, which indicate that partial entailments are relatively difficult to be computed. Finally, we consider a potential application of partial entailments in reasoning about rational agents.


A Logical Study of Partial Entailment

Journal of Artificial Intelligence Research

We introduce a novel logical notion--partial entailment--to propositional logic. In contrast with classical entailment, that a formula P partially entails another formula Q with respect to a background formula set \Gamma intuitively means that under the circumstance of \Gamma, if P is true then some "part" of Q will also be true. We distinguish three different kinds of partial entailments and formalize them by using an extended notion of prime implicant. We study their semantic properties, which show that, surprisingly, partial entailments fail for many simple inference rules. Then, we study the related computational properties, which indicate that partial entailments are relatively difficult to be computed. Finally, we consider a potential application of partial entailments in reasoning about rational agents.


Bridging Knowledge Gaps in Neural Entailment via Symbolic Models

arXiv.org Artificial Intelligence

Most textual entailment models focus on lexical gaps between the premise text and the hypothesis, but rarely on knowledge gaps. We focus on filling these knowledge gaps in the Science Entailment task, by leveraging an external structured knowledge base (KB) of science facts. Our new architecture combines standard neural entailment models with a knowledge lookup module. To facilitate this lookup, we propose a fact-level decomposition of the hypothesis, and verifying the resulting sub-facts against both the textual premise and the structured KB. Our model, NSnet, learns to aggregate predictions from these heterogeneous data formats. On the SciTail dataset, NSnet outperforms a simpler combination of the two predictions by 3% and the base entailment model by 5%.


Nie

AAAI Conferences

For the task of entity disambiguation, mention contexts and entity descriptions both contain various kinds of information content while only a subset of them are helpful for disambiguation. In this paper, we propose a type-aware co-attention model for entity disambiguation, which tries to identify the most discriminative words from mention contexts and most relevant sentences from corresponding entity descriptions simultaneously. To bridge the semantic gap between mention contexts and entity descriptions, we further incorporate entity type information to enhance the co-attention mechanism. Our evaluation shows that the proposed model outperforms the state-of-the-arts on three public datasets. Further analysis also confirms that both the co-attention mechanism and the type-aware mechanism are effective.


How Does Machine Learning Handle Ambiguity?

#artificialintelligence

The study focuses on linguistic aspects such as word choice for machine translation, parts of speech-tagging and word-sense disambiguation. The study's research paper considers the language learning process as a disambiguation problem and applies the linear separator technique. A formal definition of the disambiguation problem is defined in terms such as different word predicates, their classifications and features for the learning problem. In addition, various disambiguation methods are also emphasised for using them as linear separators.