Knowledge Compilation Properties of Trees-of-BDDs, Revisited

AAAI Conferences

Recent results have shown the interest  of trees-of-BDDs as a suitable target language for propositional knowledge compilation from the practical side. In the present paper,  the concept of tree-of-BDDs is extended to additional classes of data structures C, thus leading to trees-of-C representations (ToC). We provide a number of generic results enabling one to determine the queries/transformations satisfied by ToC depending on those satisfied by C. We also present some results about the spatial efficiency of the ToC languages. Focusing on the ToB language (and other related languages), we address a number of issues that remained open in (Subbarayan et al 2007). We show that beyond co and va, the ToB fragment satisfies im and me but satisfies neither cd nor any query among ce, se unlesssf P = NP. Among other results, we prove that ToB is not comparable w.r.t. succinctness with any of cnf, dnf,  Dnnf unless the polynomial hierarchy collapses.This contributes to the explanation of  some empirical results reported in (Subbarayan et al 2007).


Natural Language State Representation for Reinforcement Learning

arXiv.org Artificial Intelligence

Recent advances in Reinforcement Learning have highlighted the difficulties in learning within complex high dimensional domains. We argue that one of the main reasons that current approaches do not perform well, is that the information is represented sub-optimally. A natural way to describe what we observe, is through natural language. In this paper, we implement a natural language state representation to learn and complete tasks. Our experiments suggest that natural language based agents are more robust, converge faster and perform better than vision based agents, showing the benefit of using natural language representations for Reinforcement Learning.


McFate

AAAI Conferences

The naturalness of qualitative reasoning suggests that qualitative representations might be an important component of the semantics of natural language. Prior work showed that frame-based representations of qualitative process theory constructs could indeed be extracted from natural language texts. That technique relied on the parser recognizing specific syntactic constructions, which had limited coverage. This paper describes a new approach, using narrative function to represent the higher-order relationships between the constituents of a sentence and between sentences in a discourse. We outline how narrative function combined with query-driven abduction enables the same kinds of information to be extracted from natural language texts. Moreover, we also show how the same technique can be used to extract type-level qualitative representations from text, and used to improve performance in playing a strategy game.



Allen

AAAI Conferences

Current semantic parsers either compute shallow representations over a wide range of input, or deeper representations in very limited domains. We describe a system that provides broad-coverage, deep semantic parsing designed to work in any domain using a core domain-general lexicon, ontology and grammar. This paper discusses how this core system can be customized for a particularly challenging domain, namely reading research papers in biology.