faithful embedding
- North America > United States > Minnesota > Hennepin County > Minneapolis (0.14)
- North America > Canada (0.04)
- Media > Film (0.47)
- Leisure & Entertainment (0.47)
Review for NeurIPS paper: Faithful Embeddings for Knowledge Base Queries
When vacuous sketches are used in the intermediate steps, e.g. in R1 in MetaQA model, what is the intermediate output? Is it the dense-sparse representation of the entities in top-k facts? Isn't that a problem when k is large? Won't this be an issue in case there is a template that requires intersection as well in addition to unions? 3. For a given query, EmQL ranks all the entities (or gives a distribution over entities) instead of explicitly giving a set as an answer.
Review for NeurIPS paper: Faithful Embeddings for Knowledge Base Queries
This paper presents an embedding based neural query language called EmQL, that generalizes to the unknown facts in KB and also performs logical entailments better than existing methods like Qeury2Box. Strength • The proposed method is sound and novel. Weakness • There are underlying assumption in the proposed approach.
Faithful Embeddings for Knowledge Base Queries
The deductive closure of an ideal knowledge base (KB) contains exactly the logical queries that the KB can answer. However, in practice KBs are both incomplete and over-specified, failing to answer some queries that have real-world answers. However, experiments in this paper show that QE systems may disagree with deductive reasoning on answers that do not require generalization or relaxation. We address this problem with a novel QE method that is more faithful to deductive reasoning, and show that this leads to better performance on complex queries to incomplete KBs. Finally we show that inserting this new QE module into a neural question-answering system leads to substantial improvements over the state-of-the-art.