Natural Language Deduction with Incomplete Information
Sprague, Zayne, Bostrom, Kaj, Chaudhuri, Swarat, Durrett, Greg
–arXiv.org Artificial Intelligence
A growing body of work studies how to answer a question or verify a claim by generating a natural language "proof": a chain of deductive inferences yielding the answer based on a set of premises. However, these methods can only make sound deductions when they follow from evidence that is given. We propose a new system that can handle the underspecified setting where not all premises are stated at the outset; that is, additional assumptions need to be materialized to prove a claim. By using a natural language generation model to abductively infer a premise given another premise and a conclusion, we can impute missing pieces of evidence needed for the conclusion to be true. Our system searches over two fringes in a bidirectional fashion, interleaving deductive (forward-chaining) and abductive (backward-chaining) generation steps. We sample multiple possible outputs for each step to achieve coverage of the search space, at the same time ensuring correctness by filtering low-quality generations with a round-trip validation procedure. Results on a modified version of the EntailmentBank dataset and a new dataset called Everyday Norms: Why Not? show that abductive generation with validation can recover premises across in- and out-of-domain settings.
arXiv.org Artificial Intelligence
Nov-1-2022
- Country:
- Europe
- North America
- Dominican Republic (0.04)
- United States
- Louisiana > Orleans Parish
- New Orleans (0.04)
- Minnesota > Hennepin County
- Minneapolis (0.14)
- New York > New York County
- New York City (0.04)
- Texas > Travis County
- Austin (0.14)
- Washington > King County
- Seattle (0.04)
- Louisiana > Orleans Parish
- Genre:
- Research Report (0.82)
- Industry:
- Government (0.46)
- Leisure & Entertainment > Sports (0.46)
- Technology: