Few-shot Relational Reasoning via Connection Subgraph Pretraining

Neural Information Processing Systems 

Few-shot knowledge graph (KG) completion task aims to perform inductive reasoning over the KG: given only a few support triplets of a new relation \bowtie (e.g., (chop, \bowtie,kitchen), (read, \bowtie,library), the goal is to predict the query triplets of the same unseen relation \bowtie, e.g., (sleep, \bowtie,?). Current approaches cast the problem in a meta-learning framework, where the model needs to be first jointly trained over many training few-shot tasks, each being defined by its own relation, so that learning/prediction on the target few-shot task can be effective. However, in real-world KGs, curating many training tasks is a challenging ad hoc process. Here we propose Connection Subgraph Reasoner (CSR), which can make predictions for the target few-shot task directly without the need for pre-training on the human curated set of training tasks. The key to CSR is that we explicitly model a shared connection subgraph between support and query triplets, as inspired by the principle of eliminative induction.