Schema-aware Reference as Prompt Improves Data-Efficient Knowledge Graph Construction
Yao, Yunzhi, Mao, Shengyu, Zhang, Ningyu, Chen, Xiang, Deng, Shumin, Chen, Xi, Chen, Huajun
–arXiv.org Artificial Intelligence
With the development of pre-trained language models, many prompt-based approaches to data-efficient knowledge graph construction have been proposed and achieved impressive performance. However, existing prompt-based learning methods for knowledge graph construction are still susceptible to several potential limitations: (i) semantic gap between natural language and output structured knowledge with pre-defined schema, which means model cannot fully exploit semantic knowledge with the constrained templates; (ii) representation learning with locally individual instances limits the performance given the insufficient features, which are unable to unleash the potential analogical capability of pre-trained language models. Motivated by these observations, we propose a retrieval-augmented approach, which retrieves schema-aware Reference As Prompt (RAP), for data-efficient knowledge graph construction. It can dynamically leverage schema and knowledge inherited from human-annotated and weak-supervised data as a prompt for each sample, which is model-agnostic and can be plugged into widespread existing approaches. Experimental results demonstrate that previous methods integrated with RAP can achieve impressive performance gains in low-resource settings on five datasets of relational triple extraction and event extraction for knowledge graph construction. Code is available in https://github.com/zjunlp/RAP.
arXiv.org Artificial Intelligence
Sep-18-2023
- Country:
- Asia > China
- Zhejiang Province (0.14)
- Europe (1.00)
- North America > United States
- California (0.14)
- Maryland (0.14)
- Oregon (0.14)
- Asia > China
- Genre:
- Research Report > New Finding (0.48)
- Industry:
- Law (0.94)
- Technology: