Task-specific Pre-training and Prompt Decomposition for Knowledge Graph Population with Language Models
Li, Tianyi, Huang, Wenyu, Papasarantopoulos, Nikos, Vougiouklis, Pavlos, Pan, Jeff Z.
–arXiv.org Artificial Intelligence
We present a system for knowledge graph population with Language Models, evaluated on the Knowledge Base Construction from Pre-trained Language Models (LM-KBC) challenge at ISWC 2022. Our system involves task-specific pre-training to improve LM representation of the masked object tokens, prompt decomposition for progressive generation of candidate objects, among other methods for higher-quality retrieval. Our system is the winner of track 1 of the LM-KBC challenge, based on BERT LM; it achieves 55.0% F-1 score on the hidden test set of the challenge.
arXiv.org Artificial Intelligence
Aug-31-2022
- Country:
- Europe (0.46)
- North America > United States
- Minnesota (0.28)
- Genre:
- Research Report (0.64)
- Technology: