Enhancing In-Context Learning with Semantic Representations for Relation Extraction
Han, Peitao, Pereira, Lis Kanashiro, Cheng, Fei, She, Wan Jou, Aramaki, Eiji
–arXiv.org Artificial Intelligence
In this work, we employ two AMR-enhanced semantic representations for ICL on RE: one that explores the AMR structure generated for a sentence at the subgraph level (shortest AMR path), and another that explores the full AMR structure generated for a sentence. In both cases, we demonstrate that all settings benefit from the fine-grained AMR's semantic structure. We evaluate our model on four RE datasets. Our results show that our model can outperform the GPT-based baselines, and achieve SOTA performance on two of the datasets, and competitive performance on the other two.
arXiv.org Artificial Intelligence
Jun-14-2024
- Country:
- Asia
- China > Hong Kong (0.04)
- Japan > Honshū
- Kansai > Kyoto Prefecture > Kyoto (0.05)
- Middle East > UAE
- Abu Dhabi Emirate > Abu Dhabi (0.05)
- Singapore (0.05)
- Europe
- Belgium > Brussels-Capital Region
- Brussels (0.04)
- Ireland > Leinster
- County Dublin > Dublin (0.04)
- Sweden > Uppsala County
- Uppsala (0.04)
- Belgium > Brussels-Capital Region
- North America
- Asia
- Genre:
- Research Report > New Finding (0.54)
- Technology: