Enhancing In-Context Learning with Semantic Representations for Relation Extraction

Han, Peitao, Pereira, Lis Kanashiro, Cheng, Fei, She, Wan Jou, Aramaki, Eiji

arXiv.org Artificial Intelligence 

In this work, we employ two AMR-enhanced semantic representations for ICL on RE: one that explores the AMR structure generated for a sentence at the subgraph level (shortest AMR path), and another that explores the full AMR structure generated for a sentence. In both cases, we demonstrate that all settings benefit from the fine-grained AMR's semantic structure. We evaluate our model on four RE datasets. Our results show that our model can outperform the GPT-based baselines, and achieve SOTA performance on two of the datasets, and competitive performance on the other two.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found