Semantic Role Labeling with Associated Memory Network
Guan, Chaoyu, Cheng, Yuhao, Zhao, Hai
–arXiv.org Artificial Intelligence
Semantic role labeling (SRL) is a task to recognize all the predicate-argument pairs of a sentence, which has been in a performance improvement bottleneck after a series of latest works were presented. This paper proposes a novel syntax-agnostic SRL model enhanced by the proposed associated memory network (AMN), which makes use of inter-sentence attention of label-known associated sentences as a kind of memory to further enhance dependency-based SRL. In detail, we use sentences and their labels from train dataset as an associated memory cue to help label the target sentence. Furthermore, we compare several associated sentences selecting strategies and label merging methods in AMN to find and utilize the label of associated sentences while attending them. By leveraging the attentive memory from known training data, Our full model reaches state-of-the-art on CoNLL-2009 benchmark datasets for syntax-agnostic setting, showing a new effective research line of SRL enhancement other than exploiting external resources such as well pre-trained language models. 1 Introduction Semantic role labeling (SRL) is a task to recognize all the predicate-argument pairs of a given sentence and its predicates. It is a shallow semantic parsing task, which has been widely used in a series of natural language processing (NLP) tasks, such as information extraction (Liu et al., 2016) and question answering (Abujabal et al., 2017). Generally, SRL is decomposed into four classification subtasks in pipeline systems, consisting of Corresponding author.
arXiv.org Artificial Intelligence
Aug-5-2019
- Country:
- Asia > China (0.15)
- North America > United States (0.14)
- Genre:
- Research Report (0.82)
- Technology: