Improving Natural Language Processing Tasks with Human Gaze-Guided Neural Attention: Supplementary Material

Neural Information Processing Systems 

To gain further insight into the comparison between our model and the current state of the art in sentence compression, we show results of our method and ablations in relation to ablations of the method by Zhao et al. [4] (see Table 1). In their work, the authors added a "syntax-based language model" to their sentence compression network with which they obtained the state-of-the-art performance of 85.1 F1 score. The authors employ a syntax-based language model which is trained to learn the syntactic dependencies between lexical items in the given input sequence. Together with this language model, they use a reinforcement learning algorithm to improve the deletion proposed by their Bi-LSTM model. Using a naive language model without syntactic features their model obtained a F1 score of 85.0.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found