Improving Question Generation with Sentence-level Semantic Matching and Answer Position Inferring
Ma, Xiyao, Zhu, Qile, Zhou, Yanlin, Li, Xiaolin, Wu, Dapeng
–arXiv.org Artificial Intelligence
Taking an answer and its context as input, sequence-to- sequence models have made considerable progress on question generation. However, we observe that these approaches often generate wrong question words or keywords and copy answer-irrelevant words from the input. We believe that lacking global question semantics and exploiting answer position-awareness not well are the key root causes. In this paper, we propose a neural question generation model with two concrete modules: sentence-level semantic matching and answer position inferring. Further, we enhance the initial state of the decoder by leveraging the answer-aware gated fusion mechanism. Experimental results demonstrate that our model outperforms the state-of-the-art (SOT A) models on SQuAD and MARCO datasets. Owing to its generality, our work also improves the existing models significantly.
arXiv.org Artificial Intelligence
Dec-18-2019
- Country:
- North America > United States (0.68)
- Genre:
- Research Report > New Finding (0.34)
- Industry:
- Education (0.47)
- Government (0.46)
- Technology: