arXiv Analytics

Sign in

arXiv:1809.02637 [cs.CL]AbstractReferencesReviewsResources

Neural Generation of Diverse Questions using Answer Focus, Contextual and Linguistic Features

Vrindavan Harrison, Marilyn Walker

Published 2018-09-07Version 1

Question Generation is the task of automatically creating questions from textual input. In this work we present a new Attentional Encoder--Decoder Recurrent Neural Network model for automatic question generation. Our model incorporates linguistic features and an additional sentence embedding to capture meaning at both sentence and word levels. The linguistic features are designed to capture information related to named entity recognition, word case, and entity coreference resolution. In addition our model uses a copying mechanism and a special answer signal that enables generation of numerous diverse questions on a given sentence. Our model achieves state of the art results of 19.98 Bleu_4 on a benchmark Question Generation dataset, outperforming all previously published results by a significant margin. A human evaluation also shows that these added features improve the quality of the generated questions.

Related articles: Most relevant | Search more
arXiv:2104.06335 [cs.CL] (Published 2021-04-13)
On the Use of Linguistic Features for the Evaluation of Generative Dialogue Systems
arXiv:2404.03184 [cs.CL] (Published 2024-04-04)
The Death of Feature Engineering? BERT with Linguistic Features on SQuAD 2.0
arXiv:2112.08831 [cs.CL] (Published 2021-12-16, updated 2022-03-19)
Bridging between Cognitive Processing Signals and Linguistic Features via a Unified Attentional Network