arXiv Analytics

Sign in

arXiv:1807.01763 [cs.CL]AbstractReferencesReviewsResources

Seq2RDF: An end-to-end application for deriving Triples from Natural Language Text

Yue Liu, Tongtao Zhang, Zhicheng Liang, Heng Ji, Deborah L. McGuinness

Published 2018-07-04Version 1

We present an end-to-end approach that takes unstructured textual input and generates structured output compliant with a given vocabulary. Inspired by recent successes in neural machine translation, we treat the triples within a given knowledge graph as an independent graph language and propose an encoder-decoder framework with an attention mechanism that leverages knowledge graph embeddings. Our model learns the mapping from natural language text to triple representation in the form of subject-predicate-object using the selected knowledge graph vocabulary. Experiments on three different data sets show that we achieve competitive F1-Measures over the baselines using our simple yet effective approach. A demo video is included.

Comments: Proceedings of the ISWC 2018 Posters & Demonstrations
Categories: cs.CL, cs.AI
Related articles: Most relevant | Search more
arXiv:2007.14071 [cs.CL] (Published 2020-07-28)
Emotion Correlation Mining Through Deep Learning Models on Natural Language Text
arXiv:1909.08927 [cs.CL] (Published 2019-09-19)
Extracting Conceptual Knowledge from Natural Language Text Using Maximum Likelihood Principle
arXiv:2305.03960 [cs.CL] (Published 2023-05-06)
Beyond Rule-based Named Entity Recognition and Relation Extraction for Process Model Generation from Natural Language Text