arXiv Analytics

Sign in

arXiv:1910.13634 [cs.CL]AbstractReferencesReviewsResources

An Augmented Transformer Architecture for Natural Language Generation Tasks

Hailiang Li, Adele Y. C. Wang, Yang Liu, Du Tang, Zhibin Lei, Wenye Li

Published 2019-10-30Version 1

The Transformer based neural networks have been showing significant advantages on most evaluations of various natural language processing and other sequence-to-sequence tasks due to its inherent architecture based superiorities. Although the main architecture of the Transformer has been continuously being explored, little attention was paid to the positional encoding module. In this paper, we enhance the sinusoidal positional encoding algorithm by maximizing the variances between encoded consecutive positions to obtain additional promotion. Furthermore, we propose an augmented Transformer architecture encoded with additional linguistic knowledge, such as the Part-of-Speech (POS) tagging, to boost the performance on some natural language generation tasks, e.g., the automatic translation and summarization tasks. Experiments show that the proposed architecture attains constantly superior results compared to the vanilla Transformer.

Comments: This paper will be appeared in the conference workshop ICDM MLCS 2019
Categories: cs.CL, cs.LG
Related articles: Most relevant | Search more
arXiv:2210.12828 [cs.CL] (Published 2022-10-23)
Towards Pragmatic Production Strategies for Natural Language Generation Tasks
arXiv:2105.02544 [cs.CL] (Published 2021-05-06)
SGG: Learning to Select, Guide, and Generate for Keyphrase Generation
arXiv:1909.06564 [cs.CL] (Published 2019-09-14)
ALTER: Auxiliary Text Rewriting Tool for Natural Language Generation