arXiv Analytics

Sign in

arXiv:1806.04357 [cs.CL]AbstractReferencesReviewsResources

Multi-Task Neural Models for Translating Between Styles Within and Across Languages

Xing Niu, Sudha Rao, Marine Carpuat

Published 2018-06-12Version 1

Generating natural language requires conveying content in an appropriate style. We explore two related tasks on generating text of varying formality: monolingual formality transfer and formality-sensitive machine translation. We propose to solve these tasks jointly using multi-task learning, and show that our models achieve state-of-the-art performance for formality transfer and are able to perform formality-sensitive translation without being explicitly trained on style-annotated translation examples.

Comments: Accepted at the 27th International Conference on Computational Linguistics (COLING 2018)
Categories: cs.CL
Related articles: Most relevant | Search more
arXiv:2309.07597 [cs.CL] (Published 2023-09-14)
C-Pack: Packaged Resources To Advance General Chinese Embedding
arXiv:1606.01545 [cs.CL] (Published 2016-06-05)
Neural Net Models for Open-Domain Discourse Coherence
arXiv:1905.06319 [cs.CL] (Published 2019-05-15)
Exact Hard Monotonic Attention for Character-Level Transduction