arXiv Analytics

Sign in

arXiv:1912.02164 [cs.CL]AbstractReferencesReviewsResources

Plug and Play Language Models: a Simple Approach to Controlled Text Generation

Sumanth Dathathri, Andrea Madotto, Janice Lan, Jane Hung, Eric Frank, Piero Molino, Jason Yosinski, Rosanne Liu

Published 2019-12-04Version 1

Large transformer-based language models (LMs) trained on huge text corpora have shown unparalleled generation capabilities. However, controlling attributes of the generated language (e.g. switching topic or sentiment) is difficult without modifying the model architecture or fine-tuning on attribute-specific data and entailing the significant cost of retraining. We propose a simple alternative: the Plug and Play Language Model (PPLM) for controllable language generation, which combines a pretrained LM with one or more simple attribute classifiers that guide text generation without any further training of the LM. In the canonical scenario we present, the attribute models are simple classifiers consisting of a user-specified bag of words or a single learned layer with 100,000 times fewer parameters than the LM. Sampling entails a forward and backward pass in which gradients from the attribute model push the LM's hidden activations and thus guide the generation. Model samples demonstrate control over a range of topics and sentiment styles, and extensive automated and human annotated evaluations show attribute alignment and fluency. PPLMs are flexible in that any combination of differentiable attribute models may be used to steer text generation, which will allow for diverse and creative applications beyond the examples given in this paper.

Related articles: Most relevant | Search more
arXiv:2212.09282 [cs.CL] (Published 2022-12-19)
APOLLO: A Simple Approach for Adaptive Pretraining of Language Models for Logical Reasoning
arXiv:2212.02924 [cs.CL] (Published 2022-12-06)
Controlled Text Generation using T5 based Encoder-Decoder Soft Prompt Tuning and Analysis of the Utility of Generated Text in AI
arXiv:2006.03535 [cs.CL] (Published 2020-06-05)
CoCon: A Self-Supervised Approach for Controlled Text Generation