Pretrained Language Models for Text Generation: A Survey
Text generation has become one of the most important yet challenging tasks in
natural language processing (NLP). The resurgence of deep learning has greatly
advanced this field by neural generation models, especially the paradigm of
pretrained language models (PLMs). In this paper, we present an overview of the
major advances achieved in the topic of PLMs for text generation. As the
preliminaries, we present the general task definition and briefly describe the
mainstream architectures of PLMs for text generation. As the core content, we
discuss how to adapt existing PLMs to model different input data and satisfy
special properties in the generated text. We further summarize several
important fine-tuning strategies for text generation. Finally, we present
several future directions and conclude this paper. Our survey aims to provide
text generation researchers a synthesis and pointer to related research.