Diffusion-LM Improves Controllable Text Generation
We develop a new non-autoregressive language model based on continuous diffusions that we call diffusion-lm.
The continuous, hierarchical nature of these intermediate variables enables a simple gradient-based algorithm to perform complex, controllable generation tasks.
We demonstrate successful control of diffusion-lm for six challenging fine-grained control tasks, significantly outperforming prior work.
Authors
Xiang Lisa Li, John Thickstun, Ishaan Gulrajani, Percy Liang, Tatsunori B. Hashimoto