Generative Distributional Control (GDC) is a general framework for imposing constraints on samples of pretrained language models. The constraints can be either pointwise (e.g. all samples must be non-offensive) or distributional (e.g. a specified percentage of samples must mention females).
This repo contains code accompanying the following three papers:
/dpg
: A Distributional Approach to Controlled Text Generation (ICLR 2021)/cdpg
: Controlling Conditional Language Models without Catastrophic Forgetting (ICML 2022)/rm_vs_dm
: On Reinforcement Learning and Distribution Matching for Fine-Tuning Language Models with no Catastrophic Forgetting (NeurIPS 2022)