A unified set of mathematical notations for prompt-based learning
Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing
This paper surveys and organizes research works in a new paradigm in naturallanguage processing, which we dub"prompt-based learning.
Unlike traditional supervised learning, which trains a model to take in an input x and predict an output y as p(y|x), prompt-based learning is based on language models that model the probability of text directly.
To use these models to perform prediction tasks, the original input x is modified using a template into a textual string prompt x'that has some unfilled slots, and then the languagemodel is used to probabilistically fill the unfilled information to obtain a final string x, from which the final output y can be derived.
This framework is powerful and attractive for a number of reasons : it allows the language model to be pre-trained on massive amounts of raw text, and by defining a newprompting function the model is able to perform few-shot or even zero-shot learning, adapting to new scenarios with few or no labeled data.