The Pile: An 800GB Dataset of Diverse Text for Language Modeling
CPM: A Large-scale Generative Chinese Pre-trained Language Model
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
T5: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems
Controllable Generation from Pre-trained Language Models via Inverse Prompting
Wikipedia Bibliography: