Cross-Task Generalization via Natural Language Crowdsourcing Instructions
T0: Multitask Prompted Training Enables Zero-Shot Task Generalization
Scaling to Very Very Large Corpora for Natural Language Disambiguation
Revisiting Unreasonable Effectiveness of Data in Deep Learning Era
T5: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
ZeroPrompt: Scaling Prompt-Based Pretraining to 1,000 Tasks Improves Zero-Shot Generalization
Wikipedia Bibliography: