Bibliography (4):

  1. T5: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

  2. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  3. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

  4. https://sites.google.com/view/mend-editing