Bibliography (6):

  1. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

  2. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  3. RoBERTa: A Robustly Optimized BERT Pretraining Approach

  4. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding

  5. Wikipedia Bibliography:

    1. ROUGE (metric)

    2. BLEU