Bibliography (5):
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
RoBERTa: A Robustly Optimized BERT Pretraining Approach
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Wikipedia Bibliography:
ROUGE (metric)
BLEU