Bibliography (4):

  1. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding

  2. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  3. RoBERTa: A Robustly Optimized BERT Pretraining Approach

  4. XLNet: Generalized Autoregressive Pretraining for Language Understanding