Bibliography (6):

  1. XLNet: Generalized Autoregressive Pretraining for Language Understanding

  2. RoBERTa: A Robustly Optimized BERT Pretraining Approach

  3. SQuAD: 100,000+ Questions for Machine Comprehension of Text

  4. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding