Bibliography (3):

  1. Unsupervised Cross-lingual Representation Learning at Scale

  2. RoBERTa: A Robustly Optimized BERT Pretraining Approach

  3. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding