Bibliography (3):
Unsupervised Cross-lingual Representation Learning at Scale
RoBERTa: A Robustly Optimized BERT Pretraining Approach
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding