-
XLNet: Generalized Autoregressive Pretraining for Language Understanding
-
RoBERTa: A Robustly Optimized BERT Pretraining Approach
-
SQuAD: 100,000+ Questions for Machine Comprehension of Text
-
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
-