-
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
-
XLNet: Generalized Autoregressive Pretraining for Language Understanding
-
Language Models are Unsupervised Multitask Learners
-
GROVER: Defending Against Neural Fake News
-
$2019