BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
XLNet: Generalized Autoregressive Pretraining for Language Understanding
CTRL: A Conditional Transformer Language Model For Controllable Generation