Bibliography (3):
Attention Is All You Need
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter