Bibliography (3):

  1. Attention Is All You Need

  2. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  3. DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter