Bibliography (4):

  1. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  2. Attention Is All You Need

  3. TinyBERT: Distilling BERT for Natural Language Understanding

  4. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding