Bibliography (6):

  1. Attention Is All You Need

  2. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  3. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding

  4. A domain-specific supercomputer for training deep neural networks

  5. Long Range Arena (LRA): A Benchmark for Efficient Transformers

  6. Wikipedia Bibliography:

    1. Fourier transform