Bibliography (8):

  1. Attention Is All You Need

  2. Reformer: The Efficient Transformer

  3. Linformer: Self-Attention with Linear Complexity

  4. FAVOR+: Rethinking Attention with Performers

  5. Longformer: The Long-Document Transformer

  6. https://arxiv.org/pdf/2009.06732.pdf#org=google&page=5

  7. https://arxiv.org/pdf/2009.06732.pdf#org=google&page=6

  8. Wikipedia Bibliography:

    1. Reinforcement learning