Bibliography (11):

  1. https://github.com/google-deepmind/deepmind-research/tree/master/perceiver

  2. Perceiver IO: a Scalable, Fully-Attentional Model That Works on Any Modality

  3. Perceiver: General Perception with Iterative Attention

  4. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  5. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding

  6. Attention Is All You Need

  7. Grandmaster level in StarCraft II using multi-agent reinforcement learning

  8. Pointer Networks

  9. https://proceedings.neurips.cc/paper/1988/file/812b4ba287f5ee0bc9d43bbf5bbe87fb-Paper.pdf