Bibliography (9):

  1. Attention Is All You Need

  2. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  3. https://github.com/ml-jku/hopfield-layers

  4. https://ml-jku.github.io/hopfield-layers/

  5. https://iclr-blog-track.github.io/2022/03/25/Looking-at-the-Performer-from-a-Hopfield-point-of-view/

  6. Modern Hopfield Networks and Attention for Immune Repertoire Classification

  7. Large Associative Memory Problem in Neurobiology and Machine Learning

  8. Transformer Feed-Forward Layers Are Key-Value Memories

  9. Wikipedia Bibliography:

    1. Hopfield network