- See Also
-
Links
- “Token Turing Machines”, Et Al 2022
- “MeMViT: Memory-Augmented Multiscale Vision Transformer for Efficient Long-Term Video Recognition”, Et Al 2022
- “ABC: Attention With Bounded-memory Control”, Et Al 2021
- “Memorizing Transformers”, Et Al 2021
- “Recursively Summarizing Books With Human Feedback”, Et Al 2021
- “∞-former: Infinite Memory Transformer”, Et Al 2021
- “Perceiver IO: A General Architecture for Structured Inputs & Outputs”, Et Al 2021
- “Not All Memories Are Created Equal: Learning to Forget by Expiring”, Et Al 2021
- “Perceiver: General Perception With Iterative Attention”, Et Al 2021
- “Learning to Summarize Long Texts With Memory Compression and Transfer”, Et Al 2020
- “Memory Transformer”, Et Al 2020
- “Compressive Transformers for Long-Range Sequence Modelling”, Et Al 2019
- “Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks”, Et Al 2018
- “Generating Wikipedia by Summarizing Long Sequences”, Et Al 2018
- Miscellaneous
- Link Bibliography
See Also
Links
“Token Turing Machines”, Et Al 2022
“Token Turing Machines”, 2022-11-16 ( ; similar)
“MeMViT: Memory-Augmented Multiscale Vision Transformer for Efficient Long-Term Video Recognition”, Et Al 2022
“MeMViT: Memory-Augmented Multiscale Vision Transformer for Efficient Long-Term Video Recognition”, 2022-01-20 ( ; similar)
“ABC: Attention With Bounded-memory Control”, Et Al 2021
“ABC: Attention with Bounded-memory Control”, 2021-10-06 (similar)
“Memorizing Transformers”, Et Al 2021
“Memorizing Transformers”, 2021-10-05 ( ; similar; bibliography)
“Recursively Summarizing Books With Human Feedback”, Et Al 2021
“Recursively Summarizing Books with Human Feedback”, 2021-09-22 ( ; similar; bibliography)
“∞-former: Infinite Memory Transformer”, Et Al 2021
“∞-former: Infinite Memory Transformer”, 2021-09-01 (backlinks; similar)
“Perceiver IO: A General Architecture for Structured Inputs & Outputs”, Et Al 2021
“Perceiver IO: A General Architecture for Structured Inputs & Outputs”, 2021-07-30 ( ; similar; bibliography)
“Not All Memories Are Created Equal: Learning to Forget by Expiring”, Et Al 2021
“Not All Memories are Created Equal: Learning to Forget by Expiring”, 2021-05-13 ( ; similar)
“Perceiver: General Perception With Iterative Attention”, Et Al 2021
“Perceiver: General Perception with Iterative Attention”, 2021-03-04 ( ; similar; bibliography)
“Learning to Summarize Long Texts With Memory Compression and Transfer”, Et Al 2020
“Learning to Summarize Long Texts with Memory Compression and Transfer”, 2020-10-21 ( ; similar)
“Memory Transformer”, Et Al 2020
“Memory Transformer”, 2020-06-20 ( ; backlinks; similar)
“Compressive Transformers for Long-Range Sequence Modelling”, Et Al 2019
“Compressive Transformers for Long-Range Sequence Modelling”, 2019-11-13 ( ; similar)
“Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks”, Et Al 2018
“Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks”, 2018-10-01 (backlinks; similar)
“Generating Wikipedia by Summarizing Long Sequences”, Et Al 2018
“Generating Wikipedia by Summarizing Long Sequences”, 2018-01-30 ( ; similar)
Miscellaneous
Link Bibliography
-
https://arxiv.org/abs/2203.08913#google
: “Memorizing Transformers”, Yuhuai Wu, Markus Norman Rabe, DeLesley Hutchins, Christian Szegedy: -
https://arxiv.org/abs/2109.10862#openai
: “Recursively Summarizing Books With Human Feedback”, Jeff Wu, Long Ouyang, Daniel M. Ziegler, Nisan Stiennon, Ryan Lowe, Jan Leike, Paul Christiano: -
https://arxiv.org/abs/2107.14795#deepmind
: “Perceiver IO: A General Architecture for Structured Inputs & Outputs”, : -
https://arxiv.org/abs/2103.03206#deepmind
: “Perceiver: General Perception With Iterative Attention”, Andrew Jaegle, Felix Gimeno, Andrew Brock, Andrew Zisserman, Oriol Vinyals, Joao Carreira: