‘GPT-2 nonfiction’ tag
- See Also
-
Links
- “Alignment of Brain Embeddings and Artificial Contextual Embeddings in Natural Language Points to Common Geometric Patterns”, Goldstein et al 2024
- “Deep De Finetti: Recovering Topic Distributions from Large Language Models”, Zhang et al 2023
- “Attention Approximates Sparse Distributed Memory”, Bricken & Pehlevan 2021
- “Mind the Gap: Assessing Temporal Generalization in Neural Language Models § Scaling”, Lazaridou et al 2021
- “GPT-2 Folk Music”, Gwern & Presser 2019
- “MuseNet: a Deep Neural Network That Can Generate 4-Minute Musical Compositions With 10 Different Instruments, and Can Combine Styles from Country to Mozart to the Beatles”, Payne 2019
- Bibliography
See Also
Links
“Alignment of Brain Embeddings and Artificial Contextual Embeddings in Natural Language Points to Common Geometric Patterns”, Goldstein et al 2024
“Deep De Finetti: Recovering Topic Distributions from Large Language Models”, Zhang et al 2023
Deep de Finetti: Recovering Topic Distributions from Large Language Models
“Attention Approximates Sparse Distributed Memory”, Bricken & Pehlevan 2021
“Mind the Gap: Assessing Temporal Generalization in Neural Language Models § Scaling”, Lazaridou et al 2021
Mind the Gap: Assessing Temporal Generalization in Neural Language Models § Scaling
“GPT-2 Folk Music”, Gwern & Presser 2019
“MuseNet: a Deep Neural Network That Can Generate 4-Minute Musical Compositions With 10 Different Instruments, and Can Combine Styles from Country to Mozart to the Beatles”, Payne 2019
Bibliography
-
https://arxiv.org/abs/2102.01951#scaling&org=deepmind
: “Mind the Gap: Assessing Temporal Generalization in Neural Language Models § Scaling”, -
gpt-2-music
: “GPT-2 Folk Music”, -
https://openai.com/research/musenet
: “MuseNet: a Deep Neural Network That Can Generate 4-Minute Musical Compositions With 10 Different Instruments, and Can Combine Styles from Country to Mozart to the Beatles”,