“‘AI Emergence’ Tag”,2021-04-17 (backlinks):
![]()
Bibliography for tag
ai/scaling/emergence, most recent first: 4 related tags, 35 annotations, & 13 links (parent).
- See Also
- Links
- “Foundational Challenges in Assuring Alignment and Safety of Large Language Models”, et al 2024
- “A Phase Transition between Positional and Semantic Learning in a Solvable Model of Dot-Product Attention”, et al 2024
- “Compositional Capabilities of Autoregressive Transformers: A Study on Synthetic, Interpretable Tasks”, et al 2023
- “Training Dynamics of Contextual N-Grams in Language Models”, et al 2023
- “Compositional Abilities Emerge Multiplicatively: Exploring Diffusion Models on a Synthetic Task”, et al 2023
- “A Theory for Emergence of Complex Skills in Language Models”, 2023
- “Teaching Arithmetic to Small Transformers”, et al 2023
- “Schema-Learning and Rebinding As Mechanisms of In-Context Learning and Emergence”, et al 2023
- “8 Things to Know about Large Language Models”, 2023
- “The Quantization Model of Neural Scaling”, et al 2023
- “Toolformer: Language Models Can Teach Themselves to Use Tools”, et al 2023
- “Interactive-Chain-Prompting (INTERCPT): Ambiguity Resolution for Crosslingual Conditional Generation With Interaction”, et al 2023
- “Broken Neural Scaling Laws”, et al 2022
- “U-PaLM: Transcending Scaling Laws With 0.1% Extra Compute”, et al 2022
- “Challenging BIG-Bench Tasks (BBH) and Whether Chain-Of-Thought Can Solve Them”, et al 2022
- “Language Models Are Multilingual Chain-Of-Thought Reasoners”, et al 2022
- “Hidden Progress in Deep Learning: SGD Learns Parities Near the Computational Limit”, et al 2022
- “Emergent Abilities of Large Language Models”, et al 2022
- “Beyond the Imitation Game: Quantifying and Extrapolating the Capabilities of Language Models”, et al 2022
- “Data Distributional Properties Drive Emergent Few-Shot Learning in Transformers”, et al 2022
- “PaLM: Scaling Language Modeling With Pathways”, et al 2022
- “In-Context Learning and Induction Heads”, et al 2022
- “Predictability and Surprise in Large Generative Models”, et al 2022
- “The Effects of Reward Misspecification: Mapping and Mitigating Misaligned Models”, et al 2022
- “A Mathematical Framework for Transformer Circuits”, et al 2021
- “Scaling Language Models: Methods, Analysis & Insights from Training Gopher”, et al 2021
- “A General Language Assistant As a Laboratory for Alignment”, et al 2021
- “Mapping Language Models to Grounded Conceptual Spaces”, 2021
- “Program Synthesis With Large Language Models”, et al 2021
- “MMLU: Measuring Massive Multitask Language Understanding”, et al 2020
- “GPT-3: Language Models Are Few-Shot Learners”, et al 2020
- “Emergence in Cognitive Science”, 2010
- “Observed Universality of Phase Transitions in High-Dimensional Geometry, With Implications for Modern Data Analysis and Signal Processing”, 2009
- “The Phase Transition In Human Cognition § Phase Transitions in Language Processing”, et al 2009 (page 13)
- “A Dynamic Systems Model of Cognitive and Language Growth”, 1991
- Sort By Magic
- Wikipedia
- Miscellaneous
- Bibliography