‘information theory’ tag
- See Also
- Gwern
-
Links
- “Channel Capacity of a Telegraph”
- “The Unbearable Slowness of Being”, Zheng & Meister 2024
- “An Information-Theoretic Analysis of In-Context Learning”, Jeon et al 2024
- “Unifying Approaches in Active Learning and Active Sampling via Fisher Information and Information-Theoretic Quantities”, Kirsch & Gal 2023
- “Market Microstructure and Informational Efficiency: The Role of Intermediation”, Guthmann & Albrecht 2023
- “Electrochemical Potential Enables Dormant Spores to Integrate Environmental Signals”, Kikuchi et al 2022
- “What Do We Maximize in Self-Supervised Learning?”, Shwartz-Ziv et al 2022
- “Macaques Preferentially Attend to Intermediately Surprising Information”, Wu et al 2022
- “The Cost of Information Acquisition by Natural Selection”, McGee et al 2022
- “Efficiently Irrational: Deciphering the Riddle of Human Choice”, Glimcher 2022
- “First Contact: Unsupervised Human-Machine Co-Adaptation via Mutual Information Maximization”, Reddy et al 2022
- “The InterModel Vigorish (IMV): A Flexible and Portable Approach for Quantifying Predictive Accuracy With Binary Outcomes”, Domingue et al 2022
- “Intelligence and Unambitiousness Using Algorithmic Information Theory”, Cohen et al 2021
- “Entropy Trade-Offs in Artistic Design: A Case Study of Tamil kolam”, Tran et al 2021
- “Computation in the Human Cerebral Cortex Uses Less Than 0.2 Watts yet This Great Expense Is Optimal When considering Communication Costs”, Levy & Calvert 2020
- “On the Measure of Intelligence”, Chollet 2019
- “Neural Networks Are a Priori Biased towards Boolean Functions With Low Entropy”, Mingard et al 2019
- “A Unified Bellman Optimality Principle Combining Reward Maximization and Empowerment”, Leibfried et al 2019
- “Common Neural Code for Reward and Information Value”, Kobayashi & Hsu 2019
- “Humans Store about 1.5 Megabytes of Information during Language Acquisition”, Mollica & Piantadosi 2019
- “Predictability and Uncertainty in the Pleasure of Music: A Reward for Learning?”, Gold et al 2019
- “Accounting Theory As a Bayesian Discipline”, Johnstone 2018
- “Measurement Invariance Explains the Universal Law of Generalization for Psychological Perception”, Frank 2018
- “Information Flow Reveals Prediction Limits in Online Social Activity”, Bagrow et al 2017
- “Predicting Green: Really Radical (plant) Predictive Processing”, Calvo & Friston 2017
- “Exploration and Exploitation of Victorian Science in Darwin’s Reading Notebooks”, Murdock et al 2016
- “Multiplicative LSTM for Sequence Modeling”, Krause et al 2016
- “Capacity-Approaching DNA Storage”, Erlich & Zielinski 2016
- “Energy-Efficient Algorithms”, Demaine et al 2016
- Advances in Physarum Machines: Sensing and Computing With Slime Mould, Adamatzky 2016
- “Ed Fredkin and the Physics of Information: An Inside Story of an Outsider Scientist”, Hagar 2016
- “Nanoconnectomic Upper Bound on the Variability of Synaptic Plasticity”, Bartol et al 2015
- “Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines”, Soler-Toscano et al 2014
- “The Physics of Bacterial Decision Making”, Ben-Jacob et al 2014
- “Understanding Predictive Information Criteria for Bayesian Models”, Gelman 2013
- “A Widely Applicable Bayesian Information Criterion”, Watanabe 2012
- “A Cross-Language Perspective On Speech Information Rate”, Pellegrino et al 2011
- “Information Geometry and Evolutionary Game Theory”, Harper 2009
- “The Epic Story of Maximum Likelihood”, Stigler 2007
- “A Methodology for Studying Various Interpretations of the N,N-Dimethyltryptamine-Induced Alternate Reality”, Rodriguez 2006
- “What Color Are Your Bits?”, Skala 2004
- “Ultimate Physical Limits to Computation”, Lloyd 1999
- “The Physical Limits of Communication”, Lachmann et al 1999
- “Analog Versus Digital: Extrapolating from Electronics to Neurobiology”, Sarpeshkar 1998
- “Quantum Effects in Algorithms”, Jozsa 1998
- “Information Theory and an Extension of the Maximum Likelihood Principle”, Akaike 1998
- “Information and the Accuracy Attainable in the Estimation of Statistical Parameters”, Rao 1992
- “The Total Evidence Theorem for Probability Kinematics”, Graves 1989
- “Profile of Claude Shannon”, Liversidge & Shannon 1987
- “How Much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-Term Memory”, Landauer 1986
- “The Fundamental Physical Limits of Computation”, Bennett & Landauer 1985
- “Randomness Conservation Inequalities; Information and Independence in Mathematical Theories”, Levin 1984
- “A Convergent Gambling Estimate Of The Entropy Of English”, Cover & King 1978
- “Quantum Effects in Communications Systems”, Gordon 1962
- “Coding Theorems for a Discrete Source With a Fidelity Criterion”, Shannon 1959
- “Chance Remarks”, Pierce 1949
- “Chance Remarks § Shannon’s n-Gram Generations”, Pierce 1949 (page 4)
- “Gyrophone: Recognizing Speech From Gyroscope Signals”, Michalevsky 2024
- “How Many Persons Can There Be: Brain Reconstruction and Big Numbers”
- “Harder Drive: Hard Drives We Didn’t Want or Need”, tom7 2024
- Sort By Magic
- Wikipedia
- Miscellaneous
- Bibliography
See Also
Gwern
“How Complex Are Individual Differences?”, Gwern 2010
“Death Note: L, Anonymity & Eluding Entropy”, Gwern 2011
Links
“Channel Capacity of a Telegraph”
Channel capacity of a telegraph:
View External Link:
https://www.johndcook.com/blog/2024/10/19/channel-capacity-of-a-telegraph/
“The Unbearable Slowness of Being”, Zheng & Meister 2024
“An Information-Theoretic Analysis of In-Context Learning”, Jeon et al 2024
“Unifying Approaches in Active Learning and Active Sampling via Fisher Information and Information-Theoretic Quantities”, Kirsch & Gal 2023
“Market Microstructure and Informational Efficiency: The Role of Intermediation”, Guthmann & Albrecht 2023
Market Microstructure and Informational Efficiency: The Role of Intermediation
“Electrochemical Potential Enables Dormant Spores to Integrate Environmental Signals”, Kikuchi et al 2022
Electrochemical potential enables dormant spores to integrate environmental signals
“What Do We Maximize in Self-Supervised Learning?”, Shwartz-Ziv et al 2022
“Macaques Preferentially Attend to Intermediately Surprising Information”, Wu et al 2022
Macaques preferentially attend to intermediately surprising information
“The Cost of Information Acquisition by Natural Selection”, McGee et al 2022
“Efficiently Irrational: Deciphering the Riddle of Human Choice”, Glimcher 2022
Efficiently irrational: deciphering the riddle of human choice
“First Contact: Unsupervised Human-Machine Co-Adaptation via Mutual Information Maximization”, Reddy et al 2022
First Contact: Unsupervised Human-Machine Co-Adaptation via Mutual Information Maximization
“The InterModel Vigorish (IMV): A Flexible and Portable Approach for Quantifying Predictive Accuracy With Binary Outcomes”, Domingue et al 2022
“Intelligence and Unambitiousness Using Algorithmic Information Theory”, Cohen et al 2021
Intelligence and Unambitiousness Using Algorithmic Information Theory
“Entropy Trade-Offs in Artistic Design: A Case Study of Tamil kolam”, Tran et al 2021
Entropy trade-offs in artistic design: A case study of Tamil kolam
“Computation in the Human Cerebral Cortex Uses Less Than 0.2 Watts yet This Great Expense Is Optimal When considering Communication Costs”, Levy & Calvert 2020
“On the Measure of Intelligence”, Chollet 2019
“Neural Networks Are a Priori Biased towards Boolean Functions With Low Entropy”, Mingard et al 2019
Neural networks are a priori biased towards Boolean functions with low entropy
“A Unified Bellman Optimality Principle Combining Reward Maximization and Empowerment”, Leibfried et al 2019
A Unified Bellman Optimality Principle Combining Reward Maximization and Empowerment
“Common Neural Code for Reward and Information Value”, Kobayashi & Hsu 2019
“Humans Store about 1.5 Megabytes of Information during Language Acquisition”, Mollica & Piantadosi 2019
Humans store about 1.5 megabytes of information during language acquisition
“Predictability and Uncertainty in the Pleasure of Music: A Reward for Learning?”, Gold et al 2019
Predictability and Uncertainty in the Pleasure of Music: A Reward for Learning?
“Accounting Theory As a Bayesian Discipline”, Johnstone 2018
“Measurement Invariance Explains the Universal Law of Generalization for Psychological Perception”, Frank 2018
Measurement invariance explains the universal law of generalization for psychological perception
“Information Flow Reveals Prediction Limits in Online Social Activity”, Bagrow et al 2017
Information flow reveals prediction limits in online social activity
“Predicting Green: Really Radical (plant) Predictive Processing”, Calvo & Friston 2017
Predicting green: really radical (plant) predictive processing
“Exploration and Exploitation of Victorian Science in Darwin’s Reading Notebooks”, Murdock et al 2016
Exploration and exploitation of Victorian science in Darwin’s reading notebooks
“Multiplicative LSTM for Sequence Modeling”, Krause et al 2016
“Capacity-Approaching DNA Storage”, Erlich & Zielinski 2016
“Energy-Efficient Algorithms”, Demaine et al 2016
Advances in Physarum Machines: Sensing and Computing With Slime Mould, Adamatzky 2016
Advances in Physarum Machines: Sensing and Computing with Slime Mould
“Ed Fredkin and the Physics of Information: An Inside Story of an Outsider Scientist”, Hagar 2016
Ed Fredkin and the Physics of Information: An Inside Story of an Outsider Scientist
“Nanoconnectomic Upper Bound on the Variability of Synaptic Plasticity”, Bartol et al 2015
Nanoconnectomic upper bound on the variability of synaptic plasticity
“Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines”, Soler-Toscano et al 2014
Calculating Kolmogorov complexity from the output frequency distributions of small Turing machines
“The Physics of Bacterial Decision Making”, Ben-Jacob et al 2014
“Understanding Predictive Information Criteria for Bayesian Models”, Gelman 2013
Understanding predictive information criteria for Bayesian models:
“A Widely Applicable Bayesian Information Criterion”, Watanabe 2012
“A Cross-Language Perspective On Speech Information Rate”, Pellegrino et al 2011
“Information Geometry and Evolutionary Game Theory”, Harper 2009
“The Epic Story of Maximum Likelihood”, Stigler 2007
“A Methodology for Studying Various Interpretations of the N,N-Dimethyltryptamine-Induced Alternate Reality”, Rodriguez 2006
“What Color Are Your Bits?”, Skala 2004
“Ultimate Physical Limits to Computation”, Lloyd 1999
“The Physical Limits of Communication”, Lachmann et al 1999
“Analog Versus Digital: Extrapolating from Electronics to Neurobiology”, Sarpeshkar 1998
Analog Versus Digital: Extrapolating from Electronics to Neurobiology
“Quantum Effects in Algorithms”, Jozsa 1998
“Information Theory and an Extension of the Maximum Likelihood Principle”, Akaike 1998
Information Theory and an Extension of the Maximum Likelihood Principle
“Information and the Accuracy Attainable in the Estimation of Statistical Parameters”, Rao 1992
Information and the Accuracy Attainable in the Estimation of Statistical Parameters:
View PDF:
“The Total Evidence Theorem for Probability Kinematics”, Graves 1989
“Profile of Claude Shannon”, Liversidge & Shannon 1987
“How Much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-Term Memory”, Landauer 1986
“The Fundamental Physical Limits of Computation”, Bennett & Landauer 1985
“Randomness Conservation Inequalities; Information and Independence in Mathematical Theories”, Levin 1984
Randomness conservation inequalities; information and independence in mathematical theories
“A Convergent Gambling Estimate Of The Entropy Of English”, Cover & King 1978
“Quantum Effects in Communications Systems”, Gordon 1962
“Coding Theorems for a Discrete Source With a Fidelity Criterion”, Shannon 1959
Coding Theorems for a Discrete Source With a Fidelity Criterion
“Chance Remarks”, Pierce 1949
“Chance Remarks § Shannon’s n-Gram Generations”, Pierce 1949 (page 4)
“Gyrophone: Recognizing Speech From Gyroscope Signals”, Michalevsky 2024
“How Many Persons Can There Be: Brain Reconstruction and Big Numbers”
How many persons can there be: brain reconstruction and big numbers:
“Harder Drive: Hard Drives We Didn’t Want or Need”, tom7 2024
Sort By Magic
Annotations sorted by machine learning into inferred 'tags'. This provides an alternative way to browse: instead of by date order, one can browse in topic order. The 'sorted' list has been automatically clustered into multiple sections & auto-labeled for easier browsing.
Beginning with the newest annotation, it uses the embedding of each annotation to attempt to create a list of nearest-neighbor annotations, creating a progression of topics. For more details, see the link.
entropy-bias
quantum-communication
memory-estimation
information-theory
Wikipedia
Miscellaneous
Bibliography
-
https://openreview.net/forum?id=UVDAKQANOW
: “Unifying Approaches in Active Learning and Active Sampling via Fisher Information and Information-Theoretic Quantities”, -
2022-glimcher.pdf
: “Efficiently Irrational: Deciphering the Riddle of Human Choice”, -
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4014489/
: “Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines”, -
2007-rodriguez.pdf
: “A Methodology for Studying Various Interpretations of the N,N-Dimethyltryptamine-Induced Alternate Reality”, -
1986-landauer.pdf
: “How Much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-Term Memory”,