 See Also
 Gwern

Links
 “Unifying Approaches in Active Learning and Active Sampling via Fisher Information and InformationTheoretic Quantities”, Kirsch & Gal 2023
 “Market Microstructure and Informational Efficiency: The Role of Intermediation”, Guthmann & Albrecht 2023
 “Electrochemical Potential Enables Dormant Spores to Integrate Environmental Signals”, Kikuchi et al 2022
 “What Do We Maximize in SelfSupervised Learning?”, ShwartzZiv et al 2022
 “Macaques Preferentially Attend to Intermediately Surprising Information”, Wu et al 2022
 “The Cost of Information Acquisition by Natural Selection”, McGee et al 2022
 “Efficiently Irrational: Deciphering the Riddle of Human Choice”, Glimcher 2022
 “First Contact: Unsupervised HumanMachine CoAdaptation via Mutual Information Maximization”, Reddy et al 2022
 “The InterModel Vigorish (IMV): A Flexible and Portable Approach for Quantifying Predictive Accuracy With Binary Outcomes”, Domingue et al 2022
 “Intelligence and Unambitiousness Using Algorithmic Information Theory”, Cohen et al 2021
 “Entropy TradeOffs in Artistic Design: A Case Study of Tamil kolam”, Tran et al 2021
 “Computation in the Human Cerebral Cortex Uses Less Than 0.2 Watts yet This Great Expense Is Optimal When considering Communication Costs”, Levy & Calvert 2020
 “On the Measure of Intelligence”, Chollet 2019
 “Neural Networks Are a Priori Biased towards Boolean Functions With Low Entropy”, Mingard et al 2019
 “A Unified Bellman Optimality Principle Combining Reward Maximization and Empowerment”, Leibfried et al 2019
 “Common Neural Code for Reward and Information Value”, Kobayashi & Hsu 2019
 “Humans Store about 1.5 Megabytes of Information during Language Acquisition”, Mollica & Piantadosi 2019
 “Predictability and Uncertainty in the Pleasure of Music: A Reward for Learning?”, Gold et al 2019
 “Accounting Theory As a Bayesian Discipline”, Johnstone 2018
 “Measurement Invariance Explains the Universal Law of Generalization for Psychological Perception”, Frank 2018
 “Information Flow Reveals Prediction Limits in Online Social Activity”, Bagrow et al 2017
 “Predicting Green: Really Radical (plant) Predictive Processing”, Calvo & Friston 2017
 “Exploration and Exploitation of Victorian Science in Darwin’s Reading Notebooks”, Murdock et al 2016
 “Multiplicative LSTM for Sequence Modelling”, Krause et al 2016
 “CapacityApproaching DNA Storage”, Erlich & Zielinski 2016
 “EnergyEfficient Algorithms”, Demaine et al 2016
 “Advances in Physarum Machines: Sensing and Computing With Slime Mould”, Adamatzky 2016
 “Ed Fredkin and the Physics of Information: An Inside Story of an Outsider Scientist”, Hagar 2016
 “Nanoconnectomic Upper Bound on the Variability of Synaptic Plasticity”, Bartol et al 2015
 “Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines”, SolerToscano et al 2014
 “The Physics of Bacterial Decision Making”, BenJacob et al 2014
 “Understanding Predictive Information Criteria for Bayesian Models”, Gelman 2013
 “A Widely Applicable Bayesian Information Criterion”, Watanabe 2012
 “A CrossLanguage Perspective On Speech Information Rate”, Pellegrino et al 2011
 “Information Geometry and Evolutionary Game Theory”, Harper 2009
 “The Epic Story of Maximum Likelihood”, Stigler 2007
 “A Methodology for Studying Various Interpretations of the N,NDimethyltryptamineInduced Alternate Reality”, Rodriguez 2006
 “What Color Are Your Bits?”, Skala 2004
 “Ultimate Physical Limits to Computation”, Lloyd 1999
 “The Physical Limits of Communication”, Lachmann et al 1999
 “Analog Versus Digital: Extrapolating from Electronics to Neurobiology”, Sarpeshkar 1998
 “Quantum Effects in Algorithms”, Jozsa 1998
 “Information Theory and an Extension of the Maximum Likelihood Principle”, Akaike 1998
 “Information and the Accuracy Attainable in the Estimation of Statistical Parameters”, Rao 1992
 “The Total Evidence Theorem for Probability Kinematics”, Graves 1989
 “How Much Do People Remember? Some Estimates of the Quantity of Learned Information in LongTerm Memory”, Landauer 1986
 “Randomness Conservation Inequalities; Information and Independence in Mathematical Theories”, Levin 1984
 “A Convergent Gambling Estimate Of The Entropy Of English”, Cover & King 1978
 “Coding Theorems for a Discrete Source With a Fidelity Criterion”, Shannon 1959
 “How Many Persons Can There Be: Brain Reconstruction and Big Numbers”
 Sort By Magic
 Wikipedia
 Miscellaneous
 Link Bibliography
See Also
Gwern
“How Complex Are Individual Differences?”, Gwern 2010
“Death Note: L, Anonymity & Eluding Entropy”, Gwern 2011
Links
“Unifying Approaches in Active Learning and Active Sampling via Fisher Information and InformationTheoretic Quantities”, Kirsch & Gal 2023
“Market Microstructure and Informational Efficiency: The Role of Intermediation”, Guthmann & Albrecht 2023
Market Microstructure and Informational Efficiency: The Role of Intermediation
“Electrochemical Potential Enables Dormant Spores to Integrate Environmental Signals”, Kikuchi et al 2022
Electrochemical potential enables dormant spores to integrate environmental signals
“What Do We Maximize in SelfSupervised Learning?”, ShwartzZiv et al 2022
“Macaques Preferentially Attend to Intermediately Surprising Information”, Wu et al 2022
Macaques preferentially attend to intermediately surprising information
“The Cost of Information Acquisition by Natural Selection”, McGee et al 2022
“Efficiently Irrational: Deciphering the Riddle of Human Choice”, Glimcher 2022
Efficiently irrational: deciphering the riddle of human choice
“First Contact: Unsupervised HumanMachine CoAdaptation via Mutual Information Maximization”, Reddy et al 2022
First Contact: Unsupervised HumanMachine CoAdaptation via Mutual Information Maximization
“The InterModel Vigorish (IMV): A Flexible and Portable Approach for Quantifying Predictive Accuracy With Binary Outcomes”, Domingue et al 2022
“Intelligence and Unambitiousness Using Algorithmic Information Theory”, Cohen et al 2021
Intelligence and Unambitiousness Using Algorithmic Information Theory
“Entropy TradeOffs in Artistic Design: A Case Study of Tamil kolam”, Tran et al 2021
Entropy tradeoffs in artistic design: A case study of Tamil kolam
“Computation in the Human Cerebral Cortex Uses Less Than 0.2 Watts yet This Great Expense Is Optimal When considering Communication Costs”, Levy & Calvert 2020
“On the Measure of Intelligence”, Chollet 2019
“Neural Networks Are a Priori Biased towards Boolean Functions With Low Entropy”, Mingard et al 2019
Neural networks are a priori biased towards Boolean functions with low entropy
“A Unified Bellman Optimality Principle Combining Reward Maximization and Empowerment”, Leibfried et al 2019
A Unified Bellman Optimality Principle Combining Reward Maximization and Empowerment
“Common Neural Code for Reward and Information Value”, Kobayashi & Hsu 2019
“Humans Store about 1.5 Megabytes of Information during Language Acquisition”, Mollica & Piantadosi 2019
Humans store about 1.5 megabytes of information during language acquisition
“Predictability and Uncertainty in the Pleasure of Music: A Reward for Learning?”, Gold et al 2019
Predictability and Uncertainty in the Pleasure of Music: A Reward for Learning?
“Accounting Theory As a Bayesian Discipline”, Johnstone 2018
“Measurement Invariance Explains the Universal Law of Generalization for Psychological Perception”, Frank 2018
Measurement invariance explains the universal law of generalization for psychological perception
“Information Flow Reveals Prediction Limits in Online Social Activity”, Bagrow et al 2017
Information flow reveals prediction limits in online social activity
“Predicting Green: Really Radical (plant) Predictive Processing”, Calvo & Friston 2017
Predicting green: really radical (plant) predictive processing
“Exploration and Exploitation of Victorian Science in Darwin’s Reading Notebooks”, Murdock et al 2016
Exploration and exploitation of Victorian science in Darwin’s reading notebooks
“Multiplicative LSTM for Sequence Modelling”, Krause et al 2016
“CapacityApproaching DNA Storage”, Erlich & Zielinski 2016
“EnergyEfficient Algorithms”, Demaine et al 2016
“Advances in Physarum Machines: Sensing and Computing With Slime Mould”, Adamatzky 2016
Advances in Physarum Machines: Sensing and Computing with Slime Mould
“Ed Fredkin and the Physics of Information: An Inside Story of an Outsider Scientist”, Hagar 2016
Ed Fredkin and the Physics of Information: An Inside Story of an Outsider Scientist
“Nanoconnectomic Upper Bound on the Variability of Synaptic Plasticity”, Bartol et al 2015
Nanoconnectomic upper bound on the variability of synaptic plasticity
“Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines”, SolerToscano et al 2014
Calculating Kolmogorov complexity from the output frequency distributions of small Turing machines
“The Physics of Bacterial Decision Making”, BenJacob et al 2014
“Understanding Predictive Information Criteria for Bayesian Models”, Gelman 2013
Understanding predictive information criteria for Bayesian models
“A Widely Applicable Bayesian Information Criterion”, Watanabe 2012
“A CrossLanguage Perspective On Speech Information Rate”, Pellegrino et al 2011
“Information Geometry and Evolutionary Game Theory”, Harper 2009
“The Epic Story of Maximum Likelihood”, Stigler 2007
“A Methodology for Studying Various Interpretations of the N,NDimethyltryptamineInduced Alternate Reality”, Rodriguez 2006
“What Color Are Your Bits?”, Skala 2004
“Ultimate Physical Limits to Computation”, Lloyd 1999
“The Physical Limits of Communication”, Lachmann et al 1999
“Analog Versus Digital: Extrapolating from Electronics to Neurobiology”, Sarpeshkar 1998
Analog Versus Digital: Extrapolating from Electronics to Neurobiology
“Quantum Effects in Algorithms”, Jozsa 1998
“Information Theory and an Extension of the Maximum Likelihood Principle”, Akaike 1998
Information Theory and an Extension of the Maximum Likelihood Principle
“Information and the Accuracy Attainable in the Estimation of Statistical Parameters”, Rao 1992
Information and the Accuracy Attainable in the Estimation of Statistical Parameters
“The Total Evidence Theorem for Probability Kinematics”, Graves 1989
“How Much Do People Remember? Some Estimates of the Quantity of Learned Information in LongTerm Memory”, Landauer 1986
“Randomness Conservation Inequalities; Information and Independence in Mathematical Theories”, Levin 1984
Randomness conservation inequalities; information and independence in mathematical theories
“A Convergent Gambling Estimate Of The Entropy Of English”, Cover & King 1978
“Coding Theorems for a Discrete Source With a Fidelity Criterion”, Shannon 1959
Coding Theorems for a Discrete Source With a Fidelity Criterion
“How Many Persons Can There Be: Brain Reconstruction and Big Numbers”
How many persons can there be: brain reconstruction and big numbers
Sort By Magic
Annotations sorted by machine learning into inferred 'tags'. This provides an alternative way to browse: instead of by date order, one can browse in topic order. The 'sorted' list has been automatically clustered into multiple sections & autolabeled for easier browsing.
Beginning with the newest annotation, it uses the embedding of each annotation to attempt to create a list of nearestneighbor annotations, creating a progression of topics. For more details, see the link.
predictivemodeling
knowledgeacquisition
informationtheory
Wikipedia
Miscellaneous
Link Bibliography

https://openreview.net/forum?id=UVDAKQANOW
: “Unifying Approaches in Active Learning and Active Sampling via Fisher Information and InformationTheoretic Quantities”, Andreas Kirsch, Yarin Gal 
2022glimcher.pdf
: “Efficiently Irrational: Deciphering the Riddle of Human Choice”, Paul W. Glimcher 
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4014489/
: “Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines”, Fernando SolerToscano, Hector Zenil, JeanPaul Delahaye, Nicolas Gauvrit 
2007rodriguez.pdf
: “A Methodology for Studying Various Interpretations of the N,NDimethyltryptamineInduced Alternate Reality”, Marko A. Rodriguez 
1986landauer.pdf
: “How Much Do People Remember? Some Estimates of the Quantity of Learned Information in LongTerm Memory”, Thomas K. Landauer