Unifying Approaches in Active Learning and Active Sampling via Fisher Information and Information-Theoretic Quantities
Market Microstructure and Informational Efficiency: The Role of Intermediation
Electrochemical potential enables dormant spores to integrate environmental signals
Macaques preferentially attend to intermediately surprising information
Efficiently irrational: deciphering the riddle of human choice
First Contact: Unsupervised Human-Machine Co-Adaptation via Mutual Information Maximization
The InterModel Vigorish (IMV): A flexible and portable approach for quantifying predictive accuracy with binary outcomes
Intelligence and Unambitiousness Using Algorithmic Information Theory
Entropy trade-offs in artistic design: A case study of Tamil kolam
Computation in the human cerebral cortex uses less than 0.2 watts yet this great expense is optimal when considering communication costs
Neural networks are a priori biased towards Boolean functions with low entropy
A Unified Bellman Optimality Principle Combining Reward Maximization and Empowerment
Humans store about 1.5 megabytes of information during language acquisition
Predictability and Uncertainty in the Pleasure of Music: A Reward for Learning?
Measurement invariance explains the universal law of generalization for psychological perception
Information flow reveals prediction limits in online social activity
Predicting green: really radical (plant) predictive processing
Exploration and exploitation of Victorian science in Darwin’s reading notebooks
Advances in Physarum Machines: Sensing and Computing with Slime Mould
Ed Fredkin and the Physics of Information: An Inside Story of an Outsider Scientist
Nanoconnectomic upper bound on the variability of synaptic plasticity
Calculating Kolmogorov complexity from the output frequency distributions of small Turing machines
Understanding Predictive Information Criteria for Bayesian Models
A Methodology for Studying Various Interpretations of the N,N-dimethyltryptamine-Induced Alternate Reality
Analog Versus Digital: Extrapolating from Electronics to Neurobiology
Information Theory and an Extension of the Maximum Likelihood Principle
Information and the Accuracy Attainable in the Estimation of Statistical Parameters
How much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-Term Memory
Randomness conservation inequalities; information and independence in mathematical theories
Coding Theorems for a Discrete Source With a Fidelity Criterion
How Many Persons Can There Be: Brain Reconstruction and Big Numbers
https://bishopfox.com/blog/unredacter-tool-never-pixelation
https://physics.stackexchange.com/questions/816698/how-many-photons-are-received-per-bit-transmitted-from-voyager-1
https://pure.mpg.de/rest/items/item_2383162_7/component/file_2456978/content
https://www.quantamagazine.org/researchers-defeat-randomness-to-create-ideal-code-20211124/
Unifying Approaches in Active Learning and Active Sampling via Fisher Information and Information-Theoretic Quantities
https%253A%252F%252Fopenreview.net%252Fforum%253Fid%253DUVDAKQANOW.html
Efficiently irrational: deciphering the riddle of human choice
%252Fdoc%252Fstatistics%252Fdecision%252F2022-glimcher.pdf.html
Calculating Kolmogorov complexity from the output frequency distributions of small Turing machines
https%253A%252F%252Fwww.ncbi.nlm.nih.gov%252Fpmc%252Farticles%252FPMC4014489%252F.html
A Methodology for Studying Various Interpretations of the N,N-dimethyltryptamine-Induced Alternate Reality
How much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-Term Memory
%252Fdoc%252Fcs%252Falgorithm%252Finformation%252F1986-landauer.pdf.html
Wikipedia Bibliography: