- See Also
-
Links
- “Gzip versus Bag-of-words for Text Classification With k-NN”, Opitz 2023
- “U-Net CNN in APL: Exploring Zero-Framework, Zero-Library Machine Learning”, Hsu & Serrão 2023
- “Blockwise Parallel Transformer for Long Context Large Models”, Liu & Abbeel 2023
- “You And Your Research”, Hamming 2023
- “DIRAC: Neural Image Compression With a Diffusion-Based Decoder”, Goose et al 2023
- “Less Is More: Parameter-Free Text Classification With Gzip”, Jiang et al 2022
- “Monolith: Real Time Recommendation System With Collisionless Embedding Table”, Liu et al 2022
- “A Library for Representing Python Programs As Graphs for Machine Learning”, Bieber et al 2022
- “TextWorldExpress: Simulating Text Games at One Million Steps Per Second”, Jansen & Côté 2022
- “Learning With Combinatorial Optimization Layers: a Probabilistic Approach”, Dalle et al 2022
- “Overwatch: Learning Patterns in Code Edit Sequences”, Zhang et al 2022
- “Heisenbugs: The Most Elusive Kind of Bug, and How to Capture Them With Perfect Replayability—Eliminate Heisenbugs and Endless Debugging Sessions!”, Ovadia 2022
- “Progress in Mathematical Programming Solvers 2001–2020”, Koch et al 2022
- “DiffC: Lossy Compression With Gaussian Diffusion”, Theis et al 2022
- “Searching for Cyclic TV Reference Paradoxes”, Pinheiro 2022
- “Fast Text Placement Scheme for ASCII Art Synthesis”, Chung & Kwon 2022
- “Monarch: Expressive Structured Matrices for Efficient and Accurate Training”, Dao et al 2022
- “Maximum Flow and Minimum-Cost Flow in Almost-Linear Time”, Chen et al 2022
- “What Goes into Making an OS to Be Unix Compliant Certified?”, Lambert 2022
- “Silent Bugs in Deep Learning Frameworks: An Empirical Study of Keras and TensorFlow”, Tambon et al 2021
- “Improving Real-time Rendering of Dynamic Digital Characters in Cycles”, Dietrich 2021
- “Real Time Cluster Path Tracing”, Xie et al 2021
- “Learning a Large Neighborhood Search Algorithm for Mixed Integer Programs”, Sonnerat et al 2021
- “Real-time Neural Radiance Caching for Path Tracing”, Müller et al 2021
- “Randomness In Neural Network Training: Characterizing The Impact of Tooling”, Zhuang et al 2021
- “How Developers Choose Names”, Feitelson et al 2021
- “Pretrained Transformers As Universal Computation Engines”, Lu et al 2021
- “Entropy Trade-offs in Artistic Design: A Case Study of Tamil kolam”, Tran et al 2021
- “Investment vs. Reward in a Competitive Knapsack Problem”, Neumann & Gros 2021
- “MLGO: a Machine Learning Guided Compiler Optimizations Framework”, Trofin et al 2021
- “NNUE: The Neural Network of the Stockfish Chess Engine”, Goucher 2021
- “I Know What You Bought At Chipotle for $9.81 by Solving A Linear Inverse Problem”, Fleder & Shah 2020
- “Presyn: Modeling Black-Box Components With Probabilistic Synthesis”, Collie et al 2020
- “Why Johnny Won’t Upgrade”, Mattheij 2020
- “Optimal Peanut Butter and Banana Sandwiches”, Rosenthal 2020
- “Measuring Hardware Overhang”, hippke 2020
- “A Time Leap Challenge for SAT Solving”, Fichte et al 2020
- “A Bayesian Approach to the Simulation Argument”, Kipping 2020
- “Algorithms With Predictions”, Mitzenmacher & Vassilvitskii 2020
- “BanditPAM: Almost Linear Time k-Medoids Clustering via Multi-Armed Bandits”, Tiwari et al 2020
- “Lessons Learned from Bugs in Models of Human History”, Ragsdale et al 2020
- “Funnel-Transformer: Filtering out Sequential Redundancy for Efficient Language Processing”, Dai et al 2020
- “The Scaling Hypothesis”, Gwern 2020
- “Measuring the Algorithmic Efficiency of Neural Networks”, Hernandez & Brown 2020
- “Computation in the Human Cerebral Cortex Uses Less Than 0.2 Watts yet This Great Expense Is Optimal When considering Communication Costs”, Levy & Calvert 2020
- “Bringing GNU Emacs to Native Code”, Corallo et al 2020
- “Learning-based Memory Allocation for C++ Server Workloads”, Maas et al 2020
- “The History of the URL”, Bloom 2020
- “Zip Files: History, Explanation and Implementation”, Wennborg 2020
- “Quantifying Independently Reproducible Machine Learning”, Raff 2020
- “Taxonomy of Real Faults in Deep Learning Systems”, Humbatova et al 2019
- “They Might Never Tell You It’s Broken”, Chevalier-Boisvert 2019
- “Hyrum’s Law: An Observation on Software Engineering”, Wright 2019
- “Co-dfns: A Data Parallel Compiler Hosted on the GPU”, Hsu 2019c
- “Grandmaster Level in StarCraft II Using Multi-agent Reinforcement Learning”, Vinyals et al 2019
- “Local-First Software: You Own Your Data, in spite of the Cloud [paper]”, Kleppmann et al 2019
- “Neural Networks Are a Priori Biased towards Boolean Functions With Low Entropy”, Mingard et al 2019
- “A Step Toward Quantifying Independently Reproducible Machine Learning Research”, Raff 2019
- “Different Languages, Similar Encoding Efficiency: Comparable Information Rates across the Human Communicative Niche”, Coupé et al 2019
- “A View on Deep Reinforcement Learning in System Optimization”, Haj-Ali et al 2019
- “Moral Permissibility of Action Plans”, Lindner et al 2019
- “ParPaRaw: Massively Parallel Parsing of Delimiter-Separated Raw Data”, Stehle & Jacobsen 2019
- “Real-world Dynamic Programming: Seam Carving”, Das 2019
- “Unraveling the JPEG: JPEG Images Are Everywhere in Our Digital Lives, but behind the Veil of Familiarity Lie Algorithms That Remove Details That Are Imperceptible to the Human Eye. This Produces the Highest Visual Quality With the Smallest File Size—but What Does That Look Like? Let’s See What Our Eyes Can’t See!”, Shehata 2019
- “Local-first Software: You Own Your Data, in spite of the Cloud [web]”, Kleppmann et al 2019
- “GAP: Generalizable Approximate Graph Partitioning Framework”, Nazi et al 2019
- “Parsing Gigabytes of JSON per Second”, Langdale & Lemire 2019
- “AutoPhase: Compiler Phase-Ordering for High Level Synthesis With Deep Reinforcement Learning”, Haj-Ali et al 2019
- “Humans Store about 1.5 Megabytes of Information during Language Acquisition”, Mollica & Piantadosi 2019
- “Test Driving 'Power of Two Random Choices' Load Balancing”, Tarreau 2019
- “Meta-Learning Neural Bloom Filters”, Rae & al 2019
- “Reinventing the Wheel: Discovering the Optimal Rolling Shape With PyTorch”, Wiener 2019
- “Slow Software”, McGranaghan 2018
- “Learning to Perform Local Rewriting for Combinatorial Optimization”, Chen & Tian 2018
- “How to Shuffle a Big Dataset”, Hardin 2018
- “Deterministic Implementations for Reproducibility in Deep Reinforcement Learning”, Nagarajan et al 2018
- “Learning to Optimize Join Queries With Deep Reinforcement Learning”, Krishnan et al 2018
- “Umineko: The Opium Of The Magics”, Gwern 2018
- “Learning to Optimize Tensor Programs”, Chen et al 2018
- “Optimizing Query Evaluations Using Reinforcement Learning for Web Search”, Rosset et al 2018
- “Learning Memory Access Patterns”, Hashemi et al 2018
- “Tensor Comprehensions: Framework-Agnostic High-Performance Machine Learning Abstractions”, Vasilache et al 2018
- “Innovation and Cumulative Culture through Tweaks and Leaps in Online Programming Contests”, Miu et al 2018
- “The Case for Learned Index Structures”, Kraska et al 2017
- “Automatic Differentiation in PyTorch”, Paszke et al 2017
- “From Punched Cards to Flat Screens: A Technical Autobiography”, Hazel 2017
- “DAG Reduction: Fast Answering Reachability Queries”, Zhou et al 2017
- “Stochastic Constraint Programming As Reinforcement Learning”, Prestwich et al 2017
- “Learning to Superoptimize Programs”, Bunel et al 2017
- “Web Bloat”, Luu 2017
- “Resource-Efficient Machine Learning in 2 KB RAM for the Internet of Things”, Kumar et al 2017
- “P≟NP § AI”, Aaronson 2017 (page 5)
- “Machine Learning for Systems and Systems for Machine Learning”, Dean 2017
- “Full Resolution Image Compression With Recurrent Neural Networks”, Toderici et al 2016
- “Energy-Efficient Algorithms”, Demaine et al 2016
- “A Discrete and Bounded Envy-Free Cake Cutting Protocol for Any Number of Agents”, Aziz & Mackenzie 2016
- “Why WhatsApp Only Needs 50 Engineers for Its 900M Users: One of the (many) Intriguing Parts of the WhatsApp Story Is That It Has Achieved Such Enormous Scale With Such a Tiny Team”, Metz 2015
- “Scalability! But at What COST?”, McSherry et al 2015
- “Inferring Algorithmic Patterns With Stack-Augmented Recurrent Nets”, Arm et al 2015
- “The Ph.D. Grind: A Ph.D. Student Memoir”, Guo 2015
- “The Mystery Machine: End-to-end Performance Analysis of Large-scale Internet Services”, Chow et al 2014
- “How Inefficient Can a Sort Algorithm Be?”, Lerma 2014
- “Core-Guided MaxSAT With Soft Cardinality Constraints”, Morgado et al 2014
- “Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines”, Soler-Toscano et al 2014
- “Algorithmic Progress in Six Domains”, Grace 2013
- “Intelligence Explosion Microeconomics”, Yudkowsky 2013
- “Bounded Kolmogorov Complexity Based on Cognitive Models”, Strannegård et al 2013
- “The Algebraic Combinatorial Approach for Low-Rank Matrix Completion”, Király et al 2012
- “Evaluating the Design of the R Language: Objects and Functions for Data Analysis”, Morandat et al 2012
- “The International SAT Solver Competitions”, Järvisalo et al 2012
- “Uniform Random Generation of Large Acyclic Digraphs”, Kuipers & Moffa 2012
- “National Cryptologic Museum Opens New Exhibit on Dr. John Nash”, NSA 2012
- “A Brief History of NP-Completeness, 1954–2012”, Johnson 2012
- “STEPS Toward Expressive Programming Systems: "A Science Experiment"”, Ohshima et al 2012 (page 2)
- “Cutting the Pipe: Achieving Sub-Second Iteration Times”, 5.1.1 2012
- “A Cross-Language Perspective On Speech Information Rate”, Pellegrino et al 2011
- “Why Philosophers Should Care About Computational Complexity”, Aaronson 2011
- “Notes on a New Philosophy of Empirical Science”, Burfoot 2011
- “New Strategy of Lossy Text Compression”, Al-Dubaee & Ahmad 2010
- “You’re Doing It Wrong: Think You’ve Mastered the Art of Server Performance? Think Again.”, Kamp 2010
- “Formal Theory of Creativity & Fun & Intrinsic Motivation (1990–2010)”, Schmidhuber 2010
- “Coolex: The Coolest Way to Generate Combinations”, Ruskey & Williams 2009
- “De-anonymizing Social Networks”, Narayanan & Shmatikov 2009
- “The Gödel Letter”, Gödel 2009
- “Producing Wrong Data Without Doing Anything Obviously Wrong!”, Mytkowicz et al 2009
- “Is There A Fourth Futamura Projection?”, Robert 2009
- “Dual-Pivot Quicksort Algorithm”, Yaroslavskiy 2009
- “Driven by Compression Progress: A Simple Principle Explains Essential Aspects of Subjective Beauty, Novelty, Surprise, Interestingness, Attention, Curiosity, Creativity, Art, Science, Music, Jokes”, Schmidhuber 2008
- “Aggregating Inconsistent Information: Ranking and Clustering”, Ailon et al 2008
- “Interview With Donald Knuth”, Binstock 2008
- “Optimal Boarding Method for Airline Passengers”, Steffen 2008
- “Harnessing Vision for Computation”, Changizi 2008
- “Communication in Economic Mechanisms”, Segal 2006b
- “How To Break Anonymity of the Netflix Prize Dataset”, Narayanan & Shmatikov 2006
- “Oral History of Butler Lampson § WWW”, Lampson & Kay 2006 (page 36)
- “History of Combinatorial Generation (The Art of Computer Programming: Volume 4: Pre-Fascicle 4B: §7.2.1.7) § Pg22”, Knuth 2005 (page 22)
- “NP-complete Problems and Physical Reality”, Aaronson 2005
- “Lower-Stretch Spanning Trees”, Elkin et al 2004
- “Clustering by Compression”, Cilibrasi & Vitanyi 2003
- “A Short History of Computational Complexity”, Fortnow et al 2003
- “Least Effort and the Origins of Scaling in Human Language”, Cancho & Sole 2003
- “Extended Comment on Language Trees and Zipping”, Goodman 2002
- “Solving Real-World Linear Programs: A Decade and More of Progress”, Bixby 2002
- “A Bit-Vector Algorithm for Computing Levenshtein and Damerau Edit Distances”, Hyyrö 2002
- “Estimating and Comparing Entropy across Written Natural Languages Using PPM Compression”, Behr et al 2002
- “Naked Objects: a Technique for Designing More Expressive Systems”, Pawson & Matthews 2001
- “Language Trees and Zipping”, Benedetto et al 2001
- “On Proebsting's Law”, Scott 2001
- “Peopleware: Why Measure Performance”, DeMarco & Lister 2001
- “Fast Text Compression With Neural Networks”, Mahoney 2000
- “The Effects of Moore’s Law and Slacking on Large Computations”, Gottbrath et al 1999
- “The Physical Limits of Communication”, Lachmann et al 1999
- “Quantum Effects in Algorithms”, Jozsa 1998
- “Applications of Randomness in System Performance Measurement”, Blackwell 1998
- “Proebsting’s Law: Compiler Advances Double Computing Power Every 18 Years”, Proebsting 1998
- “THE ENTROPY OF ENGLISH USING PPM-BASED MODELS—Data Compression Conference, 1996. DCC '96. Proceedings”
- “Entropy of Natural Languages: Theory and Experiment”, Levitin & Reingold 1994
- “Building a Large Annotated Corpus of English: The Penn Treebank”, Marcus et al 1993
- “On the Computational Complexity of the Jones and Tutte Polynomials”, Jaeger et al 1990
- “Planning and Learning in Permutation Groups”, Mose et al 1989
- “Separating Strings With Small Automata”, Robson 1989
- “Optimal Nonlinear Approximation”, DeVore et al 1989
- “Hypertext and the Oxford English Dictionary”, Raymond & Tompa 1988
- “Three Scientists and Their Gods: Looking for Meaning in an Age of Information”, Wright 1988
- “Average Case Complete Problems”, Levin 1986
- “Probabilistic Counting Algorithms for Data Base Applications”, Flajolet & Martin 1985
- “Randomness Conservation Inequalities; Information and Independence in Mathematical Theories”, Levin 1984
- “The Competitive Allocation Process Is Informationally Efficient Uniquely”, Jordan 1982
- “Epigrams on Programming”, Perlis 1982
- “Procedural Reflection in Programming Languages”, Smith 1982
- “On Holy Wars and a Plea for Peace”, Cohen 1981
- “Mutation Analysis Of Program Test Data”, Budd 1980
- “A Correct Preprocessing Algorithm for Boyer-Moore String-Searching”, Rytter 1980
- “A Convergent Gambling Estimate Of The Entropy Of English”, Cover & King 1978
- “Algorithms for Loop Matchings”, Nussinov et al 1978
- “Fast Pattern Matching in Strings”, Knuth et al 1977
-
“Structured Programming With
go To
Statements”, Knuth 1974 - “Interstellar Communication: Scientific Perspectives”, Ponnamperuma & Cameron 1974
- “A Parallel Algorithm for the Efficient Solution of a General Class of Recurrence Equations”, Kogge & Stone 1973
- “The Dangers of Computer-Science Theory”, Knuth 1973
- “Universal Sequential Search Problems”, Levin 1973
- “The Humble Programmer [EWD340]”, Dijkstra 1972
- “A Simple Randomization Procedure”, Sandelius 1962
- “Generation of Random Permutations of Given Number of Elements Using Random Sampling Numbers”, Rao 1961
- “Possible Principles Underlying the Transformations of Sensory Messages”, Barlow 1961
- “Coding Theorems for a Discrete Source With a Fidelity Criterion”, Shannon 1959
- “The Codeless Code: Case 96: ‘Stateless’”
- “Submission #6347: Chef Stef’s NES Arkanoid 'warpless' in 11:11.18”
- “TSP Art”
- Sort By Magic
- Wikipedia
- Miscellaneous
- Link Bibliography
See Also
Links
“Gzip versus Bag-of-words for Text Classification With k-NN”, Opitz 2023
“Gzip versus bag-of-words for text classification with k-NN”
“U-Net CNN in APL: Exploring Zero-Framework, Zero-Library Machine Learning”, Hsu & Serrão 2023
“U-Net CNN in APL: Exploring Zero-Framework, Zero-Library Machine Learning”
“Blockwise Parallel Transformer for Long Context Large Models”, Liu & Abbeel 2023
“Blockwise Parallel Transformer for Long Context Large Models”
“You And Your Research”, Hamming 2023
“DIRAC: Neural Image Compression With a Diffusion-Based Decoder”, Goose et al 2023
“DIRAC: Neural Image Compression with a Diffusion-Based Decoder”
“Less Is More: Parameter-Free Text Classification With Gzip”, Jiang et al 2022
“Less is More: Parameter-Free Text Classification with Gzip”
“Monolith: Real Time Recommendation System With Collisionless Embedding Table”, Liu et al 2022
“Monolith: Real Time Recommendation System With Collisionless Embedding Table”
“A Library for Representing Python Programs As Graphs for Machine Learning”, Bieber et al 2022
“A Library for Representing Python Programs as Graphs for Machine Learning”
“TextWorldExpress: Simulating Text Games at One Million Steps Per Second”, Jansen & Côté 2022
“TextWorldExpress: Simulating Text Games at One Million Steps Per Second”
“Learning With Combinatorial Optimization Layers: a Probabilistic Approach”, Dalle et al 2022
“Learning with Combinatorial Optimization Layers: a Probabilistic Approach”
“Overwatch: Learning Patterns in Code Edit Sequences”, Zhang et al 2022
“Heisenbugs: The Most Elusive Kind of Bug, and How to Capture Them With Perfect Replayability—Eliminate Heisenbugs and Endless Debugging Sessions!”, Ovadia 2022
“Progress in Mathematical Programming Solvers 2001–2020”, Koch et al 2022
“DiffC: Lossy Compression With Gaussian Diffusion”, Theis et al 2022
“Searching for Cyclic TV Reference Paradoxes”, Pinheiro 2022
“Fast Text Placement Scheme for ASCII Art Synthesis”, Chung & Kwon 2022
“Monarch: Expressive Structured Matrices for Efficient and Accurate Training”, Dao et al 2022
“Monarch: Expressive Structured Matrices for Efficient and Accurate Training”
“Maximum Flow and Minimum-Cost Flow in Almost-Linear Time”, Chen et al 2022
“What Goes into Making an OS to Be Unix Compliant Certified?”, Lambert 2022
“What goes into making an OS to be Unix compliant certified?”
“Silent Bugs in Deep Learning Frameworks: An Empirical Study of Keras and TensorFlow”, Tambon et al 2021
“Silent Bugs in Deep Learning Frameworks: An Empirical Study of Keras and TensorFlow”
“Improving Real-time Rendering of Dynamic Digital Characters in Cycles”, Dietrich 2021
“Improving Real-time Rendering of Dynamic Digital Characters in Cycles”
“Real Time Cluster Path Tracing”, Xie et al 2021
“Learning a Large Neighborhood Search Algorithm for Mixed Integer Programs”, Sonnerat et al 2021
“Learning a Large Neighborhood Search Algorithm for Mixed Integer Programs”
“Real-time Neural Radiance Caching for Path Tracing”, Müller et al 2021
“Randomness In Neural Network Training: Characterizing The Impact of Tooling”, Zhuang et al 2021
“Randomness In Neural Network Training: Characterizing The Impact of Tooling”
“How Developers Choose Names”, Feitelson et al 2021
“Pretrained Transformers As Universal Computation Engines”, Lu et al 2021
“Entropy Trade-offs in Artistic Design: A Case Study of Tamil kolam”, Tran et al 2021
“Entropy trade-offs in artistic design: A case study of Tamil kolam”
“Investment vs. Reward in a Competitive Knapsack Problem”, Neumann & Gros 2021
“MLGO: a Machine Learning Guided Compiler Optimizations Framework”, Trofin et al 2021
“MLGO: a Machine Learning Guided Compiler Optimizations Framework”
“NNUE: The Neural Network of the Stockfish Chess Engine”, Goucher 2021
“I Know What You Bought At Chipotle for $9.81 by Solving A Linear Inverse Problem”, Fleder & Shah 2020
“I Know What You Bought At Chipotle for $9.81 by Solving A Linear Inverse Problem”
“Presyn: Modeling Black-Box Components With Probabilistic Synthesis”, Collie et al 2020
“Presyn: Modeling Black-Box Components with Probabilistic Synthesis”
“Why Johnny Won’t Upgrade”, Mattheij 2020
“Optimal Peanut Butter and Banana Sandwiches”, Rosenthal 2020
“Measuring Hardware Overhang”, hippke 2020
“A Time Leap Challenge for SAT Solving”, Fichte et al 2020
“A Bayesian Approach to the Simulation Argument”, Kipping 2020
“Algorithms With Predictions”, Mitzenmacher & Vassilvitskii 2020
“BanditPAM: Almost Linear Time k-Medoids Clustering via Multi-Armed Bandits”, Tiwari et al 2020
“BanditPAM: Almost Linear Time k-Medoids Clustering via Multi-Armed Bandits”
“Lessons Learned from Bugs in Models of Human History”, Ragsdale et al 2020
“Funnel-Transformer: Filtering out Sequential Redundancy for Efficient Language Processing”, Dai et al 2020
“Funnel-Transformer: Filtering out Sequential Redundancy for Efficient Language Processing”
“The Scaling Hypothesis”, Gwern 2020
“Measuring the Algorithmic Efficiency of Neural Networks”, Hernandez & Brown 2020
“Computation in the Human Cerebral Cortex Uses Less Than 0.2 Watts yet This Great Expense Is Optimal When considering Communication Costs”, Levy & Calvert 2020
“Bringing GNU Emacs to Native Code”, Corallo et al 2020
“Learning-based Memory Allocation for C++ Server Workloads”, Maas et al 2020
“The History of the URL”, Bloom 2020
“Zip Files: History, Explanation and Implementation”, Wennborg 2020
“Quantifying Independently Reproducible Machine Learning”, Raff 2020
“Taxonomy of Real Faults in Deep Learning Systems”, Humbatova et al 2019
“They Might Never Tell You It’s Broken”, Chevalier-Boisvert 2019
“Hyrum’s Law: An Observation on Software Engineering”, Wright 2019
“Co-dfns: A Data Parallel Compiler Hosted on the GPU”, Hsu 2019c
“Grandmaster Level in StarCraft II Using Multi-agent Reinforcement Learning”, Vinyals et al 2019
“Grandmaster level in StarCraft II using multi-agent reinforcement learning”
“Local-First Software: You Own Your Data, in spite of the Cloud [paper]”, Kleppmann et al 2019
“Local-First Software: You Own Your Data, in spite of the Cloud [paper]”
“Neural Networks Are a Priori Biased towards Boolean Functions With Low Entropy”, Mingard et al 2019
“Neural networks are a priori biased towards Boolean functions with low entropy”
“A Step Toward Quantifying Independently Reproducible Machine Learning Research”, Raff 2019
“A Step Toward Quantifying Independently Reproducible Machine Learning Research”
“Different Languages, Similar Encoding Efficiency: Comparable Information Rates across the Human Communicative Niche”, Coupé et al 2019
“A View on Deep Reinforcement Learning in System Optimization”, Haj-Ali et al 2019
“A View on Deep Reinforcement Learning in System Optimization”
“Moral Permissibility of Action Plans”, Lindner et al 2019
“ParPaRaw: Massively Parallel Parsing of Delimiter-Separated Raw Data”, Stehle & Jacobsen 2019
“ParPaRaw: Massively Parallel Parsing of Delimiter-Separated Raw Data”
“Real-world Dynamic Programming: Seam Carving”, Das 2019
“Unraveling the JPEG: JPEG Images Are Everywhere in Our Digital Lives, but behind the Veil of Familiarity Lie Algorithms That Remove Details That Are Imperceptible to the Human Eye. This Produces the Highest Visual Quality With the Smallest File Size—but What Does That Look Like? Let’s See What Our Eyes Can’t See!”, Shehata 2019
“Local-first Software: You Own Your Data, in spite of the Cloud [web]”, Kleppmann et al 2019
“Local-first software: You own your data, in spite of the cloud [web]”
“GAP: Generalizable Approximate Graph Partitioning Framework”, Nazi et al 2019
“GAP: Generalizable Approximate Graph Partitioning Framework”
“Parsing Gigabytes of JSON per Second”, Langdale & Lemire 2019
“AutoPhase: Compiler Phase-Ordering for High Level Synthesis With Deep Reinforcement Learning”, Haj-Ali et al 2019
“AutoPhase: Compiler Phase-Ordering for High Level Synthesis with Deep Reinforcement Learning”
“Humans Store about 1.5 Megabytes of Information during Language Acquisition”, Mollica & Piantadosi 2019
“Humans store about 1.5 megabytes of information during language acquisition”
“Test Driving 'Power of Two Random Choices' Load Balancing”, Tarreau 2019
“Meta-Learning Neural Bloom Filters”, Rae & al 2019
“Reinventing the Wheel: Discovering the Optimal Rolling Shape With PyTorch”, Wiener 2019
“Reinventing the Wheel: Discovering the Optimal Rolling Shape with PyTorch”
“Slow Software”, McGranaghan 2018
“Learning to Perform Local Rewriting for Combinatorial Optimization”, Chen & Tian 2018
“Learning to Perform Local Rewriting for Combinatorial Optimization”
“How to Shuffle a Big Dataset”, Hardin 2018
“Deterministic Implementations for Reproducibility in Deep Reinforcement Learning”, Nagarajan et al 2018
“Deterministic Implementations for Reproducibility in Deep Reinforcement Learning”
“Learning to Optimize Join Queries With Deep Reinforcement Learning”, Krishnan et al 2018
“Learning to Optimize Join Queries With Deep Reinforcement Learning”
“Umineko: The Opium Of The Magics”, Gwern 2018
“Learning to Optimize Tensor Programs”, Chen et al 2018
“Optimizing Query Evaluations Using Reinforcement Learning for Web Search”, Rosset et al 2018
“Optimizing Query Evaluations using Reinforcement Learning for Web Search”
“Learning Memory Access Patterns”, Hashemi et al 2018
“Tensor Comprehensions: Framework-Agnostic High-Performance Machine Learning Abstractions”, Vasilache et al 2018
“Tensor Comprehensions: Framework-Agnostic High-Performance Machine Learning Abstractions”
“Innovation and Cumulative Culture through Tweaks and Leaps in Online Programming Contests”, Miu et al 2018
“Innovation and cumulative culture through tweaks and leaps in online programming contests”
“The Case for Learned Index Structures”, Kraska et al 2017
“Automatic Differentiation in PyTorch”, Paszke et al 2017
“From Punched Cards to Flat Screens: A Technical Autobiography”, Hazel 2017
“From Punched Cards to Flat Screens: A Technical Autobiography”
“DAG Reduction: Fast Answering Reachability Queries”, Zhou et al 2017
“Stochastic Constraint Programming As Reinforcement Learning”, Prestwich et al 2017
“Stochastic Constraint Programming as Reinforcement Learning”
“Learning to Superoptimize Programs”, Bunel et al 2017
“Web Bloat”, Luu 2017
“Resource-Efficient Machine Learning in 2 KB RAM for the Internet of Things”, Kumar et al 2017
“Resource-Efficient Machine Learning in 2 KB RAM for the Internet of Things”
“P≟NP § AI”, Aaronson 2017 (page 5)
“Machine Learning for Systems and Systems for Machine Learning”, Dean 2017
“Machine Learning for Systems and Systems for Machine Learning”
“Full Resolution Image Compression With Recurrent Neural Networks”, Toderici et al 2016
“Full Resolution Image Compression with Recurrent Neural Networks”
“Energy-Efficient Algorithms”, Demaine et al 2016
“A Discrete and Bounded Envy-Free Cake Cutting Protocol for Any Number of Agents”, Aziz & Mackenzie 2016
“A Discrete and Bounded Envy-Free Cake Cutting Protocol for Any Number of Agents”
“Why WhatsApp Only Needs 50 Engineers for Its 900M Users: One of the (many) Intriguing Parts of the WhatsApp Story Is That It Has Achieved Such Enormous Scale With Such a Tiny Team”, Metz 2015
“Scalability! But at What COST?”, McSherry et al 2015
“Inferring Algorithmic Patterns With Stack-Augmented Recurrent Nets”, Arm et al 2015
“Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets”
“The Ph.D. Grind: A Ph.D. Student Memoir”, Guo 2015
“The Mystery Machine: End-to-end Performance Analysis of Large-scale Internet Services”, Chow et al 2014
“The Mystery Machine: End-to-end performance analysis of large-scale Internet services”
“How Inefficient Can a Sort Algorithm Be?”, Lerma 2014
“Core-Guided MaxSAT With Soft Cardinality Constraints”, Morgado et al 2014
“Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines”, Soler-Toscano et al 2014
“Calculating Kolmogorov complexity from the output frequency distributions of small Turing machines”
“Algorithmic Progress in Six Domains”, Grace 2013
“Intelligence Explosion Microeconomics”, Yudkowsky 2013
“Bounded Kolmogorov Complexity Based on Cognitive Models”, Strannegård et al 2013
“The Algebraic Combinatorial Approach for Low-Rank Matrix Completion”, Király et al 2012
“The Algebraic Combinatorial Approach for Low-Rank Matrix Completion”
“Evaluating the Design of the R Language: Objects and Functions for Data Analysis”, Morandat et al 2012
“Evaluating the Design of the R Language: Objects and Functions for Data Analysis”
“The International SAT Solver Competitions”, Järvisalo et al 2012
“Uniform Random Generation of Large Acyclic Digraphs”, Kuipers & Moffa 2012
“National Cryptologic Museum Opens New Exhibit on Dr. John Nash”, NSA 2012
“National Cryptologic Museum Opens New Exhibit on Dr. John Nash”
“A Brief History of NP-Completeness, 1954–2012”, Johnson 2012
“STEPS Toward Expressive Programming Systems: "A Science Experiment"”, Ohshima et al 2012 (page 2)
“STEPS Toward Expressive Programming Systems: "A Science Experiment"”
“Cutting the Pipe: Achieving Sub-Second Iteration Times”, 5.1.1 2012
“A Cross-Language Perspective On Speech Information Rate”, Pellegrino et al 2011
“Why Philosophers Should Care About Computational Complexity”, Aaronson 2011
“Why Philosophers Should Care About Computational Complexity”
“Notes on a New Philosophy of Empirical Science”, Burfoot 2011
“New Strategy of Lossy Text Compression”, Al-Dubaee & Ahmad 2010
“You’re Doing It Wrong: Think You’ve Mastered the Art of Server Performance? Think Again.”, Kamp 2010
“You’re Doing It Wrong: Think you’ve mastered the art of server performance? Think again.”
“Formal Theory of Creativity & Fun & Intrinsic Motivation (1990–2010)”, Schmidhuber 2010
“Formal Theory of Creativity & Fun & Intrinsic Motivation (1990–2010)”
“Coolex: The Coolest Way to Generate Combinations”, Ruskey & Williams 2009
“De-anonymizing Social Networks”, Narayanan & Shmatikov 2009
“The Gödel Letter”, Gödel 2009
“Producing Wrong Data Without Doing Anything Obviously Wrong!”, Mytkowicz et al 2009
“Producing Wrong Data Without Doing Anything Obviously Wrong!”
“Is There A Fourth Futamura Projection?”, Robert 2009
“Dual-Pivot Quicksort Algorithm”, Yaroslavskiy 2009
“Driven by Compression Progress: A Simple Principle Explains Essential Aspects of Subjective Beauty, Novelty, Surprise, Interestingness, Attention, Curiosity, Creativity, Art, Science, Music, Jokes”, Schmidhuber 2008
“Aggregating Inconsistent Information: Ranking and Clustering”, Ailon et al 2008
“Aggregating inconsistent information: Ranking and clustering”
“Interview With Donald Knuth”, Binstock 2008
“Optimal Boarding Method for Airline Passengers”, Steffen 2008
“Harnessing Vision for Computation”, Changizi 2008
“Communication in Economic Mechanisms”, Segal 2006b
“How To Break Anonymity of the Netflix Prize Dataset”, Narayanan & Shmatikov 2006
“Oral History of Butler Lampson § WWW”, Lampson & Kay 2006 (page 36)
“History of Combinatorial Generation (The Art of Computer Programming: Volume 4: Pre-Fascicle 4B: §7.2.1.7) § Pg22”, Knuth 2005 (page 22)
“NP-complete Problems and Physical Reality”, Aaronson 2005
“Lower-Stretch Spanning Trees”, Elkin et al 2004
“Clustering by Compression”, Cilibrasi & Vitanyi 2003
“A Short History of Computational Complexity”, Fortnow et al 2003
“Least Effort and the Origins of Scaling in Human Language”, Cancho & Sole 2003
“Extended Comment on Language Trees and Zipping”, Goodman 2002
“Solving Real-World Linear Programs: A Decade and More of Progress”, Bixby 2002
“Solving Real-World Linear Programs: A Decade and More of Progress”
“A Bit-Vector Algorithm for Computing Levenshtein and Damerau Edit Distances”, Hyyrö 2002
“A Bit-Vector Algorithm for Computing Levenshtein and Damerau Edit Distances”
“Estimating and Comparing Entropy across Written Natural Languages Using PPM Compression”, Behr et al 2002
“Estimating and Comparing Entropy across Written Natural Languages Using PPM Compression”
“Naked Objects: a Technique for Designing More Expressive Systems”, Pawson & Matthews 2001
“Naked objects: a technique for designing more expressive systems”
“Language Trees and Zipping”, Benedetto et al 2001
“On Proebsting's Law”, Scott 2001
“Peopleware: Why Measure Performance”, DeMarco & Lister 2001
“Fast Text Compression With Neural Networks”, Mahoney 2000
“The Effects of Moore’s Law and Slacking on Large Computations”, Gottbrath et al 1999
“The Effects of Moore’s Law and Slacking on Large Computations”
“The Physical Limits of Communication”, Lachmann et al 1999
“Quantum Effects in Algorithms”, Jozsa 1998
“Applications of Randomness in System Performance Measurement”, Blackwell 1998
“Applications of Randomness in System Performance Measurement”
“Proebsting’s Law: Compiler Advances Double Computing Power Every 18 Years”, Proebsting 1998
“Proebsting’s Law: Compiler Advances Double Computing Power Every 18 Years”
“THE ENTROPY OF ENGLISH USING PPM-BASED MODELS—Data Compression Conference, 1996. DCC '96. Proceedings”
“Entropy of Natural Languages: Theory and Experiment”, Levitin & Reingold 1994
“Building a Large Annotated Corpus of English: The Penn Treebank”, Marcus et al 1993
“Building a Large Annotated Corpus of English: The Penn Treebank”
“On the Computational Complexity of the Jones and Tutte Polynomials”, Jaeger et al 1990
“On the computational complexity of the Jones and Tutte polynomials”
“Planning and Learning in Permutation Groups”, Mose et al 1989
“Separating Strings With Small Automata”, Robson 1989
“Optimal Nonlinear Approximation”, DeVore et al 1989
“Hypertext and the Oxford English Dictionary”, Raymond & Tompa 1988
“Three Scientists and Their Gods: Looking for Meaning in an Age of Information”, Wright 1988
“Three Scientists and Their Gods: Looking for Meaning in an Age of Information”
“Average Case Complete Problems”, Levin 1986
“Probabilistic Counting Algorithms for Data Base Applications”, Flajolet & Martin 1985
“Probabilistic counting algorithms for data base applications”
“Randomness Conservation Inequalities; Information and Independence in Mathematical Theories”, Levin 1984
“Randomness conservation inequalities; information and independence in mathematical theories”
“The Competitive Allocation Process Is Informationally Efficient Uniquely”, Jordan 1982
“The competitive allocation process is informationally efficient uniquely”
“Epigrams on Programming”, Perlis 1982
“Procedural Reflection in Programming Languages”, Smith 1982
“On Holy Wars and a Plea for Peace”, Cohen 1981
“Mutation Analysis Of Program Test Data”, Budd 1980
“A Correct Preprocessing Algorithm for Boyer-Moore String-Searching”, Rytter 1980
“A Correct Preprocessing Algorithm for Boyer-Moore String-Searching”
“A Convergent Gambling Estimate Of The Entropy Of English”, Cover & King 1978
“Algorithms for Loop Matchings”, Nussinov et al 1978
“Fast Pattern Matching in Strings”, Knuth et al 1977
“Structured Programming With go To
Statements”, Knuth 1974
“Interstellar Communication: Scientific Perspectives”, Ponnamperuma & Cameron 1974
“A Parallel Algorithm for the Efficient Solution of a General Class of Recurrence Equations”, Kogge & Stone 1973
“A Parallel Algorithm for the Efficient Solution of a General Class of Recurrence Equations”
“The Dangers of Computer-Science Theory”, Knuth 1973
“Universal Sequential Search Problems”, Levin 1973
“The Humble Programmer [EWD340]”, Dijkstra 1972
“A Simple Randomization Procedure”, Sandelius 1962
“Generation of Random Permutations of Given Number of Elements Using Random Sampling Numbers”, Rao 1961
“Generation of Random Permutations of Given Number of Elements Using Random Sampling Numbers”
“Possible Principles Underlying the Transformations of Sensory Messages”, Barlow 1961
“Possible Principles Underlying the Transformations of Sensory Messages”
“Coding Theorems for a Discrete Source With a Fidelity Criterion”, Shannon 1959
“Coding Theorems for a Discrete Source With a Fidelity Criterion”
“The Codeless Code: Case 96: ‘Stateless’”
“Submission #6347: Chef Stef’s NES Arkanoid 'warpless' in 11:11.18”
“Submission #6347: Chef Stef’s NES Arkanoid 'warpless' in 11:11.18”
“TSP Art”
Sort By Magic
Annotations sorted by machine learning into inferred 'tags'. This provides an alternative way to browse: instead of by date order, one can browse in topic order. The 'sorted' list has been automatically clustered into multiple sections & auto-labeled for easier browsing.
Beginning with the newest annotation, it uses the embedding of each annotation to attempt to create a list of nearest-neighbor annotations, creating a progression of topics. For more details, see the link.
compression
machine-learning, artificial-intelligence, algorithms, optimization, data-analysis
team
Wikipedia
Miscellaneous
-
/doc/cs/algorithm/2018-miu-figure1-progressofbestperformingprogramovertimeofcontest.jpg
-
/doc/cs/algorithm/2018-mcgranaghan-inkandswitch-slowsoftware-inputlatencycascade.png
-
http://www.scholarpedia.org/article/Applications_of_algorithmic_information_theory
-
https://ahrm.github.io/jekyll/update/2022/04/14/using-languge-models-to-read-faster.html
-
https://benkrause.github.io/blog/human-level-text-prediction/
-
https://blog.research.google/2022/09/tensorstore-for-high-performance.html
-
https://cacm.acm.org/magazines/2023/6/273222-the-silent-revolution-of-sat/fulltext
-
https://github.com/tigerbeetle/tigerbeetle/blob/main/docs/DESIGN.md#architecture
-
https://jacobbrazeal.wordpress.com/2023/07/09/computationally-optimal-sequences-of-barbell-plates/
-
https://johnnysswlab.com/decreasing-the-number-of-memory-accesses-the-compilers-secret-life-2-2/
-
https://maxhalford.github.io/blog/text-classification-by-compression/
-
https://priceonomics.com/the-spectrum-auction-how-economists-saved-the-day/
-
https://psyche.co/ideas/as-language-evolves-who-wins-out-speakers-or-listeners
-
https://pub.towardsai.net/stable-diffusion-based-image-compresssion-6f1f0a399202
-
https://robertheaton.com/2018/12/17/wavefunction-collapse-algorithm/
-
https://web.archive.org/web/20100322192300/http://33bits.org/2010/03/15/open-letter-to-netflix/
-
https://www.ageofinvention.xyz/p/age-of-invention-the-beacons-are
-
https://www.eveonline.com/news/view/information-is-power-excel-release
-
https://www.honeycomb.io/blog/hard-stuff-nobody-talks-about-llm
-
https://www.johndcook.com/blog/2017/02/08/how-efficient-is-morse-code/
-
https://www.kalzumeus.com/2010/06/17/falsehoods-programmers-believe-about-names/
-
https://www.lesswrong.com/posts/GveDmwzxiYHSWtZbv/shannon-s-surprising-discovery-1
-
https://www.lesswrong.com/posts/aRxDLju75KXD6PCpB/wolf-incident-postmortem
-
https://www.lesswrong.com/posts/no5jDTut5Byjqb4j5/six-and-a-half-intuitions-for-kl-divergence
-
https://www.nuff.ox.ac.uk/users/klemperer/biggestpaper.pdf#page=2
-
https://www.overcomingbias.com/p/office-by-combo-auctionhtml
-
https://www.quantamagazine.org/how-lossless-data-compression-works-20230531
-
https://www.quantamagazine.org/how-mathematical-curves-power-cryptography-20220919/
-
https://www.quantamagazine.org/physicists-observe-unobservable-quantum-phase-transition-20230911/
-
https://www.stavros.io/posts/compressing-images-with-stable-diffusion/
-
https://xorshammer.com/2008/08/21/compute-definite-integral/
Link Bibliography
-
https://dl.acm.org/doi/pdf/10.1145/3589246.3595371
: “U-Net CNN in APL: Exploring Zero-Framework, Zero-Library Machine Learning”, Aaron W. Hsu, Rodrigo Girão Serrão -
1986-hamming
: “You And Your Research”, Richard W. Hamming -
https://arxiv.org/abs/2212.09410
: “Less Is More: Parameter-Free Text Classification With Gzip”, Zhiying Jiang, Matthew Y. R. Yang, Mikhail Tsirlin, Raphael Tang, Jimmy Lin -
https://levelup.gitconnected.com/searching-for-cyclic-tv-reference-paradoxes-d125ff014279
: “Searching for Cyclic TV Reference Paradoxes”, Jamie Pinheiro -
https://arxiv.org/abs/2204.00595
: “Monarch: Expressive Structured Matrices for Efficient and Accurate Training”, -
https://arxiv.org/abs/2112.13314
: “Silent Bugs in Deep Learning Frameworks: An Empirical Study of Keras and TensorFlow”, Florian Tambon, Amin Nikanjam, Le An, Foutse Khomh, Giuliano Antoniol -
https://code.blender.org/2021/12/improving-real-time-rendering-of-dynamic-digital-characters-in-cycles/
: “Improving Real-time Rendering of Dynamic Digital Characters in Cycles”, Kévin Dietrich -
https://arxiv.org/abs/2106.12372#nvidia
: “Real-time Neural Radiance Caching for Path Tracing”, Thomas Müller, Fabrice Rousselle, Jan Novák, Alexander Keller -
https://www.ethanrosenthal.com/2020/08/25/optimal-peanut-butter-and-banana-sandwiches/
: “Optimal Peanut Butter and Banana Sandwiches”, Ethan Rosenthal -
https://arxiv.org/abs/2006.06856
: “BanditPAM: Almost Linear Time <em>k</em>-Medoids Clustering via Multi-Armed Bandits”, Mo Tiwari, Martin Jinye Zhang, James Mayclin, Sebastian Thrun, Chris Piech, Ilan Shomorony -
scaling-hypothesis
: “The Scaling Hypothesis”, Gwern -
https://pointersgonewild.com/2019/11/02/they-might-never-tell-you-its-broken/
: “They Might Never Tell You It’s Broken”, Maxime Chevalier-Boisvert -
https://www.hyrumslaw.com/
: “Hyrum’s Law: An Observation on Software Engineering”, Hyrum Wright -
2019-hsu-3.pdf
: “Co-dfns: A Data Parallel Compiler Hosted on the GPU”, Aaron Wen-yao Hsu -
2019-vinyals.pdf#deepmind
: “Grandmaster Level in StarCraft II Using Multi-agent Reinforcement Learning”, -
umineko
: “Umineko: The Opium Of The Magics”, Gwern -
https://www.wired.com/2015/09/whatsapp-serves-900-million-users-50-engineers/
: “Why WhatsApp Only Needs 50 Engineers for Its 900M Users: One of the (many) Intriguing Parts of the WhatsApp Story Is That It Has Achieved Such Enormous Scale With Such a Tiny Team”, Cade Metz -
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4014489/
: “Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines”, Fernando Soler-Toscano, Hector Zenil, Jean-Paul Delahaye, Nicolas Gauvrit -
https://www.informit.com/articles/article.aspx?p=1193856
: “Interview With Donald Knuth”, Andrew Binstock -
https://archive.computerhistory.org/resources/text/Oral_History/Lampson_Butler/102658024.05.01.pdf#page=36
: “Oral History of Butler Lampson § WWW”, Butler Lampson, Alan Kay -
2001-pawson.pdf
: “Naked Objects: a Technique for Designing More Expressive Systems”, Richard Pawson, Robert Matthews -
1989-fiat.pdf
: “Planning and Learning in Permutation Groups”, Shahar Mose, Adi Shamir, Ilan Shimshoni, Gabor Tardos -
1980-budd.pdf
: “Mutation Analysis Of Program Test Data”, Timothy Alan Budd -
1961-barlow.pdf
: “Possible Principles Underlying the Transformations of Sensory Messages”, H. B. Barlow