“New Report on How Much Computational Power It Takes to Match the Human Brain”, Joseph Carlsmith2020-09-11 (, ; similar)⁠:

Open Philanthropy is interested in when AI systems will be able to perform various tasks that humans can perform (“AI timelines”). To inform our thinking, I investigated what evidence the human brain provides about the computational power sufficient to match its capabilities. I consulted with more than 30 experts, and considered four methods of generating estimates [simulating neurons, comparing brain region sizes to similarly powerful algorithms, laws of physics limits, & IO bandwidth/latency], focusing on floating point operations per second (FLOP/s) as a metric of computational power.

The full report on what I learned is here. This blog post is a medium-depth summary of some context, the approach I took, the methods I examined, and the conclusions I reached. The report’s executive summary is a shorter overview.

In brief, I think it more likely than not that 1015 FLOP/s is enough to perform tasks as well as the human brain (given the right software, which may be very hard to create). And I think it unlikely (<10%) that more than 1021 FLOP/s is required. [The probabilities reported here should be interpreted as subjective levels of confidence or “credences”, not as claims about objective frequencies, statistics, or “propensities” (see Peterson2009, Chapter 7, for discussion of various alternative interpretations of probability judgments). See Muehlhauser (2017a), §2, for discussion of some complexities involved in using these probabilities in practice.] But I’m not a neuroscientist, and the science here is very far from settled. [My academic background is in philosophy.] I offer a few more specific probabilities, keyed to one specific type of brain model, in the report’s appendix.

For context: the Fugaku supercomputer (~$1 billion) performs ~4×1017 FLOP/s, and a V100 GPU (~$10,000) performs up to ~1014 FLOP/s. [Google’s TPU supercomputer, which recently broke records in training ML systems, can also do ~4×1017 FLOP/s. NVIDIA’s newest SuperPOD can deliver ~7×1017 of AI performance. The A100, for ~$200,000, can do 5×1015 FLOP/s.] But even if my best-guesses are right, this doesn’t mean we’ll see AI systems as capable as the human brain anytime soon. In particular: actually creating/training such systems (as opposed to building computers that could in principle run them) is a substantial further challenge.

Estimates of FLOP/s budgets large enough to perform tasks as well as the human brain (given the right software).