“How Much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-Term Memory”, 1986-10-01 (; backlinks):
How much information from experience does a normal adult remember?
The “functional information content” of human memory was estimated in several ways. The methods depend on measured rates of input and loss from very long-term memory and on analyses of the informational demands of human memory-based performance.
Estimates ranged around 109 bits.
It is speculated that the flexible and creative retrieval of facts by humans is a function of a large ratio of “hardware” capacity to functional storage requirements. …Thus, the estimates all point toward a functional learned memory content of around a billion bits for a mature person. The consistency of the numbers is reassuring…Computer systems are now being built with many billion bit hardware memories, but are not yet nearly able to mimic the associative memory powers of our “billion” bit functional capacity. An attractive speculation from these juxtaposed observations is that the brain uses an enormous amount of extra capacity to do things that we have not yet learned how to do with computers. A number of theories of human memory have postulated the use of massive redundancy as a means for obtaining such properties as content and context addressability, sensitivity to frequency of experience, resistance to physical damage, and the like (eg. 1975; 1982; et al 1985). Possibly we should not be looking for models and mechanisms that produce storage economies (eg. 1972), but rather, ones in which marvels are produced by profligate use of capacity.
See Also:
Biological limits to information processing in the human brain
The human brain in numbers: a linearly scaled-up primate brain
Nanoconnectomic upper bound on the variability of synaptic plasticity
Analog Versus Digital: Extrapolating from Electronics to Neurobiology
New Report on How Much Computational Power It Takes to Match the Human Brain