“On the Computational Power of Threshold Circuits With Sparse Activity”, Kei Uchizawa, Rodney Douglas, Wolfgang Maass2006-12-01 (, )⁠:

Circuits composed of threshold gates (McCulloch-Pitts neurons, or perceptrons) are simplified models of neural circuits with the advantage that they are theoretically more tractable than their biological counterparts. However, when such threshold circuits are designed to perform a specific computational task, they usually differ in one important respect from computations in the brain: they require very high activity. On average every second threshold gate fires (sets a 1 as output) during a computation. By contrast, the activity of neurons in the brain is much sparser, with only about 1% of neurons firing. This mismatch between threshold and neuronal circuits is due to the particular complexity measures (circuit size and circuit depth) that have been minimized in previous threshold circuit constructions.

In this letter, we investigate a new complexity measure for threshold circuits, energy complexity, whose minimization yields computations with sparse activity. We prove that all computations by threshold circuits of polynomial size with entropy \U0001D4AA(log n) can be restructured so that their energy complexity is reduced to a level near the entropy of circuit states. This entropy of circuit states is a novel circuit complexity measure, which is of interest not only in the context of threshold circuits but for circuit complexity in general.

As an example of how this measure can be applied, we show that any polynomial size threshold circuit with entropy \U0001D4AA(log n) can be simulated by a polynomial size threshold circuit of depth 3.

Our results demonstrate that the structure of circuits that result from a minimization of their energy complexity is quite different from the structure that results from a minimization of previously considered complexity measures, and potentially closer to the structure of neural circuits in the nervous system. In particular, different pathways are activated in these circuits for different classes of inputs.

This letter shows that such circuits with sparse activity have a surprisingly large computational power.

…The resulting circuits with sparse activity may help us to elucidate the way in which circuits of neurons are designed in biological systems. In fact, the structure of computations in the threshold circuits with sparse activity that were constructed in the proof of theorem 1 is reminiscent of biological results on the structure of computations in cortical circuits of neurons, where there is concern for the selection of different pathways (dynamic routing) in dependence of the stimulus (Olshausen et al 1995). In addition, our constructions provide first steps toward the design of algorithms for future extremely dense VLSI implementations of neurally inspired circuits [eg. GPUs], where energy consumption and heat dissipation become critical factors.

It is well known (see, eg. Hajnal et al 1993) that threshold circuits can be made robust against random failure of gates with a moderate increase in circuit size. Such methods can also be applied to the sparsely active threshold circuits that were constructed in this letter, maintaining their sparse activity feature. For example, one can replace each threshold gate by an odd number k of identical copies of this gate and take their majority vote with the help of another threshold gate. This increases the circuit size by a factor of only k + 1, but preserves their sparse activity. Furthermore, the resulting circuit computes correctly as long as the majority of gates in each group of k gates computes without a fault. Additional noise suppression could exploit that all legitimate activation patterns of gates in the circuit CT that was constructed in lemma 5 have a quite specific structure, since they simulate an activation path in a tree T.

The new concepts and results of this letter suggest a number of interesting open problems in computational complexity theory. At the beginning of §3, we showed that the energy complexity of a threshold circuit that computes some function f cannot be less than the a priori bound given by the minimal required circuit entropy for computing such a function. This result suggests that the entropy of circuit states required for various practically relevant functions should be investigated. Another interesting open problem is the trade-off between energy complexity and computation speed in threshold circuits, both in general and for concrete computational problems. Finally, we consider that both the energy complexity and the entropy of threshold circuits are concepts of interest in their own right. They give rise to interesting complexity classes that have not been considered previously in computational complexity theory. In particular, it may be possible to develop new lower-bound methods for circuits with low entropy, thereby enlarging the reservoir of lower-bound techniques in circuit complexity theory.