Massive parallelism: up to $10^{11}$ neurons

Perhaps the most distinguishing feature of the brain in comparison to regular computing is massive parallelism. While essentially a quantitative difference, it has enormous implications to the character of processing in the brain. This can be illustrated by the example of recognizing a figure. This process normally occurs so fast that given the speed of neural interconnections, only something in the order of a hundred consecutive processing steps could have taken place [Ham03, p. 11]. When John Hopfield describes the emergent phenomena that arise from a collective of neurons, he used the following analogy:

``Suppose you put two molecules in a box, every once in a while they collide and that's an exciting event. ... If we'd put ten or even a thousand more molecules in the box all we'd get is more collisions. But if we put a billion billion molecules in the box, there is a new phenomenon - sound waves.''

Hopfield's analogy implies that may is a threshold quantity at which new phenomena, such as intelligence, can emerge.

Currently, the maximum reported scale based on FPGA seems around 65 million simulated neurons3.16. The goal of 1 billion can be reached, it's not a constraint of the architecture [dGKG+98,dGK02]. Importantly, a larger number could be simulated on ordinary computers, but it would very slow instead of real-time.

Erik de Bruijn 2007-10-19