``Learning does not just alter the knowledge base for a fixed computational engine,Andy Clark [Cla01]
it alters the internal computational architecture itself.''
Plasticity (in the sense of changing interconnection topology) is fundamental to learning in biology.
Currently, when creating a processor device, all attempts are made to make sure every piece of hardware is the same. Elimination of variation is a central thought when manufacturing computer chips. A microprocessor with too much variability3.13 in its substrate, would render it useless. By design it depends on deterministic behavior of its components. Biological neural structures, which are plastic, are much more robust and can even make use of the physical diversity of its micro-environment [CdZR93]. An enormous contribution of von Neumann to computing is to store data and executable code both in memory, lifting the limitation of application specificness of computers3.14. While operations and data are on the same medium, reliable execution demands that data is never mistaken for operations or vice versa. This has been an important source of problems and exploitability in software. The brain doesn't work based on a clear distinction. To the brain, memory and experience is inherently meaningful (which data by definition is not) and greatly determine the way we interpret and perceive. Interpretation and perception of the brain can hardly be considered programs performing operations. This illustrates a fundamental difference of the architectures of the traditional computer and the brain. A computer's memory resides in distinct units, while the brain encodes every memory in its structure.
The former is a comparison of traditional computing to the brain. Obviously, simulating neural behavior (with ANNs) would be more similar to the functioning of the brain. But the logical and physical separation of memory units and processing units puts considerable requirements on the I/O speed and bandwidth of traditional architectures. A possible way out of this is using a different VLSI chip architecture: FPGA3.15. An FPGA is essentially an array of memory that manifests as logic gates. Next to the logic areas, there are (re)programmable interconnect blocks that constitute the switching areas. The discrete state nature of digital computers has been considered a problem. Analog VLSI (aVLSI) is another approach which does not have the problem of computationally expensive multiplication by logic gates [ZS03].
Neural Networks are characterized by parallelism, modularity and dynamic adaptation. FPGAs are well-suited because of concurrency, and reconfigurability. The reconfigurability FPGA aspect is exploited in several ways. In a sense, neuroplasticity is achieved by topology adaptation. Learning is implemented by adaptation of weights. FPGAs also provide a good basis for rapid prototyping of ANN designs [ZS03].
Erik de Bruijn 2007-10-19