Researchers at the Massachusetts Institute of Technology (MIT) have developed a computer chip that they claim can mimic the physical changes that occur in the brain as it learns.
The chips have a way of simulating the way the brain's neurons adapt in response to new information. This is known as plasticity, and is believed to underlie many brain functions, including learning and memory.
The chips contain around 400 transistors, which pales in comparison to the latest CPUs such as Intel's Core i7-920 processor, which has 731 million according to its technical specifications.
But bulk processing power is not its aim, rather it attempts to simulate the activity of a single brain synapse, the connection between two neurons that allows information to flow from one to the other.
The researchers anticipate this chip will help neuroscientists learn more about the way in which the brain works, and could also be used in neural prosthetic devices such as artificial retinas.
There are about 100 billion neurons in the brain, each of which forms connections, or synapses, with other neurons. Neurons release neurotransmitters, which bind to receptors on synaptic cell membranes, activating ion channels.
Opening and closing those channels changes the cell's electrical potential. If the potential changes dramatically enough, the cell fires an electrical impulse called an action potential.
All of this synaptic activity depends on the ion channels, which means that the brain's functionality relies on them in order to learn.
The MIT researchers designed their chip so that the transistors could mimic the activity of different ion channels. While most chips operate in a binary, on/off mode, current flows through the transistors on the new brain chip in analogue, not digital, fashion.
A gradient of electrical potential drives current to flow through the transistors just as ions flow through ion channels in a cell.
"We can tweak the parameters of the circuit to match specific ion channels," said Chi-Sang Poon, a principal research scientist in the Harvard-MIT Division of Health Sciences and Technology.
"We now have a way to capture each and every ionic process that's going on in a neuron."
According to the MIT web site, the researchers plan to use their chip to build systems to model specific neural functions, such as the visual processing system.
They hope that it will prove to be faster than even high-capacity computer systems, which can take days to simulate a simple brain circuit. With the analogue chip system, the simulation is even faster than the biological system itself.
A further potential application is interfacing with biological systems. This could enable communication between neural prosthetic devices such as artificial retinas and the brain.
"Further down the road, these chips could also become building blocks for artificial intelligence devices," explained Poon.
In August this year, IBM unveiled chips whose design is inspired by the architecture of the human brain.
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)