Trouble viewing? Open in web browser.

Journalist Resources Stanford News Stanford Experts Contact Us
Stanford University homepage

News Service

July 12, 2011

Stanford engineers build a nanoscale device for brain-inspired computing

Researchers at the Stanford School of Engineering have delivered a nanoelectronic synapse that might drive a new class of microchips that can learn, adapt and make probability-based decisions in complex environments. Their work might one day lead to real-time brain simulators that enhance our understanding of neuroscience.

By Andrew Myers

"This development could lead to electronic devices that are so small and so energy efficient that we might be able to make nanoelectronic versions of certain parts of the brain to study how they work," said H.-S. Philip Wong, a professor of electrical engineering. (Photo by Linda A. Cicero / Stanford News Service)

If you doubt the power of the human brain, ponder this for a moment: It takes today's state-of-the-art supercomputer eight-and-a-half minutes to simulate just five seconds of normal human brain activity. Meanwhile, that supercomputer will consume 140,000 times as much electricity as the brain – 1.4 million watts to ten to be exact – to do the work. For sheer processing power and efficiency, nothing quite compares to the human brain.

In a recent paper in the online edition of the journal Nano Letters, a team of Stanford engineers has demonstrated a new nanoelectronic device that emulates human synapses, the brain's computing mechanism.  It is a breakthrough that might one day lead to portable, energy-efficient, adaptable and interactive computer systems that can learn rather than merely respond to given programs. 

The Stanford team, led by Professor H.-S. Philip Wong, post-doctoral scholar Duygu Kuzum and graduate students Rakesh Jeyasingh and Byoungil Lee, has been working in a new field known as "brain-inspired computing," which seeks to mimic in computer chips the neurological signaling mechanism of the human synapse.

The researchers are not the first to venture down this path, but they are the first to succeed at creating synaptic devices small enough, with a low-enough energy consumption, and created with a mature technology so as to anticipate commercial viability down the road.

"This development could lead to electronic devices that are so small and so energy efficient that we might be able to make nanoelectronic versions of certain parts of the brain to study how they work," said Wong, a professor of electrical engineering. "While you can't alter a biological brain, a synthetic device such as this would allow researchers to change the device parameters to reveal how real brains function."

How computers work

To understand why this device is such a departure from what went before, it is necessary to understand how computer systems store and compute information. Within the nano-scale circuitry of today's computer chips are billions of tiny electrical components – transistors – that convey information using binary logic. That is, their logic is based on two numbers, either 1 or 0. In electrical terms, a transistor is either "on" or "off."

With enough transistors packed into each chip, programmers can manipulate electrical circuits, turning the billions of transistors on or off as necessary to store and process information – to "compute." The speed and size of computer chips has largely been determined by our ability to create faster transistors and to pack them into smaller spaces.

Synapses are the smallest computational units in the brain, but different from transistors in at least two very important ways. First, they can vary in strength. In other words, synapses can convey far more information than a transistor. Second, synapses can change over time.

"Synapses change based on learning," said Jeyasingh, "something conventional computers cannot do. Once most computer chips are made, you cannot change them easily."

Practice makes perfect 

In neuroscience, these two advantages are combined in a concept known as "synaptic plasticity," one of the leading theoretical foundations for how our brains learn, remember and compute.

Like transistor circuits, neurons and synapses are small and packed tightly together, but their circuitry is based on the varying strength of the synapses. The repetition of electrochemical signals traveling the same path will reinforce the synapses in the path and make them more or less likely to fire in the future. As neuroscientists like to say, "Neurons that fire together, wire together."

Synaptic plasticity explains why practice makes perfect. Repeating an electrical pattern through practice strengthens the pattern; therefore, the brain  "learns."

"The brain is an amazing machine. Its circuits are far more complex, far more capable, far more energy-efficient and far more powerful for performing certain tasks than even the very best computer chips based on binary logic," said Kuzum.

The Stanford team's device emulates synaptic plasticity using a technology known as "phase-change material," the same technology that allows DVDs and CDs to store information. When juiced with electricity, these materials change their physical characteristics and therefore their electrical conductivity in tiny increments – more electricity, more change.

Rather than the two states of a transistor, however, the Stanford team has demonstrated an ability to control the synaptic device in 1 percent increments – like a lightbulb on a dimmer – meaning each phase-change synapse can convey at least 100 values.

The device can be manufactured using existing commercial equipment with readily available materials.

"Using well-understood manufacturing processes, we can construct a cross-point architecture allowing three-dimensional stacking of layers that could one day approach the density, compactness and massive parallelism of the human brain," said Kuzum.

The researchers do not, however, foresee their new chips replacing existing ones. Instead, they say, they will lead in promising and exciting new directions that are currently out of reach.

"Our long-term goal is not to replace existing chips, but to define a fundamentally distinct form of computational devices and architectures. These new devices and architectures will excel at distributed, data-intensive algorithms that a complex, real-world environment requires, the sort of algorithms that struggle through today's processing bottlenecks," said Kuzum. 

Thinking in parallel

Among the most intriguing possibilities of these synaptic devices is greater parallelism. The brain is very good at juggling many types of sensory information simultaneously, something computers do very poorly. A supercomputer, by comparison, does not owe its great power to the speed of its processors so much as to splitting up big problems among many processors, each working on a small part of the problem. A more brain-like architecture might allow much smaller chips to think in parallel on many things at the same time.

And where might this lead us? It could lead to real-time brain simulations for use in neuroscience that may augment our understanding acquired from biological measurements. Brain-inspired computers may prove particularly adept at making decisions based on probability, as well.

"This work is a promising step forward in our ability to emulate brain functions using nanoelectronic devices and circuits. We can now contemplate a new direction of research which utilizes nanoelectronics for the study of neuroscience," said Wong.

"Beyond neuroscience, more brain-like systems could find use at the intersection of sensing and computation," said Kuzum. "Such applications would be able to process huge amounts of sensory data in parallel, meaning computers that can process visual information, recognize images or aid in navigation."

On a more fundamental level, the work is likely to produce a deeper understanding of the physics of gradual control of phase-change materials, allowing for additional fine tuning of the synaptic devices and even greater processing ability.

"This is a significant development," said Wong. "And we are excited to see where it leads."

This work is supported by DARPA SyNAPSE through a collaboration with IBM Research, the National Science Foundation and the Nanoelectronics Research Initiative of the Semiconductor Research Corporation.

Andrew Myers is associate director of communications at the Stanford School of Engineering.

-30-

Contact

H,-S. Philip Wong, Dept. of Electrical Engineering, (650) 725-0982, hspwong@stanford.edu

Andrew Myers, School of Engineering: (650) 736-2245, admyers@stanford.edu

Dan Stober, Stanford News Service: (650) 721-6965, dstober@stanford.edu

Related Information

 

Update your subscription

  • Email: news-service@stanford.edu
  • Phone: (650) 723-2558

More Stanford coverage

Facebook Twitter iTunes YouTube Futurity RSS

Journalist Resources Stanford News Stanford Experts Contact Us

© Stanford University. Stanford, California 94305. (650) 723-2300.