Enable breadcrumbs token at /includes/pageheader.html.twig

NIST Switches On New AI Technology

Superconducting synapse may power computer brains.
A new superconducting synapse developed by NIST researchers could mark a major step forward for neuromorphic computing, a form of artificial intelligence. Credit: geralt/Pixabay

A new superconducting synapse developed by NIST researchers could mark a major step forward for neuromorphic computing, a form of artificial intelligence. Credit: geralt/Pixabay

Researchers at the National Institute of Standards and Technology (NIST) have built a superconducting switch that learns like a biological system and could connect processors and store memories in future computers, NIST officials intend to announce today. The switch in some ways outperforms the human brain that inspired it and offers a wide range of benefits for medical diagnoses, smart cars and intelligence analysis.

The NIST switch is called a synapse, after its biological counterpart, and it supplies a missing piece for neuromorphic computers. Envisioned as a new type of artificial intelligence, such computers could boost machine perception and decision making.

A synapse is a connection or a switch between two brain cells. NIST’s artificial synapse—a squat metallic cylinder 10 micrometers in diameter—is like the real thing because it can process incoming electrical spikes to customize spiking output signals. This processing is based on a flexible internal design that can be tuned by experience or environment. The more firing between cells or processors, the stronger the connection between synapses. Both the real and artificial synapses can maintain old circuits and create new ones.

The new synapse would be used in neuromorphic computers made of superconducting components, which can transmit electricity without resistance and therefore would be more efficient than other designs based on semiconductors or software. Superconducting devices mimicking brain cells and transmission lines have been developed, but until now, efficient synapses—a crucial piece—have been missing.

The brain is especially powerful for tasks such as context recognition because it processes data both sequentially and simultaneously, and it stores memories in synapses all over the system. A conventional computer processes data only in sequence and stores memory in a separate unit.

The artificial synapse can fire
 much faster than the human
 brain—1 billion times per
 second, compared with a brain
 cell’s 50 times per second—using about one ten-thousandth as much energy as a human synapse.

“We don’t know of any other artificial synapse that uses less energy,” says NIST physicist Mike Schneider. “The devices are roughly a million times more energy-efficient than the human brain for a single spiking event.”

The artificial synapse retains an advantage over its biological brethren when cooled to extreme temperatures of about 4 kelvins (-452 Fahrenheit). “If we include cooling, we are still roughly a thousand times more energy-efficient than the human brain,” Schneider says.

He describes that level of efficiency as an important benchmark. “Most deep neural network or artificial intelligence algorithms today are several orders of magnitude less efficient than the human brain. This gives us hopes of tackling even more complex problems than are currently possible in fields such as image recognition and language translation,” he says.

Schneider estimates that the emerging technology could be fielded within a decade. “We have demonstrated device-level performance that is very interesting, but a substantial amount of work remains to scale this technology for fielding. I would estimate that we are five to 10 years away from something in the field,” he says.

To make that happen, researchers must prove the technology works with much larger numbers of devices. “We have demonstrated single device operation on tens of devices, but something in the field would probably need roughly millions of devices. We have reason to be optimistic about this endeavor … but it is a long way from 10 to a million,” Schneider offers.

He adds that his team also is attempting to develop an architecture that takes advantage of the properties of the devices. “We have started doing this with physically realistic simulations, but this will also require a substantial effort in the next five to 10 years,” Schneider says.

The NIST synapse is a Josephson junction, long used in NIST voltage standards. These junctions are a sandwich of superconducting materials with an insulator as a filling. When an electrical current through the junction exceeds a level called the critical current, voltage spikes are produced. The synapse uses standard niobium electrodes but has a unique filling made of nanoscale clusters of manganese in a silicon matrix.

Synapse behavior also can be tuned by changing how the device is made and its operating temperature. By making the nanoclusters smaller, researchers can reduce the pulse energy needed to raise or lower the magnetic order of the device. Raising the operating temperature slightly from -456.07 degrees Fahrenheit to -452.47 Fahrenheit, for example, results in more and higher-voltage spikes.

Crucially, the synapses can be stacked in 3-D to make large systems that could be used for computing. NIST researchers created a circuit model to simulate how such a system would operate.

The NIST synapse’s combination of small size, superfast spiking signals, low energy needs and 3-D stacking capability could provide the means for a far more complex neuromorphic system than other technologies have demonstrated, researchers say.

The Intelligence Advanced Research Projects Activity (IARPA) supported the research. The team’s work appears in the peer-reviewed scientific journal Science Advances.