•  IBM this summer will deliver to the Air Force Research Laboratory the 64-chip array, brain-inspired, IBM TrueNorth Neurosynaptic System, an artificial intelligence supercomputing system that will enable deep-network learning and information discovery. (Image Courtesy U.S. Air Force)
      IBM this summer will deliver to the Air Force Research Laboratory the 64-chip array, brain-inspired, IBM TrueNorth Neurosynaptic System, an artificial intelligence supercomputing system that will enable deep-network learning and information discovery. (Image Courtesy U.S. Air Force)

AFRL Anticipates Arrival of Neuromorphic Supercomputer

January 29, 2018
By George I. Seffers
E-mail About the Author

IBM is set to deliver the 64-chip TrueNorth System soon.


The U.S. Air Force Research Laboratory (AFRL) is set to receive a neuromorphic supercomputer from IBM this summer.

The AFRL and IBM announced last June that they were collaborating on the system, which an IBM press release described as “first-of-a-kind brain-inspired supercomputing system” powered IBM’s TrueNorth Neurosynaptic System. The scalable platform IBM is building for AFRL will feature an end-to-end software ecosystem designed to enable deep neural-network learning and information discovery. The system’s advanced pattern recognition and sensory processing power will equal 64 million neurons and 16 billion synapses, while the processor component will consume only 10 watts of power, roughly the equivalent “of a dim light bulb,” the IBM release states.

“We’ll be receiving a 64-chip neurosynaptic supercomputer IBM is developing in our collaboration with them this summer, which will provide us with a platform to further our research. This will be a key milestone in our research capabilities,” says Daniel Goddard, who directs AFRL’s Information Directorate.

IBM researchers believe the brain-inspired, neural network design of TrueNorth will be far more efficient for pattern recognition and integrated sensory processing than systems powered by conventional chips. AFRL is investigating applications of the system in embedded, mobile, autonomous settings where, today, size, weight and power are key limiting factors.

“The need to successfully conduct warfighting operations in contested environments requires size, weight and power-efficient, intelligent, onboard computing capabilities that can autonomously detect and classify targets, predict threats, recognize complex events and perform cognitive reasoning to enable robust decision making in the battle space,” Goddard explains. “Neuromorphic computing, hardware, software and systems inspired by the working mechanism of the human brain can achieve at least three orders of magnitude size, weight and power improvement over conventional computing architectures and algorithms.”

Goddard adds that the lab’s neuromorphic computing research team focuses on developing and fielding the game-changing technology through innovations in new, massively parallel computing, in-memory processing architectures, new nanoelectronic devices and circuits, hardware-optimized deep learning models, algorithms and applications.

While conventional supercomputers have proven to beat humans in a game of chess, warfighters work in a complex, confusing and imperfect domain and could use smarter machines to help them navigate the combat environment.

“With chess, you know every possible state of the game, so for any point in the game, you perfectly know every possible move one can make. You can calculate them,” Goddard points out. “With the game of poker, you bring into play imperfect information. There’s bluffing involved, and you’re not quite sure what cards everybody has and what’s still in the deck.”

Warfighters, he adds, also must deal with imperfect information—and vast amounts of information at that. That coupled with the sheer amount of information—what does it mean to operate in a multi-domain world? It means processing power.

“In order to power multi-domain command and control, in order to power cyber space superiority, you’re going to have to go through the mountains of data. You must have processing power. Whether it’s neuromorphic computing, whether it’s quantum computing or whether it’s some type of hybrid computing, we in the information directorate are pursuing extreme computing techniques to power the capabilities the Air Force will need in the future,” he states.

The IBM TrueNorth Neurosynaptic System can efficiently convert data, such as images, video, audio and text from multiple, distributed sensors into symbols in real time, says the company’s release. AFRL will combine a right-brain perception capability with the left-brain symbol processing capabilities of conventional computer systems. The large scale of the system will enable both data parallelism, where multiple data sources can be run in parallel against the same neural network, and model parallelism, where independent neural networks form an ensemble can be run in parallel on the same data.

The system will enable 512 million neurons per rack. A single processor in the system consists of 5.4 billion transistors organized into 4,096 neural cores creating an array of 1 million digital neurons that communicate with one another via 256 million electrical synapses.    

The IBM TrueNorth Neurosynaptic System was originally developed under the auspices of Defense Advanced Research Projects Agency’s (DARPA) Systems of Neuromorphic Adaptive Plastic Scalable Electronics program in collaboration with Cornell University.

To learn more about AFRL's use of artificial intelligence technology, read the March edition of SIGNAL Magazine.

Enjoyed this article? SUBSCRIBE NOW to keep the content flowing.


Departments: 

Share Your Thoughts: