Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars     Apps     EBooks
   AFCEA logo
 

Brainwaves Boost Intelligence

October 2011
By George I. Seffers, SIGNAL Magazine
E-mail About the Author

 
The U.S. Defense Department may be on the brink of harnessing brain signals for intelligence analysts.

Researchers working for the U.S. Defense Department are nearing completion of a six-year project designed to harness brainwaves for imagery analysis, significantly improving the speed and accuracy of identifying critical information. The program brings operational neurotechnology into the realm of imagery analysis via advances in signal processing, human-computer interfaces and groundbreaking neuroscience, with the goal of providing new tools to warfighters.

Geospatial intelligence analysts face an enormous volume of imagery, especially in this age of advanced satellite data collection. Only a fraction of the data collected can be processed and reviewed in a timely manner, leaving work-weary analysts scurrying to keep up with a task that is both time-consuming and tedious. However, by tapping into the brain’s electrical signals, this project seeks to lighten the workload and enable analysts to keep up better with the deluge of incoming information.

The Neurotechnology for Intelligence Analysts (NIA) program is designed to record brain signals in an operational environment and process the signals in real time to select images that need further review. The process someday could apply to static, broad-area and video imagery. In the coming months, the Defense Advanced Research Projects Agency (DARPA) expects to complete the third and final phase of research and development on the NIA program and turn it over to the National Geospatial-Intelligence Agency (NGA) for potential fielding. The NGA is the transitional partner and has worked closely with DARPA from the beginning of the program. The NGA will decide which prototypes, if any, will be fielded. That decision will come after analysis of a competition, which was held last summer. The analysis could take as much as six months, according to an NGA official.

The competition included three prototype systems from Teledyne Technologies Incorporated, Thousand Oaks, California; Honeywell International, Morristown, New Jersey; and a team involving Columbia University and Neuromatters LLC, both located in New York.

The three prototypes were installed in a geospatial analysis testbed owned by the NGA, and experiments were conducted with image analysts. For comparison, participants analyzed images using the traditional method as well. Todd Hughes, DARPA’s NIA program manager, likens the traditional process of broad-area search to a dog owner searching for pet photographs. “Imagine you have a stack of photos on your hard drive, and you’re looking for photos of your dog. You flip through all those photos and pull out the ones of the dog and put those in a separate file,” Hughes says.

Except, of course, defense analysts are more likely to be searching for airplanes, tanks or ammunition stockpiles. “The way it’s done today is that an analyst will take an image of a big piece of real estate and zoom in and scroll across the terrain looking for different objects and then enter that into the record. That process takes a very long time, and it’s very intensive and, quite frankly, not very interesting,” Hughes explains.

Intelligence agencies can speed up the process by breaking down a larger image into smaller, more manageable pieces known as chips. When an image with target data flashes before the eyes, the viewer’s brain will send out a signal within 300 milliseconds—before the analyst even consciously realizes the image contains something interesting. Sensors detect that brainwave response, known as P300, in an electroencephalography (EEG) cap, traditionally used in hospitals for monitoring brainwaves. “Instead of panning and scanning and scrolling across the landscape to look for particular objects, we’re creating a bunch of smaller images and then flashing them in front of image analysts,” Hughes says. “Every time one of those chips appears containing something an analyst is looking for, that P300 goes off, and that image is put into a smaller folder. We’re anticipating that this could at least double the rates at which an image analyst can research an area of terrain.”

Doubling the image-processing rate would, of course, mean that an analyst could do twice the amount of work in the same amount of time, or the same amount of work in half the time. “Either way, it’s a significant improvement to that process, which is a very important task that image analysts have to perform,” Hughes says.

The human brain continually generates various kinds of electrical signals or brainwaves that indicate a person’s state of mind. Beta waves, for example, indicate an excited or stressed mind; alpha waves indicate a conscious but relaxed mind. Theta waves and delta waves indicate varying degrees of sleep. Alpha waves fire away at the fastest rate—up to 40 times per second. The brain can transmit more than one kind of wave simultaneously, but usually one kind will dominate. An EEG works by picking up on the electrical signals and transmitting them to a machine that amplifies the signals.

The brain also is adept at rapidly and accurately identifying numerous objects each day. It brilliantly hones in on important features of an image while disregarding irrelevant elements. Modern neuroscience techniques have decoded much of the biology behind this capability, and the field of visual neuroscience has grown to include a wide array of experimental and analytical techniques, according to NIA program documentation. The converging neurophysiology, sensor and signal processing advances indicate that the human visual system is a top-notch target detector whose signals can be harnessed for operational purposes.

Analysts also must sometimes view images of the same geographical areas taken at various times and try to spot significant differences. Computer programs have seen limited success in identifying targets or changes in imagery, but so far, substantial investments by industry and the government have not developed technologies capable of keeping up with the demand for imagery analysis. For example, computer programs are not capable of the subtle analysis necessary to discern between a peaceful public gathering and a chaotic public protest. Computers may be improving their chess game, but humans so far have the advantage in identifying critical intelligence data, and the NIA seeks to press that advantage further.

The NIA began in 2005 building on P300 research performed by neuroscientists. “At DARPA, we decided to investigate how that might be applied to broad area search. P300 is one of the brain signals that has been more extensively studied, and it’s also easy to detect. One reason we are confident this will work is that P300 is strong and reliable in the presence of a recognized object,” Hughes adds.

During phase two of the NIA program, 12 imagery analysts from the Defense Department conducted a test of the various NIA platforms against traditional search methods. The analysts searched imagery that ranged from 225 to 300 square kilometers for signs of surface-to-air missile sites and urban helipads. Using the NIA platform, they showed a minimum of 600 percent improvement, according to the documentation. Hughes says researchers will not know precisely how much is gained until the final tests are done and evaluated.

The three teams involved took significantly different approaches. Teledyne paired brainwave monitoring with eye tracking. The resulting target-detection brain patterns were used to identify a variety of targets in multiple imagery types. The Columbia University team developed a system that allowed users to jump to regions of imagery most likely to contain targets based on the analysts’ brain signals during previous viewings. Honeywell’s phase two system fused data from the analysts’ brain responses and button presses with results being displayed as target probability maps overlaid on imagery, which the analysts used to verify targets.

The NIA is a strategic-level program, but DARPA is exploring neuroscience for a broad array of uses, including the control of robotic prostheses and enhancing training and tactical situational awareness. “One of the interesting concepts on the horizon is the interpretation of different kinds of signals for control interfaces. We’ve been looking at EEG signals as an indicator of motor skills,” Hughes says. He explains that DARPA envisions a device that will interpret neurosignals as a way of moving an avatar through a virtual environment, instead of using a controller, such as a joystick, a gamepad, or even a Kinect or similar device. “That, I think, has a number of potentially revolutionary applications to control virtual avatars but also to control virtual devices like robots, for example. That’s one area we’re investigating that I think has a lot of promise.”

Another related program, the Cognitive Technology Threat Warning System, seeks to combine brainwaves with advanced optics technologies to provide soldiers with next-generation binoculars designed to improve threat detection and situational awareness. “We certainly envision much more tactical applications of neurotechnology,” Hughes says.

RELATED ARTICLES:  http://bit.ly/nYfcRQhttp://bit.ly/nWTbwP

WEB RESOURCES
DARPA Defense Sciences Office: http://1.usa.gov/p5DsoK
National Geospatial-Intelligence Agency: https://www1.nga.mil/Pages/Default.aspx