ARL Researches the Human Brain

Scientists at the Army Research Laboratory (ARL) monitor a soldier’s brainwaves as he operates systems in a simulated tank. The work seeks to understand thought patterns and physical states during combat pursuant to teaming the soldier with artificial intelligence.
U.S. Army scientists are learning more about how the human brain functions so they can team its bearer with artificial intelligence (AI). The goal is for AI to understand a soldier’s moods and feelings and adjust its own actions accordingly.
Researchers aim for a future iteration of AI that would measure a soldier’s cognitive and physical state and trigger actions that would support, or even save, the individual in combat. These actions might direct the human on a different course, or ultimately initiate activities that complete the soldier’s mission or protect the individual in combat.
But first, scientists must understand these cognitive signs so they can cue the AI to respond accurately. That entails reading thoughts and discerning moods revealed by brainwaves and physical appearances.
“What we really want to try to do is have the AI be able to adapt to the real-time state changes of the human,” explains Jean Vettel, Ph.D., senior science lead at the Combat Capabilities Development Center (CCDC), Army Research Laboratory (ARL). This means changes in the human’s intent and what task the person is about to perform. The AI would glean this information from a change in the person’s response to the environment, she says.
She offers as an example a stressed-out soldier whose AI determines that a terrorist is in a nearby building. A human teammate might implement a lockdown on all the building’s doors. The advanced AI would take the place of that human teammate and perform the same lockdown autonomously.
The first step in this research has focused on physiological signals in the human body that can be measured to predict a person’s state, Vettel relates. The ARL’s research effort is exploring all types of physiological indicators, ranging from polygraph-like indicators to true brainwave readings. “We actually expect that we will likely use multiple physiological signals to increase our confidence of the state estimate,” Vettel says. “Any given signal from our bodies is ambiguous,” she notes, explaining that any one of many different variables may cause a change in a single indicator, and it’s hard to know how to interpret that signal. However, if soldiers are equipped with sensors for several different physiological signals, then incorporating knowledge across the signals will provide the necessary disambiguation, she says.
Vettel cites recent research into whether brain signals can account for past performances of an individual person—leading to predicting what that person might do. “Instead of constantly viewing the person as a nebulous creature that you could only learn about if you asked them questions, let’s instead focus on science that capitalizes on advancement and how we can image our physiology, our brain data, to be able to analyze and ask, ‘Can I use the signals in my brain to predict something about my performance?’” she suggests. “Because, if I can detect that relationship, then if I’m currently performing a task on whatever mission I’m doing, and brain signals can indicate that I’m not going to perform the task very well, that is a way we can start capitalizing on detecting the state I need, or where I could benefit from a teammate that could help me perform that task better.” That teammate could be another human or AI, she notes.
Recent research has shifted to examining how to collect sufficient data that removes the need to average statistics across numerous people to obtain enough statistic power for analysis, she expresses. Currently, the individual differences in each person’s brain fundamentally alter the way that each brain’s dynamic occurs. As a result, research is focusing on the protection state at the individual level so that technology can adapt to an individual person specifically.
That research also will help predict how an individual will act or react in a situation. Capturing the model of an individual will define how each person’s brain will function, as opposed to the one-size-fits-all definition that risks mis-evaluations.
Another research goal aims to “push into more complex tasks,” Vettel says. Most signs detected by existing efforts have tended to focus on particular brain mechanisms with specific actions. Laboratory studies of people sitting alone responding to pictures they are shown have revealed a lot about brain functions, she notes, but ARL research is exploring more nebulous areas. She relates that the laboratory recently collected data from two people driving in an instrumented car along I-95, and it recorded brain data from the driver as the passenger shared previously unknown information. The lab wanted to know if the brain signals that emerged when the two people were talking about the information would be able to predict what the driver would remember from the conversation—how well the information was communicated.
“We’re pushing into, ‘Can we use brain data to predict performance in these naturalistic settings,’ namely whenever there is risk involved,” Vettel continues. Driving safely along a busy multilane interstate highway qualifies as a risky endeavor, she notes. There is risk in the primary task, but the scientists also are looking at whether the physiological signals can predict a secondary task, such as communication.
More on the ARL’s effort to pair brains with AI, including what the lab ultimately hopes to accomplish, is in the August issue of SIGNAL Magazine, available online August 1.