• Panelists at the Defensive Cyber Operations Symposium discuss AI in the C2 domain.
     Panelists at the Defensive Cyber Operations Symposium discuss AI in the C2 domain.
  • Deputy Assistant Secretary of Defense for Command, Control, Communications, Cyber and Business Systems George Duchak moderates a panel at DCOS.
     Deputy Assistant Secretary of Defense for Command, Control, Communications, Cyber and Business Systems George Duchak moderates a panel at DCOS.
  • Col. Paul Craft, USA, director, Operations, J-3, JFHQ-DODIN, discusses AI and C2 during a panel at DCOS.
     Col. Paul Craft, USA, director, Operations, J-3, JFHQ-DODIN, discusses AI and C2 during a panel at DCOS.
  • Misty Blowers, director of research, development and strategy, Peraton Inc., discusses AI for defense at DCOS.
     Misty Blowers, director of research, development and strategy, Peraton Inc., discusses AI for defense at DCOS.
  • Terry Carpenter, director and program executive officer, Services Development Directorate, DISA, discusses kinetic warfare at DCOS.
     Terry Carpenter, director and program executive officer, Services Development Directorate, DISA, discusses kinetic warfare at DCOS.
  • Daniel Prieto, strategic executive, Google Cloud, discusses AI and machine learning during a panel at DCOS.
     Daniel Prieto, strategic executive, Google Cloud, discusses AI and machine learning during a panel at DCOS.

Artificial Intelligence Use in Command and Control

May 17, 2018
Kimberly Underwood
E-mail About the Author

In order to use AI in the C2 domain, the military needs quality data and to look to commercial solutions, experts say.


Experts speaking at the AFCEA Defensive Cyber Operations Symposium in Baltimore agree that the use of artificial intelligence (AI) in warfighting, and in command and control (C2) applications in particular, could provide advantages to the warfighter in terms of faster information processing and improved decision making and cyber defense. The hitch, though, is that the quality of data used to build algorithms and add to machine learning can vary. This impacts the quality of AI-related conclusions, which could put warfighters at great risk. The commercial sector also offers untapped resources, in terms of AI and blockchain functionality, that the military should examine, an expert said.

Deputy Assistant Secretary of Defense for Command, Control, Communications, Cyber and Business Systems George Duchak offered that use of AI in defensive cyber operations for C2 hinges directly on the quality of information beforehand. “AI systems are really only as good as their training data and we've seen fairly recently that adversarial learning shows that a small amount of corrupted training data could have huge impacts on the predictive ability of the AI expert system.”

Col. Paul Craft, USA, director, Operations, J-3, Joint Force Headquarters–Department of Defense Information Networks (JFHQ-DODIN) offered that AI could be used as a means to get to a secure network. “Artificial intelligence will help us speed our ability to get to a secure network, to see bad things happening and be able to make decisions faster.” As someone who works to secure the Department of Defense’s information network, Craft knows that the 1 billion events a day coming in against the network is very hard to defend, and this is where AI could help the department in automating some of the network defense.

AI functionality needs to be adaptive to respond to a full spectrum of cyber attacks and be effective on the battlefield for C2, cyber defense-related uses, said Misty Blowers, director of research, development and strategy, Peraton Inc. The AI capacity in use needs to be able to read what is happening on the battlefield, understand it and offer the best solution to the warfighter.

“I mean you've got disruption, destruction, denying services, deceiving, degrading your systems,” she said. “And to be able to fight through and be able to carry out the mission in the presence of these different attacks, it's important for these AIs to be able to negotiate and be able to realize which functionality is important—[to determine what is] most important for the current situation, for the current environment and for the current operational scenario that they're facing.”

Blowers is optimistic that this can be done, and done “in a way that is covert.” She advised the military to borrow some AI functionality from the commercial sector, especially in the convergence of AI and blockchain. These architectures could improve distributed communications, enabling effective C2, she noted. “In the commercial sector, there are extremely novel techniques of imparting distributed timeless capability and functionality, which in my opinion we don't pay enough attention to,” she stated.

Blowers pointed to the creative blockchain constructs of peer-to-peer networks, game theoretics and cryptography. “There are some advancements in smart contracts where the smart contracts aren't so much a contract as much as they are an artificial intelligence that enables secure communications and proof in verifiability of message trafficking,” she said. “These constructs allow for a negotiation of autonomy—autonomous distributed systems to be enabled through blockchain. They allow for multiple levels of encryption, even deceptive messaging to kind of throw things off, and untraceable communications.”

Terry Carpenter, director and program executive officer, Services Development Directorate, Defense Information Systems Agency (DISA), acknowledged that the warfighter is hindered by the sheer amount of data coming off of sensors and systems. “I listen to the warfighter and the folks trying to deal with this new domain and what I overwhelmingly hear is this volume of data and just moving that volume of data to the cloud [is difficult],” he said. AI can make a difference on the battlefield, but not without forethought.

With kinetic warfare, Carpenter said, leaders put a lot of energy into building the policies procedures, the concept of operations (CONOPS). For the cyber domain, leaders need to do the same. “It's an opportunity but it's also a challenge,” he warned. “And the challenge is we need to take the time to get a handle on that data and what it's really telling us and tag it appropriately so that when we do get to AI, we can actually make heads or tails from it from it in a meaningful way faster, because we took the time to really understand it.” If not, he said, “When you're looking at the data [it could be indicating] all kinds of false positives.”

Daniel Prieto, strategic executive, Google Cloud, advised not to treat AI “separately as some shiny object” because doing so could lead to moving “too quickly” to fully autonomous activities. AI could make an impact in “augmenting the human element,” he said, cautioning that AI should not be “unleashed” independently. “It should always be learning from human analysts to always tune the AI and ML models,” he stated. “I think there's so much out there in terms of military use of AI, but we always need to tether it to the success of the men and women on the front lines. Our objectives should be to keep a human in the loop for a significant time [to come] before we even contemplate purely autonomous action.”

Prieto agreed that quality data was the key to AI. “It’s true, it’s sort of a garbage in garbage out if the data is flawed.” He also thought that training AI models was not easy. “It takes large volumes of data before you get confidence in it and it requires appropriate tagging of data elements to build the right models,” he said.

Enjoyed this article? SUBSCRIBE NOW to keep the content flowing.


Departments: 

Share Your Thoughts: