AI May Benefit Cyber Defense More Than Offense
Artificial intelligence (AI) capabilities offer greater benefits to cyber defenders than to adversaries, according to U.S. Army Cyber Command officials. And generative AI, in particular, could provide “overwhelmingly positive” effects for the command.
Mark A. “Al” Mollenkopf, Army Cyber Command’s science advisor to the commanding general, acknowledged that AI poses a threat. “There’s a lot of talk in the AI domain right now and a little bit of concern that bad actors are going to use AI. AI is going to really lower the bar to certain types of malicious behavior, especially phishing, malware generation and disinformation,” he said.
But for the most part, AI offers an upside for Army cyber forces, he added. “However, I also think that we’re going to see some advanced AI tooling that’s going to help us to detect disinformation very effectively. It’s going to help us detect advanced phishing and detect advanced forms of malware.”
Steven Rehn, the Army Cyber Command chief technology officer, noted that network complexity and integration, the amount of data, and the amount of knowledge needed to operate and defend the network continue to grow exponentially. AI and machine learning can help lower that complexity, enhancing decision-making for network operations and defense.
Ultimately, Rehn said, AI and machine learning shift the power dynamic in the cyber realm. “Right now, the old adage is the advantage goes to the attacker. Today, I think with AI and machine learning, it starts to shift that paradigm to giving an advantage back over to the defender. It’s going make it much harder for the offensive side.”
The command is in the early stages of building an AI system for continuous cyber monitoring of systems, Mollenkopf revealed. “We’re thinking that we’re at the point now from an artificial intelligence perspective, where continuous monitoring can be plugged into existing data platforms and seems to enhance the visibility and security writ large. The idea is to have AI-driven systems that can automatically adjust their machine learning weights as functional drift occurs over time. That sort of thing is really critical to the effectiveness of continuous monitoring systems.” Machine learning models have weights that need to be adjusted to work more effectively, especially neural networks.

Right now, the old adage is the advantage goes to the attacker. Today, I think with AI and machine learning, it starts to shift that paradigm to giving an advantage back over to the defender. It’s going make it much harder for the offensive side.
AI could continuously monitor various types of Army systems, possibly including some weapon platforms. “I think a lot of our weapons systems are not technically able to be connected in real time, so increasing their security may take different methods of monitoring. But that is definitely going to be part of the research that we’re doing is to see how we can monitor some of these systems that are intermittently connected, how we can improve their security as well,” Mollenkopf elaborated.
Rehn added that Army Cyber Command is working with the office of the Assistant Secretary of the Army for Acquisition, Logistics and Technology, ASA (ALT), on security and continuous monitoring of weapon systems. Securing a platform is one thing; monitoring and alerting operators of malicious behavior is another, he indicated. “ASA(ALT), I think with our advice, will figure out how to secure the platform and all weapon systems as appropriate. How do we alert to make sure that in real time somebody is aware, as appropriate, that there’s something wrong? We’re finding malicious behavior on this particular weapon or platform and then we can take action. So, the commander now has the visibility and understands that there is a potential risk and can make a risk determination on how to execute and operate.”
Penetration testing might be required to test AI systems. “I don’t want to go into too much detail here because this is really early. We haven’t gotten to the next level of clearly defining, but at a high level, we think that there’s some utility in having automated pen testing to make sure that the continuous monitoring AIs are actually catching things that resemble enemy or adversary activity,” Mollenkopf said.
The science advisor compared continuous monitoring AI to X-ray technology. “It’s going be as transformational, I think, as X-ray technology was for medicine. We just have to think about ways to better structure, how we do event logging and what kind of signals we actually need to admit so that we can see anomalies and malicious activity, like an X-ray across our system,” he suggested.
Command officials say that generative AI, which responds to prompts by providing content such as text, images and audio, may prove particularly useful for the service’s cyber forces. “I think generative AI is going to be a tool that’s going to really accelerate our ability to do complex tasks consistently, collectively, over time,” Mollenkopf offered. “For example, we’re starting to see in industry where we can use generative AI to generate some really complex queries that can be sent out to our data platforms to summarize correlated events or help us better see activity from a cybersecurity perspective.” He added that those are the near-term capabilities.
Benefits further in the future might include widespread document generation. “Using generative AI long-term as a mechanism to distribute knowledge and experience will be a game changer,” Mollenkopf asserted. “For example, if we have some highly tuned models that can be used by hundreds of personnel to generate documents—like statements of work or contracts—in a consistent way that saves us time and money, that’s going to be good for the government and the contracted partner.”
However, the most compelling capability for AI will be at the warfighting edge where data and decision-making come together, he suggested. That could include someone in an office using an AI personal assistant or deployed soldiers on a tactical computer with an intermittent network using AI to monitor troop positions, intelligence or disruptive changes in the operating environment. “Where AI, including generative AI, is going to be the most impactful is at the edge. This is where we’ll be able to accelerate better decision-making. This is going to be critical for the Army as we move forward,” Mollenkopf said.

We’ve been thinking a lot about how generative AI is going to affect our workforce. And it’s overwhelmingly positive.
The command has been assessing generative AI, and the pros may outweigh the cons by far. It could enhance productivity, lower costs and prove to be a “combat multiplier” by assisting software developers and providing “boilerplate code for projects” or “troubleshoot problems and code,” according to Mollenkopf. “We’ve been thinking a lot about how generative AI is going to affect our workforce. And it’s overwhelmingly positive. AI is really becoming the revolutionary disrupter in a few areas that we thought it might turn out to be.”
The question, though, is whether the Army will constantly play catch up and react to criminal and threat actor innovation or stay ahead of adversaries. To do the latter, the service will need to retain knowledgeable, experienced cyber personnel—and may need to expand AI expertise. “That means what we need is to have the ability to generate AI experts with domain-specific signal, cyber, electromagnetic warfare expertise. And that’s really key. That means we’re also going to have to have new technologies to help us train and educate and confer that experience as efficiently as possible,” Mollenkopf explained.
Brig. Gen. Paul Craft, deputy commanding general, Joint Force Headquarters-Cyber, agreed. “I would consider AI advanced technology, technology we need to make better decisions and really parse data in a faster manner. There are a lot of things we do in cyber analytics. There’s a lot of things we certainly do in cyber defense or some things we do on the offense. Certainly, I agree that cyber, electronic warfare and Signal Corps personnel need to train.”
The deputy commander added, however, that some personnel come to the service already armed with AI expertise, and many pursue AI-related graduate degrees. Additionally, the Army offers some AI classes at its Cyber Center of Excellence, Fort Gordon, Georgia.
Mollenkopf also stressed the need to retain talented personnel. “We’ve got to look at smart ways to retain that knowledge and experience. We’re going to cost a little more and be a little bit inefficient, but it’s critical that industry helps the Army think about ways to optimize our training pipelines that we use to create the next-generation of experts so we don’t become too dependent on tools that can be exploited or denied at the time of our adversaries’ choosing.”