Enable breadcrumbs token at /includes/pageheader.html.twig

AI and Attack Radios Teach DARPA Unexpected Lessons

Electronic warfare systems behave smartly without artificial intelligence.
DARPA’s Squad X program has taught researchers that artificial intelligence offers advantages not related to faster decision-making, and that electronic warfare systems can behave smartly without being equipped with artificial intelligence.  DARPA

DARPA’s Squad X program has taught researchers that artificial intelligence offers advantages not related to faster decision-making, and that electronic warfare systems can behave smartly without being equipped with artificial intelligence. DARPA

Researchers have learned some surprising lessons from the technologies developed under the Defense Department’s Squad X program, which will end this year. For example, artificial intelligence may not help warfighters make faster decisions, but it does provide a planning advantage over adversaries. Furthermore, when it comes to detecting and electronically attacking enemy signals, systems can make smart decisions without artificial intelligence.

When first conceived in 2016, the Defense Advanced Research Projects Agency’s (DARPA’s) Squad X program was expected to explore four technology areas—precision engagement, nonkinetic engagement, squad sensing and squad autonomy—on behalf of dismounted soldiers and Marines in a squad formation. Since then, the program has evolved to focus on small units at multiple echelons, such as squads, platoons and Special Operations teams.

The program has developed two technologies: Lockheed Martin’s Augmented Spectral Situational Awareness and Unaided Localization for Transformative Squads (ASSAULTS) system and CACI’s family of radios known as BITS Electronic Attack Module (BEAM). The first was supposed to use autonomous robots with sensor systems to detect enemy locations, allowing small units to engage and target enemy forces without being detected first. It, too, has evolved and is now a testbed for assessing artificial intelligence (AI) technologies.

“What Lockheed has done is created an operating system that allows you to experiment with and plug and play with different components. I can’t tell you how hard that is,” says Philip Root, DARPA’s Squad X program manager. “Adding and subtracting AI components is even more difficult because they could contradict each other. They could fight amongst each other, these AI behaviors.”

The BEAM technology detects, locates and attacks specific threats in the radio frequency and cyber domains, including adversarial small unmanned air systems. Although the system is not enabled by AI, Root indicates the computer processing capabilities make it pretty smart. The radios communicate with one another to “find the best formation to grab the most information about the enemy,” he says.

Both Marines and Special Forces units seem to see the system as team member rather than tool, he asserts. “They would give the BEAM system the mission, and it would modify its behavior depending on the threat it saw and where they were in the mission, where it saw high-value targets,” he elaborates. “So, technically, it was not AI. It didn’t use the machine learning and deep learning necessary to have that technical term, but there are many aspects that reflected a form of intelligence.”

Lockheed’s ASSAULTS system taught researchers some valuable lessons. One of the first lessons is that AI can learn a lot from warfighters. For instance, researchers could use the wisdom and experience of commanders to teach and train AI and robotic systems.

“I didn’t see that coming, the thought that we could learn from squad leaders, company commanders, battalion commanders, regarding tactics and then use that to inform unmanned air systems and unmanned ground systems would be a completely different technical direction and one that we’re now beginning to explore,” Root says.

He adds that the experience with Lockheed Martin has taught him that the military may not want to attempt building AI systems that are better than humans. “That doesn’t mean we shouldn’t try to develop good AI. It just means that instead of trying to replace the wisdom and experience of the small unit commander, we should try to create AI that helps support the wisdom and experience of the most junior Marine on the team, and sometimes that junior Marine is a robot.”

Additionally, researchers learned that AI systems do not necessarily allow warfighters to make decisions faster but may help them to plan more effectively. Root says his team collected data to test the hypothesis that AI-equipped friendly forces, known as blue forces, would make decisions more rapidly than the enemy, or red forces.

“What we found is the opposite. What we found is that blue was able to plan in great depth with several decision points and courses of action. And red was not,” he explains. “Red was deciding really quickly but out of necessity. They were reacting. Blue was able to have superior situational awareness and then act with precision and real initiative to change the environment completely and dominate their local battlespace.”

Another unexpected lesson involves the process for collecting data to train AI systems. Normally, companies train systems using their own data or publicly available information. Root has concluded that AI systems designed for military use should be trained instead with military data. “Lockheed Martin helped me see that we need to consider different approaches for data curation, data stewardship and AI certification. When we collect this data, it likely should be owned by the department and provided to industry to train their systems,” he suggests.

The Defense Department, he points out, collects vast amounts of information in experimentation, training and operational environments. That information is more relevant for military AI or robotic systems than the data industry can easily access. Root says his team collected nearly a terabyte of data in its final experiment alone. “If we own the data, then every time we do an experiment or a training exercise, we will collect more data, steward it, and consider using that for additional training data. What that leads you to then is a place where the AI that a unit uses—whether that’s a squad, platoon, company or battalion … will learn and mature as the unit continues to train with it.”

Therefore, the department could rethink the process for testing and evaluating AI technologies. “We might need a data certification and a unit certification where the unit and the AI are certified at the same time, meaning that the unit is capable and effective using the AI, and the AI is providing valid feedback,” Root offers.

Former Secretary of Defense James Mattis, Root recalls, was keen on supporting close combat units and instituted the Close Combat Lethality Task Force. Mattis’ mantra was that a soldier or Marine needed to experience 20 gunfights through realistic training before engaging in combat.

The Squad X team experimented with battalions planning and executing their missions in simulation before passing mission orders to squads. Those squads would then plan and execute their missions in simulation before live training.

“If I have battalions, companies, platoons and squads that could all train with the same mission-type orders and then implement those missions in simulation, that provides some value certainly. But then with the opportunity to jump into the training range and execute that same mission and do it live, we start seeing the ability to train much more quickly and comprehensively and have AIs that are multi-echelon,” Root says.

He cites one Squad X experiment with help from the department’s Test Resource Management Center that involved having AI mounted on drones identify Marines in different environments and wearing different types of camouflage uniforms. In some cases, the uniforms worked well and the Marines blended into the background.

But Root’s point is that the service collected reams of relevant data in the process and could put that data to use for AI training. “We collected gigabytes of data. It would be really expensive for industry to recreate that each time. You would need access to Marine uniforms. You would need access to all the same environments. And there would be a lot of duplication if every vendor tried to do the same,” he explains. “We have the most training-relevant data in the government.”

The family of BEAM radios also demonstrated innovation. CACI documentation says the BEAM system surveys the environment to enable deployed units to counter small drones; cellular, digital, or analog push-to-talk radios; data links; wireless fidelity signals; and digital or analog video signals. BEAM can scale by operating in a cluster and can also operate autonomously to deliver distributed attacks and provide rapid, responsive force protection capability in hostile environments.

“There are some signals like threat unmanned aerial systems that have too wide a signal in terms of bandwidth for one node to be able to pick up and monitor,” Root explains. “So, CACI developed an ability—this is part of their unique capability—to link these separate software-defined radios together to monitor these wide signals and then perform a geolocation calculation to be able to triangulate into these signals.”

The system initially was designed to be small and lightweight so that it could be carried in a backpack, but CACI then developed a larger version for ground vehicles and another version for Aerovironment’s hand-launched drone known as Puma. The flexibility of the radios could protect ships, smaller boats that carry Marines from ship to shore, landing forces and fixed locations. The 31st Marine Expeditionary Unit in Okinawa, Japan, has been using the system. Special forces detachments also have experimented with the technology.

Furthermore, the system has been deployed to combat zones. “One of the reasons we know this works is that we sent this downrange to Afghanistan and Iraq and had great effects. Obviously, I can’t say much about it, but two thumbs up from the customers with whom we collaborated and supported,” Root reports.

Root describes the BEAM technology as very mature and says it could be adopted by the military services or other departments or agencies, such as border patrol units. While there is currently no planned transition path for the Lockheed Martin or CACI technologies, he is discussing both with multiple parties.

The Lockheed Martin system will not be used in combat any time soon, Root indicates. Instead, the system will be used to experiment with AI. “It allows us to do some incredible experiments, and it really helps us understand what we need. It lets us collect a lot of data, and in the world of AI, data is king,” Root says. “The nation that collects the most tactical data has the greatest advantage, and the Lockheed solution undoubtedly lets us collect more data than any other experimental system that I’ve seen.”

Root describes the pending end of the program as bittersweet and notes that others will judge DARPA’s work. “I don’t get to decide if we did enough. Some future Marine in harm’s way will decide if we did enough.”