Enable breadcrumbs token at /includes/pageheader.html.twig

Researchers Advance Autonomous ISR Technology

New subsystems enable greater unmanned vehicle independence.

Future conflicts will require smart, autonomous unmanned platforms capable of delivering critical information to warfighters at blinding speed, enabling faster, more effective battlefield decisions to win wars and save lives. Researchers at the Johns Hopkins University Applied Physics Laboratory may have been ahead of their time in creating the infrastructure required for autonomous systems to rapidly provide data to warfighters.

A 2012 article in the physics lab’s technical digest describes the Organic Persistent Intelligence, Surveillance and Reconnaissance (OPISR) system as a “visionary, game-changing approach to ISR.” David Scheidt, a member of the principal professional staff, Research and Exploratory Development Department, Johns Hopkins University Applied Physics Laboratory, served as the OPISR principal investigator and authored the article. He calls the system a “novel combination of distributed image processing, information management and control algorithms that enable real-time, autonomous coordination between ad hoc coalitions of autonomous unmanned vehicles, unattended ground sensors and front-line users.”

OPISR is a software and communications subsystem that, when added to an ISR asset such as an unmanned vehicle or unattended sensor, supports the rapid, autonomous movement of information across a tactical force. It provides intelligence directly to the warfighter without requiring that warfighter to personally direct, or even know about, the OPISR assets gathering the information. The system seeks relevant intelligence, pushing key tactical data directly to affected soldiers in real time.

OPISR was developed under a two-year internal research and development program that ended in 2012. The program has led to several follow-on efforts. OPISR technology was included in last year’s Office of Naval Research demonstration of a swarm of unmanned boats known as rapid intervention vehicles.

“The way those vehicles shared information and collaborated was the technology that came out of OPISR,” Scheidt explains. The individual decisions were made by the Control Architecture for Robotic Agent Command and Sensing (CARACaS) system from NASA’s Jet Propulsion Laboratory.

Scheidt also is working with the Navy Science of Autonomy program to improve the reasoning engines for enhanced unmanned system decision making. This summer, the Navy will conduct flight tests involving the next-generation reasoning engine known as the Adaptive Autonomy Controller, the successor to the Autonomy Toolkit. The upcoming demonstrations will improve methods for humans to instruct and interact with autonomous vehicles; allow decision making under diverse, unexpected operating conditions; and add the capability to continue working for long periods in a denied environment, such as when communications are jammed.

In another follow-on program, the Defense Department’s Test Resource Management Center is developing technology for testing autonomous systems at military ranges, which will help warfighters gain trust in the unmanned vehicles. “This is building the tools the ranges will need when OPISR or successors to OPISR actually get delivered to them, and the military asks for operational testing of autonomous systems,” Scheidt adds.

In addition, the Office of Naval Research is using the technology to manage radio frequency systems and the platforms they rely on. The idea is for unmanned vehicles to establish communications with a cell tower, for example, in case of interference.

The OPISR has not yet been adopted into a program of record, possibly because it is ahead of its time. “It has been a struggle to adopt autonomous vehicles for the last decade,” Scheidt says. “In part, the user community is just starting to catch up. Before you can have groups of vehicles performing as a team to support you, you need to convince people that a single vehicle can be autonomous. We’re just starting to get some of the precursor programs ... to the point they’re close to being fielded, both in industry and in the military.”

He cites the self-driving cars being developed by Google and Tesla, as well as the Navy’s Autonomous Aerial Cargo/Utility System and the Army’s unmanned tractor-trailers as examples of how far autonomous systems have come. In addition, the Defense Advanced Research Projects Agency’s (DARPA’s) Collaborative Operations in Denied Environment (CODE) program is developing algorithms and software to improving the ability of unmanned aircraft to conduct operations in denied or contested airspace. Scheidt says his team provided insights to DARPA’s CODE program manager in that program’s early stages.

With recent advances, the time may have come for OPISR or a similar capability. Scheidt projects that OPISR, or at least some pieces of it, will transition into a program of record within the next three years and will be fielded within the next five to eight years. “I have little doubt this is coming. Depending on who you believe, it’s coming in the next five years or the next 30 years. I’m in the camp that says the next five to 10 years,” Scheidt offers. “A lot of people overestimate how far away this capability is.”

Paul Scharre, a Center for a New American Security fellow who is familiar with some of Scheidt’s work, agrees. The vision Scheidt presents for a cloud of persistent, on-demand surveillance to support warfighters on the ground is unquestionably the type of surveillance architecture needed for the future, he says. “It’s fair to say that the technology is moving quickly, and things that weren’t possible five years ago are definitely possible today. I don’t see anything in this architecture that looks impossible or requires technology that doesn’t exist yet,” Scharre says. “There would be some work to do in integrating all of these components, but it’s worth pursuing because the payoff is huge.”

Scheidt indicates that challenges must be solved before autonomous systems can work effectively in the ISR arena. “The limiting factor is the vehicle’s ability to see and understand its environment. You want to have the vehicles look for certain things: bad guys, people who are lost, downed pilots, for example,” he says.

Finding humans who are not camouflaged is fairly easy for machines, but their human recognition capabilities go only so far. “If you and I were talking on a street corner, distinguishing the two of us with an unmanned system in the sky is a little more challenging. We’re still a ways from doing that,” Scheidt offers.

However, machines can detect targets effectively. “We have target recognition capabilities that are quite mature. If that’s what you’re looking for, and what you need to trail, track, interact and interdict, then you can field systems now that look for easily identifiable things,” he observes.

The decision-making technology for autonomous systems also is fairly mature, Scheidt says. “We can have effective decisions made by cooperative unmanned autonomous vehicles, particularly the air and [naval] surface vehicles, but they’ve made good headway with the ground [systems] as well,” he states.

Ground vehicles have the most obstacles—literally—including other vehicles and tough terrain. “You don’t want your autonomous system running into a ditch,” Scheidt deadpans. He also notes that Google’s car will have to cope with “stupid people” but not enemy forces.

In addition, machines struggle to understand context. Military ISR systems need not be capable of complex thoughts, but they do need to make some decisions on their own because they can do so much faster than humans. Scheidt compares OPISR’s intelligence level to that of a sheepdog. “It doesn’t understand why you want the sheep to go into the pen. It’s capable of accepting very simple commands and doing simple things, but it doesn’t require constant supervision,” he explains.

With rudimentary autonomy, the systems require humans to act as supervisors, understanding the context of a situation and assigning certain tasks. “But why you would want to do that [task] is something you don’t want the machines thinking about,” Scheidt adds.

The OPISR’s decision-making skills actually have improved since 2012, and his team is exploring “deep learning” for enhanced perception, which shows “great promise in terms of classification and understanding the environment,” he says. “It has the potential for revolutionizing the field. It’s really starting to do that.”

In addition, the cognitive technology has improved so that unmanned systems armed with OPISR can learn and make decisions on their own. “They are starting to recognize the current situation and adapt the way they make decisions in response,” Scheidt reports.

The technology offers multiple benefits. OPISR-enabled unmanned systems could save money by “not having to pay a pilot to babysit” the unmanned vehicle, he says, but that is not the greatest benefit. The more compelling use case is when the mission requires faster-than-human thinking and reaction time. “The more complex a problem is, the longer it takes a human to figure out what is going on. If the time it takes for an analyst to sort out the data ... hurts mission effectiveness, then it pays to have the machine make the decision and act on the fly,” Scheidt asserts.

Additionally, warfighters on the ground may need an unmanned system to perform a specific task, but they may not have the time to actually pilot the system. “People running around in Humvees have better things to do than to try to figure out exactly what path the unmanned vehicle should be on or exactly what the vehicle should do. Rather, they need to—at a very abstract level—task the vehicle and receive useful, actionable intelligence from it. That’s really the killer app,” Scheidt asserts.