Cybersecurity for Unmanned Aerial Vehicle Missions
This threat can come from signals beamed into a control stream or even embedded software containing a Trojan horse. Researchers are addressing this challenge from traditional and innovative directions as the use of unmanned aerial vehicles continues to expand into new realms. But the issues that must be accommodated are growing as quickly as threat diversity.
In December 2014, at Early County Airport in Blakely, Georgia, a team from the University of Virginia (UVA) and the Georgia Tech Research Institute (GTRI), operating under a Federal Aviation Administration (FAA) Certificate of Authorization, conducted flight tests that evaluated a new class of cybersecurity solutions on an unmanned aerial vehicle (UAV) performing a video surveillance mission. The goal was to protect computer-controlled remote systems from cyber attacks. The specific solutions that were assessed are part of a UVA-developed concept for a new layer of cybersecurity referred to as System-Aware. The System-Aware nomenclature is based on the fact that this class of solutions depends on detailed knowledge of the design of the system being protected. This layer of security serves to both complement network and perimeter security solutions as well as protect against supply chain and insider attacks that may be embedded within a system.
The team launched experimental cyber attacks to prevent the UAV from observing specific ground-based locations. Attacks included changing navigation waypoints, embedding errors in the Global Positioning System (GPS) and taking control of camera direction. In addition, cyber attacks were conducted onboard the aircraft, including corrupting aircraft-provided camera direction data being sent to the ground to support video exploitation, thereby disabling ground-based interpretation of streaming video.
Some of the attacks were initiated through the UAV pilot’s support system, some via embedded Trojan horses within the aircraft and some from third-party locations using the aircraft’s air-ground communication system to access onboard electronic systems. To ensure safe operation during the experiments, a dual command system allowed a safety-focused operational team to take immediate control of the aircraft if needed. Technical results from the flight tests were positive, demonstrating that the System-Aware concept can significantly improve the cybersecurity of physical systems.
In correlation with the Georgia flight tests, a team from UVA and the MITRE Corporation at Creech Air Force Base in Nevada designed and conducted a set of tabletop simulation-based experiments with active military UAV pilots. The experiments involved the pilots being supported by System-Aware solutions that could automatically detect cyber attacks that were similar to those launched in the Georgia flight tests. The pilots were presented with suggested aircraft reconfigurations to restore operations, such as resetting a waypoint or switching from GPS-based navigation to less accurate—but more trusted—inertial navigation.
While the pilots found attack detection to be useful, some of their reactions differed from what was anticipated when designing the experiments. For example, one pilot indicated that the system’s response to an attack was not enough to deter him from terminating the operation because of concerns about residual elements of the attack that had not yet occurred. Another pilot suggested that real-time access to a cybersecurity expert would greatly reduce worries about making decisions with insufficient knowledge. A third pilot raised concerns about the possibility of the monitoring system being the target of attack, potentially causing the pilot to make counterproductive decisions. These results highlighted the importance of operator training for addressing rare, unpredictable cyber attack situations that require confident decision making. Researchers have begun to understand and address this important issue better.
The System-Aware cybersecurity concept developed at UVA is pertinent to a wide range of computer-controlled systems, such as UAVs, cars, radars, turbines and weapon systems. System-Aware implementations involve connecting the dedicated cybersecurity monitoring system, known as Sentinel, to the system being protected. The Sentinel monitor is designed to collect information to detect illogical behaviors that can be categorized as likely cyber attacks. The low-power, small-footprint prototype electronics package implemented for the UAV flight evaluations consisted of sensors, microprocessors and communication devices. This technology collected and analyzed data to detect potential cyber attacks, and it disseminated the results.
For example, if the Sentinel observed a change in waypoint occurring within the navigation subsystem but no message from the pilot directing this change, the monitor could conclude that a cyber attack was underway. Similarly, if a camera-pointing command was received onboard the aircraft and the Sentinel observed that the command differed from the camera’s response, it could deduce that a cyber attack was the likely cause.
Although system restoration can be automated, operators themselves still may want to respond to an attack to sustain operations. For this capability, planners can create specially protected locations for storing critical flight information, and the Sentinel can draw on that information to restore a waypoint or camera direction. With critical aircraft subsystems, in particular, the operational community gains confidence when pilots are involved in initiating these commands.
Recognizing that different systems require their own unique solutions, the System-Aware concept includes reusable design patterns for monitoring and restoration. For example, monitoring for unfounded parameter changes that significantly affect system performance is a reusable response to a broad set of potential cyber attacks designed to make such changes.
Parameters in a radar surveillance system determine the system’s performance regarding false and missed detections. Criteria for automated collision-avoidance systems initiating their warnings are determined through parameter settings in the automated collision-detection algorithms. An entirely different design pattern under development involves monitoring to ensure that the chain of command’s doctrinal requirements for implementing critical system configuration changes—such as switching modes in a multimode system—are met. This type of design pattern would force an attacker who wants to change a system’s operating configuration to concurrently attack the computers that are engaged in the process that supports the chain of command’s structure. The Sentinel research effort has developed a significant set of reusable design patterns that were implemented for the experiments.
One risk associated with applying a Sentinel-based solution is the potential for a cyber attack to affect the Sentinel’s performance, causing false or missed attack detections or corrupting automated reconfiguration decisions. Accordingly, a critical element of the System-Aware concept is to provide careful attention to Sentinel cybersecurity. Implied in this concept is that securing the Sentinel is more readily accomplished and less expensive than security additions to the system the Sentinel protects.
Still, Sentinel designs must satisfy a number of technical requirements. First, they must be significantly less complex than the protected systems. Second, the amount of software required for Sentinels must be a fraction of the software embedded in the controller for the protected system. Third, the deterrence of supply chain and insider attack risks must be addressed.
The research prototype used onboard the UAV employed triple diverse redundancy, providing a high level of security while satisfying economic and physical constraints related to airborne systems. The prototype Sentinel used three different manufacturers’ computer boards, three separate operating systems and three versions of the Sentinel software. Comparing the outputs from the diverse implementations provided a real-time capability to detect and eliminate a corrupt Sentinel element. Furthermore, the Sentinel prototype design included a moving target cybersecurity solution; on a randomized basis, every few seconds, a switch would occur between the three diverse implementations to select the portion of the Sentinel that would be in control versus operating in a hot shadow mode.
Sentinel’s diverse components were sufficiently low in cost to make the hardware portion of the solution economically viable. And the quantity and complexity of the monitoring software supported achieving the desired Sentinel attributes. For example, the software-based monitoring functions for detecting a cyber attack required 300 to 500 lines of code and were not intertwined. This made them far more manageable in quantity and complexity than the software being protected in the UAV.
An important aspect of the UAV research project involved developing an approach for selecting which system functions to monitor and protect. The Sentinel needs to secure the functions considered to be most critical by the system owners and operators as well as the activities deemed to be most desirable and easiest for cyber attackers to disrupt. Satisfying this requirement calls for an integrated design team with expertise on the system to be protected, the motivations of cyber attackers and the technical approaches for conducting cyber attacks. In addition, cost analysis is necessary to determine the most cost-effective subset of System-Aware solutions.
Early in the design phase of the prototype, the UVA/GTRI team carried out the needed assessments. But, as the project moved into the implementation phase, newly discovered information emerged, resulting in the need to modify earlier decisions. Furthermore, early predictions about the complexity of attacks and costs needed refinement. Finally, as the project unfolded, an important Sentinel design factor emerged that had not been considered: the need to assess alternatives to achieve the desired disruption. This entails determining how significant the difference is between the alternatives that can accomplish a desired outcome and how influential defending the “easiest” attack would be on the motivations of the cyber attacker.
The recognition of the complexity of deriving Sentinel design requirements led to a decision-support tool research effort to aid system-specific determinations of the most desirable Sentinel defense capabilities. This activity currently is advancing toward a prototype-based set of experiments involving system planners. The U.S. Defense Department, through the Systems Engineering Research Center, is supporting the work. The SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology in Hoboken, New Jersey.
Based upon the outcomes of these experiments and the growing awareness of cyber attack risks, several additional System-Aware projects have been initiated. These include a Virginia State Police endeavor focused on automobiles; a National Institute of Standards and Technology project on 3-D printers; and Defense Department work addressing radar and weapon fire-control systems.
With the high visibility of automation innovations such as autonomous vehicles, robots and the Internet of Things, people are becoming more aware of the need to address cybersecurity for physical systems. However, because of the rapid emergence of these innovations, the nation’s work force and education system are not ready to fully support this new need. Engineering schools, for example, do not integrate the curriculum for electromechanical systems-related departments, such as mechanical engineering, with the curriculum for the departments that teach cybersecurity, such as computer science. The same separations occur in industrial organizations.
Furthermore, cybersecurity experience related to information systems does not address the need for immediate response to disruptions that physical system solutions require—nor does it include training to respond to rare events that can be life-threatening. Important cybersecurity investments will be required to enable safe deployment of the innovations for automated physical systems that are being developed.
Barry Horowitz is the Munster Professor of Systems and Information Engineering and chair, University of Virginia Department of Systems and Information Engineering. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the U.S. Department of Defense.
Comment
Incredibly interesting
Do you have a downloadable whitepaper on how this was all done. I would enjoy reading about your team's work in this area. Thank you.
Have you assess performance
Have you assess performance impact of aircraft mission, since aircraft system is complext and large?
Comments