Approach keeps analysts in the information security loop.
The U.S. Air Force is researching an information assurance system that incorporates the human factor into protecting data. The system would help analysts charged with monitoring networks identify potential breaches more easily by removing clutter and presenting them with a clear assessment of the danger level.
Experts agree that an operator’s ability to recognize cues and identify critical points along the decision path is crucial. Although hardware and software are vital information assurance components, the human security analyst continues to be a key element in the decision superiority military leaders seek. To assist in this mission, the Air Force is examining a mixed-initiative system based on research of both technology and human behavior.
Historically, information protection has been an important factor in winning both military and commercial wars. The Internet creates more access points, which complicates information protection and forces organizations to find ways to control access to proprietary information.
The U.S. Defense Department includes information assurance—both passive protection and active defense—as part of the core competency of information superiority. According to the Joint Chiefs of Staff, information superiority is the first objective in any military operation.
The Air Force describes information assurance as operations that protect and defend data and information systems by ensuring their availability, integrity, authenticity, confidentiality and nonrepudiation. It also includes the ability to restore information systems by incorporating protection, detection and reaction capabilities. Computer network defense (CND), which the Air Force defines as protecting information, computers and networks from disruption, denial, degradation or destruction, enables decision superiority.
To date, many network operators have focused on intrusion detection as the primary activity of CND. Suspicious traffic from among the billions of packets flowing daily across the network must be identified and categorized. Network intrusion detection systems analyze local network traffic to spot widely known computer attack commands. In addition, they recognize varying degrees of anomalous behaviors on the network. Experts agree that the sheer volume of this network traffic makes the identification of attacks and attackers a daunting task.
Network intrusion detection systems themselves are not perfect. Security analysts are bombarded with both false positive and false negative warnings. As a result, the very nature of the analyst’s job requires constant vigilance, a role very similar to that of the radar operator waiting for an enemy aircraft to cross into friendly air space. The attacker must make a first move before the surveillance system registers an event on the screen.
Intrusion detection research has sought to solve these problems by concentrating on the bits and bytes of data as they flow between networks. Primarily two paradigms dominate this research. The first, anomaly detection, is aimed at identifying intrusions by comparing current system activity to a programmed base line. When a significant statistical deviation from the base line occurs, an alert is triggered. Misuse detection is the second paradigm. A system built on this paradigm looks for known patterns or attack signatures.
While developers have consistently improved the capabilities of systems based on these paradigms, they still fall short of total protection. Focusing on how information flows between networks leaves out an important component of the intrusion detection system—the human security analyst. The cognitive tasks the analyst accomplishes are equally important to the technical tasks required of the automated portion of the system.
In an article titled “Situation Awareness in NCM” published in Naturalistic Decision Making, 1997, Dr. Mica R. Endsley maintains that recognizing cues to develop situational awareness and identifying critical points along the decision path are two cognitive tasks operators perform as they interact with their systems. Designing a system around the operator’s cognitive needs that synergizes the strengths of humans and the strengths of automation can help lead to decision superiority in network defense scenarios. Such a system is specifically designed to provide the information that the operator needs to make decisions.
To enhance the relationship between man and machine, the Air Force is examining a mixed-initiative system. To begin their work, researchers studied user tasks, both physical and cognitive. At the Air Force Institute of Technology, Wright-Patterson Air Force Base, Ohio, they used the applied cognitive task analysis method that Klein Associates, Dayton, Ohio, developed to study the cognitive tasks of computer intrusion detection experts. The study identified four major decision steps that took place after an alert was received. These entail answering the following questions: What is the source of the alert? Is the alert an intrusion attempt? Was the system compromised? What was the depth of the compromise? Each step was analyzed according to the situational awareness cues, evaluation strategies, information sources and difficulties encountered during the intrusion detection process.
This analysis led to the development of five cognitive requirements for network intrusion detection systems: recognition of nonlocal Internet protocol (IP) addresses, identification of source IP addresses, development of a mental image of normalcy, creation and maintenance of analyst situational awareness, and facilitation of knowledge sharing.
The study found that all of these are essential to successful intrusion detection. The conceptual mixed-initiative network intrusion detection system (MiNIDS) is the result of the study.
MiNIDS suggests ways to improve analyst-automation interaction during the intrusion detection decision process. These ways stem from incorporating the principles of naturalistic decision making and through the application of the situational awareness cues discovered during the data analyses.
The MiNIDS interface helps the analyst evaluate each alert quickly and accurately through a stoplight metaphor. Red represents the most dangerous incidents; amber signifies incidents that need attention, but not immediately; yellow signals incidents that are worthy of note but can be evaluated at a later time; and green signifies low-threat or no-threat incidents. The color allows the analyst to arrive at a conclusion about the nature of the alert by making it easier to spot nonlocal IP addresses while determining the source of the alert and to more easily proceed to deciding whether the alert is an intrusion attempt.
Observations during the case studies showed that the analysts tended to evaluate each alert individually before performing any correlation analyses. Fewer alerts on the screen reduce clutter and make the individual assessment easier to perform. The interface is designed to display up to five alerts at a time. It provides easy access to a database query interface via a button on the right side of the screen.
The first column the analyst sees is the real-time threat indicator. Because it appears first, it plays a key role in setting the security analyst’s mental expectation during evaluation of the alert by displaying a yellow, amber or red square with a secondary cue of a 1, 2 or 3 in the middle of the square.
Next, cross-case analysis showed that the security analysts were quite concerned with determining the source of the alert information and knowing the destination of the alert. To support action, the source IP address as well as the domain name to which the IP address belongs is provided. Similar information for the destination also is furnished, including the port and service requested at the destination.
To increase the analyst’s situational awareness of a particular alert, data transfer between the source and the destination is displayed at the right of the destination information section. A red square with a “Y” in the middle signifies the dangerous situation of data transfer across the Internet. A green square with an “N” in it means that no data was transferred across the connection. The time and date that the intrusion detection system issued the alert provide the analyst with a chronology of events. This information will help in correlation analysis.
The MiNIDS interface exploits the mixed-initiative approach. Using cognitive task analysis, the thought processes of security experts can be used to better harness the information an automated intrusion detection system provides. While traditional intrusion detection systems have been criticized as being difficult and monotonous to use, the MiNIDS interface offers a user-efficient approach to better synergize the strength of the operator and technology.
Maj. David P. Biros, USAF, is an assistant professor of information resource management at the Air Force Institute of Technology, Wright-Patterson Air Force Base, Ohio. Capt. Todd Eppich, USAF, is a communications analyst at the Air Force Communications Agency Technology Directorate, Scott Air Force Base, Illinois.