Study proposes replacing obsolete methods with new tools based on modeling and simulation.
Many U.S. government facilities rely on outdated security procedures for their computer networks. A recent report by the U.S. Department of Energy, compiled and edited by researchers at the Argonne National Laboratory, cites the need for applying scientific methods and modeling to develop a new generation of computer tools to defend against cyberattack.
Modeling initiatives and new research into predictive systems are required to thwart the increasingly aggressive, ever-evolving cyberattacks on both equipment and data. These efforts are part of the recommendations of a recent report by the U.S. Department of Energy that calls for a more scientific approach to cybersecurity. The report criticized
Authored by experts from across the department’s national laboratories, academia and the private sector, A Scientific Research and Development Approach to Cyber Security condones a “game changing” transformation of network security based on a deep understanding of network dynamics and experimentation. The document notes that network and computer defense traditionally have been reactive rather than anticipatory. The report’s editor, Charles Catlett, chief information officer (CIO) at the Argonne National Laboratory,
Catlett compares the current practice of operational cybersecurity to practicing medicine in the late 19th century. At that time, doctors had begun to lay the foundations of modern medicine, but it was a trial-and-error process that involved the sharing of best practices. These historical efforts were not based on underlying theories or models. He notes that for the last several decades, doctors have used models and simulations as opposed to trial and error. “We would really like to move cybersecurity practice forward, but that requires that it develop some tools and models that we can use to base decisions on,” he says.
The application of modeling and simulation for cybersecurity is one of the report’s underlying themes. The study also focuses on architecture for systems and components, distributed applications and computers. Although great scientific effort is focused on the design and development of electronic components and software, Catlett notes that many of the core operational assumptions behind hardware platforms are remnants from the days of mainframe computers. He cites the example of the implicit trust between devices plugged into a machine and the machine itself. “That’s left over from the fact that you used to get the machine and all the peripherals from one vendor. That’s not the case anymore. You plug in devices into a personal computer today that have microprocessors and operating systems within them. If they misbehave, you’re in big trouble if the operating system automatically trusts them,” Catlett shares.
The report also asks CIOs and information technology professionals to think differently about data and information. Catlett says that during the last several years, experts have come to consider data as active instead of passive. This new thinking views data as an interactive object that is aware of its surroundings and any changes made to its information. He notes that he had outside experts from industry and academia review the report for accuracy. What most interested the industry experts was the notion of active data control. “If you’re a software vendor or an artist who has a piece of a movie or a song that you want to release, the value in that software or media is in its distribution. But once you distribute it, you lose control over how it’s used,” he says.
Industry’s approaches to controlling media are heavy-handed and not very effective, Catlett observes. These security methods do not work because the data is treated as a passive object. The report emphasizes a different approach to data that seeks to make information protect itself.
One example of a self-protecting data application is to provide data sets with active, lifelike properties. These properties are analogous to DNA in biological systems and would serve to identify data sets to allow them to maintain information relating to identity, provenance and integrity. When these sets are combined, they would inherit the genetics of their parent data sets, enabling users to determine the ultimate origins of the information.
As the report was being put together, Catlett says that he challenged the authors to imagine technologies that were not in use today but that were just over the horizon in terms of feasibility and practicality. The authors looked at ongoing work in self-correcting software systems. He notes that organizations such as the Defense Advanced Research Projects Agency (DARPA) have undertaken extensive research into active data technologies.
The traditional method for defending networks is to build a series of electronic moats with firewalls to provide a defense in depth. Catlett maintains that the report does not advocate discontinuing the use of passive defenses. However, he adds that passive defense becomes difficult when it must support scientific and military collaboration across international boundaries. “It’s not that we would advocate taking walls down. We are just pointing out that maybe we should take a step back and think if there’s a better way to do this. If we get it right, it doesn’t mean that the moats and walls have to go away, but maybe the area that they sit around is much smaller,” he says.
One way to reduce the dependence on passive defense is to strengthen authentication technologies. The report cites DARPA research into putting code inside data objects that would prevent two pieces of data from being combined into a third piece. Catlett explains that much classified information consists of pieces that are not individually sensitive unless they are combined. Because of government’s worries about this information, the current state of data defense is preventing it from being combined by unauthorized parties. “If you can prevent it [data] from being combined, then you might loosen up how you might protect it,” he maintains.
Fixed cyberdefenses such as firewalls limit information sharing across international boundaries, such as multinational military operations.
The Energy Department report cites the need to develop new techniques for verifying authentication and for actively protecting data.
Technologies such as applying biological techniques to data sets would prevent “mosaic” situations where unclassified data is combined to produce classified results. Because the data’s origins are traceable through workflow or use, with the utilization of distributed data storage, it may be possible in the near future to re-create a data set from a single sample of its DNA. The report states that such a capability would produce a living data set that would be self-organizing and able to recognize a user’s right to access information in specific combinations.
Catlett says that the report seeks to highlight steps to reinvent aspects of cybersecurity that can be changed over several years. One of the document’s goals is to make the work of network administrators and CIOs much easier. He maintains that CIOs should not entrust their cybersecurity entirely to outside experts. While these specialists may be very good, they do not have the tools necessary to do the job. “They are like really good country doctors,” he adds.
Federal agencies and private companies instead must strengthen their own personnel. To make an organization’s cybersecurity team more effective, managers must challenge employees to examine their approach from a high-level point of view. Catlett notes that security teams often operate at a tactical level, applying patches and firewalls and attending to short-term issues rather than viewing the overall security picture. In discussing these security issues with experts, he notes, it is important to step back and understand the necessity for implementing a growing array of patches and firewalls before drilling down to apply individual steps.
Another security consideration is philosophical. “We tend to move against diversity in the information technology space because it’s cheaper to maintain systems to the degree that you get lots of uniformity. It’s also arguably better for users in terms of predicting how to use a system if there’s uniformity,” he says.
However, the downside to uniformity is that monolithic systems are more susceptible to catastrophic failure from a range of attacks. Catlett explains that CIOs tend toward uniformity out of cost consciousness. But he says that CIOs must consider cybersecurity incidents from a cost aspect. After managers have asked their personnel about what security models they are running, the next consideration must be how much diversity exists in a system or network. “How diverse are we, and what can we do to guard against catastrophic failure,” he says.
The report concludes that although computer networks have grown considerably more complex over the decades, current cybersecurity policies remain largely reactive. The document notes that few models exist to verify the effectiveness of security policies, and it adds that there are few adequate methods to extract knowledge and awareness from situational data. “Current approaches to protecting and controlling digital information effectively disable its digital nature in order to reduce the problem to one of physical access, rather than exploiting that digital nature to create self-protective mechanisms,” the report states.
Network security architecture is another weakness because it has not fundamentally changed in 20 years. The study concludes that hardware and firmware continue to be implicitly trusted despite their sources, while administrators continue to erect firewalls and gaps to defend passive data with decreasing effectiveness and increasing cost.
But by applying advances in mathematics and computational science, the report maintains, the nature of cybersecurity can be changed. A scientific approach would allow the development of model-based tools that would enable platforms and networks to anticipate and avoid attacks and threats. The report concludes by emphasizing support for new capabilities in mathematics for developing predictive awareness for secure systems, self-protective data and software, and designing greater security and trust in platforms.