Securing Information Vexes Defense Planners

November 2008
By Charlotte Adams

John Grimes, the assistant secretary of defense for networks and information integration and Defense Department chief information officer, gives the opening address at the AFCEA Solutions Series’ Information Assurance conference.
Too many hazards, too many vulnerabilities, too valuable a target.

Information assurance, encompassing such objectives as data availability, integrity and confidentiality, is a growing concern in the enormous data processing and communications enterprise run by the U.S. Defense Department and the underlying commercial infrastructure. The Defense Department’s network, known collectively as the Global Information Grid, is powerful but fragile. It also is under constant attack.

These attacks, growing more sophisticated by the day, require a renewed focus on risk assessment, realistic training and continuity of operations under duress, according to government and industry officials who addressed AFCEA’s recent information assurance conference. The latest in the Solutions Series of events, the conference was held September 9-10 in Washington, D.C.

John Grimes, the assistant secretary of defense for networks and information integration and Defense Department chief information officer, indicated the scope of the problem. The Global Information Grid (GIG), he said, embraces 21 satellite communications gateways, 120,000 commercial telecom circuits, some 15,000 networks, 7 million computers worldwide and thousands of warfighting and support software applications. The GIG’s enterprise computing core includes 34 mainframes, 6,100 servers and 1,700 terabytes of storage.

In other words, the Pentagon or the Defense Department “has a big bull’s-eye”  on it, Grimes said. The GIG is under surveillance or attack around the clock, including more than 6 million probes or scans a day. And, the physical infrastructure is fragile. Grimes described how, six months ago, a ship anchored at the wrong place near Alexandria, Egypt. Its anchor then cut two cables, affecting 90 percent of the communications going into the theater. The Defense Information Systems Agency (DISA) and its partners rectified this situation, but two days later a cable in Indonesia was severed. Shortly thereafter, a military communications cable in Doha lost power.

Worse than the fragile physical infrastructure is the changing threat. Grimes asserted that the threat has evolved from curious teenage hackers to industrial spies and cyberwarriors. He cited denial-of-service cyberattacks in Estonia and Georgia in 2007 and 2008, respectively. His presentation raised the question: Were these episodes hacking incidents, crimes or acts of war—and who decides? “If you want to bring a nation down without firing a kinetic weapon, you will soon find that a nonkinetic [weapon] can bring you down just as quickly,” Grimes emphasized.

He cited statistics from the U.S. Computer Emergency Readiness Team (U.S.-CERT), which claimed that security incidents were up 55 percent in fiscal year 2007 compared to fiscal year 2006. U.S.-CERT describes itself as a partnership between the Department of Homeland Security and the public and private sectors to protect the Internet infrastructure and coordinate defense against and responses to cyberattacks across the nation. Grimes also asserted that 1.3 percent of all Web searches link to “infected” sites—more than 59 million Web pages. In 2008 alone, moreover, 72 percent of networks with more than 100 PCs have been infected, he said.

John W. Thompson, chairman of the board and chief executive officer of Symantec Corporation, warns of attacks on information sources.
John W. Thompson, chairman of the board and chief executive officer of Symantec Corporation, presented similar evidence, pointing to a semiannual survey his company publishes. According to the survey, more than 70 percent of observed Internet attacks were going after the information sources, rather than individual personal computers or network pipes. “It was clearly targeted at a very specific information source where there is opportunity for financial reward,” Thompson said. In the past few years, he stressed, more than 3 billion records have been exposed. You can buy not only credit cards, but “complete identities” online, Thompson claimed.

More malicious code is created every day than good code, Thompson added. If this trend continues, it will reach the point where concepts and techniques such as “white listing” may be employed. This management-by-denial strategy requires that only code that has been demonstrated or is known to be good can be executed on a device, he explained. The bottom line, he said, is that organizations have to increase their understanding of what data they have, where it exists in the infrastructure and what precautions should be taken to protect it.

According to Symantec, university networks are particularly vulnerable, as these institutions seek to foster more open interchange. Last year, 24 percent of information-based attacks were focused on the educational environment, Thompson said. But a close second was government, which was targeted in 20 percent of these attacks.

Vice Adm. Nancy Brown, USN, director for command, control, communications and computer systems  (J-6), the Joint Staff, described the challenges of the current conglomeration of systems and services, which she called the “GIG 1.0.” It is a collection of service stovepipes, or intranets, tied together through the Defense Information System Network (DISN) backbone, she said. Among the challenges with the GIG 1.0 are its lack of a common network architecture and policies, multiple “netops” constructs and information-sharing requirements.

Panelists in a town hall session on securing the Web are (l-r) panel facilitator Bob Lentz, deputy assistant secretary of defense for information and identity assurance; Vice Adm. Nancy Brown, USN, director, command, control, communications and computer systems (J-6), the Joint Staff; Dr. Christopher Kubic, chief architect for the National Security Agency’s Information Assurance Architecture and Systems Security Engineering Group; and David Mihelcic, chief technology officer, Defense Information Systems Agency.
The GIG today is really more like version 0.2 than 1.0, said Dr. Christopher Kubic, chief architect for the National Security Agency’s Information Assurance Architecture and Systems Security Engineering Group. He advocated the need to “get away from the system-high environment” and move to a “transactional enterprise information assurance” protection model. On his to-do list also are the strengthening of demilitarized zones (DMZs) between the Secret Internet Protocol Router Network (SIPRNET) and the Nonsecure Internet Protocol Router Network (NIPRNET), improving the computer network defense architecture, enhancing access control to information on the SIPRNET and increasing the ability to assess information assurance compliance.

DISA, which deals with the GIG on a daily basis, deployed tiger teams from the agency, intelligence groups, the services and the Joint Staff to better understand the issues with Internet protocol (IP) networks, transport communications networks and end-user systems, said David Mihelcic, the agency’s chief technology officer. “We found—no matter how bad you think it is—it’s worse,” he said.

The unclassified NIPRNET, which handles essential functions such as air transport scheduling and other logistics information, has 16 connections to the global Internet. “Like it or not, our future is completely bound to the global Internet,” Mihelcic said. Among the many NIPRNET problems are the lack of a well-established firewall perimeter, absence of DMZs where outward-facing servers can reside, and users who “still are not doing basic hygiene things,” he said. The NIPRNET also has been subject to “exfiltrations,” the Defense Department’s Grimes added. This would not be so serious except that, when aggregated, these leaks of unclassified/sensitive information escalate to Top Secret in classification.

Nor is the SIPRNET problem-free. Mihelcic described it as “hard and crunchy on the outside but soft and chewy on the inside.” It relies on a combination of physical security and either end-to-end or hop-by-hop encryption. But the physical security is uneven, and one breach in physical security could lead to a breach in the network. The networks basically “happened,” he said. They grew from the bottom up rather than being designed from the top down. But now it is time to “move from this ad hoc collection of stuff” to something that is designed for a particular purpose, he declared. Terminals need to be hardened, servers need to be moved to more secure enclaves, and a common and joint methodology needs to be used for handling identity and access control.

The U.S. economy also is a major target, and much proprietary information could be at risk. The typical organization experiences approximately 50-percent growth in digital content per year, according to Symantec’s Thompson. “Unstructured data,” moreover, the bits and bytes existing outside of the relational databases that drive much critical decision making, is growing by 60 percent to 70 percent per year. So great is this avalanche that if information continues to grow at the current rate, soon there will be more of it than there are individual grains of sand on every beach on all the oceans of the world, he predicted.

Information also is becoming more distributed and mobile than ever, Thompson said. He stressed the need to strike a balance between infrastructure and information protection and to “move protection closer to the information.” He called for one common way to look at evaluating breaches and one common set of laws to understand and identify the consequences. Right now, Thompson said, 30 states have their own views about how data breaches ought to be managed.

On the other hand, industry needs to adopt a risk-based approach to information sharing, he said. “An encryption application is like a carpenter with a hammer—to him every problem is a nail,” Thompson said. But this approach can be self-defeating because of the high back-end infrastructure cost. There already is an issue with redundancy, he noted, adding that users should “start to de-dupe some of the data.” They may not know it, but there are half a dozen to a dozen copies of every e-mail somewhere in their systems. A risk-based approach would tackle this problem by limiting its scope. But it would require understanding the data users have, knowing where it exists in the infrastructure and then deciding what should be protected.

If data is so crucial to today’s military and commercial operations, it is important to understand the situation with respect to adversaries in cyberspace. “We have to stop fooling ourselves that we can keep adversaries out of our cyberspace” and look at the “integrity aspect—how to reconstitute trust,” said Anthony Bargar, a senior policy analyst who handles GIG mission assurance in the Office of the Deputy Assistant Secretary of Defense for Information and Identity Assurance. He emphasized the task of ensuring that the GIG works under fire, deflects attacks, recovers and restores trust in the information. Other aspects of recovery include figuring out what decisions were made against bad data and enabling the continuance of a necessary minimum of essential functions. “We cannot secure everything at all times. We cannot make everything resilient and redundant at all times,” he said.

Among the key initiatives Bargar stressed was increasing the Defense Department’s ability to plan, simulate and execute more realistic exercises under “cyberduress” in order to fulfill the mantra, “train as we fight.” This will involve writing better playbooks, creating more realistic scenarios and focusing on lessons learned, as well as creating “live fire areas, with digital bits flying, to be able to look at the pros and cons of how we do business.” He wants to “get past these notions that the network will always be there and that we’ll be able to trust it” and to increase understanding of the “physics of a sophisticated cyberattack.” The military has to start planning for what happens in a worst-case scenario, when the adversaries change tactics and techniques “from exfiltrating data to manipulating [it].”

Bargar further suggested that managers may be underestimating the scope of the problem, “making risk decisions with blinders on.” Certification and accreditation processes often end at the fence line, he said. Network centricity has made it very difficult to know all the components within the GIG. “It’s difficult to know where the other end of the wire goes.” But the department needs to do a better job at that, he added.

Risk decisions need to be made with a view toward implications beyond one’s immediate area. “Cyberspace is not an AOR [area of responsibility] issue,” Bargar said. Though they are difficult to understand, the cross-cutting “cascade effects” of a cyberattack cannot be ignored. Echoing calls for increased government involvement and funding, Bargar said, “We’ve got to be able to get past the notions that we’ll be able to keep cyber on the cheap and have it there when we really need it.” The amount of funding devoted to information assurance with resiliency and diversity is very small, he said. And resilience is a means of cyberdeterrence. People are thinking now about the concept of mutually assured destruction as it applies to cyberdeterrence, he added, although it is a lot more difficult than nuclear deterrence because there are so many adversaries involved.

Two other areas in need of more attention are “increasing situational awareness throughout the OSI [Open Systems Interconnection] stack” and transforming business processes by using “mission-driven risk management” in network operations, Bargar stated. He cited a 2008 study by the Defense Department inspector general that said that some 60 percent of mission-critical systems lack contingency plans. They “haven’t felt the pain of a degraded environment,” he said. It is important to “relook at the risk management models and protect just enough,” he said, but the hard part will be defining what is just enough. Adm. Brown also emphasized that “not everything needs protection.” Building a “moat around everything” does not provide the necessary agility, she said.

Russell Rochte, a professor at the Joint Military Intelligence College, posed some difficult dilemmas. One needs to ask what capability is most important to preserve, which governmental functions and corporate services should be maintained and at what cost, he said. The political conversation is ongoing, Rochte noted, including privacy rights and the obligation for public defense. But the risk of being wrong is “our national structure in the event of a catastrophic attack.” Another fraught question is the Defense Department’s decision in the 1980s to use leased lines. The decision was driven by financial concerns, Rochte said. But the business case for redundant, separate networks might be national survival.

It also is important to understand where the adversary is coming from, Rochte stated. What if the U.S. military engages in a set-piece cyberbattle while the adversary adopts political warfare? What if an adversary takes down Visa’s processing capability the day after Thanksgiving? Or, he also suggested, what if the U.S. infrastructure experiences an Estonia-type denial-of-service attack when quarterly financial statements are due?

The picture is no prettier on the industry side, according to Don O’Neill, president of the Center for National Software Studies. The current critical infrastructure protection model is insufficient to ensure continuity of operations for critical missions, he said. It is an “accidental” collection of legacy systems that are becoming increasingly integrated, interoperable and interdependent in ways that are difficult to understand.

While the Department of Homeland Security is “driving sector-oriented activity,” O’Neill said, the government is not addressing “cross-sector [critical infrastructure] issues.” If the government has any oversight role, it is time for it to exert that oversight before the system becomes too complex to understand and control, he said. An audience member took issue with O’Neill’s cross-sector viewpoint, citing the example of the IT Sector Coordinating Council. A partnership for critical infrastructure security has been established, the attendee said, and sector leaders meet regularly to discuss national security issues.

The current “protection model” nevertheless is insufficient to ensure continuity of operations for critical missions, O’Neill said. Managers need to focus on resilience and move from the “static combination lock” approach to a dynamic strategy more like a chessboard. Industry sectors are not coordinating on matters such as recovery time objectives, he claimed. In fact, executives do not even want to know too much about problems they cannot control because of the “moral hazard” that would create.

Bill Neugent, a MITRE fellow, echoed this warning. “The critical infrastructure is about as secure as beach cottages on a sandbar,” he said. “To a great extent, we’re naked in cyberspace.” And the United States is probably the “most IP-dependent country” in the world.

“We need to assume the bad guys are inside,” Neugent continued. This means that managers need to think about “how to deal with resilience assurance, situations when protection fails.” As did Bargar, he stressed the need “to exercise for real” and “to stop pretending that the adversary has no cyberforce.” For one thing, “we need to start thinking like the bad guys,” he said. The goal should be not to destroy their networks but to “own” them. And commercial business cannot afford this, he asserted. “It has to be a public/private partnership.”

Photography by Michael Carpenter


Enjoyed this article? SUBSCRIBE NOW to keep the content flowing.