Low-Tech Humans Subvert High-Tech Information Assurance
The tragic events of September 11 provide ghastly substance to the metaphor of asymmetric warfare. And, they add credence to prescient but nebulous warnings of threats to homeland security and concomitant vulnerabilities of critical infrastructures.
While public switched networks (PSNs), cellular telephones, wireless networks and the Internet—the backbone and heart of the U.S. information infrastructure—were not prime targets, the cascading consequence of collateral damage to information systems was laid bare. The information infrastructure was found wanting in support to intelligence collection, law enforcement, disaster mitigation and recovery efforts.
A shortfall in network capacity, single-node sensitivity and the lack of interoperability among police, fire and first responder networks exposed gaps in the road to information assurance that cannot be filled solely with new technology, firewalls, anti-virus patches or cryptography.
A once indifferent but now belatedly aroused public clamors for government action, so money will be provided—perhaps thoughtlessly. The Information Technology Association of America proposes spending $10 billion for information technology (IT) security, and Senator Joseph Lieberman (D-CT) proposes a $1 billion IT fund to jump-start some of the more pressing security requirements in government and the private sector. But these resources may be wasted in fruitless quest of a technical silver bullet if we overlook problems created by humans who misuse available technology.
The United States has yet to conduct a comprehensive national threat assessment of its information systems. Nevertheless, the ability to transact business, operate government and respond to physical, chemical, biological or nuclear attacks will be constrained by the capacity, accessibility, reliability and security of the information infrastructure.
The groundwork for a serious effort to improve information assurance was laid on October 16, 2001, by the presidential executive order “Critical Infrastructure Protection in the Information Age.” This directive elevated Richard Clarke to the position of special adviser to the president for cyberspace security, where he will chair a Critical Infrastructures Protection Board operating under the new cabinet-level Office of Homeland Security. Also, the significant resources of the National Communications System (NCS) have been brought to the fore, with its Committee of Principals redesignated as the Committee on National Security and Emergency Preparedness.
A first step would be to heed the calls for caution expressed by Bruce Schneier in his book Secrets & Lies: “If you think that technology can solve your security problems, then you don’t understand the problems and you don’t understand the technology.”
History is replete with examples of how human apathy, ignorance, carelessness, greed or disloyalty have subverted the best security that technology could provide at the time. So, proposals for reform should first confront that weakest link—the human operator and the predisposition to abuse technology.
The information security needs of the national defense establishment once were provided quietly, efficiently and almost exclusively with tools, techniques, policies and processes fashioned by the National Security Agency and its progenitors. They were aided by the laboratories of a chosen set of contractors, and their measures were used by trained and properly vetted personnel.
But those devices and measures were expensive, often complicated, difficult to install and maintain, relatively slow in operation and easily bypassed. Also, their keys were readily accessible to adversaries who purchased access from disloyal employees.
Frustrated to find that commercial information technology in their administrative offices outperformed military systems in their command posts, senior commanders argued for more off-the-shelf solutions. Their pleas to permit commercial information hardware and software in command and control and weapons applications were heeded, resulting in a sea change in acquisition policy. Military systems increasingly rely on commercial products and services where functionality and cutting-edge applications routinely take precedence over security. The result is an increase in opportunities for penetration, theft and corruption of data, and denial of service.
Analyses attribute more than 70 percent of security breaches to the insider. Technology consistently has produced techniques that could reduce vulnerability, but the user just as consistently has discovered ways to ignore, bypass or misuse these tools.
It need not sully the heroic efforts of the wizards in England’s Bletchley Hall—they broke the German Enigma encryption cipher in World War II—to admit the seminal role played by Reichswehr Cipher Center employee Hans-Thilo Schmidt, who sold the Enigma operating manuals to France. Nor should we overlook the Poles who built and then turned over a working model of this marvelous “unbreakable” device to the English, or the Luftwaffe signalmen who used the names of their girlfriends in key settings. As David Kahn notes in The Code Breakers, “The [effort] succeeded only with the help of stolen or otherwise compromised materials.”
During the Korean conflict, combat forces had a very secure one-time-tape encipherment system to encrypt radio teletype traffic. However, classified messages routinely were transmitted in the clear by operators who failed to follow simple operating procedures.
A voice encryption device was underutilized during the Cold War allegedly because Gen. Curtis LeMay, USAF, said, “It sounds like Donald Duck.” So his senior officers often “talked around” classified subjects over unprotected high frequency radio circuits.
Even though encryption devices were vastly improved by the 1960s, security monitors regularly reported military plans being handed to Viet Cong and North Vietnamese forces by personnel who casually planned forthcoming missions over unprotected voice circuits.
The Cold War found U.S. military forces equipped with an array of highly secure encryption devices. However, keys to the entire U.S. Navy tactical cryptologic structure were sold to the Soviet Union by John A. Walker Jr. and Jerry Whitworth.
Finally, as two agencies in the U.S. intelligence community recently learned, a trusted insider easily can exploit a dedicated and secured virtual Internet protocol (IP) network.
We find ourselves engaged today in a new type of war that threatens the information infrastructure. Yet, an apparent indifference toward information security persists. Richard Clarke says that attacks on the nation’s critical IT infrastructure could potentially cause “catastrophic damage to the economy.”
Dependence on information that once was temporal is now absolute. However, access to that information is routinely denied because of security holes in commercial software that runs servers along with communications protocols that were designed for cooperative ease of use, not security.
Data are now just as vulnerable to theft or to corruption in storage as they once were only during transit. Yet, effective asymmetric encryption techniques, such as Pretty Good Privacy (PGP), are underutilized, and software routinely is purchased from unknown and unknowable third parties in foreign lands.
Amateurs breach information systems and cause losses estimated in the billions of dollars, but security patches to detect or block intrusions either are not installed or are not kept current. We continue to buy technology so high in vulnerability that it literally invites assault.
Both industry and the military now are rushing to embrace wireless networks and mobile computing where, advertising hubris to the contrary, encryption is still weak. With these, vulnerability has been taken to even higher levels by threats such as drive-by hackers and individual loss of laptops and personal data assistants.
It is the policy of both the Clinton and Bush administrations that the federal government should lead by example in information security in a “voluntary public-private partnership involving corporate and nongovernmental organizations.” But the mantle of federal leadership is not enhanced by an October 2001 congressional panel assessment of protection in federal computer networks that reports a drop in grade from “D-minus to F.”
Perhaps it is time to rethink policy and use public funds to selectively ruggedize parts of this privately owned information infrastructure upon which homeland security rests. But what is the proper focus for such investment? The widespread use of poor security practices by people points to structural faults in the technical baseline.
How well did information systems work on September 11? Despite enormous damage to a Verizon central office sited near the World Trade Center that left millions of callers with no dial tone (300,000 voice and 3 million data lines, according to an article in Wired magazine), the PSN functioned as it was designed to. This was because controls for priority access and circuit allocation had been put into place by the NCS during the Cold War. Accredited officials had priority access, and incoming calls to Manhattan were blocked to prevent trunk overload.
Cellular telephone networks were saturated in damaged areas, which made a case for a PSN-type priority and pre-emption capability in mobile networks to ensure network access to first-responders.
Those denied access to the PSN and cellular nodes rushed to e-mail and instant messaging on the Internet, demonstrating saturation-avoidance features inherent in the IP network that ensure routing around damage and optimal use of long distance bandwidth. But packet networks also can be forced into overload, message traffic will back up, and voice over IP then becomes impractical.
If the homeland is now a battlefield bereft of combat-capable information systems, and a war is to be waged by a host of first-responders, what changes must be made to put peacetime information systems on a wartime footing? The answer, says Frank Rose writing in Wired, is “a ground-up redesign of the entire local telecom system, with massive bandwidth and enough routers to handle huge loads of traffic.”
A more realistic approach begins by adopting the attributes of any tactical military information system. That means investment in hardened, redundant, dispersed switching and router nodes employing dissimilar software, multiple alternative transmission paths, and an order-wire that does not go down with the mission channel. Since redundancy is viewed by business as an undesirable cost, not a benefit, this necessarily will mean federal funding.
Clarke has invited industry to evaluate a government Internet, dubbed GovNet, to be shared only by authorized users and having no connections to the Internet or other networks. GovNet would use dedicated, encrypted, fiber optic channels to help isolate it from viruses, worms and other attacks moving through the public Internet.
In such a network, tight control over the router software must be maintained, and servers must be operated by a highly trained and well-paid core of administrators using the same personnel reliability procedures as are used for nuclear weapons.
User terminals must be “dumb” or “thin.” That is, they must provide authenticated users with nothing more than a keyboard, screen and dial tone to a secure server—not the Pentium 4 terminals routinely used and abused by today’s work force.
Structural changes must be accompanied by continual security awareness training sessions and enforced by harsh sanctions for all infractions. This architecture also lends itself to identification of a mole, since all transactions can be scanned, identified and traced with full assurance of authentication.
Paul Strassmann, former director of defense information, U.S. Defense Department, suggests the way to quickly and cheaply test GovNet is by carving out a “domain already operating under central control; installing a secure net that performs a few but most-often-used critical tasks such as e-mail, text and presentation slides; and labeling ‘insecure’ the existing devices and nets of questionable security, permitting them to atrophy gradually.”
Many in the work force will be upset with changes that sever unfettered and misused access to the Internet, but activities of the malicious insider will be markedly constrained.
This new architecture demands little from research and development beyond that necessary to maintain parity with hacker threats. A thin-user architecture will improve security and will substantially lower acquisition and seat maintenance costs.
The most compelling argument for a new approach to information assurance is that it can be implemented incrementally with minimum perturbation to an ongoing enterprise.
Col. Alan D. Campen, USAF (Ret.), is a contributing editor to SIGNAL, adjunct faculty member of the National Defense University School of Information Warfare and Strategy, and a contributing editor to four books on information and cyberwar.