Enable breadcrumbs token at /includes/pageheader.html.twig

The Earthly Realities Of Cyberspace

Information technology is widely and often wildly heralded as the key to rearming the U.S. military for networked conflict, marshaling a host of disparate and dispersed bureaucracies to secure the homeland, and exporting American principles of liberty and justice. But, cloaked by hubris is the indisputable fact that the worth of information technology is established not by how much it costs, but how intelligently it is employed and how well it satisfies user needs.
By Col. Alan D. Campen, USAF (Ret.)

Why information technology can’t connect the dots.

Information technology is widely and often wildly heralded as the key to rearming the U.S. military for networked conflict, marshaling a host of disparate and dispersed bureaucracies to secure the homeland, and exporting American principles of liberty and justice. But, cloaked by hubris is the indisputable fact that the worth of information technology is established not by how much it costs, but how intelligently it is employed and how well it satisfies user needs.

Recent events have stressed the nation’s information networks, revealing vulnerabilities and implicit but ignored vital relationships among culture, organization and management, all of which combine to proscribe performance of information systems. Surveys of the utility of technology programs in both the public and private sectors yield assessments ranging from disappointing to dismal—the metric of merit being how well system performance matches operational requirements, costs and schedules.

The future shock that Alvin Toffler predicted in his 1970 book of the same name has struck broadly and deeply, startling inventors, investors, providers and consumers alike, and it challenges assumptions about the availability, reliability, integrity and security of information technology products and services—the essence of information assurance.

Some earthly realities of cyberspace will both point and pace transformation.

The information industry and its exuberant investors were astonished to discover that insanely great technology does not itself a market make, and that Metcalf’s law of exponential network growth is not a sound foundation for business plans. Grossly overbuilt and undersubscribed, 90 percent of fiber optic cables are dark, and all but a few long-haul communications carriers have failed—a shocking unintended consequence of the Telecommunications Act of 1996 enacted to spur competition.

Also, except where serving limited, highly specialized needs, communications satellite technology has been trumped by cellular telephony which, in turn, is being overtaken by multimedia messaging, which may be aced out by ultrawideband, fourth-generation, local mesh (WiFi) networks, or an as-yet-unknown and equally short-lived technical marvel.

Venture capital is hibernating, abandoning the information technology market to confused individual investors reeling in shock from Global Crossing, WorldCom and others, and paralyzed by fear of shoes yet to drop.

The U.S. Defense Department, which eagerly embraced commercial off-the-shelf (COTS) products, now has only a fuzzy view of information technology products and services that actually will be available to satisfy its specialized and rapidly changing needs. Further, the department has so little buying power that it is powerless to leverage investment decisions in the private sector.

Adm. Dennis C. Blair, USN, former commander in chief of the U.S. Pacific Command, has characterized the defense acquisition process as “fundamentally broken” and unable to keep up with rapidly changing military needs, so it cannot take advantage of emerging technologies (SIGNAL, April, page 67). But there are some hopeful spots in his dismal assessment. Changes are underway in defense acquisition strategy that could more closely couple developers and operators. Field experiments of COTS products could lead to 80 percent solutions—build a little, test a little, buy a little—with products, doctrine and training fielded in months rather than decades.

The U.S. Joint Forces Command has the charter to capture early meaningful results from the Joint Warrior Interoperability Demonstration, and advanced concept technology demonstrations may finally give substance to what many fruitlessly advocated almost two decades ago as evolutionary, now renamed spiral, acquisition.

In his 1996 Declaration of the Independence of Cyberspace—bellowed in protest of the 1996 Telecommunications Act—John Perry Barlow rudely invited the “governments of the industrial world … to butt out of cyberspace. … No elected government [has a] moral right to rule us nor do you possess any methods of enforcement we have true reason to fear.” Now, even the most impassioned netizen must acknowledge that those who produce information content deserve compensation for their labors and protection of intellectual property rights. Inventors are entitled to fair return on investment. Internet travelers should be able to journey securely, and citizens rightfully expect the privacy and security promised by the Constitution.

However, the preconditions for stability, security and predictability in the information age will be satisfied best through protocols, regulations and laws enforced by governments. Perhaps critics eventually will applaud what some now condemn as obnoxious, officious meddling of government regulators.

Ever escalating hacker attacks notwithstanding, fears of debilitating effects on the nation’s information infrastructure do not seem to be widely shared. In a Washington Post article by Barton Gillman, Richard Clarke, the president’s point man on cybersecurity, is quoted as accusing corporate chief executive officers of spending more on their coffee mess than on information security.

While most authorities agree that a cyberattack alone would not bring this nation to its knees, there is growing recognition that even minor disruptions to information networks would amplify the effects of physical, chemical or biological attacks and grievously impede the labors of first responders in damage control and consequence management.

Paul Strassmann, NASA’s acting chief information officer, once proclaimed software to be “one of the most poorly constructed, unreliable and least maintainable technological artifacts ever invented by man,” with quality so poor that it could only get better. In his article in the Massachusetts Institute of Technology’s Technology Review, Charles C. Mann reports that software experts think quality is worse today and that “some engineers argue the only solutions are litigation and legislation,” or else the government will come in and regulate the industry. Cybersecurity expert Bruce Schneier also agrees that technology firms are not likely to improve the security of their products until they begin to face product liability lawsuits or more stringent laws.

In a June 2002 article titled “Buggy Whipped,” The Economist weighs into the debate over software quality, reporting that while most of the industry responding to its survey agreed on the magnitude of the problem (responses were mainly from the software industry itself), most felt that product liability laws were not the answer. Why? This immature industry has no quality or reliability baseline upon which to evaluate performance and render judgments.

The Economist acknowledges that at least the software industry is seeking to change things for the better—hardly a confidence-building response to military come-as-you-are warfighters.

“Nobody knows I’m a dog,” is an infamous cartoon caption about anonymity on the Internet and should temper expectations of those who view the undisciplined World Wild Web as an effective vehicle for conducting international public diplomacy. Once hailed as the purveyor of peace through understanding, the Internet has become, in the words of New York Times commentator Thomas Friedman, “at its ugliest, just an open sewer, an electronic conduit for untreated, unfiltered information … anything but a vehicle for understanding.” Quoting another author, Friedman continues, “The world is being wired together technologically, but not socially, politically or culturally. It’s the message, not the media that counts.”

Reporter Megan Lisagor of Federal Computer Week quotes Rep. Curt Weldon (R-PA) as saying, “Our communications system in this country is a total, dismal failure.” Probably an overstatement, but the first responders who arrived at the World Trade Center with incompatible communications systems, saturated radio frequencies and jammed cellular telephones might agree.

But communications outages on September 11, 2001, were localized and of short duration. Essential connectivity was quickly restored, and valuable lessons were learned about how and where to buttress the nation’s infostructure. Co-location of switching hubs trades off effectiveness for economy. Storage, routing and switching centers should not be limited to mid-town urban areas, and tall buildings are poor locations for antennas that serve essential public needs.

Further, local law enforcement and fire-fighting units will be hampered without assured access to radio frequencies, and September 11 delivered a wake-up call on management of the radio frequency spectrum. The United States must not treat spectrum as a source of revenue. It is a finite, precious resource to be preserved, protected and allocated for societal as well as industrial purposes.

On a positive note, the Cellular Telecommunications and Internet Association admits that the 1755-megahertz to 1850-megahertz band “is a valuable, necessary resource for military and homeland defense” and has reduced its demands for access to those frequencies for “a decade and probably more.”

Finally, it is instructive to note that the public switched network (PSN) performed under stress as it was designed to do, providing a model for the wireless component of the information infrastructure as society goes mobile. That the PSN performed well was no accident: The criticality of telecommunications to national purposes was recognized early in the Cold War. The organization, process and procedures for centralized, coordinated control of switched networks have been in place since the National Communications System was formed in 1963 and its National Security and Emergency Preparedness component created by executive order in 1981.

Dots are data that represent facts. Information technology will not connect facts, but it can connect trained analysts and foreign area specialists and people skilled in languages and cultures who can make sense out of isolated snippets of data. However, as Joint Staff Director for Intelligence, J-2, Rear Adm. Lowell E. Jacoby, USN, cautions, “There are no future facts.” The terrorist attacks exposed a void of imagination and collaboration more than an absence of facts.

As the intelligence community pursues a Darwinian transformation, it must shed itself of the Green Door and other excess Cold War baggage and shift from deductive reasoning of facts about a well-known adversary to inductive assessment of the unknown and often unknowable.

Law enforcement also confronts a cultural change that information technology can do little to ease. The Federal Bureau of Investigation is a highly decentralized, reactive, paper-based, criminal investigative organization with limited skills in analysis of intelligence, a culture nurtured by laws and a tradition that constrains collaboration and data sharing. This will be, in the words of an editorial in the Washington Post, “an enormous undertaking that cuts against its basic organizational culture.”

U.S. military forces have proved remarkably adept in exploiting multiple sensors, high-data-rate airborne and satellite communications and precise geolocational accuracy to dramatically increase efficiency and effectiveness of kinetic warfare while reducing the risk to warriors and noncombatants. But information operations reach beyond the battlefield, striving for a much more expansive and elusive goal. The military is challenged to integrate information operations fully into deliberate and crisis action planning as well as to support combat operations.

Information operations someday may become a military core, but this will not happen until vacuous phrases such as information superiority and knowledge dominance give way to specific, measurable objectives articulated in terms such as persistent surveillance or time-critical targeting. These terms provide the technical and operational specificity that are so essential to wedding information technology to military organization, doctrine and tactics.

Finally, information will not become a weapons system until a trained career force is in place to convert broad philosophy into operational art. Defense Planning Guidance 2004 begins that process, but results are years away.

Technologies like ground and space-based lasers, dense wave division multiplex and a wideband Internet protocol network will expand bandwidth and data capacity by orders of magnitude. This will momentarily shift the burden from communicators—long criticized for failing to provide adequate system capacity—to the backs of the users who now must figure out how to function and survive in an information-rich battlespace.

Assistant Secretary of Defense for Command, Control, Communications and Intelligence John P. Stenbit has a vision of a 10-gigabit-per-second Global Information Grid based on a “pull” operational concept, where the customer plucks relevant but unprocessed data posted from arrays of sensors (SIGNAL, May, page 19). This is a controversial paradigm shift to an approach that differs from today’s operational concept, which “pushes” analyzed information outward in anticipation of expressed or assumed consumer needs.

The notion of network-centric war fare being supported by networks carrying unevaluated sensor data was not received warmly by attendees at the Joint Military Intelligence College Conference 2002, convened in June to consider challenges to and responses from the intelligence community. “If,” some conferees asked, “the Global Information Grid functions like an Internet—no ombudsman, no editor, no analyst—what confidence will the customer have that data are correct?”

While a “pull” Global Information Grid need not necessarily eliminate the analyst, it is not surprising that the intelligence community has concerns about any operational concept that does not explicitly place a skilled human somewhere in the chain between sensor and trigger. Here again, the issue is not with information technology, but how it is integrated into military operational art.

Although technology is available to all, it bestows benefit only on those who acquire new skills and when culture, organization and processes adapt to constantly changing technology. The essence of the challenge facing transformers is captured in a comment by a senior chief information officer who said, “All the information technology in the world would not enable the Washington Redskins to play soccer.”

 

Col. Alan D. Campen, USAF (Ret.), is a contributing editor to SIGNAL, an adjunct faculty member of the National Defense University School of Information Warfare and Strategy and a contributing editor to four books on information and cyberwar.