Enable breadcrumbs token at /includes/pageheader.html.twig

Playing Truth or Consequences With the Year 2000 Problem

Ready or not, the year 2000 soon will dawn over remote islands astride the international date line in the far Pacific, and what has been called the "first scheduled, non-negotiable, global disaster" will unfold, revealing which of many wildly differing year 2000 scenarios will play out. While no one can foretell which, if any, are accurate, woe to any who have failed the "due diligence test."
By Col. Alan D. Campen, USAF (Ret.)

We don’t know what we don’t know, and the truth is not out there!

Ready or not, the year 2000 soon will dawn over remote islands astride the international date line in the far Pacific, and what has been called the “first scheduled, non-negotiable, global disaster” will unfold, revealing which of many wildly differing year 2000 scenarios will play out. While no one can foretell which, if any, are accurate, woe to any who have failed the “due diligence test.”

This test requires, in the face of potentially huge operational and fiscal liabilities, implementing contingency plans that at least isolate or insulate private, public and business enterprises from computers that cannot count beyond 1999. A rancorous, greedy and litigious society is poised to wreak vengeance on those who fail, at a bare minimum, to exercise just-in-case consequence management. This attitude results from the recognition that the year 2000 problem (Y2K) is more a business than a technical issue.

Global disruptions may strike societies in their most vulnerable areas. These could range from cascading power grid failures to a delay in implementation of Europe’s new union wide currency. Indeed, some Y2K fixes could generate unique, unforeseen problems of their own.

The U.S. Y2K czar, John A. Koskinen, states, “There may be a power brownout or blackout. There may be telecommunications problems. There may be transportation problems in some areas. … We don’t expect, at this stage, that the government will need to exercise those contingency plans, but it would be a mistake not to have them.”

People are pummeled daily with articles, radio and television programs and Internet chats about the Y2K problem that warn of either a senseless panic foisted on the ignorant by the greedy, or an unprecedented disaster that will set the global economy back decades. Another potential scenario is that all defenses will fail, and people should stock a bunker in the Oklahoma wilderness and hunker down for TEOTWAWKI, an acronym identifying “the end of the world as we know it.” Or, it could all be just a cruel trick of technology that actually presents a timely challenge for change.

There is heightened agreement among many commentators that the Y2K impact most likely will draw from each scenario. However, these vary significantly among users based on several factors. They include dependency—whether the enterprise can function despite date-related computer disruptions; exposure—reliance on external information systems whose health cannot be assessed; and discipline—customers and stakeholders tolerating discontinuous or erroneous service. Key factors for due diligence include management steps taken that, in the eyes of the courts, demonstrate earnestness and endeavor in exercising consequence management. Finally, the most important factor may be a hefty bankroll to fend off the tidal wave of litigation.

The truth about Y2K remains elusive. However, an irrefutable fact is that some software and embedded chips—whose purpose is to detect, calculate or otherwise act upon dates—will fail or malfunction in unpredictable ways when 1999 becomes 2000. Beyond that, there is only some agreement about some points.

In terms of economic impact, the effect is real, not conjectural. It is now, and its repercussions have absolutely nothing to do with any of the threat scenarios, nor the efficacy of remedial measures. While experts disagree as to the extent of financial penalties from remediation or litigation, none can deny that multibillions of dollars will be involved. This will affect the bottom line of corporations, many of which could take hits of as much as 7 percent. In addition, untold human resources that otherwise would have gone into research and development and product improvement will have been diverted to combat the Y2K dilemma.

Some projections of the global impact run into the trillions of dollars, and these escalate with each surprising new discovery of vulnerability and consequences. Ernst & Young and Computerworld predict that the cost of Y2K fixes will defer implementation of secure electronic commerce, and it could delay the introduction of the European currency, the euro.

The China Economic Times reports that the State Council has decreed that all Chinese government agencies and government-owned enterprises must complete millennium bug testing by September 1999, and those who fail to meet the deadline will be punished.

The U.S. Defense Department will spend up to $6 billion on Y2K, and most of this is being siphoned from urgently needed new information technology programs intended to enhance military capabilities. Secretary of Defense William Cohen, worried that his department is more than a year behind its remedial schedule, has directed military commanders and program managers to take personal responsibility for ensuring Y2K compliance and has threatened to halt all information investments other than remediation unless he sees demonstrable and measurable progress.

Unfortunately, an inscrutable barrier deflects even the most aggressive seeker of the truth about Y2K vulnerabilities and remedies. This situation is compounded in no small measure by the tendency to “shoot the messenger” bearing ill tidings. Myth, mystery, secrecy, fear, uncertainty, complexity, ignorance, apathy, denial, greed and avarice all stand in the way of remediation. In some cases, poorly informed or incompetent management has delayed corrective measures to the point where contingency plans now are the only reasonable remaining recourse.

Any so-called “safe harbor” is too late and too shallow. John Petersen, president of the Virginia-based Arlington Institute, which specializes in research and policy on the changing nature of security, notes a prevailing “powerful dynamic of secrecy.” The unwillingness to share information about defects in the national information infrastructure—and Y2K could be the first major trial—has hampered the construction of defenses, delayed remediation and vastly increased costs. Government proffers to provide a protective enclave for sharing facts about what does and does not work were hooted from the stage. This is not an entirely astonishing rebuff given the government’s spotty record on empathy for the privacy of its citizens. In the words of two very senior former defense officials, “The private sector is reluctant to work with the government on this issue [infrastructure protection] because of the high cost, unclear risk and the prospect of heavy-handed government action.”

“Let the private sector police itself and report cyberincidents,” was the rallying cry of the Manhattan Cyber Project (MCP), a propitious but short-lived model for future public/private partnerships. MCP quickly folded, the public was told, largely because while corporate information specialists could retrieve empirical evidence of defects, their corporate lawyers and chief executive officers feared the consequences of data sharing.

In a rare display of nonpartisanship, the 105th Congress passed the Year 2000 Information and Readiness Disclosure Act. This provided limited antitrust exemption to make it easier for firms to cooperate with one another to solve the Y2K problem. Some members of the 106th Congress plan to introduce legislation limiting the liability for defective remedies and the onslaught of lawsuits that already has begun. Business Week has referred to Y2K as “2000 reasons to celebrate.” Everyone, it is said, is “either a potential litigant or a potential target—or both.” If unchecked by law, regulation or reason, legal costs will greatly exceed those of remediation.

Another potential problem is that a fix is not always a fix. Preparing hubs and terminals for the millennium transition is an essential, but far from adequate, step. One writer criticizes this approach as “an insane response in a systems world.” Even if most information systems are modified and then faithfully tested, they still could fail because of interconnection with systems that have not been fixed. An information chain links an enterprise—suppliers, movers, storers and consumers—that are each an unpredictable, oft-forgotten but essential node that might not have been repaired or was altered in some incompatible way. This creates the potential for a rolling wave of multiple, parallel failures. As another expert has written, “simply solving most of these problems is not enough.”

Everything is interconnected through global networks and, as John Petersen has written, people “seem oblivious to the networks in which they participate, or the systems and interconnections of modern life ... [but] networks mean that no one system can protect itself from Y2K failures by just attending to its own internal systems.”

Embedded systems are the most critical. Analysis of embedded chip problems reported by the British government’s Millennium Bug Campaign reveals that, of some 7 billion embedded microchips that sustain the world’s manufacturing and engineering base, 15 percent on average are expected to fail in unpredictable and inconsistent ways. No one knows for what purposes or where these billions of chips have been implanted. There is no audit trail.

Another problem is that repairs induce errors. While experts may differ over the number of Y2K problems that will not be corrected in time, as well as the severity of resultant damage, all acknowledge that remedial actions will be less than 100 percent effective—in some instances, less than 85 percent. One expert predicts that as many as 10 percent of repairs will create new defects, and the issue of bad fixes is not widely discussed in the year 2000 literature. Software expert Don Estes writes, “Whenever you change anything in the code of a program, something else will break ... and that the best-case scenario for a company with 10 million lines of code is 1,000 undetected errors.”

Capers Jones, chairman, Software Productivity Research Incorporated, Burlington, Massachusetts, writes in Probabilities of Year 2000 Damages, “Only a combination of inspections and very careful testing can approach or exceed 99 percent, which is the level needed by the year 2000 problem.” It is naive and risky to assume that all the Y2K errors will be found and repaired, he continues. With possibly 5 percent to more than 20 percent of the year 2000 problems still remaining in software after 1999 ends, Jones concludes that the probability of significant damages is alarmingly high, and there should be emergency response teams available in every company and government agency to deal with the impact of problems that are discovered too late.

It is probably too late to finish, but not too late to start. Peter de Jager, an internationally recognized authority on Y2K and one of the earlier voices to sound the alarm, now laments the apparent tendency on the part of the media to “leap from denial to despair in one fell swoop.” The public knows, de Jager writes, that the problem is serious and remedies were applied too late, but, “we are not helpless children lost at sea ... [but] grownups capable of fixing things when they break.” Organizations that start late, says Koskinen, “must make very hard decisions about what are the most critical systems ... and focus resources on those.”

As far as truth and consequences go, the truth is that society knows so little about its potentially fatal dependency on fragile information systems that it cannot be expected to ponder hypothetical consequences. Society does not yet comprehend how much of human endeavor depends upon near-perfect service from software-driven, richly interconnected and easily destabilized digital networks.

In his chapter in the AFCEA International Press book Cyberwar 2.0: Myths, Mysteries and Reality, Robert-John Garigue describes “threats to humanity in a cybersociety ... where we [may] have lost control of our destiny, subjugating ourselves to someone else’s order of things, to someone else’s belief system ... [through] using software as the medium to convey and express knowledge and to make sense of the world.” Societal belief systems, Garigue continues, “are presently being instantiated into protocols, computers and software ... not open to public debate and social criticism.”

On a positive note, while Y2K may be a painful but timely wake-up call that shakes society’s collective faith in technology and silver bullets, it reveals the consequences of excessive dependency upon an unverifiable art form—software. It also obliges people to prepare long-range strategies to efficiently deal with an information-rich society at the dawn of a new millennium as well as the problems that are yet to be discovered.

 

Col. Alan D. Campen, USAF (Ret.), is manager of AFCEA International Press, a SIGNAL contributing editor, co-editor of Cyberwar 2.0: Myths, Mysteries and Reality, and adjunct professor at the National Defense University School of Information Warfare and Strategy.