Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars     Apps     EBooks
   AFCEA logo
 

TECHNOLOGY AND SECURITY

Friday, March 12, 2010
Bill Nolte

I awoke one Saturday earlier this year planning to write on cybersecurity and related issues, to put down some thoughts on the complexities of our technology, the vulnerabilities inherent in that technology, and the subsequent risks we face as a society.

           

But first, a more immediate requirement needed to be addressed: I had to go to the local donut shop to surprise my grandson with a breakfast of blueberry donuts.  As it turned out, I had plenty of time to think about technology. Or the coming baseball season. Or when to stock up on crabgrass preventer.  The store’s machinery was malfunctioning, and on a January morning in a one-donut-shop-beach-town, alternatives were not in the cards.  Over the next hour, batch after batch of overcooked donuts dropped into several large trashcans, enough to cause Homer Simpson serious angst.  Finally I left, supplied with donuts that could have lubricated several large pieces of industrial machinery.  Not that a seven-year-old noticed.

           

The incident gave me an opportunity to consider, not for the first time, how dependent we are on technology, especially the technologies associated with computing in various forms.  Without question, the donut machine’s problems could have reflected a completely industrial, mechanical failure.  A few years ago, this would have been the case.  Something would have gone wrong with this gear or that lever, with the possibility that the offending part could be pounded or bent into submission.  Attached to this machine, however, along a vat of oil and devices for extracting the finished product was – you guessed it – a control panel, equipped almost certainly with more storage than a 1960s mainframe or the space shuttle.  This appeared to be the source of the problem. And no amount of rebooting seemed to solve it.

 

That is, in the end, our collective problem.  At no point in human history has a society ever entrusted its prosperity, its order, even its survival, so completely to a single technology, or in this case a system of intimately related technologies.  In 1851, in the midst of the “industrial revolution,” the British census, in the most industrialized country in the world, reported that no longer was the majority of the British workforce engaged in agriculture, though agricultural workers remained the largest portion of the workforce.  Had “the world’s workshop” suddenly discovered it was out of coal, the result would have been catastrophic, but Britain would have survived, albeit frightfully poorer.  And colder.

           

What of us?  Whether it’s the risk of automating medical records as part of an effort to cut costs, or the shortfalls demonstrated by the Detroit  bombing attempt, or the daily efforts of governments, illegal organizations, or adolescents with time on their hands, to hack into public and corporate information systems, we have placed our lives in the care of technologies we  know to be extraordinarily vulnerable.  For the most part, we simply accept this as reality, piling up record internet sales even though we have been warned about identity theft and other dangers.  To a degree, this represents prudent cost/benefit and risk management analysis.  After all, we continue to use the highways even though we know that thousands die every year in auto accidents.  Dangerous?  Yes.  Irrational?  Probably not.  (Irrationality approaches when, no longer able to escape a wrecked car by cranking down the window, we purchase a nifty hammer most of us could not locate on a scavenger hunt, let alone in an emergency.)

We have simply not paid adequate attention to the vulnerabilities of our information and information technology systems.  Information about the problem is available to the public, of course.  Stories about hacking and other cyber insecurities appear in the papers and on the internet every day, with the China/Google issue taking things to new level.

           

But the concern remains.  We no doubt have not heard the end of congressional outrage over the failure to “connect the dots” in the Detroit incident.  (Just when there was hope that horrible metaphor would go away.)  We will all adjust to increased scrutiny at airports, some of it for real, some of it in the form of “Potemkin processing.”  Less will be done on railroads, shopping malls and other locations, for a variety of reasons, reasonable and unreasonable, and we will hear more about Google and China.  Nevertheless, barring ever more dramatic news, public attention will drift and fade.

           

The current discussion of cybersecurity is uncomfortably reminiscent of the period before September 11, 2001.  It is not as if we did not hear about homeland security (or homeland defense, as it largely was) or the risks of terrorism before that day.  The Hart-Rudman commission’s final report had appeared only a few months before, buried on inside pages of major newspapers.  We had experienced attacks on American cities, ships, and embassies.  We continued to see the destructive capacity of modern explosives around the world.  But what was to be done?

           

Here’s hoping that after the obligatory posturing at the Detroit incident we can have a serious national conversation about information security and the tradeoffs associated with it.  Even more, we should hope this conversation can take place without the stimulus of a national tragedy.  The recent Cyber Shockwave exercise (shown on CNN with the tag line “We Were Warned”), should be an important part of that conversation.  Among others, former DNI Mike McConnell has pressed the case in every forum from 60 Minutes, to Senate testimony, to a WashingtonPost op-ed.  Others have challenged McConnell’s statements, and that’s fine.  It’s exactly the way we should engage on important public issues.  With any luck, this discussion should help inform everything from personal decisions about swiping credit cards or boarding airplanes to public decisions regarding the balance between security and privacy, or how we build oversight mechanisms to monitor that balance.  AFCEA can be an important, nonpartisan, nonideological contributor to this conversation.  We have, as noted above, been warned.

 

William Nolte is chairman of the AFCEA Intelligence Committee