Uneasy Sleep in a Golden Age

September 2010
By Linton Wells II, SIGNAL Magazine

This summer I attended a series of thought-provoking conferences, ranging from business technology to clean energy to cybersecurity and network integration. Collectively, they suggest that we’re living in a “golden age” of technological innovation, but they also highlighted a growing gap between increasingly interactive capabilities and the ability to provide security at several levels, ranging from individual privacy to critical infrastructure protection. The bottom line is that nothing I heard makes me sleep better at night.

Our present era may be considered one of the great technological opportunities in history, across several dimensions. The information domain is being shaped by the convergence of cloud computing, mobility, Internet connectivity and transmedia applications—such as integrated video, data, print and physical-virtual links. New editions of several publications are being written for the iPad.

The rate of change in biogenetics is outpacing Moore’s law. It cost $3 billion to sequence the human genome in 2000. Now it costs about $10,000, and the cost will be $1,000 in 3 years—a decrease of about one order of magnitude every two years, at least at this stage of the innovation curve.

Startup companies are generating commercial nanomaterials, and serious forecasters suggest that the combination of genetics, nano and robotics can lead a “molecular age,” with—eventually—atomic-level assembly of materials.

Research on alternative energy is proceeding on many parallel paths, despite relatively low oil and gas prices and difficult political hurdles at the moment.

Brain researchers describe this time as their “renaissance.”

So, with all these trends, what’s not to like?

From a knowledge security perspective, innovative approaches can enhance security in the cloud, but the way people are using their mobile systems is undercutting the best efforts at protecting the center. As someone said: In this environment, leadership only has the illusion of control.

Consider three cases: the seductiveness of mobile capability, unintended functionality in applications, and a general lack of continuity plans if access to the cloud is lost.

In the first case, corporate leadership may consider certain data very sensitive, worthy of extreme protection in local storage or within their private clouds. But, if an authorized user e-mails the data to a mobile device even for legitimate access on the road, such protection may get replaced by a four-digit procurement instrument identification number (PIIN)—if that.

At DEFCON 18—the hacker convention in July—a briefing on mobile apps presented cases where sensitive user data could be purloined through a smart phone wallpaper app. Most of us do not understand what is going on behind the seductive functionality that apps provide.

Most companies pay serious attention to business continuity planning, but few at the conferences seemed to have thought much about what to do if access to the cloud itself is lost for a prolonged period. One possible cause of a prolonged outage could be cyberattacks; another could be a widespread power outage. This consideration needs to be a core part of resilience and robustness in business continuity planning. The danger is compounded by the interactions between information and communications technologies, or ICTs, and critical infrastructures. DEFCON included at least eight sessions devoted to hacking supervisory control and data acquisition (SCADA) systems, smart grid vulnerabilities and other links between the virtual and physical worlds.

This leads to several key issues. On a technical level, key elements of security need to be redefined. How are information objects shared? The paradigm has to shift to data-centric strategies vice perimeter-centric—from “keeping bad things out” to “keeping good things in.” In turn, this requires more transparency into what users are doing in the network.

From an operational perspective, resilience and robustness issues need re-thinking. It may not be enough to have offsite backup if you cannot connect in an emergency.

Security policy is critical: Is it clear and comprehensive? Do your employees understand and follow it?

In the end, it all comes down to people. When Lou Gerstner was chief executive officer of IBM, he asked how he would know if his organization had a good information assurance program. The answer was: “Walk down the hall. Find a random employee. Ask them three questions: ‘Would you know if your computer was being interfered with?’ If yes, ‘Would you know whom to call to get support?’ If yes, ‘Would you care enough to call?’” Unless you can answer “yes” to all three of these questions for each of your employees, you can spend all you want on technology and still fail on the people side.

For all the excitement over technology, the gap between functionality and security continues to grow. Very little I heard this summer suggests there is any reason for comfort.

Linton Wells II is the director of the Center for Technology and National Security Policy (CTNSP) in the Institute for National Security Studies and a distinguished research professor at the National Defense University in Washington, D.C. The views expressed are his own and not those of the U.S. Defense Department or of SIGNAL Magazine.


Enjoyed this article? SUBSCRIBE NOW to keep the content flowing.