In government, industry, the military and society as a whole, technology reigns. Change is coming faster than words can be written to describe it. In virtually every corner of the world, information systems are remaking governments, re-engineering economies, restructuring militaries and redefining societies. Not even the industrial revolution had as far-reaching an effect when it sprang upon the world less than two centuries ago.
Many enabling technologies that have fueled the information age emerged from directed research and development initiated barely 20 years ago. Government funding for much of this research evaporated long ago, and now the information field is ripe for a new generation of development. Failure to fund new research will adversely affect each of the areas where information technologies are bringing about revolutions, and it also promises to upset the global balance of power, politically and economically.
Research and development funding tends to have two sources: corporate and national government. In the two countries most clearly identified as economic technocracies—the United States and Japan—both sectors played different roles in supporting basic research. Japan, contrary to many Americans’ perceptions, initiated most of its research and development in the 1970s and 1980s through corporate funding. When that country’s bubble economy burst in the early 1990s, financial losses cost companies most of their fuel for advanced research. As a result, the past few years have seen Japan fighting to keep up with technological change, instead of leading the way.
The United States took a different approach to high-technology research funding. In the early 1980s, the Reagan Administration’s defense buildup included increased spending on basic research disciplines that advanced many military technologies. The introduction of the Strategic Defense Initiative (SDI) in particular furthered research into a host of microprocessor and computing technologies. Much of this SDI work played key roles in the ongoing information revolution as it found its way into the commercial marketplace. Concurrently, commercial semiconductor technology was viewed as a national economic asset that needed government support for research. Accompanied by new tax laws and deregulation that freed the entrepreneurial spirit among existing and emerging information technology companies, the United States quickly assumed the lead in defining the direction of the information revolution.
However, with the overwhelming success of computing and communications systems, both government officials and the public seem to have forgotten the roots of these ever-burgeoning technology crops. Large companies continue to pursue independent research, and academia—in many cases teamed with government and industry—still serves as a font of new ideas in the communications and electronics arena. Nonetheless, the directed government research and development effort continues to shrink relative to the economy as a whole.
And the military? The United States and its North Atlantic Treaty Organization (NATO) allies are working determinedly to incorporate commercial information system advances into their military infrastructures. Today’s technology-savvy forces are evolving into what ultimately will be an information organism capable of projecting force under all potential contingencies anywhere in the world.
Now it is up to Western democracies to turn their attention to research and development funding. Where possible, nations should increase support for basic research in their areas of academic and commercial strength. And, their governments should ensure that mechanisms exist, or are put in place, to encourage private sector spending in this arena.
This is not advocating industrial policy. Funding basic research is different from attempting to pick winners in the commercial marketplace. Successful high-technology research and development will support existing applications and spawn new ones. When advances enter the marketplace, commercial competition shows the way. The commercial sector will continue to provide rapid advances through its independent research and development. However, NATO governments should not take for granted that this will provide all the necessary fuel for the technology pipeline. The two-track approach of government and industry funding complementary research must be supported.
The United States, for example, can—and should—increase funding for several future generations of military technology innovations. This can be accomplished through service laboratories, agency-specific programs and the Defense Advanced Research Projects Agency (DARPA). That agency’s new director, Frank Fernandez, is reorganizing DARPA for improved efficiency and to reflect the cross-disciplinary nature of many ongoing research programs. Just as in the recent past, many of these advances would find their way into the commercial marketplace.
The reason timing is so crucial in this issue is that change can come from a number of different directions. Every nation with a standing military force noted the effectiveness of information technologies used during the Gulf War. Most are seeking to incorporate these capabilities into their own forces, but a few also are focusing their efforts on denying Western militaries the advantages wrought by these systems. History is rife with examples of underestimated adversaries abruptly emerging with surprising strength.
Simply put, information technologies are too dynamic for successful purveyors and providers to sit on their research and development laurels. National governments must increase funding appropriately and effectively. The Free World is built on a security infrastructure of economic, political and military vitality, and information technologies are becoming the linchpin that holds this together. Governments must act now to ensure the viability of their innovation machines or be prepared to play catch-up when the technological world changes—again.