Rapid Changes Lie Ahead for Computing
![]() |
A U.S. Marine Corps corporal uses his command and control personal computer to track vehicles with the Enhanced Position Location Reporting System. Military computing will change significantly as new processor capabilities allow local computers such as laptops to perform functions previously limited to centralized facilities. Also, changes in the military human-machine interface will spawn transformations that go far beyond command and control. |
Businesses, the military and consumers have never seen the pace of change in computing that may be just around the corner, according to a leading technologist at the world’s largest software company. Craig Mundie, chief research and strategy officer at Microsoft Corporation, predicts that new hardware and software architectures will open up a host of revolutionary capabilities and applications—but they also will tax information system developers and managers who must stay abreast of advances without sacrificing the integrity of their systems.
Mundie relates that the changes that have taken place in the computing world since the early 1990s have been steady and relatively predictable. He describes this as a period of comparative calm and stability. The computing world’s continuous pace of change goes on unabated, but the transformations that are coming are more profound in terms of recent history.
He sees the computing sector as having a traditional S-curve of evolution where a fairly stable time is followed by a period of rapid transformation. As the current stable time comes to an end, the next 10 years will see dramatic and dynamic variations in hardware, software and architectures as experts scurry to keep up with their counterparts in each discipline.
New processor technologies, quantum improvements in aggregate complexity, the increasing scale of data centers and the emergence of ubiquitous and even mobile broadband together will push computing into another unknown period of accelerating transformation. Those factors will coalesce to create a data environment from which something new will emerge—possibly in the next five years.
Mundie foresees a number of shifts taking place over the next decade. The nature of hardware advances will be altered as new technological capabilities come into play. New programming methods and tools will emerge as hardware becomes more complex. Formal composition in software development will become more important than ever because computer capabilities will be improved vastly.
Foremost will be a transformation of the microprocessor itself. Mundie contends that existing technology cannot continue to increase clock rates at the same pace that has occurred over the past 20 years. This will place a great premium on new microprocessor architectures that favor low power and a higher core count. Changes also may come in the form of revisions to machines’ memory hierarchy because the relative ratios of processor performance to various types of memory performance also have changed dramatically over the past two decades.
More specialty cores may emerge. Just as mainstream computing companies have worked with commercial game consoles, similar specialty capabilities may come into play in defense and homeland security applications, Mundie suggests.
The tremendous amount of computational ability that will be available in low-power silicon devices presents a good opportunity for the military, Mundie offers. Functions that used to be computationally intractable, or that could be performed only in a centralized facility, will be computable locally. Examples could include vision capabilities or advanced speech recognition at levels of quality and reliability far in excess of anything now available. He suggests that this may lead to changes in the human-machine interface in a military environment—changes that go far beyond affecting command and control.
Ad hoc mesh networking capabilities likely will benefit. Sophisticated low-level local area networks could be meshed to create connectivity where it is unavailable. This capability will become pervasive, Mundie predicts. Local connectivity for increased computational capacity will open up a host of useful applications in the military’s self-contained tactical environment, he suggests.
“[The changes] represent a rich set of tools with which to try to move military applications of computing and information warfare to another level,” he states.
This computing transformation will pose a challenge to application writers, Mundie observes. Their efforts will require a greater focus on how people express concurrent applications or parallel execution models. These concerns historically have been reserved for experts who focused on supercomputing or other types of scientific applications, but now that parallel engineering approach must be applied to all coming applications that require acceleration.
“There will have to be greater attention paid to the methods with which [applications] are designed and the tools that are used to construct them in order to improve reliability, scalability and security in these classes of applications,” Mundie declares.
These changes will require new software design methodologies, many of which Microsoft has been implementing under its trustworthy computing initiative over the past five years, Mundie says. The techniques will need to be adopted by designers building mission-critical applications and not by just the vendors of critical system software.
It will be more important to combine service-oriented components in many of these applications, he adds. Very-large-scale data centers will be able to provide much of this capability. But, some deployment differences may evolve between what happens in the general consumer, commercial enterprise and small business environments and in military applications.
The first three environments will experience more free-form combinations of cloud-based services with locally executing capabilities—or even the ability to have cloud-based applications that users can click and run, Mundie continues. This architecture may be applicable for solving many scale problems in a military context, but the challenges of maintaining connectivity in a broadband environment pose tactical risks that could preclude adapting the architecture’s commercial model for military use.
The military will be adjusting its approaches for these developments, particularly with its own specific requirements. One of the most important points, Mundie offers, is the military’s requirement to allow autonomous action to continue in the absence of network connectivity or amid equipment damage. This autonomy requirement exceeds that of the commercial sector. “Mission-critical for the military has a threat model that’s different from that of the rest of us,” he comments.
A major change is in store for higher level computer connectivity. This is where applications would talk to other applications without any human intermediation, and Mundie believes that Web-based connectivity will spawn more ad hoc interaction between applications. In effect, this process first took place when humans sitting at their computers learned to interact with Web sites through their Web browsers. But the future will require computer applications to carry out the same ad hoc process without human help.
“When you start moving higher level software into a similar regime, it gets to be more challenging because you don’t have human intuition or common sense associated with it,” he points out. “And, the more that program-to-program interaction is enabled, then the more that care will have to be exercised in the construction of those Web services and the attendant applications.”
Mundie cites past iterations as examples of software that proved easily exploitable. Microsoft’s own software in the past has been propagated by worms and viruses such as the “I love you” virus that infected millions of users’ e-mail inboxes. He warns that these types of threats could appear on a much larger scale when many more applications are being exposed directly to a Web service environment.
“There will have to be a lot more architecture wrapped around how these programs are structured and how you control their interactions,” Mundie says. “Where there are some systematic mechanisms for doing that, I think that you are going to see network connectivity evolve in quite a different way in order to make this easier.”
Mechanisms will allow network managers to control the interactions or information exchanges among systems. “The connectivity that people desire at a logical level is too difficult to express statically in the form of traditional network or file access controls,” he continues. “So, you need to move the whole thing up a semantic level to where people are doing these things under policies that can be expressly stated and analyzed as opposed to administered to death by traditional IT people with access control mechanisms—or [through] the sequestration of information through traditional multilevel security systems.
“These are some of the more profound changes that have to be wrought, particularly in the military and government environments, in order to make those systems operate in the new environment,” Mundie emphasizes.
Previously, separation or access control was achieved by controlling network access, Mundie relates. That approach will not work in a future where anyone must be able to talk to anyone else, he points out. The same holds true for applications interacting with other applications. Higher level policy constraints will be necessary, and this will lead to a repartitioning between expectations of a physical network and how policy will be applied.
And, this will require uniform identity mechanisms for applications as well as for people and platforms. Policy will have to be specified against that triad, he notes.
If these approaches are implemented correctly, users should see a more uniform and transparent availability of the resources that they hope to tap, Mundie suggests. Today’s boundaries—such as those encountered when moving from a laptop to a desktop—prevent consistent access to the information that a user needs. “These techniques will go a long way toward allowing ‘do anything from anywhere’ without subverting policy or security of the underlying systems,” he says.
Another key change will be a more natural integration of components based on the Web or on high-scale data centers into applications, he says, adding that this will support the fully distributed nature of application design. It will affect capabilities with regard to the amount of data along with the ability to retrieve that data and to stream that information down to people. Mundie emphasizes that this encompasses not just the textual or static images typical of today’s Web but increasingly the image- and video-oriented components of what can be discovered or accessed by a user on the Web.
Graphical interfaces will continue to evolve. Graphical user interfaces and communications facilities are becoming integrated as the functions of different devices are united in a single communications paradigm. This will allow more natural extension to different device types, both mobile and fixed, as well as tighter integration through the graphical presentation of information.
“The intrinsic power of what can be done in the microprocessor will go up by a really dramatic amount,” Mundie declares. “The way in which people interact with the machine at the local level will be dramatically different.”
The computing industry will need to phase in new techniques to take advantage of these new capabilities, Mundie maintains. Some initially will be used in subsets of overall computing such as items related to the natural user interface and the movement toward more video- and speech-based communications.
“The ability to mine information out of higher scale amounts of data—both in a sensor-driven environment and in a data processing environment—will start out as the way in which we harness the power of these increasingly powerful systems,” he declares.
Web Resource
Microsoft reorganization announcement: www.microsoft.com/presspass/press/2006/jun06/06-15CorpNewsPR.mspx