Computer Productivity Defies Definition and Confirmation
Just how the ubiquitous desktop devices are changing business practices remains a topic for heated debate.
Now that computers have established themselves as the main driver for socioeconomic change in the foreseeable future, many analysts are questioning whether their perceived productivity gains are largely illusory. Issues such as software complexity and time efficiency weigh against a prevailing mindset that businesses and individuals cannot succeed, or even survive, without their new silicon-based mentors.
This new topic is the subject of several traditional two-handed analyses. On one hand, productivity may be hampered. On the other hand, some experts believe that productivity may be hidden in immeasurable ways. On one hand, the U.S. economy is reaping the benefits of leadership in the information revolution. On the other hand, the unparalleled economic growth of the past 10 years may have been a result of factors totally unrelated to computerization.
This debate rages against the backdrop of a change in computer technology. The World Wide Web has shifted emphasis from desktop computing power toward network access and utility. Similarly, new application-specific units are offering users greater flexibility in how they exploit information technologies. Yet, as machines become more powerful and software more elaborate, analysts are stepping back to assess the real productivity gains the technology is generating--if any.
"In the aggregate, I think the computer is not improving productivity, and it might even have a negative effect," declares Dr. Clinton W. Kelly III, senior vice president, advanced technology programs at SAIC, McLean, Virginia.
Kelly differentiates between computers operating as independent units and those that mainly serve as interfaces to the Web. He maintains that the Web has transformed commerce and is having a major effect on productivity, although no hard statistical proof exists yet to illustrate the degree of change. The most widespread capability in use is e-mail, followed by Web surfing.
However, desktop units, which have transformed the way people do business, may not have truly improved productivity. Kelly states that claims of successes in improving productivity are founded on anecdotal evidence rather than backed up by studies. In fact, productivity may actually be hampered as computers require greater efforts by humans to use them in particular functions.
Computers applied to factory automation and to calculation tasks such as scientific and technical computing have had an "impressive impact" on productivity, he allows. They are able to generate faster and more far-reaching results in science and reduce the number of personnel in factory operations. Beyond calculation tasks, however, computers are being used to augment people rather than replace them or automate their functions. Very few studies, either before or after the introduction of computers, have produced metrics to quantify productivity in their augmentation role.
University of Colorado psychology professor Dr. Thomas K. Landauer echoes this sentiment. Landauer, former director of cognitive science research at Bellcore and author of the book The Trouble With Computers, suggests that a genuine question remains as to whether computers have yet been very good for productivity. He doubts strongly that there has been nearly as positive an effect as most people assume.
"I am not a Luddite," he emphasizes. "Computers do wonderful things. They offer many magical functions that are extremely valuable for businesses. It's a mixture of these things with characteristics causing trouble that is the problem," he allows.
Landauer blames the process for designing computers intended for individual use, which he describes as "badly broken." This process does not ensure that the device helps a user perform tasks better and more quickly than would be done otherwise. Any improvements that have occurred over the life span of personal computers have come "nowhere near fast enough to be worth what we have been paying for them," he states.
Kelly warns that another problem hindering computer productivity is the complexity of the software that resides on the desktop unit. For example, in 1992 Microsoft Office provided 311 commands. In 1997, the same program came with 1,033 commands. Few people could know all 311 commands, let alone learn more than three times that amount just five years later, he says. Many of these commands were added to suit an increasing and diverse marketplace. However, they do not apply to each customer. Consequently, users must proceed more slowly and with more caution because they stand to make more mistakes with unfamiliar functions.
This complexity is the leading reason personal computers have let down their users in productivity, Kelly warrants. He believes that many users may have been most productive using early icon-based machines such as the original Macintosh computer, where it was not as easy to make mistakes.
Landauer agrees, saying that "the complexity of the software is the villain." One cause he cites is the temptation to have a program perform fascinating functions that actually are antiproductive. In many cases, computer software tries to guess a user's actions or intent, then insists on performing the function regardless of the operator's wishes while also making overrides difficult.
He blames this state on current market dominance by one software company. "If there were at least three or four other strong companies writing software, they would give us a choice of products, and there would be a cure for this," Landauer declares. Customers would opt for easier programs, and this "voting with money" would drive the market.
The psychologist notes that testing has begun on human-computer interfaces and interactions, but it is "too little, too late." Tests that actually examine productivity are rare and are not even performed by a major software manufacturer, he charges. "The market dynamic is to offer more and more clever things that show off the ability of the computer to do something new and interesting, but the valuation of whether what they are doing is actually helpful in the productivity sense is just not done often enough," he says.
This software complexity also is linked to computer hardware problems. Landauer describes it as a ping-pong game between adding more features to software that in turn demands more capacity and processing speed from its hardware. This growing complexity makes the hardware increasingly unreliable. He describes the result as "a rather large productivity hit" for users who must frequently deal with software bugs and computer crashes and must sacrifice time to learn how to master the machine.
Kelly adds, "What I find surprising is that everybody just assumes that having this big box on your desk is a good thing. It is rarely questioned. Every year or so, you feel like É if you don't upgrade that box on your desk with new hardware or software, you are going to lose a competitive advantage. I think that there is no evidence that it is going to be the case," he offers.
Landauer believes that people tend to choose this complexity when buying software, especially in the business arena. Customers often think that more is better when it comes to features and functions, and they use this quantitative measure to make their choices. Not all business applications are problematic, however. Landauer notes that bookkeeping software and inventory databases, for example, are highly valuable to business practices. However, he reports that some estimates state that the average company worker spends about two weeks each year doing nonproductive, but job-related, tasks with the computer.
Not all experts believe that computers are failing the productivity test. Dr. Howard Frank of the University of Maryland, takes issue with much of the computer productivity criticism. Frank, dean of the Robert H. Smith School of Business, is a former director of the information technology office at the Defense Advanced Research Projects Agency. He believes that computers are providing substantial productivity advances that may not show up in traditional metrics. The proof, he says, is in the booming U.S. economy.
"We are just going through the very beginning of an economic revolution that is then going to become a social revolution, and it is all happening in parallel," he states. "All of those issues [of productivity] have been put to bed." Frank notes that he does not differentiate between the computer and the network in this assessment.
One of the greatest reasons that the U.S. economy has continued its tremendous growth with little inflation has been the impact of networks and computing productivity, he offers. Many mundane tasks that used to take significant amounts of time--official record research, for example--now can be performed in minutes or even seconds on the Web.
Ways of doing business have changed rapidly. Networks and connected computing capabilities now allow large organizations that produce goods or services to reduce their supply chain length of time and increase efficiencies of individual processes. These companies are able to improve their monitoring of these processes and generate real-time financial management data. "On the operational line, your job has become much more efficient; on the management end, you actually can see what's going on instead of waiting weeks to take the pulse of the business," Frank relates.
On the output end, product tracking has been revolutionized, and marketing has entered the electronic arena. A person can save the time spent traveling to, and shopping in, a mall by rapidly ordering the desired product directly off the Web. "Every part of the cycle, from creation and innovation all the way through delivery and customer service, has been impacted," Frank notes. For the U.S. Defense Department, information technology is the differentiating factor that will continue to have a significant effect on military superiority.
Frank offers that traditional metrics do not necessarily measure the positive effects of computer technologies because business has transitioned from a linear world of productivity increases to a world of nonlinear increases. This trend may continue for some time. "When you look at the United States versus its [international] competitors, our lead looks like it is accelerating," he adds.
Classical measures of productivity at the macro level are strong indicators. Those productivity figures are up, and Frank attributes it to the computing and network environment. "We don't need new measures. We need to step back a bit and say 'yes indeed, we waste a particular part of our time in working with these things.' But think about what it would be like if you did not have them," he suggests. "The fact that we spend time mastering the use of these things is certainly a negative. But even partial mastery or use provides much more of an upside than the cost in terms of time because you can do so much more."
One expert who believes that evidence is lacking to link computers to the healthy U.S. economy is Paul A. Strassmann, a professional computer executive who has held key positions managing large computer complexes. Strassmann, a former principal deputy assistant secretary of defense for command, control, communications and intelligence, warrants that there is a lack of factual backup to support many findings on computers and productivity. This is especially true in discussions on the technology's impact on the U.S. economy as a whole, he states.
The author of the book The Squandered Computer, Strassmann maintains that there is no direct correlation between spending on computers and corporate profitability. His sampling of 2,395 U.S. industrial corporations failed to establish any correlation to a statistical level of high reliability. In some cases, companies that spent very little on computers had high profitability, while some others that spent extensively on the technology had little or no profitability.
"Spending on computers is not a variable that explains why these corporations have such radically different performance characteristics," he notes. Some successful corporations are able to leverage computers, while others are not. Conversely, some corporations have found their computer technologies to be great enhancers of performance, while the reverse also is true.
Analyzing U.S. productivity growth since 1990 does reveal an increase, Strassmann allows. However, he charges that this does not mean that computers were the cause. This growth of productivity can be explained by other factors. "The dominant and almost overwhelming--95 percent--contribution to the growth of productivity in the United States was the dramatic lowering of the cost of capital and the favorable currency position that we enjoyed," he declares. Every year, more than half a trillion dollars has been transferred to the United States. Economic modeling using the policies that have been in effect since early 1990 simulate an economy in which productivity growth is explained without resorting to computerization as the cause.
"In itself, by itself, a computer is just glass, plastic and metal. What is forgotten is that it's not computers that are productive. It is human beings--with or without computers--that are productive," he relates. "This is where training, organization, morale and other factors appear to be totally dominating." The cost of desktop units now represents less than 8 percent of total information technology spending, and that percentage is shrinking. He acknowledges that computers are enabling people to do things that they could never do before. The quality, complexity and reach of organizations have been greatly enhanced with computerization.
However, this complexity can be a double-edged sword, especially with regard to enabling software. Business school Dean Frank acknowledges that many business users become enmeshed in the web of software complexity. Yet, purchasing custom-designed software is much less cost-effective than buying off the shelf. The 90 percent or 80 percent solution may be good enough, he suggests.
A move toward specialized devices is now underway, and this is part of the evolution of computer technology, Frank states. These mobile, convenient, function-oriented devices represent a natural step on the heels of the plummeting cost of embedded computing. As these chips continue to increase in power and decrease in cost, more function-specific devices will emerge to free the user from the desktop. He notes that each of these specialized devices poses its own complexity issues.
Frank emphasizes that it is not yet clear whether another exponential productivity leap will emerge from this specialization development, or whether productivity improvements will be a linear continuation of ongoing advances. No "nuclear application" is in sight that would generate another leap in the productivity wave, he admits. However, another breakthrough application could lie just around the corner.
"I am a devout believer in the positive impact that [information technology] has had and the potential for even greater impact in the future," Frank declares. "It has been a fabulous rollout over the past eight years, and we are only seeing the tip of the iceberg."
For the future, SAIC's Kelly foresees a "divide and conquer" strategy taking shape that will spawn more specialized devices over the next few years. The complicated box is being divided into many smaller, application-specific devices. This began with the introduction of information appliances, which are tailored toward specific functions or applications. Palm computers, for example, primarily provide calendar, address book and other personal reference functions. Kelly describes these application-based devices as agents. With wireless links, they can provide e-mail capabilities and limited Web surfing. A large, expensive desktop box is no longer required for these functions.
Command inputs will be simplified as well. Keystrokes and mouse clicks will be replaced with speech, but higher level commands will predominate. "Because of network resources, devices, in the aggregate, will be so intelligent that you simply indicate intent, and the device knows the command structure and figures out how to marshal that complicated command structure to get as close to realizing your intent as possible. You never have to know it," Kelly predicts.
He believes that this spells the beginning of the end of the all-encompassing desktop computer. This is a pattern that has recurred in the past with many multifunctional appliances and tools giving way to specialized devices operating more simply and effectively. Eventually, Kelly suggests, intelligent appliances connected to networks will serve as personal assistants that know their user's wants and needs.
The University of Colorado's Landauer maintains that necessary changes to benefit productivity will come only when customers demand them. The solution will be UCD--which he alternately translates as user-centered design, development or deployment. The business community, which has the greatest amount of leverage with software manufacturers, must determine how to demand that its purchases are demonstrably productive.
Landauer suggests the creation of an Underwriters Laboratories listing or the equivalent of the Good Housekeeping seal of approval to certify software for productivity. A product carrying this designation would be guaranteed to improve productivity, or the purchaser would be able to obtain a refund. Similarly, contract writers for large companies could codify these productivity terms in a contract under threat of refunds and damage compensation.
As customers exert more influence on software development, the market will see many more useful computer applications that grow with increasing computational power. Landauer also believes that more alternatives offered over the Internet will provide greater choice among consumers.
If Landauer's UCD approach does not bear fruit, steadily increasing software complexity at some point will impel the public to reject these products, the professor predicts. "The bloom will leave the rose, and people will not be so attracted to things that are only showy," he warrants.
The Smith School of Business's Frank believes that the dynamic information technology wave guarantees change in products. "The only time you get perfection is when a device is obsolete," he declares. The innovation phase will see the constant introduction of new devices, options and capabilities into the marketplace--faster than human beings can be trained to accommodate it. Rather than a weakness, this is a sign of strength. "The difficulties and complexities, rather than being drawbacks, are derivative of the richness of the environment.
"We're fortunate--and unfortunate--to be sitting in a revolution and to be aware of the revolution at the beginning of it," Frank says. "Right now, we are in the beginning of a business and social revolution, with a cultural revolution just ahead. We have a new wave of small children growing up in this natural environment where they have the tools of success in their hands all the time. When they reach the stage where they are able to contribute in a productive way, things will emerge from this that are different than anything that we can imagine. We are about 10 to 15 years away from this true cultural revolution."
Strassmann questions whether the current booming economic growth will even continue. "Ultimately we are dealing with the wealth of society," he offers. U.S. productivity over the past 200 years has doubled the per capita real income approximately every 70 years. "The question is, will the real average income in the United States go from the current number of about $30,000 to about $60,000 in the next 70 years--and if so, by what means?"