Any aggregation of computers, software and networks can be viewed as a “cloud.” The U.S. Defense Department is actually a cloud consisting of thousands of networks, tens of thousands of servers and millions of access points. The department’s fiscal year 2012 spending for information technologies is $38.4 billion. This includes the costs of civilian and military payroll as well as most information technology spending on intelligence. The total Defense Department cloud could be more than $50 billion, which is 10 times larger than the budget of the 10 largest commercial firms. So, the question is: How efficient is the Defense Department in making good use of its information technology?
The efficiency of any system is defined as the ratio of outputs to inputs—also known as the productivity ratio of an enterprise. If only a small fraction of inputs is converted to outputs, then information technology can be labeled as inefficient. The metric of the productivity ratio always is evaluated in dollars. These types of numbers are readily available for the Defense Department because the Office of Management and Budget (OMB) publishes analyses of information costs every year.
Making these output/input evaluations requires finding out how much of the total available information technology budget is consumed in management, which is defined as, “… costs incurred in the general upkeep or running of a plant, premises, or business, and not attributable to specific products or results.”
The OMB lists the “information and technology management” function for information technology. This includes all planning, administrative, management and acquisition costs as well as communications costs that cannot be attributed to any specific output.
This table shows that only 48.8 percent of Defense Department functions are related to the costs of output. The remaining 51.2 percent are attributed to “management,” which includes expenditures for Defense Department chief information officers and component staffs. The information technology output/input ratio for the entire Defense Department can be estimated to be less than half.
Is the 48.8 percent ratio a good measure of information technology performance? Can it be compared with the best commercial practices?
FUNCTIONAL AREA in billions 2012 Budget % of Total Information and Technology Management $19.7 51.2% Defense and National Security $10.8 28.1% Supply Chain Management $3.0 7.7% Human Resource Management $1.7 4.4% Administrative and Financial Management $1.1 2.7% Health $1.0 2.7% Intelligence Operations, Legal $0.9 2.3% Planning and Budgeting $0.2 0.5% General Science and Innovation $0.1 0.3% Environmental Management $0.0 0.1% Total $38.4 100.0%
I have worked with productivity numbers for more than 30 years, and I have published more than 100 articles and books on this topic and own a registered trademark on Information Productivity. In terms of information technology spending per capita, the Defense Department is most comparable to the financial services sector because of its large amount of purchase transactions and huge assets. By comparison, major banks—where the information technology budget for the top firms averages $2 billion—information technology productivity always has shown a ratio between 70 percent and 80 percent, in contrast with less than 50 percent for the Defense Department.
The primary reason for the difference between commercial firms and the Defense Department are expenditures for information technology infrastructure maintenance ($7.7 billion) and for information security ($2.8 billion). These two items account for more than half of the communications expense that is included in the Defense Department management costs.
With an estimate of 15,000 defense networks in operation, the first priority for any future cost reductions should be the consolidation of communications. According to a 2006 Government Accountability Office report, the Global Information Grid (GIG) was supposed to achieve major reductions in the number of networks. That has not happened.
The Defense Department must restructure its information technology communication operations away from an environment where it is vulnerable to multiple cyber attacks. Cutting down on the number of networks requires shifting of computing to a much smaller number of enterprise clouds. This will reduce costs as well as increase security.
Paul A. Strassmann is the distinguished professor of information sciences at George Mason University and teaches AFCEA’s online cyber ops course. The views expressed are his own and not necessarily those of SIGNAL Magazine.