Autonomic Resources recently announced that it has been awarded one of the General Services Administration's first blanket purchase agreements for the first government-wide contract for cloud computing. Under this agreement, Autonomic Resources will offer public cloud services to provide U.S. government customers with simplified computing power, storage, and networking infrastructure that can be acquired and utilized on-demand, all from certified data centers with enhanced multi-factor authentication access. Autonomic Resources is one of only a few vendors to have met the technical requirements necessary to be awarded a GSA contract for cloud computing.
Apps Tap into Cloud Computing
The hard-hitting storms that beleaguered parts of the United States this year taught the East Coast a valuable lesson-sometimes you just can't get to work. But with immovable deadlines tasks still must be accomplished. One way offices can continue to function with personnel in disparate locations (assuming they all have power) is by storing documents in locations other than organizations' computer drives. Using the Internet as a storage device enables people to continue to move work forward, even if they can't get out the front door.
Terry Halvorsen, the Defense Department’s acting chief information officer, is expected very soon to release a new policy revising the role the Defense Information Systems Agency (DISA) plays in brokering cloud services. The changes are designed to speed cloud service acquisitions by preventing bottlenecks created by having only one agency act as broker. DISA no longer will be the sole acquisition agency, but it will continue to ensure network access to cloud service providers is secure and reliable, agency officials say.
There are no do-overs when it comes to safeguarding the U.S. military’s sensitive data. With that key, concise and blunt notion in mind, defense leaders say they are taking a slow, methodical, multipronged approach as the Defense Information Systems Agency develops a cloud security model for the whole of the Defense Department.
With current security controls too strict and limiting, agency personnel are sleuthing for the ideal balance that would let a greater number of commercial cloud service providers compete for billions in federal funding, while still safeguarding national security. Their goal is to determine what might be safe—and what might be safe enough.
The U.S. government is adopting changes to the cloud computing certification program that will better protect against potential insider threats. The improvements include additional penetration testing, more thorough testing of mobile devices, tighter controls over systems being carried from a facility and more stringent scrutiny of systems connecting from outside the network.
When cloud computing revolutionized the way businesses stored, processed and transmitted data, the rapid transformation—as with a lot of technological advances—left U.S. government agencies behind the times. The government’s hurried effort to align itself with the paradigm shift from traditional stand-alone computers, workstations and networks to the not-quite-understood cloud computing technology left a policy aperture fraught with challenges that caught some agencies unprepared—particularly adjuncts in inspector general and general counsel offices.
As organizations migrate more data into public clouds, demands for a different type of security are emerging. A specialized option is available now for Amazon Web Services that aims to mitigate threats more quickly by finding them faster and suggesting methods of remediation.
Known as the Evident Security Platform for Amazon Web Services (ESP for AWS), the technology offers a solution expressly designed for the Amazon environment. It has a rapid deployment of five minutes or less and gives a dashboard view of identified threats. In the first week it launched, 50 companies of various sizes signed on for the platform, including several large, multinational corporations.
Explosive amounts of data and the strains on limited financial resources have prompted corporations and governmental agencies alike to explore joint tenancy in the cloud for storing, processing and transmitting data. But while good fences—or in this case isolation mechanisms—make good neighbors, in the virtual world of cloud security the idiom might not ring entirely true. In the public cloud arena, risks arise when organizations place their data in a cloud system but cannot control who their neighbors might be.
The U.S. Army’s current tactical network delivers a wide range of capabilities for warfighters, including unprecedented communications on the move. But the complexity can overwhelm commanders who have countless critical tasks to complete and soldiers’ lives in their hands. Future tactical networks will automate many processes and may be smart enough to advise commanders, similar to JARVIS, Iron Man’s computerized assistant.
Virtualization and cloud implementation are critical components of information technology planning, acquisition and management going forward. Cloud implementations are important to security, efficiency, effectiveness, cost savings and more pervasive information sharing, particularly among enterprises. Cloud architectures also are extremely important for more effective use of mobile technologies. Mobility increasingly is important, particularly for the military, which needs a full range of information technology services while on the move. Yet increased movement to the cloud, along with traditional uses of spectrum, are putting unprecedented demands on every part of the spectrum.
The global market for cloud-based architecture and related services and applications is expected to surge through 2017, analysts say. Demand for a variety of virtualized “as a service” capabilities such as infrastructure, software and security also will increase.
Worldwide spending on cloud-related technologies and services will be in the range of $174.2 billion in 2014, a 20 percent increase from the $145.2 billion spent in 2013, states a recent report by IHS Technology. According to IHS, by 2017 the cloud market will be worth $235.1 billion, triple the market’s $78.2 billion in 2011.
Researchers working on multiple projects in Europe and the United States are using cloud computing to teach robotic systems to perform a multitude of tasks ranging from household chores to serving hospital patients and flipping pancakes. The research, which one day could be applied to robotic systems used for national defense, homeland security or medical uses, lowers costs while allowing robots to learn more quickly, share information and better cooperate with one another.
The U.S. Air Force Space Command (AFSPC) is helping the service put its joint modernization plans into place. As the command responsible for handling cyberspace, communications and information missions, it is the Air Force’s instrument in meeting major Defense Department technology goals, such as establishing the Joint Information Environment (JIE).
Software developed by university researchers accurately predicts cloud computing issues before they occur, enhancing reliability; cutting costs; potentially improving cybersecurity; and saving lives on the battlefield.
Infrastructure-as-a-service clouds are prone to performance anomalies because of their complex nature. But researchers at North Carolina State University (NCSU) have developed software that monitors a wide array of system-level data in the cloud infrastructure—including memory used, network traffic and computer power usage—to define normal behavior for all virtual machines in the cloud; to detect deviations; and to predict anomalies that could create problems for users.
The U.S. Defense Department now is advancing into the third generation of information technologies. This progress is characterized by migration from an emphasis on server-based computing to a concentration on the management of huge amounts of data. It calls for technical innovation and the abandonment of primary dependence on a multiplicity of contractors.
The move to the cloud that is gripping all elements of government and industry offers great potential for the U.S. Navy, according to its chief information officer. Terry Halvorsen told the breakfast audience on the final day of TechNet Asia-Pacific 2013 in Honolulu, Hawaii, that the move to the cloud is one of the best areas for gaining effect in Navy information technology.
However, other elements must fall into place for this move to be successful. Halvorsen said it must be but it must be coupled “with how you look at and structure applications,” adding the Navy has too many applications.
Another Overhyped Fad
By Mark M. Lowenthal
Director of National Intelligence Lt. Gen. James R. Clapper, USAF (Ret.), once observed that one of the peculiar behaviors of the intelligence community is to erect totem poles to the latest fad, dance around them until exhaustion sets in, and then congratulate oneself on a job well done.
Recent insider security breaches have put increased scrutiny on the U.S. intelligence community’s cloud computing plans. But cloud computing initiatives remain unchanged as the technology is expected to enhance cybersecurity and provide analysts with easier ways to do their jobs in less time.
The first step toward an enterprisewide information environment is taking place on desktops belonging to personnel with the National Geospatial-Intelligence Agency (NGA) and the Defense Intelligence Agency (DIA). Deployment has begun for the Intelligence Community Information Technology Enterprise, or ICITE, which aims to provide a common computing environment based on cloud technology (see SIGNAL Magazine articles Managing Change in the Intelligence Community and Intelligence CIOs Teaming for Change from October 2012).