International Business Machines Corporation, Global Government Industry, Bethesda, Maryland, is being awarded a $12,171,809 firm-fixed-price and cost-reimbursement contract for Enterprise Information Services Production Environment, a cloud-like "platform as a service" information technology hosting environment used to host classified and unclassified Air Force data and software programs. Electronic Systems Center, Wright-Patterson Air Force base, Ohio, is the contracting activity.
It's been slow going for Defense Department IT since the Clinger-Cohen Act of 1996 mandated creating the Information Technology Architecture. In 1999, the Federal Chief Information Officers Council defined the Federal Enterprise Architecture (FEA). It's now 2011, and according to a Government Accountability Office report, the enterprise architecture methodology still has not deployed. In his viewpoint article "About Face" in this issue of SIGNAL Magazine, Paul A.
Industry leaders are working hard to identify and create the Internet of the future, and News Editor Rita Boland digs in with an examination of this virtual "ground breaking" in cyberspace in her article, "Upcoming Online Experiences," in this issue of SIGNAL Magazine. The piece is the first in a four-part SIGNAL semaphore series: The Future of the Internet. Kevin Orr, Cisco Corporation's vice president of U.S.
The computing device shouldn't matter, nor its provider: Defense Department personnel just want their information securely, by authorized channels, in a timely manner. Department customers want personal information assistants (PIAs), adapted to their position, training level and necessary connections. Paul A. Strassmann discusses the potential way forward in his article, "A Culture Shock Is Coming," in this issue of SIGNAL Magazine. Info sources must include data received from people, sensors or public websites.
With the thousands of applications running on U.S. Defense Department networks, programmers have literally been dream weavers, pulling together the pieces necessary to make these systems fully functional. Hundreds of contracting organizations are tied up in these networks, making it a monumental challenge to pool all resources into an efficient, future "whole." But as with any evolution, it cannot take place overnight. In his second installment in a series of articles covering defense information technology, Paul A.
Autonomic Resources recently announced that it has been awarded one of the General Services Administration's first blanket purchase agreements for the first government-wide contract for cloud computing. Under this agreement, Autonomic Resources will offer public cloud services to provide U.S. government customers with simplified computing power, storage, and networking infrastructure that can be acquired and utilized on-demand, all from certified data centers with enhanced multi-factor authentication access. Autonomic Resources is one of only a few vendors to have met the technical requirements necessary to be awarded a GSA contract for cloud computing.
Apps Tap into Cloud Computing
The hard-hitting storms that beleaguered parts of the United States this year taught the East Coast a valuable lesson-sometimes you just can't get to work. But with immovable deadlines tasks still must be accomplished. One way offices can continue to function with personnel in disparate locations (assuming they all have power) is by storing documents in locations other than organizations' computer drives. Using the Internet as a storage device enables people to continue to move work forward, even if they can't get out the front door.
Terry Halvorsen, the Defense Department’s acting chief information officer, is expected very soon to release a new policy revising the role the Defense Information Systems Agency (DISA) plays in brokering cloud services. The changes are designed to speed cloud service acquisitions by preventing bottlenecks created by having only one agency act as broker. DISA no longer will be the sole acquisition agency, but it will continue to ensure network access to cloud service providers is secure and reliable, agency officials say.
There are no do-overs when it comes to safeguarding the U.S. military’s sensitive data. With that key, concise and blunt notion in mind, defense leaders say they are taking a slow, methodical, multipronged approach as the Defense Information Systems Agency develops a cloud security model for the whole of the Defense Department.
With current security controls too strict and limiting, agency personnel are sleuthing for the ideal balance that would let a greater number of commercial cloud service providers compete for billions in federal funding, while still safeguarding national security. Their goal is to determine what might be safe—and what might be safe enough.
The U.S. government is adopting changes to the cloud computing certification program that will better protect against potential insider threats. The improvements include additional penetration testing, more thorough testing of mobile devices, tighter controls over systems being carried from a facility and more stringent scrutiny of systems connecting from outside the network.
When cloud computing revolutionized the way businesses stored, processed and transmitted data, the rapid transformation—as with a lot of technological advances—left U.S. government agencies behind the times. The government’s hurried effort to align itself with the paradigm shift from traditional stand-alone computers, workstations and networks to the not-quite-understood cloud computing technology left a policy aperture fraught with challenges that caught some agencies unprepared—particularly adjuncts in inspector general and general counsel offices.
As organizations migrate more data into public clouds, demands for a different type of security are emerging. A specialized option is available now for Amazon Web Services that aims to mitigate threats more quickly by finding them faster and suggesting methods of remediation.
Known as the Evident Security Platform for Amazon Web Services (ESP for AWS), the technology offers a solution expressly designed for the Amazon environment. It has a rapid deployment of five minutes or less and gives a dashboard view of identified threats. In the first week it launched, 50 companies of various sizes signed on for the platform, including several large, multinational corporations.
Explosive amounts of data and the strains on limited financial resources have prompted corporations and governmental agencies alike to explore joint tenancy in the cloud for storing, processing and transmitting data. But while good fences—or in this case isolation mechanisms—make good neighbors, in the virtual world of cloud security the idiom might not ring entirely true. In the public cloud arena, risks arise when organizations place their data in a cloud system but cannot control who their neighbors might be.
The U.S. Army’s current tactical network delivers a wide range of capabilities for warfighters, including unprecedented communications on the move. But the complexity can overwhelm commanders who have countless critical tasks to complete and soldiers’ lives in their hands. Future tactical networks will automate many processes and may be smart enough to advise commanders, similar to JARVIS, Iron Man’s computerized assistant.
Virtualization and cloud implementation are critical components of information technology planning, acquisition and management going forward. Cloud implementations are important to security, efficiency, effectiveness, cost savings and more pervasive information sharing, particularly among enterprises. Cloud architectures also are extremely important for more effective use of mobile technologies. Mobility increasingly is important, particularly for the military, which needs a full range of information technology services while on the move. Yet increased movement to the cloud, along with traditional uses of spectrum, are putting unprecedented demands on every part of the spectrum.
Researchers working on multiple projects in Europe and the United States are using cloud computing to teach robotic systems to perform a multitude of tasks ranging from household chores to serving hospital patients and flipping pancakes. The research, which one day could be applied to robotic systems used for national defense, homeland security or medical uses, lowers costs while allowing robots to learn more quickly, share information and better cooperate with one another.
The global market for cloud-based architecture and related services and applications is expected to surge through 2017, analysts say. Demand for a variety of virtualized “as a service” capabilities such as infrastructure, software and security also will increase.
Worldwide spending on cloud-related technologies and services will be in the range of $174.2 billion in 2014, a 20 percent increase from the $145.2 billion spent in 2013, states a recent report by IHS Technology. According to IHS, by 2017 the cloud market will be worth $235.1 billion, triple the market’s $78.2 billion in 2011.
The U.S. Air Force Space Command (AFSPC) is helping the service put its joint modernization plans into place. As the command responsible for handling cyberspace, communications and information missions, it is the Air Force’s instrument in meeting major Defense Department technology goals, such as establishing the Joint Information Environment (JIE).
Software developed by university researchers accurately predicts cloud computing issues before they occur, enhancing reliability; cutting costs; potentially improving cybersecurity; and saving lives on the battlefield.
Infrastructure-as-a-service clouds are prone to performance anomalies because of their complex nature. But researchers at North Carolina State University (NCSU) have developed software that monitors a wide array of system-level data in the cloud infrastructure—including memory used, network traffic and computer power usage—to define normal behavior for all virtual machines in the cloud; to detect deviations; and to predict anomalies that could create problems for users.