Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars
AFCEA logo
 

cloud computing

U.S. Army Innovates on Cloud Computing Front

March 1, 2013
By George I. Seffers

Officials work to provide a new cloud approach across 
the service as well as the Defense Department.

U.S. Army officials estimate that by the end of the fiscal year, they will go into production on a new cloud computing solution that could potentially be made available across the Defense Department and could eventually be used to expand cloud capabilities on the battlefield. The platform-as-a-service product incorporates enhanced automation, less expensive software licensing and built-in information assurance.

During the past year, officials with the Army’s Communications-Electronics Command (CECOM) Software Engineering Center (SEC), Aberdeen Proving Ground, Maryland, have been working on a cloud computing approach known as Cloud.mil. A four-person team took about four months to deliver the first increment, which is now in the pre-production phase and is being touted to Army leaders, as well as to Defense Department and Defense Information Systems Agency (DISA) officials, as a possible Army-wide and Defense-wide solution.

U.S. Nuclear Agency Enhances Cybersecurity With Cloud Computing


March 1, 2013
By George I. Seffers

Officials aim to have a solution in place by year's end.

The U.S. agency responsible for the management and security of the nation’s nuclear weapons, nuclear proliferation and naval nuclear reactor programs is racing to put unclassified data on the cloud this year. Cloud computing is expected to provide a wide range of benefits, including greater cybersecurity, lower costs and networking at any time and from anywhere.

Officials at the National Nuclear Security Administration (NNSA), an agency within the Department of Energy, expect to have a cloud computing capability this year. The solution, known as Yourcloud, will provide the NNSA with its own cloud computing environment to manage data more securely, efficiently and effectively. It is part of an overall effort to modernize the agency’s information infrastructure. Yourcloud replaces an aging infrastructure that resulted in too many data centers and an inability to refresh equipment as often as necessary.

The Yourcloud infrastructure will be built and owned by industry, while the NNSA will control the data residing in the cloud. “We’ll be using a commercial data center space with a managed cloud provider as well as a managed security provider to offer us fee as a service back to our customer base,” says Travis Howerton, NNSA chief technology officer.. I don’t want to own my own infrastructure on the unclassified side, but I do want to own my own data. That’s why we’ve been pushing the innovation agenda around security, taking advantage of the lower-cost industry options while not compromising our security posture. What we really have to do is figure out how to insource security and outsource computing, to keep the keys of the kingdom inside, to protect the crown jewels, to make sure we own the security of our data, but then to take advantage of low-cost computing wherever it may be. We are evolving to that model.”

Top Information Technology Officials Peer into the Future

February 28, 2013
George I. Seffers

Top information technology officials from a variety of government agencies identified cloud computing, mobile devices and edge technologies as the technologies that will be critical for accomplishing their missions in the future.

Luke McCormack, chief information officer, Justice Department, cited cloud-as-a-service as vital to the future. He urged industry to continue to push the barriers of stack computing, and he mentioned edge technology as an emerging technology. “Edge is going to be really critical to perform missions,” he said. He cited the Google Glass project as an indicator of what the future will bring.

Mobility could be the future of training and simulation, suggested Sandy Peavy, chief information officer for the Federal Law Enforcement Training Center within the Department of Homeland Security (DHS). She revealed that her office is putting together a pilot program introducing tablet computers into the training environment, and ideally, she would like trainees to be able to access simulation on the mobile device of their choice. Peavy also reported that the Bureau of Alcohol, Tobacco, Firearms and Explosives is providing special agents with iPhones and experimenting with other devices. “If I’m going to be able to provide just-in-time training, then mobile technology is the key.”

Richard Spires, chief information officer for the Department of Homeland Security, also cited mobility as a key future trend, but he also brought up metadata tagging, saying that it helps to understand the data itself and to establish rules for who sees what information. Metadata tagging is especially important as the department grapples with privacy concerns.

Customs and Border Protection Agency Eyes the Cloud

February 1, 2013
By George I. Seffers

The U.S. agency responsible for customs and border protection has suffered from an unreliable infrastructure and network downtimes but already is seeing benefits from a fledgling move to cloud computing. Those benefits include greater reliability and efficiency and lower costs.

Customs and Border Protection’s (CBP’s) priorities include moving the agency to cloud computing and adopting greater use of mobile devices. The CBP Cloud Computing Environment (C3E) moves the agency away from a number of stovepipe platforms. “In the past, we’ve run about every kind of platform that’s out there. We are a large IBM mainframe legacy shop. We use a lot of AIX Unix and also Solaris Unix, so we’ve even got different flavors of Unix out there, and then obviously, big Windows farms,” reveals Charlie Armstrong, CBP chief information officer and assistant commissioner for the office of information and technology. “This new environment that we’re moving to collapses a lot of that down into a single environment and loses all of the mainframe, and it gets us out of building environments from scratch.”

Armstrong describes CBP as being in the early stages of its move to the cloud, but the agency already is seeing benefits, he says. He compares creating a computing environment to building cars. “Building an environment with yesterday’s approach was like going to the car dealership, buying all the parts and having to put the car together yourself. Now, what we’re trying to do is to buy a fully integrated product that allows us to stand up environments quicker and also improve performance,” he explains.

Researchers Organize to 
Share Data, Speed Innovation

February 1, 2013
By Max Cacas

To meet the challenge of implementing big data, a new international scientific organization is forming to facilitate the sharing of research data and speed the pace of innovation. The group, called the Research Data Alliance, will comprise some of the top computer experts from around the world, representing all scientific disciplines.

Managing the staggering and constantly growing amount of information that composes big data is essential to the future of innovation. The U.S. delegation to the alliance’s first plenary session, being held next month in Switzerland, is led by Francine Berman, a noted U.S. computer scientist, with backing from the National Science Foundation (NSF).

Meeting the challenges of how to harness big data is what makes organizing and starting the Research Data Alliance (RDA) so exciting, Berman says. “It has a very specific niche that is very complementary to a wide variety of activities. In the Research Data Alliance, what we’re aiming to do is create really tangible outcomes that drive data sharing, open access, research data sharing and exchange,” all of which, she adds, are vital to data-driven innovation in the academic, public and private sectors. The goal of the RDA is to build what she calls “coordinated pieces of infrastructure” that make it easier and more reasonable for people to share, exchange and discover data.

“It’s really hard to imagine forward innovation without getting a handle around the data issues,” emphasizes Berman, the U.S. leader of the RDA Council, who, along with colleagues from Australia and the European Union, is working to organize the alliance. Ross Wilkinson, executive director of the Australian National Data Service, and John Wood, secretary-general of the Association of Commonwealth Universities in London, are the other members of the council.

Big Data in Demand for Intelligence Community

January 4, 2013
By George I. Seffers

The National Security Agency is poised to deliver an initial cloud computing capability for the entire intelligence community that will significantly enhance cybersecurity and mission performance, and unleash the power of innovation for intelligence agencies, Lonny Anderson, NSA chief information officer, says.

U.S. Government Bets Big on Data

January 1, 2013
By George I. Seffers

A multi-agency big data initiative offers an array of national advantages.

U.S. government agencies will award a flurry of contracts in the coming months under the Big Data Research and Development Initiative, a massive undertaking involving multiple agencies and $200 million in commitments. The initiative is designed to unleash the power of the extensive amounts of data generated on a daily basis. The ultimate benefit, experts say, could transform scientific research, lead to the development of new commercial technologies, boost the economy and improve education, all of which makes the United States more competitive with other nations and enhances national security.

Big data is defined as datasets too large for typical database software tools to capture, store, manage and analyze. Experts estimate that in 2013, 3.6 zettabytes of data will be created, and the amount doubles every two years. A zettabyte is equal to 1 billion terabytes, and a terabyte is equal to 1 trillion bytes.

When the initiative was announced March 29, 2012, John Holdren, assistant to the president and director of the White House Office of Science and Technology Policy, compared it to government investments in information technology that led to advances in supercomputing and the creation of the Internet. The initiative promises to transform the ability to use big data for scientific discovery, environmental and biomedical research, education and national security, Holdren says.

Currently, much of generated data is available only to a select few. “Data are sitting out there in labs—in tens of thousands of labs across the country, and only the person who developed the database in that lab can actually access the data,” says Suzanne Iacono, deputy assistant director for the National Science Foundation (NSF) Directorate for Computer and Information Science and Engineering.

Reading, Writing and Big Data Basics

January 1, 2013
By Max Cacas

An industry-supported online school provides a good grounding
in the science and application of very large datasets.

A virtual school, developed by a team of leading software and hardware companies, is providing readily accessible education in the use of large information datasets. The classes range from entry-level sessions on the essentials of big data for managers to practical instruction for veteran programmers who are accustomed to managing more traditional relational databases.

The mission of BigData University is to provide training and broaden the expertise of the big data community, explains Ben Connors, director of BigData University and worldwide head of alliances with Jaspersoft Incorporated, San Francisco. The uses of big data are expanding, whether for improving the health of children, facilitating the search for clean sources of energy or analyzing intelligence data from unmanned aerial vehicles. As a result, managers are realizing the potential that may be hidden within large information files whose size is measured by petabytes and exabytes, Connors explains.

Implementing the Defense Department
 Cloud Computer Strategy Poses New Challenges

December 1, 2012
By Paul A. Strassmann

A few staff experts can formulate new strategies in a short time. Over the years, the U.S. Defense Department has accumulated a large collection of long-range planning documents. However, none of the plans ever was fully implemented, as new administrations kept changing priorities.

The just announced Defense Department Cloud Computing Strategy presents a long list of radically new directions. Ultimately, it will take hundreds of thousands of person-years to accomplish what has been just outlined. Several points stand out.

In one, individual programs would not design and operate their own infrastructures to deliver computer services. Users would develop only applications. This approach will require tearing apart more than 3,000 existing programs. A pooled environment will be supported by cloud computing that depends on different processing, storing and communications technologies. Small application codes then can be managed separately, relying exclusively on standard interfaces. The challenge will be how to manage more than 15 years’ worth of legacy software worth about half a trillion dollars, but in completely different configurations. Making such changes will require huge cost reductions of the infrastructure that currently costs $19 billion per year.

Another point is that cloud computing will reduce the costs of the existing computing infrastructure. The Defense Department will have to virtualize close to 100,000 servers and integrate that construct with 10,000 communication links. The department will end up with a small number of enterprise-level pooled and centrally managed operations. This is a short-term multibillion-dollar effort that can be financed only from rapid savings, because no new funding will be available.

Cyber Committee Shares Expertise

November 15, 2012
By Maryann Lawlor

Ranging in topics from cloud computing to supply chain management, AFCEA’s Cyber Committee has published five white papers. Available on the committee’s website, information ranges from the basics to high-level recommendations that will be useful not only to organizations’ information technology personnel but also to leadership planning strategies for the future.

 

Pages

Subscribe to RSS - cloud computing