Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars
AFCEA logo
 

cloud computing

Securing Critical Infrastructure Through Nontraditional Means

February 1, 2013
BY Rita Boland

A cloud project takes advantage of emerging concepts to protect energy against disruptive threats.

Researchers at Cornell University and Washington State University have teamed to create GridCloud, a software-based technology designed to reduce the time and difficulty involved with creating prototypes of smart-grid control paradigms. The system will help overcome hurdles of cloud computing in complex settings. The effort combines Cornell’s Isis2 platform, designed for high-assurance cloud computing, with Washington State’s GridStat technology for smart grid monitoring and control. The advent of this technology promises to boost both the security and the reliability of electrical services.

Developers aim to build a scalable software structure that is secure, self-healing and inexpensive to operate. They believe that by combining Isis2 and GridStat, a cloud-based grid can have all those factors as well as guarantee consistency. Infrastructure owners motivated by economies of scale and the desire to deploy the new smart-grid solutions end up with a system that also is more resistant to attack and likely to survive other disruptions.

Dr. Ken Birman, a professor at Cornell and co-principal investigator on the project, explains that several motivations drive the effort. One involves trying to find a solution to control a power grid when multiple organizations own and have access to the infrastructure. “A second challenge that’s emerged is that people have studied the power grid and found that we don’t operate it very efficiently,” Birman says. Power suppliers often are producing extra power, for example, or finding it difficult to take advantage of renewable sources. Sometimes renewable energy—such as the type that comes from solar panels on homes—is blocked from entering the power grid because officials lack the knowledge to access and use it safely.

Cloud Industry Group Issues Mobile Computing Guidelines

March 1, 2013
By Max Cacas

When it comes to popular smartphones and tablets, security can be a many-layered and necessary endeavor

The growing use of advanced mobile devices, coupled with the increase in wireless broadband speed, is fueling demand by employees to bring their own devices to the job. This situation has opened a new set of security challenges for information technology staff, especially when it comes to the use of apps.

As the popularity and capability of mobile devices expands, standards are necessary to ensure that personal devices can function securely on enterprise networks. To address this need, the Cloud Security Alliance (CSA) organized its Mobile Working Group last year. The group recently released guidance to members on how enterprise administrators can successfully integrate smartphones and tablets into their work environment. The CSA is a not-for-profit organization of industry representatives focused on information assurance in the cloud computing industry.

U.S. Army Innovates on Cloud Computing Front

March 1, 2013
By George I. Seffers

Officials work to provide a new cloud approach across 
the service as well as the Defense Department.

U.S. Army officials estimate that by the end of the fiscal year, they will go into production on a new cloud computing solution that could potentially be made available across the Defense Department and could eventually be used to expand cloud capabilities on the battlefield. The platform-as-a-service product incorporates enhanced automation, less expensive software licensing and built-in information assurance.

During the past year, officials with the Army’s Communications-Electronics Command (CECOM) Software Engineering Center (SEC), Aberdeen Proving Ground, Maryland, have been working on a cloud computing approach known as Cloud.mil. A four-person team took about four months to deliver the first increment, which is now in the pre-production phase and is being touted to Army leaders, as well as to Defense Department and Defense Information Systems Agency (DISA) officials, as a possible Army-wide and Defense-wide solution.

U.S. Nuclear Agency Enhances Cybersecurity With Cloud Computing


March 1, 2013
By George I. Seffers

Officials aim to have a solution in place by year's end.

The U.S. agency responsible for the management and security of the nation’s nuclear weapons, nuclear proliferation and naval nuclear reactor programs is racing to put unclassified data on the cloud this year. Cloud computing is expected to provide a wide range of benefits, including greater cybersecurity, lower costs and networking at any time and from anywhere.

Officials at the National Nuclear Security Administration (NNSA), an agency within the Department of Energy, expect to have a cloud computing capability this year. The solution, known as Yourcloud, will provide the NNSA with its own cloud computing environment to manage data more securely, efficiently and effectively. It is part of an overall effort to modernize the agency’s information infrastructure. Yourcloud replaces an aging infrastructure that resulted in too many data centers and an inability to refresh equipment as often as necessary.

The Yourcloud infrastructure will be built and owned by industry, while the NNSA will control the data residing in the cloud. “We’ll be using a commercial data center space with a managed cloud provider as well as a managed security provider to offer us fee as a service back to our customer base,” says Travis Howerton, NNSA chief technology officer.. I don’t want to own my own infrastructure on the unclassified side, but I do want to own my own data. That’s why we’ve been pushing the innovation agenda around security, taking advantage of the lower-cost industry options while not compromising our security posture. What we really have to do is figure out how to insource security and outsource computing, to keep the keys of the kingdom inside, to protect the crown jewels, to make sure we own the security of our data, but then to take advantage of low-cost computing wherever it may be. We are evolving to that model.”

Top Information Technology Officials Peer into the Future

February 28, 2013
George I. Seffers

Top information technology officials from a variety of government agencies identified cloud computing, mobile devices and edge technologies as the technologies that will be critical for accomplishing their missions in the future.

Luke McCormack, chief information officer, Justice Department, cited cloud-as-a-service as vital to the future. He urged industry to continue to push the barriers of stack computing, and he mentioned edge technology as an emerging technology. “Edge is going to be really critical to perform missions,” he said. He cited the Google Glass project as an indicator of what the future will bring.

Mobility could be the future of training and simulation, suggested Sandy Peavy, chief information officer for the Federal Law Enforcement Training Center within the Department of Homeland Security (DHS). She revealed that her office is putting together a pilot program introducing tablet computers into the training environment, and ideally, she would like trainees to be able to access simulation on the mobile device of their choice. Peavy also reported that the Bureau of Alcohol, Tobacco, Firearms and Explosives is providing special agents with iPhones and experimenting with other devices. “If I’m going to be able to provide just-in-time training, then mobile technology is the key.”

Richard Spires, chief information officer for the Department of Homeland Security, also cited mobility as a key future trend, but he also brought up metadata tagging, saying that it helps to understand the data itself and to establish rules for who sees what information. Metadata tagging is especially important as the department grapples with privacy concerns.

Customs and Border Protection Agency Eyes the Cloud

February 1, 2013
By George I. Seffers

The U.S. agency responsible for customs and border protection has suffered from an unreliable infrastructure and network downtimes but already is seeing benefits from a fledgling move to cloud computing. Those benefits include greater reliability and efficiency and lower costs.

Customs and Border Protection’s (CBP’s) priorities include moving the agency to cloud computing and adopting greater use of mobile devices. The CBP Cloud Computing Environment (C3E) moves the agency away from a number of stovepipe platforms. “In the past, we’ve run about every kind of platform that’s out there. We are a large IBM mainframe legacy shop. We use a lot of AIX Unix and also Solaris Unix, so we’ve even got different flavors of Unix out there, and then obviously, big Windows farms,” reveals Charlie Armstrong, CBP chief information officer and assistant commissioner for the office of information and technology. “This new environment that we’re moving to collapses a lot of that down into a single environment and loses all of the mainframe, and it gets us out of building environments from scratch.”

Armstrong describes CBP as being in the early stages of its move to the cloud, but the agency already is seeing benefits, he says. He compares creating a computing environment to building cars. “Building an environment with yesterday’s approach was like going to the car dealership, buying all the parts and having to put the car together yourself. Now, what we’re trying to do is to buy a fully integrated product that allows us to stand up environments quicker and also improve performance,” he explains.

Researchers Organize to 
Share Data, Speed Innovation

February 1, 2013
By Max Cacas

To meet the challenge of implementing big data, a new international scientific organization is forming to facilitate the sharing of research data and speed the pace of innovation. The group, called the Research Data Alliance, will comprise some of the top computer experts from around the world, representing all scientific disciplines.

Managing the staggering and constantly growing amount of information that composes big data is essential to the future of innovation. The U.S. delegation to the alliance’s first plenary session, being held next month in Switzerland, is led by Francine Berman, a noted U.S. computer scientist, with backing from the National Science Foundation (NSF).

Meeting the challenges of how to harness big data is what makes organizing and starting the Research Data Alliance (RDA) so exciting, Berman says. “It has a very specific niche that is very complementary to a wide variety of activities. In the Research Data Alliance, what we’re aiming to do is create really tangible outcomes that drive data sharing, open access, research data sharing and exchange,” all of which, she adds, are vital to data-driven innovation in the academic, public and private sectors. The goal of the RDA is to build what she calls “coordinated pieces of infrastructure” that make it easier and more reasonable for people to share, exchange and discover data.

“It’s really hard to imagine forward innovation without getting a handle around the data issues,” emphasizes Berman, the U.S. leader of the RDA Council, who, along with colleagues from Australia and the European Union, is working to organize the alliance. Ross Wilkinson, executive director of the Australian National Data Service, and John Wood, secretary-general of the Association of Commonwealth Universities in London, are the other members of the council.

Big Data in Demand for Intelligence Community

January 4, 2013
By George I. Seffers

The National Security Agency is poised to deliver an initial cloud computing capability for the entire intelligence community that will significantly enhance cybersecurity and mission performance, and unleash the power of innovation for intelligence agencies, Lonny Anderson, NSA chief information officer, says.

U.S. Government Bets Big on Data

January 1, 2013
By George I. Seffers

A multi-agency big data initiative offers an array of national advantages.

U.S. government agencies will award a flurry of contracts in the coming months under the Big Data Research and Development Initiative, a massive undertaking involving multiple agencies and $200 million in commitments. The initiative is designed to unleash the power of the extensive amounts of data generated on a daily basis. The ultimate benefit, experts say, could transform scientific research, lead to the development of new commercial technologies, boost the economy and improve education, all of which makes the United States more competitive with other nations and enhances national security.

Big data is defined as datasets too large for typical database software tools to capture, store, manage and analyze. Experts estimate that in 2013, 3.6 zettabytes of data will be created, and the amount doubles every two years. A zettabyte is equal to 1 billion terabytes, and a terabyte is equal to 1 trillion bytes.

When the initiative was announced March 29, 2012, John Holdren, assistant to the president and director of the White House Office of Science and Technology Policy, compared it to government investments in information technology that led to advances in supercomputing and the creation of the Internet. The initiative promises to transform the ability to use big data for scientific discovery, environmental and biomedical research, education and national security, Holdren says.

Currently, much of generated data is available only to a select few. “Data are sitting out there in labs—in tens of thousands of labs across the country, and only the person who developed the database in that lab can actually access the data,” says Suzanne Iacono, deputy assistant director for the National Science Foundation (NSF) Directorate for Computer and Information Science and Engineering.

Reading, Writing and Big Data Basics

January 1, 2013
By Max Cacas

An industry-supported online school provides a good grounding
in the science and application of very large datasets.

A virtual school, developed by a team of leading software and hardware companies, is providing readily accessible education in the use of large information datasets. The classes range from entry-level sessions on the essentials of big data for managers to practical instruction for veteran programmers who are accustomed to managing more traditional relational databases.

The mission of BigData University is to provide training and broaden the expertise of the big data community, explains Ben Connors, director of BigData University and worldwide head of alliances with Jaspersoft Incorporated, San Francisco. The uses of big data are expanding, whether for improving the health of children, facilitating the search for clean sources of energy or analyzing intelligence data from unmanned aerial vehicles. As a result, managers are realizing the potential that may be hidden within large information files whose size is measured by petabytes and exabytes, Connors explains.

Pages

Subscribe to RSS - cloud computing