Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars  Apps     EBooks
   AFCEA logo
 

cloud computing

Researchers Organize to 
Share Data, Speed Innovation

February 1, 2013
By Max Cacas

To meet the challenge of implementing big data, a new international scientific organization is forming to facilitate the sharing of research data and speed the pace of innovation. The group, called the Research Data Alliance, will comprise some of the top computer experts from around the world, representing all scientific disciplines.

Managing the staggering and constantly growing amount of information that composes big data is essential to the future of innovation. The U.S. delegation to the alliance’s first plenary session, being held next month in Switzerland, is led by Francine Berman, a noted U.S. computer scientist, with backing from the National Science Foundation (NSF).

Meeting the challenges of how to harness big data is what makes organizing and starting the Research Data Alliance (RDA) so exciting, Berman says. “It has a very specific niche that is very complementary to a wide variety of activities. In the Research Data Alliance, what we’re aiming to do is create really tangible outcomes that drive data sharing, open access, research data sharing and exchange,” all of which, she adds, are vital to data-driven innovation in the academic, public and private sectors. The goal of the RDA is to build what she calls “coordinated pieces of infrastructure” that make it easier and more reasonable for people to share, exchange and discover data.

“It’s really hard to imagine forward innovation without getting a handle around the data issues,” emphasizes Berman, the U.S. leader of the RDA Council, who, along with colleagues from Australia and the European Union, is working to organize the alliance. Ross Wilkinson, executive director of the Australian National Data Service, and John Wood, secretary-general of the Association of Commonwealth Universities in London, are the other members of the council.

Big Data in Demand for Intelligence Community

January 4, 2013
By George I. Seffers

The National Security Agency is poised to deliver an initial cloud computing capability for the entire intelligence community that will significantly enhance cybersecurity and mission performance, and unleash the power of innovation for intelligence agencies, Lonny Anderson, NSA chief information officer, says.

U.S. Government Bets Big on Data

January 1, 2013
By George I. Seffers

A multi-agency big data initiative offers an array of national advantages.

U.S. government agencies will award a flurry of contracts in the coming months under the Big Data Research and Development Initiative, a massive undertaking involving multiple agencies and $200 million in commitments. The initiative is designed to unleash the power of the extensive amounts of data generated on a daily basis. The ultimate benefit, experts say, could transform scientific research, lead to the development of new commercial technologies, boost the economy and improve education, all of which makes the United States more competitive with other nations and enhances national security.

Big data is defined as datasets too large for typical database software tools to capture, store, manage and analyze. Experts estimate that in 2013, 3.6 zettabytes of data will be created, and the amount doubles every two years. A zettabyte is equal to 1 billion terabytes, and a terabyte is equal to 1 trillion bytes.

When the initiative was announced March 29, 2012, John Holdren, assistant to the president and director of the White House Office of Science and Technology Policy, compared it to government investments in information technology that led to advances in supercomputing and the creation of the Internet. The initiative promises to transform the ability to use big data for scientific discovery, environmental and biomedical research, education and national security, Holdren says.

Currently, much of generated data is available only to a select few. “Data are sitting out there in labs—in tens of thousands of labs across the country, and only the person who developed the database in that lab can actually access the data,” says Suzanne Iacono, deputy assistant director for the National Science Foundation (NSF) Directorate for Computer and Information Science and Engineering.

Reading, Writing and Big Data Basics

January 1, 2013
By Max Cacas

An industry-supported online school provides a good grounding
in the science and application of very large datasets.

A virtual school, developed by a team of leading software and hardware companies, is providing readily accessible education in the use of large information datasets. The classes range from entry-level sessions on the essentials of big data for managers to practical instruction for veteran programmers who are accustomed to managing more traditional relational databases.

The mission of BigData University is to provide training and broaden the expertise of the big data community, explains Ben Connors, director of BigData University and worldwide head of alliances with Jaspersoft Incorporated, San Francisco. The uses of big data are expanding, whether for improving the health of children, facilitating the search for clean sources of energy or analyzing intelligence data from unmanned aerial vehicles. As a result, managers are realizing the potential that may be hidden within large information files whose size is measured by petabytes and exabytes, Connors explains.

Implementing the Defense Department
 Cloud Computer Strategy Poses New Challenges

December 1, 2012
By Paul A. Strassmann

A few staff experts can formulate new strategies in a short time. Over the years, the U.S. Defense Department has accumulated a large collection of long-range planning documents. However, none of the plans ever was fully implemented, as new administrations kept changing priorities.

The just announced Defense Department Cloud Computing Strategy presents a long list of radically new directions. Ultimately, it will take hundreds of thousands of person-years to accomplish what has been just outlined. Several points stand out.

In one, individual programs would not design and operate their own infrastructures to deliver computer services. Users would develop only applications. This approach will require tearing apart more than 3,000 existing programs. A pooled environment will be supported by cloud computing that depends on different processing, storing and communications technologies. Small application codes then can be managed separately, relying exclusively on standard interfaces. The challenge will be how to manage more than 15 years’ worth of legacy software worth about half a trillion dollars, but in completely different configurations. Making such changes will require huge cost reductions of the infrastructure that currently costs $19 billion per year.

Another point is that cloud computing will reduce the costs of the existing computing infrastructure. The Defense Department will have to virtualize close to 100,000 servers and integrate that construct with 10,000 communication links. The department will end up with a small number of enterprise-level pooled and centrally managed operations. This is a short-term multibillion-dollar effort that can be financed only from rapid savings, because no new funding will be available.

Cyber Committee Shares Expertise

November 15, 2012
By Maryann Lawlor

Ranging in topics from cloud computing to supply chain management, AFCEA’s Cyber Committee has published five white papers. Available on the committee’s website, information ranges from the basics to high-level recommendations that will be useful not only to organizations’ information technology personnel but also to leadership planning strategies for the future.

 

Defense Board Computing Recommendations Lack Strength

November 1, 2012
By Paul A. Strassmann

 

The Defense Business Board is the highest-level committee advising the U.S. Secretary of Defense. Its report on “Data Center Consolidation and Cloud Computing” offers advice on what directions the Defense Department should follow.
 

However, the Defense Business Board (DBB) report is incomplete. It does not offer actionable solutions; it only raises policy-level questions. As components are formulating budget requests through fiscal year 2018, they will find nothing in this report to guide them on what type of realignments are needed to advance the Defense Department toward cloud computing.

The department’s fiscal year 2012 budget for information technology is reported to be $38.5 billion, $24 billion of which is dedicated to infrastructure. Those numbers are incomplete because they do not include the payroll of 90,000 military and civilian employees, which is worth more than $10 billion. The numbers do not include the time expended by employees in administrative, support, training and idle time that is associated with more than 3 million online users, which amounts to at least $3,000 per capita per year, or $9 billion. In terms of potential cost reduction targets, the total direct information technology budget should be at least $50 billion. In addition, there are collateral management costs such as excessive purchasing due to long procurement cycles, high user support costs to maintain separate systems and high labor costs resulting from inefficient staff deployment.

Managing Change in the
 Intelligence Community

October 1, 2012
By Max Cacas

A new computing architecture emphasizes shared resources.

The nation’s intelligence community has embarked on a path toward a common computer desktop and a cloud computing environment designed to facilitate both timely sharing of information and cost savings. The implementation could result in budget savings of 20 to 25 percent over existing information technology spending within six years, but the ramifications could include large cultural changes that result both in lost jobs and business for industry partners.

Al Tarasiuk, chief intelligence officer for the Office of the Director of National Intelligence (ODNI), explains that the changes will be difficult. Agency employees, and the vendors who help operate and manage information technology for the 17 agencies composing the nation’s intelligence apparatus, will feel the effects of the cost cuts.

“Right now, technology is not our biggest risk. The culture change is our biggest risk, and that extends to our industry partners. We have a lot of industry employed in the community through service contracts and other things. They could help, or they could choose not to help,” Tarasiuk emphasizes, candidly describing the pivotal role of these firms in a transition that could spell the loss of both business and jobs. “They know, and I’ve been very open with them, that we’re not going to need the pool of resources of people that we have today to manage what we have in the future.”

Defense Department Unveils Cloud Computing Strategy

July 12, 2012

On Wednesday, the Defense Department (DOD) issued its long-awaited cloud computing strategy. Officials also announced in a memo from Teri Takai, chief information officer for the DOD, that the Defense Information Systems Agency (DISA) will oversee the new strategy as "enterprise cloud service broker."

DARPA Modifies Cloud Computing Contract

July 11, 2012
By George Seffers

Terremark Federal Group Incorporated, Miami, Florida, is being awarded a $9,116,831 modification to a firm-fixed-price contract to provide cloud-based computing, infrastructure, data, and analytical support under the General Services Administration Special Item Number (SIN) 132-51 and SIN 132-52. The Defense Advanced Research Projects Agency is the contracting activity.

Pages

Subscribe to RSS - cloud computing