big data

June 28, 2017
By Maryann Lawlor
DIA Director Lt. Gen Vincent Stewart, USMC, who was recently tapped to become U.S. Cyber Command’s deputy commander, says analyzing and distributing the growing amount of data the intelligence community collects is a constant challenge.

Senior intelligence officials identified the increasing amount of data and how to handle it as the one of the largest challenges the intelligence community faces today. “We are collecting more data than we can effectively process,” said Defense Intelligence Agency (DIA) Director Lt. Gen Vincent Stewart, USMC. “What we process, we struggle to make sense of, and what we understand, we can’t effectively disseminate across a global enterprise to ensure it helps drive critical decision making.”

June 29, 2017
By George I. Seffers
U.S. Army officials assessing cutting-edge cyber and electronic warfare capabilities during Cyber Quest 2017 tout the ingenuity of soldiers participating in the exercise.

U.S. Army officials expect that by this fall, they will have formal approval of a rapid prototyping process for acquiring cyber and electronic warfare prototypes assessed during the just-completed Cyber Quest 2017 exercise at Fort Gordon, Georgia.

Army officials describe Cyber Quest as an annual cyber and electronic warfare exploration and collaboration event hosted by the Cyber Center of Excellence. This is the second year for the event.

May 2, 2017

The U.S. Defense Department has initiated a market research effort to identify potential industry sources under a potential five-year, $325 million acquisition program for technical support services. The market research effort could potentially lead to a contract award this fiscal year.

April 28, 2017

Forecasting data collected during the Intelligence Advanced Research Projects Activity's (IARPA’s) Aggregative Contingent Estimation (ACE) program by team Good Judgment is now available for use by the public and the research community via

April 20, 2017

Organizations today must deal with an avalanche of big data and the advanced computing requirements that are driven by so much data. To cover the accelerated speeds and throughput needs they increasingly face, their information systems require increased network speeds and upgrades as well as improved security and monitoring tools.

April 17, 2017
By Sandra Jontz

The cannonade of small satellites hovering above the Earth is creating a dilemma for government and industry alike: how to process enormous amounts of data sent to the ground. 

Collecting information isn’t the hard part, nor is transmitting it, experts say. What vexes intelligence analysts the most is not being able to make heads or tails of petabyte upon petabyte of data. But the government seeks help from the commercial world to make that happen.

February 13, 2017
By George I. Seffers
Sandia scientists Marlene and George Bachand show off their new method for encrypting and storing sensitive information in DNA. Digital data storage degrades and can become obsolete, and old-school books and paper require lots of space. (Photo by Lonnie Anderson)

Behind the Science is an occasional series of blogs focusing on the people advancing science and technology.

George and Marlene Bachand, a married couple working at Sandia National Laboratories, have partnered on more science projects than they can recall.

February 1, 2017
By George I. Seffers
Researchers have developed a technique for encoding text within synthetic DNA that they say would take an infinite number of random, brute-force attacks to break.

Scientists at Sandia National Laboratories are searching for partners to apply technology for encrypting text within synthetic DNA. The encryption is far stronger than conventional technology and practically impossible to break, researchers say.

In September, the Sandia team wrapped up a three-year effort titled Synthetic DNA for Highly Secure Information Storage and Transmission. The project developed a new way of storing and encrypting information using DNA. The work was funded through Sandia’s internal Laboratory Directed Research and Development program. 

February 2, 2017

Organizations constantly are seeking new ways to address workload-specific storage demands in terms of performance and capacity while also meeting service-level agreements, response-time objectives and recovery-point objectives.

Many information technology operations are inspired by successful hyperscale organizations such as Facebook, Google and Amazon. However, most enterprises lack the scale and substantial development and operations commitment necessary to deploy software-defined storage infrastructure in the same ways. Hyperscale economics also typically don’t work out at smaller scale, resulting in poor utilization or unacceptable reliability issues.

January 1, 2017
By Stephanie Domas and Dr. Nancy McMillan

Advances in genomics, medical sensors and data-driven health care increasingly are enabling doctors and patients to make personalized and targeted care decisions. But the effectiveness of these precision medicine capabilities depends on critical cybersecurity components to protect patient privacy and the integrity of patient data.  

December 1, 2016
By George I. Seffers
A U.S. Marine Corps corporal fires a GAU-17/A gun during a Valiant Shield exercise over Farallon de Medinilla, Northern Mariana Islands. Machine-learning software may help predict which warfighters will be best suited for specific missions.

Researchers are developing an open source machine-learning framework that allows a distributed network of computers to process vast amounts of data as efficiently and effectively as supercomputers and to better predict behaviors or relationships. The technology has a broad range of potential applications, including commercial, medical and military uses.

Anyone who needs to analyze a few trillion datasets can use a supercomputer or distribute the problem among processors on a large network. The former option is not widely available, and the latter can be complicated. 

November 29, 2016

The Kill Chain Integration Branch at Hanscom Air Force Base has begun an experimentation campaign, known as Data-to-Decisions, to look at ways to provide warfighters data in the fastest and most efficient way possible. The campaign is in its early stages but, according to officials, already showing the potential for favorable results.

November 1, 2016
By Robert K. Ackerman

A cluster of macrotechnologies offers the potential for a new wave of innovation that revolutionizes all aspects of government, military and civilian life. Many of these technologies are familiar, and their effects are well-known. What may not be common knowledge is that the more these technologies advance, the more their synergies increase.

July 6, 2016
By Sandra Jontz

Cybersecurity today is less about stopping adversaries from breaching networks and more about damage control once they get in, an adjustment that has government and businesses embracing a new trend that merges security and big data.

This confluence gives rise to a growing practice called threat hunting, the act of aggressively going after cyber adversaries rather than waiting to learn they have breached security perimeters.

While growing in popularity, a recent survey of security experts notes that a significant portion of threat hunting is still being performed ad hoc, negating benefits of a repeatable processes and a waste of resources in trying unverified methods that provide minimal value.

June 29, 2016
By Marcella Cavallaro

Big data is prevalent across the federal government, particularly the policy-shaping power of new data streams and better constituent information. Two years ago, President Barack Obama signed into law the Digital Accountability and Transparency Act (DATA Act) that prioritized making public government data in an effort to bring transparency to federal processes and harness the collective power of information. The law gives the public  access to data about the nation’s populations, regions, services, economic opportunities and more that can be used by everyone—from local governments and citizens to private business and state agencies.

June 1, 2016
By Frank Christian Sprengel and Sebastian Leinhos

Despite plenty of indicators pointing to an impending refugee crisis in Europe, policy makers failed to see it coming. They neglected to set in motion judicious humanitarian response plans and bungled strategic decisions to prepare for the people in need, worsening this catastrophe. Government leaders lacked the foresight that might have avoided many of the issues the continent and its authorities now face. 

Inadequate preparation and a deficit of upfront collaboration have been toxic for all stakeholders. The crisis should elicit a fervent search for answers to this question: How can European decision makers act in a more appropriate, comprehensive and sustainable way?

May 12, 2016
By Sandra Jontz

The Defense Department's continued collaboration to streamline the whole of the military's information technology networks and systems, known as the Joint Information Environment, tops leaders' agendas and fiscal spending plans—now available with a caveat for decision makers, officials said.

May 11, 2016
By George I. Seffers
LinQuest's cyber solution allows analysts to view data in 3-D.

LinQuest Corporation officials are now offering a new game-based technology that allows cyber analysts to view data in an immersive 3-D environment, which the company says allows quicker understanding of the data, saving users both time and money.

The 3-D Cyber Immersive Collaboration Environment (ICE) allows analysts to create a 3-D virtual world in which users are represented as avatars able to interact with big data analytics and/or real-time systems. The virtual world includes video feeds, data feeds, web interfaces, data visualizations and analytical tools. Once the crisis is over, the virtual world and its super metadata can be archived into the cloud.

April 5, 2016
Sandra Jontz

Government conversations related to safeguarding cyberspace spin around policy as much as technology, particularly when it comes to sluggish efforts to modernize networks.

Federal information assurance security policies and standard operating procedures (SOPs) were penned in the late 1990s and early 2000s, long before today’s threats rendered them obsolete, not to speak of the challenges posed by the emerging Internet of Things (IoT) influence.

March 7, 2016
By Al Krachman, Esq., and Brian Friel

The use of big data in source selection decisions for contract awards is growing. But big data also is shaping acquisition policy. One of the recent Defense Department Acquisitions System performance reports began, “In God we trust; all others must bring data.”

Federal acquisition policy has been cobbled together by Congress and executive branch policy makers over the past few decades with little actual data to back up the policies they’ve written. Big data is coming to federal procurement, and it will change the way policy is made.

Pages