Behavioral analytic tools might just open new horizons for better cybersecurity that would let experts better prioritize alerts and collect actionable intelligence, giving them an advantage for more rapid responses to breaches. Or might they open new doors for hackers?
Many information technology organizations are taking a different approach to cybersecurity that radically reduces the time to detect and respond to attempted cyber attacks.
Article updated December 3, 2014.
With a number of uncertainties coloring their activities, officials at the U.S. Army’s Communications-Electronics Research, Development and Engineering Center are preparing their program objective memorandum, laying out several key projects and goals for the coming years. The leaders are calibrating efforts to align with expected congressional funding as well as with the capabilities soldiers require for mission success.
The U.S. Air Force is using big data analysis tools to create a picture of a battlefield or area of interest that can be monitored in real time as well as stored and replayed. By merging sensor streams with data tagging and trend detection software, this capability will allow analysts and warfighters to observe, track and potentially predict enemy force operations based on their observed behavior.
The emergence of big data combined with the revolution in sensor technology is having a synergistic effect that promises a boom in both realms. The ability to fuse sensor data is spurring the growth of large databases that amass more information than previously envisioned. Similarly, the growth of big data capabilities is spawning new sensor technologies and applications that will feed databases’ ever-increasing and diverse types of information.
This rarely happens, but for 2014, defense and technology analysts are in agreement that big data and cybersecurity are the two drivers in planning and investment for information technology, both in government and in industry. Most everything else will be enabling these two key capabilities. While much attention has been focused on the threats and work being done globally on cybersecurity, I want to focus on big data.
Big data is critical because, unless it is collected, analyzed, managed and made ubiquitously available, many analysts and decision makers will be buried in information they cannot use effectively in a timely fashion. It also is the starting and ending point for many of the technologies and capabilities we care about: networks, data centers, cloud initiatives, storage, search, analytics and secure access
The increasing presence of news sources on the Internet offers an unprecedented opportunity to access open-source intelligence for a variety of purposes. Researchers from several U.S. universities have collaborated to take advantage of these resources, creating a big data collection and distribution process applicable to disciplines ranging from social research to national security.
Big Data increasingly is viewed as the future of knowledge management, aided and abetted by the cloud. And, it would seem to be a perfect fit in the field of intelligence. But two longtime experts in intelligence take opposing views on the utility of big data for intelligence.
What do modern intelligence agencies run on? They are internal combustion engines burning pipelines of data, and the more fuel they burn the better their mileage. Analysts and decision makers are the drivers of these vast engines; but to keep them from hoofing it, we need big data.
Another Overhyped Fad
By Mark M. Lowenthal
Director of National Intelligence Lt. Gen. James R. Clapper, USAF (Ret.), once observed that one of the peculiar behaviors of the intelligence community is to erect totem poles to the latest fad, dance around them until exhaustion sets in, and then congratulate oneself on a job well done.
In considering how best to manage the challenges and opportunities presented by big data in the U.S. Defense Department, Dan Doney, chief innovation officer with the Defense Intelligence Agency (DIA), says the current best thinking on the topic centers around what he calls, “the five Vs”.
Current efforts to deal with big data, the massive amounts of information resulting from an ever-expanding number of networked computers, storage and sensors, go hand-in-hand with the government’s priority to sift through these huge datasets for important data. So says Simon Szykman, chief information officer (CIO) with the U.S. Department of Commerce.
Effectively dealing with data sets measured in terabytes and petabytes sometimes takes an ecosystem. And at times, that ecosystem is dependent on metadata, a sub-dataset that describes the dataset so that it can be analyzed quickly.
That’s according to Todd Myers, a big data specialist with the National Geospatial-Intelligence Agency (NGA), who spoke at the AFCEA SOLUTIONS Series - George Mason University Symposium, "Critical Issues in C4I," on Tuesday.