Enable breadcrumbs token at /includes/pageheader.html.twig

Multiple Thrusts Define Geospatial Agency Big Data Efforts

The foundational approach to intelligence may be altered by this burgeoning capability.

Technology innovations, new roles and expanding missions are shaping the move toward big data in the National Geospatial-Intelligence Agency. A mix of tradecraft and technology is ensuing as the agency evolves from an organization that always has worked with voluminous imagery files to one in which big data represents a goal that promises to change many aspects of intelligence.

David Bottom is the director of the information technology services directorate at the National Geospatial-Intelligence Agency (NGA). He explains that, with its imagery library, the NGA has been generating and using large data files for some time. Imagery resolution, file complexity and the number of files continue to increase. Bottom allows that the agency must transition from dealing in large data files to incorporating the concept of big data. “There is a lot of information in those large data files that you could consider to be big data,” he offers. “So how do we actually transition the agency—not just to being a large data file provider, but to that big data environment where there is a lot going on in those image files?”

Big data is not fundamentally changing the NGA’s mission, Bottom states. The capability does allow the agency to function as a foundation for integrated intelligence. It also provides increased capabilities in terms of being able to deliver a better product more quickly. “If those data points—and their relationships—are portrayed in time and space in a way that enables the user to quickly make sense of something, that is the power,” he declares.

The NGA has been doing this for some time, but on paper, he continues. With mobile connectivity having spurred users to want to be able to consume information digitally—and even interact in real time with data, as opposed to static products—the agency will “continue to tune” its online delivery without changing its underlying mission.

Ultimately, NGA big data would allow the agency to move past persistent intelligence into a realm in which analysts and decision makers “live within the data,” Bottom maintains. “This means they are not interacting with products,” he explains. “They are able to work with the data in real time. The data is updated in real time, and the speed of understanding is going to be updated in real time as well.

“The goal of all of this is to give our decision makers more time to make decisions,” he summarizes. “The more we can be anticipatory as an intelligence community, the more time decision makers have to make decisions—whether at the policy-maker level or warfighters or first responders.”

As the NGA is moving toward big data, it is focusing its investments in several areas. One, Bottom says, is a tradecraft known as structured observation management. With geospatial intelligence, earth observation involves seeking changes below. Each change represents an observation, Bottom notes, with each observation representing an artificial or natural object. The agency is taking a much more deliberate approach in recording those observations instead of writing imagery reports, and this leads to more “sense making” that strives to explain why events are taking place.

One way the agency is addressing this goal is developing analytics or algorithms to examine data first and then cue the analyst as to which elements he or she might want to examine immediately. One of the NGA’s key capabilities is the ability to “tease out” bits of data in its large imagery files, Bottom maintains. This applies whether the data is earth observation, radar, hyperspectral or LIDAR information, he says.

Structured observation management, activities-based intelligence and other NGA initiatives such as next-generation collection are tied together by the concept of analytic models. These constitute a hypothesis of what analysts expect to see; and while this concept is not new, what is innovative is that these models would be machine-readable, Bottom says. This will require coding them in a way that permits automating the examination of these large data sets to excerpt what is important, he explains.

“This is a continuous exercise,” Bottom points out. “So, as things unfold, we are going to continue to tweak those analytic models because there may be an outcome we didn’t expect. We may learn something that may cause us to go back and change or refine a model or come up with new models. That will be the core in terms of our automation—to actually figure out what is important.

“Of course, we always are going to continue to automate the things that people are doing,” he continues. “The trick for us is to make sure that, as we move from people in the loop, it helps us become more efficient.”

Bottom says the structured observation management tradecraft that the NGA is working to put in place has generated outcomes that have been “hugely impactful from a mission perspective.” The adoption of the tradecraft itself is a huge success story, he adds. The NGA has been able to play a foundational role enabling the high-level directive for intelligence integration with this tradecraft along with related disciplines and tools, particularly those developed for activities-based intelligence. Bottom offers that these efforts constitute the fundamental underpinnings for the integrated intelligence environment.

The NGA still has a way to go for achieving its goals in big data, Bottom admits, offering that the agency is only about 20 percent along the path to big data effectiveness. One of the key issues is figuring out exactly what the tradecraft will be, he allows. The agency also must determine which investments to make in tools and other capabilities, including human capital.

These developments occur on different time horizons, Bottom states, so success will not come at once. “We have a lot of work to do in each of those areas,” he warrants, adding that the agency is at least 12 to 16 months away from where he would want it to be. Now that the NGA has its next-level strategy in place, its progress toward big data should accelerate, he offers.

Skill sets remain the biggest challenge facing the NGA’s big data efforts, Bottom declares. “[We need to] make sure we have the right folks in the right places at the right time to move us forward,” he says. His directorate is well-known for its technical and programmatic expertise, but those capabilities alone will not be enough. The skill sets that the agency needs are in areas such as data science, visualization and search capabilities.

Data science is at the core of big data enablers. Visualization is essential because, while the agency always has been involved with visual information, the increasing amount of data added to imagery products serves the user best if presented in a visual manner. For search, Bottom points out that anyone can find information by Googling it online. The challenge is to make sense of those results, and achieving that will require new skill sets.

Another challenge is not particular to the NGA: limited resources. “Any investment we make not only has to deliver mission value, it also has to make us less expensive,” Bottom says. The need to increase capabilities with the same or fewer resources has added a sense of urgency to its work.

Meeting both challenges will require trade-offs, Bottom admits. The agency’s overall strategy outlines the areas where it will take risks and where it will invest in the future. Some training is a long-term investment, he points out.

Mobile technologies have posed a new challenge to the NGA. “From a mobile standpoint, it’s not really scalable,” Bottom says, adding that the Intelligence Community Information Technology Enterprise (ICITE) addresses the request for mobile technology through its cloud infrastructure. ICITE allows the agency to place one copy of data in one environment, which permits bringing analysts and their applications to the data—a fundamental departure from the past. This will provide the architectural underpinnings that will enable wider use of mobile applications.

“When you take a look at what NGA’s business is, it is to move imagery around,” Bottom points out. “We spend a lot of resources moving data to users in such a way that they can exploit it with the human eye. The downside to that is you end up with a lot of different copies of data in a lot of different locations; and it becomes terribly difficult to integrate because you have different copies of data—and from a timing perspective, it may not be the most current.”

From industry, the NGA would like help in dealing with two fundamental speed challenges. Bottom describes the first as speed of information integration. The agency needs to integrate information from various sources quickly, and it must perform this integration in time and space. “Quickly is going to be different every day because every day you must be faster,” he offers.

The second challenge is to prevent overwhelming the user with information. Bottom is seeking capabilities that help the user understand quicker. “Speed of understanding is the other speed issue. How do we convey information in such a way that it is easily understandable?” he asks.

“We’re there to enable the user,” Bottom continues. “Approaching things from the user’s perspective … it would be good to get back to basics in terms of human factors engineering—how humans will interact with capabilities, and what we want that user experience to be, is something that is going to continue to be important.”

Comments

The content of this field is kept private and will not be shown publicly.

Plain text

  • No HTML tags allowed.
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.