• Credit: Panchenko Vladimir/Shutterstock
     Credit: Panchenko Vladimir/Shutterstock

Open Source to Dominate Intelligence

February 12, 2021
By Robert K. Ackerman
E-mail About the Author

Conventional methods largely will focus on the data left out of this new material.


The tsunami of information that will hit with the full exploitation of 5G cellular will create a wealth of open source intelligence that will define the art in coming years. New sensor systems, artificial intelligence (AI) processing and expanded information delivery methods will produce new types of intelligence available in greater detail for a range of customers.

These points were discussed by a panel of experts during an AFCEA Intelligence webinar titled “Intelligence Analysis Tradecraft in an Open and Rapid World, Part 2.” Open source is likely to become more detailed and thus more valuable. And, its nature will change as new AI-based methods of processing it will bring it to the forefront of intelligence collection.

“Starting with open source is absolutely the way to go,” said David Gordon, senior advisor for geo-economics and strategy at the International Institute for Strategic Studies. “We are in this information overload situation, and all of the tools out there, and the big data sources and what you can do with them, allow you to create a mechanism through open source to really identify the gaps that we can’t fill on this.

“For 75 percent of the intelligence questions that our analysts are asked, the place to start will be in open source,” he continued. “Then you can target the collectors much more proficiently and much more particularly.”

Terry Roberts, founder, president and CEO of WhiteHawk Inc., described how the Office of Naval Intelligence used this approach when she was deputy director there. “We always used globally available public knowledge and body of work,” she related. “It was foundational to everything we did.

“So, I want us to flip everything on its ear and make publicly available data analytics and knowledge where we start,” she declared.

Eric Haseltine, former chief technology officer for national intelligence at the Office of the Director of National Intelligence, pointed out that the Internet and mobile communications already are allowing interpretations of statistical data. “We can move from inferential statistics to descriptive statistics where we can know pretty much what whole populations are going to do—or are doing,” he said.

Panel moderator Fran Moore, former director for analysis at the CIA, offered that a recent task for finding was that the commercialization of space, along with integrating AI and data analytics, will allow the private sector to produce multisource intelligence that rivals traditional intelligence in terms of relevance. Roberts cited a confluence of high-end computing, machine learning (ML) and publicly available datasets has allowed commercial companies to add their own tradecraft to generate vital information.

Haseltine noted that the key technologies are not in sensing. “The most important technologies are not the ones that help you find a needle in a haystack or find big trends,” he offered. “[It’s] those that make it easy for normal humans to find all that stuff and understand it. We have information overload in open source exponentially greater than the worst that we have in the worst part of the intelligence community. The volume, velocity and variety of stuff is just staggering.”

He continued that the real issue is to have humans who are skilled and smart without necessarily being information technology experts. “Those things that help humans find all that stuff and then understand it are going to be important,” he predicted.

Haseltine also endorsed the importance of AI and ML. Most ongoing efforts involve supervised learning, where a human tells a computer whether something is right or wrong. Instead, the future will be in unsupervised learning in which people learn the answers to questions they never thought to ask. The AI/ML system would see patterns in the data that it determines are not random, and it would call them to the attention of the operator. His own experience is that serendipitous findings are far more useful than those he asked for.

But open source intelligence could open up a new vulnerability to adversaries. “If we’re not looking at what’s openly discoverable about our bases, our operations, our cyber architecture—if we’re not using commercial commodity capabilities that our adversaries absolutely are using … then we’re not seeing what they’re seeing of us,” Roberts maintained. She added that many of those commercial commodity capabilities are less expensive, accessible and easy to use, and thus attractive to foes. “That is a whole other aspect of open source that is falling on the cutting floor because we’re not doing it end to end.”

A recording of the “Intelligence Analysis Tradecraft in an Open and Rapid World, Part 2” webinar is available as part of the AFCEA Intelligence Committee Channel Webinar Series.

Enjoyed this article? SUBSCRIBE NOW to keep the content flowing.


Departments: 

Share Your Thoughts:

Very good insight on the informational tsunami headed our way

Share Your Thoughts: