Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars
AFCEA logo
 

Intelligence

A Longtime Tool of the Community

October 1, 2013
By Lewis Shepherd

What do modern intelligence agencies run on? They are internal combustion engines burning pipelines of data, and the more fuel they burn the better their mileage. Analysts and decision makers are the drivers of these vast engines; but to keep them from hoofing it, we need big data.
 
The intelligence community necessarily has been a pioneer in big data since inception, as both were conceived during the decade after World War II. The intelligence community and big data science always have been intertwined because of their shared goal: producing and refining information describing the world around us, for important and utilitarian purposes.

Let’s stipulate that today’s big-data mantra is overhyped. Too many technology vendors are busily rebranding storage or analytics as “big data systems” under the gun from their marketing departments. That caricature rightly is derided by both information technology cognoscenti and non-techie analysts.

I personally understand the disdain for machines, as I had the archetypal humanities background and was once a leather-elbow-patched tweed-jacketed Kremlinologist, reading newspapers and human intelligence (HUMINT) for my data. I stared into space a lot, pondering the Chernenko-Gorbachev transition. Yet as Silicon Valley’s information revolution transformed modern business, media, and social behavior across the globe, I learned to keep up—and so has the intelligence community.

Twitter may be new, but the intelligence community is no Johnny-come-lately in big data. U.S. government funding of computing research in the 1940s and 1950s stretched from World War II’s radar/countermeasures battles to the elemental electronic intelligence (ELINT) and signals intelligence (SIGINT) research at Stanford and MIT, leading to the U-2 and OXCART (ELINT/image intelligence platforms) and the Sunnyvale roots of the National Reconnaissance Office.

Another Overhyped Fad

October 1, 2013
By Mark M. Lowenthal

Director of National Intelligence Lt. Gen. James R. Clapper, USAF (Ret.), once observed that one of the peculiar behaviors of the intelligence community is to erect totem poles to the latest fad, dance around them until exhaustion sets in, and then congratulate oneself on a job well done.
 
One of our more recent totem poles is big data. Big data is a byproduct of the wired world we now inhabit. The ability to amass and manipulate large amounts of data on computers offers, to some, tantalizing possibilities for analysis and forecasting that did not exist before. A great deal of discussion about big data has taken place, which in essence means the possibility of gaining new insights and connections from the reams of new data created every day.

Or does it?

Some interesting assumptions about big data need to be probed before we dance some more around this totem pole. A major problem is the counting rules. Eric Schmidt, the chairman of Google, has said, “We create as much information in two days now as we did from the dawn of man through 2003.” He quantifies this as five exabytes of data (5 x 1018). Schmidt admittedly counts user-generated content such as photos and tweets, for example. All of this may be generated; but is it information, and more importantly, is it intelligence?

This data clearly is information—to someone—but very little of it would qualify as intelligence. It does qualify as a very large haystack in which there are likely to be very few needles that will be of use to anyone engaged in intelligence. To cite a more relevant example, the National Security Agency (NSA) programs lately in the news went through millions of telephone metadata records, which led to 300 further inquiries. The argument can be made that without the NSA metadata program, these leads might not have existed at all; but a means-and-ends argument remains over the larger big data claims.

Is Big Data the Way 
Ahead for Intelligence?

October 1, 2013

Another Overhyped Fad

By Mark M. Lowenthal

Director of National Intelligence Lt. Gen. James R. Clapper, USAF (Ret.), once observed that one of the peculiar behaviors of the intelligence community is to erect totem poles to the latest fad, dance around them until exhaustion sets in, and then congratulate oneself on a job well done.

One of our more recent totem poles is big data. Big data is a byproduct of the wired world we now inhabit. The ability to amass and manipulate large amounts of data on computers offers, to some, tantalizing possibilities for analysis and forecasting that did not exist before. A great deal of discussion about big data has taken place, which in essence means the possibility of gaining new insights and connections from the reams of new data created every day.

Or does it?

Read the complete perspective

A Longtime Tool of the Community

By Lewis Shepherd

What do modern intelligence agencies run on? They are internal combustion engines burning pipelines of data, and the more fuel they burn the better their mileage. Analysts and decision makers are the drivers of these vast engines; but to keep them from hoofing it, we need big data.

The intelligence community necessarily has been a pioneer in big data since inception, as both were conceived during the decade after World War II. The intelligence community and big data science always have been intertwined because of their shared goal: producing and refining information describing the world around us, for important and utilitarian purposes.

Read the complete perspective

Understanding 
the Written 
Foreign Language

October 1, 2013
By Robert K. Ackerman

A transliteration tool developed jointly by the intelligence community and a commercial firm is helping eliminate the problem of misidentified foreign names and places in databases. These types of errors can allow a potential terrorist or plot to slip though security if analysts cannot identify common proper nouns and establish valuable links.

The new system helps avoid this problem of misidentification arising from different interpretive spellings of names from a language that does not use Western-style Roman lettering. This problem has become an issue when terrorists’ names are not matched in different databases because their spelling is interpreted differently. Analysts are not able to put together the pieces in a puzzle to develop an accurate picture that shows a potential threat.
 

Information Sharing Takes on New Shape

October 1, 2013
By Henry S. Kenyon

U.S. intelligence agencies soon will be able to share information with each other in a single common computing environment. This effort will increase intra-agency cooperation and efficiency while cutting information technology operating costs through the use of shared services.

The Intelligence Community Information Technology Enterprise (ICITE) is part of a broad strategy led by the Office of the Director of National Intelligence (ODNI) and supported by the chief information officers (CIOs) of the five major intelligence agencies. ICITE replaces the old agency-based information technology model with one using a common architecture operating as a single enterprise across the intelligence community.

Launched in 2012, ICITE is not a formal program of record but a development effort directed by the ODNI and agency CIOs. ICITE has five major goals: create a single, standards-based interoperable enterprise architecture for the intelligence community; provide seamless and secure collaboration tools for person-to-person and data-to-data information sharing; establish a standardized, consolidated business process to support agency missions; set up a governance and oversight process; and create partnerships in the intelligence community, across the U.S. government, international partners and industry.

Committed to Cloud Computing

October 1, 2013
By George I. Seffers

Recent insider security breaches have put increased scrutiny on the U.S. intelligence community’s cloud computing plans. But cloud computing initiatives remain unchanged as the technology is expected to enhance cybersecurity and provide analysts with easier ways to do their jobs in less time.

With cloud computing, reams of data reside in one location rather than in a variety of repositories. Combining data leads to greater efficiencies for intelligence analysts, but in the view of some, it also means greater vulnerabilities. “There’s a school of thought that says if you co-locate data, you actually expose more of it in case of an insider threat than if you keep it all in separate repositories by data type,” explains Lonny Anderson, National Security Agency (NSA) chief information officer. “The onus is on us to convince the rest of the community, the rest of the Defense Department, that we can secure their information in the cloud in a way that they simply can’t secure it today.”

Anderson acknowledges that the recent insider leaks have increased doubts within the intelligence community about cloud computing, but he expresses confidence that the agency and the intelligence community are on the right path. “I think everybody is a little more nervous and a little more security conscious.

“Everything we’ve learned so far of [NSA leaker Edward Snowden’s] activities has reinforced for us that the path we’re already on is the right path. The lesson we’ve learned is the need to share information but to share selectively, only with those with a need to know,” Anderson says. “The leaks actually reinforced the need to move to the cloud and move there more quickly.”

A New -INT Looms for Social Media

October 1, 2013
By Robert K. Ackerman

The Arab Spring, which rose from street-level dissent to form a mass movement, might not have come as a surprise to intelligence agencies if only they had been able to read the tea leaves of social media. The characteristics of social media that differentiate it from other messaging media are compelling intelligence officials to change the way they derive valuable information from it. As a result, experts are calling for the creation of a new discipline that represents a separate branch of intelligence activity.

The type of information found on social media is far different from that intercepted via any other type of messaging media. It is pushed by its sender out to large numbers of people. It often consists of information about individuals that is not readily available elsewhere. And, it can represent an indication of groupthink that is not discernible from traditional intercepts.

Sir David Omand is a visiting professor, Department of War Studies, King’s College, London. He is a former U.K. intelligence and security coordinator and the former director of the U.K. Government Communications Headquarters (GCHQ), which provides both signals intelligence (SIGINT) and information assurance as one of the United Kingdom’s three intelligence agencies.

“This is more than a shift from one kind of communications medium to another,” Omand declares. He points out that SIGINT experts have accommodated the shift from copper wires to fiber because the same messages were being carried by the different media. The only change was the transport mode.

ICITE Builds From the Desktop Up

September 9, 2013
By Robert K. Ackerman

As the intelligence community moves into the cloud, it launches the first step at the desktop level.

Learning Real-World Intelligence Analysis

September 6, 2013
George I. Seffers

Officials at Auburn University, Auburn, Alabama, are developing a program that allows students from any academic discipline to work closely with the U.S. intelligence community in a variety of actual national security-related problems. The university is on track to begin offering a minor in intelligence analysis in the relatively near future and a major in the next five years.

Implemented about a year ago, the program is described as a work in progress. In fact, it has not yet been officially named, but will likely be called the Intelligence Analysis Program. “The goal of the program is to train the future analysts for the intelligence community, the military and business. "What we are trying to do is to provide a learning environment in which students have to deal with real analytical problems,” reports Robert Norton, professor and director of the Open Source Intelligence Laboratory, Auburn University. “We’re not just using things like case studies. We’re actually working current problems. And we do so in an environment where they’re working under an operational tempo similar to what is experienced in the intelligence community.”

Future intelligence analysts learn how analytical products are put together, how data is validated and how to communicate findings in a timely manner. “What we say is that our students work on real problems with real customers. We are working with the intelligence community, we’re working with various combatant commands, and we’re working with various businesses,” Norton says.

Have We Gone Down the Rabbit Hole?

September 1, 2013
By Kent R. Schneider

Do you ever find yourself trying to reconcile with your environment? That is where I am now with regard to national security and reaction to leaks and programs designed to protect against terrorist threats.

In 2010, Julian Assange and his WikiLeaks organization got themselves on the world stage by publishing large volumes of classified documents, many provided by Pfc. Bradley Manning, USA, an intelligence analyst. At that time, and since, both Assange and Manning have been held up as villains by some and as heroes and whistle-blowers by others.

In May of this year, Edward Snowden, a computer analyst hired by Booz Allen Hamilton to work on U.S. National Security Agency (NSA) programs, leaked massive classified data to the British newspaper The Guardian concerning NSA intelligence-gathering programs. Again, Snowden is a traitor or a hero, depending on whom you talk to. A recent USA Today poll found 55 percent of Americans felt Snowden was a whistle-blower and hero.

The government continues to address these massive leaks, their implications to national security and the changes to law that may be needed. In the Manning case, the administration consistently has been determined to prosecute him for treason and aiding the enemy. On July 30, USA Today reported on its online front page with the headline, “Manning verdict redefines meaning of traitor.” While the military court ruled that Manning was guilty of a number of the charges, including parts of the Espionage Act, he was found not guilty of “giving aid to the enemy,” the most serious of the charges, because the prosecutors did not prove beyond a reasonable doubt that he had “a specific intent to aid or assist the enemy.” Legal analysts now are saying that Congress should review the Espionage Act in light of the pervasiveness of technology and its new role in warfighting and terrorism.

Pages

Subscribe to RSS - Intelligence