Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars     Apps
AFCEA logo
 

Intelligence Must Plan to Develop Tomorrow's Analyst

April 2009
By Christopher Zinner

A specific approach is needed for effectiveness, high performance.

As the intelligence community looks to a future in which better intelligence decisions emerge from smarter use of available but limited resources, human capital must take center stage. Including culture, values, education and lifelong learning will provide the right ingredients to evolve the intelligence community to the next level of a learning organization—and to achieve high performance for the missions it supports.

From warfighters on the front lines to policymakers on Capitol Hill, people depend on the intelligence community to protect national security interests. The community faces an incredible and increasingly complex challenge in bringing this information to practical application. Quantity, sources and types of information are exploding.

But the needs of the intelligence community demand a changing skill set. The ability to access and manipulate information is so important to intelligence analysts in today’s world that deficiencies must be addressed immediately. Too much potentially available intelligence remains unseen because it stays impenetrable to analysts who lack the boldness and the technology savvy to translate important data into meaningful and actionable information.

The traditional response to improving intelligence effectiveness has been to turn to technology. The intelligence community spends hundreds of millions of dollars on technology integration services, satellite equipment and new technologies in areas related to knowledge discovery, data mining and collaboration.

But perhaps more than ever, intelligence agencies now have a special obligation to operate with leanness, efficiency and high accuracy so that every dollar possible can be directed to support missions on the ground. As the U.S. economic squeeze grows tighter, the imperative grows to work smarter and achieve better outcomes with fewer resources.

The call for a different approach—one that acknowledges the limits of technology and of the ability to spend on technology—seems clear. Technology is only one part of the intelligence picture, and it potentially is the smaller part. The intelligence community can collect all the data in the world, but that data only will be as useful as the people charged with analyzing and disseminating it can make it.

The real capital of the intelligence community is its people. For example, among the key players are the all-source intelligence analysts who pull together information from multiple contexts and make sense of it quickly and accurately. This provides those who must act with the highest likelihood of taking the appropriate action.

Extensive research has revealed the linkages between talent management and high performance. For example, executives who manage high-performance organizations view their talent strategies as a top priority in sustaining the superior performance of their organizations. Truly talent-powered organizations are adept at defining talent needs, discovering diverse sources of ability, developing individual and collective skills, and deploying talent in ways that align people with strategic objectives. However, research also has found that, while both aligning and engaging the right talent is crucial to achieving strategic objectives, too many organizations still have fragmented talent management systems, processes and practices.

These points are as true in the intelligence community as they are elsewhere. As the intelligence community looks toward a new future of higher performance, it must turn its attention first to developing and nurturing its human resources. This is a fundamentally more effective and cost-effective approach than seeking salvation in technology.

Accordingly, the intelligence community needs a new strategy for human capital development. It must begin with a vision of a new type of intelligence analyst, and it must continue with a logical plan for turning its work force into a group of collaborative individuals. They must be able to provide ever-greater value by knowing how to tame overwhelming data streams and being unafraid to assert their well-founded opinions.

Today’s information analysts typically come from a liberal arts background. At their university, they may have majored in history, political science, language or international studies. They enter the intelligence community with limited experience manipulating multiple streams of data and limited exposure to the technologies that can do much of the heavy lifting for them.

Technology competency is a major differentiator between elite and average analysts. The top performers are technology savvy: They know how to use automated tools; they know how to set up and run the best queries; and they can manipulate information very well. The intelligence community has the responsibility for nurturing its analysts. This entails urging them to embrace technology, to bring their knowledge to a problem set that can be cracked open with the help of smart technology tools. The intelligence community’s number-one priority should be to train analysts to use already available technology to better massage, intelligently search, interpret and use the resulting information to generate good analysis for users.

One related idea would be a variation on the software engineering concept of extreme programming, in which two people work side-by-side at a computer, programming as a team for faster and better results. In the intelligence community, this concept would translate into pairing two or three intelligence analysts with a technologist who could help them obtain the information they need to do their jobs. While initial costs may be somewhat high, the concept would pay dividends through higher-quality intelligence, and eventually the analysts would learn to use the technology successfully to manipulate the data themselves.

The second change the intelligence agencies need to encourage is for their analysts to start making more aggressive assertions. Currently the intelligence community does not train its analysts to go out on a limb and give an opinion. Instead, analysts too often fall into the habit of caveats and hedging their assertions, which profoundly dilutes value to the intelligence consumer.

For example, an analyst simply saying that how a particular foreign government will react depends on six factors—and then listing the factors—is not nearly as helpful as an analyst saying for each of those six variables, “If X happens, then the foreign government will do this, but if Y happens, then the government will react this way.” The job of the analyst is not merely to go out and find and assemble information. Analysts need to make hypotheses and assertions—to give educated opinions on the likelihood of different scenarios and outcomes.

Encouraging analysts to take the leap will require some fundamental changes in the intelligence community’s culture. Right now, analysts are afraid of being wrong. They fear punishment if people take the wrong action based on the information they provide, even if their original assertion is made logically on the basis of seemingly solid information. Likewise, intelligence consumers have not been conditioned to push back—to ask for those assertions rather than accept information without an overlay of analysis on top of it.

Even before training people to think about and present intelligence differently, however, intelligence agencies need to define a communitywide standard of what analysis actually is. This would entail formally agreeing on the differences between search and research—filtering data to establish facts, trends and patterns—and analysis, which would involve sifting through the research to determine its implications within a specific context. Defining an intelligence analyst competency framework then will serve as an anchor for any subsequent talent development and management program.

Once the intelligence community has established a vision of the types of skills and work habits it wants its analysts to exhibit, it will need a new plan for developing these skills and habits in its employees.

The jobs that analysts perform differ widely, and intelligence community training must be restructured to become more specialized. For example, even among a group of all-source analysts, some individuals will focus more on strategy while others focus more on tactics. Some analysts will do high-value individual targeting by using human intelligence, signals intelligence and similar means to find the people who are financing and leading enemy operations. Some experts will be indications and warnings individuals who identify events as they happen and put together plans of immediate action. Geopolitical strategic analysts will study the relationships between individual nations and make predictions on the influence their likely actions will have on the rest of the world in terms of trade, military action and other consequences.

These are just a few examples. What they all point to is the need for training that goes far beyond lectures and PowerPoint presentations to large groups of individuals. The common training approach within the intelligence community today is a model of “buttonology,” characterized by a passive syllabus, slide shows, readings or multimedia presentations—and ultimately, limited interaction. While a starting point, the buttonology approach never will provide a complete foundation for a quality analyst’s education.

Even if two analysts were doing different jobs, they truly could not be taught analytical effectiveness in a classroom. Formal learning is most useful at building lower levels of proficiency; while job experience, active collaboration with other practitioners and teaching other people is more effective for building higher levels of proficiency. Intelligence training has to be hands-on and contextually relevant to the analyst—for example, all-day scenario training that mixes theory, tools of the trade and practical exercises.

To make this new model of individualized analyst training meet requirements of scale, the intelligence community should consider implementing an intelligence community university. Instead of each organization having its own training, agencies would tap into a formalized communitywide training program and a single set of curricula.

As intelligence people from different agencies come together to work on specific threats, they may have trouble collaborating because they have not all been trained in the same way. An intelligence community university not only would standardize training and skills, but it also would allow intelligence individuals from different organizations to begin building valuable networks across participating agencies.

Analyst training and development is not a one-time event. It is a continuous process that occurs both formally and informally to eventually create a culture of collaboration and continuous learning. Individuals learn best from one another, and one of the finest ways to encourage a culture of collaboration and knowledge-sharing is through the common experience that an intelligence community university would provide.

Finally, in addition to specialized and ongoing training, intelligence analysts need the structure of a more formal feedback loop. Measuring the accuracy of an individual’s analysis is a fairly straightforward task. Yet, typically it does not happen often enough. A feedback loop gives analysts a starting point for improvement. After all, people cannot improve without knowing how they are doing in the first place. Finding out that they have been inaccurate encourages analysts to place more self-scrutiny on their own analytical skills and future judgments.

However, it is important to understand possible unintended consequences. The objective of measuring accuracy is not to punish. This point goes hand-in-hand with creating a culture where intelligence analysts do not fear retribution if they perform due diligence, make a reasonable assertion and unfortunately come up wrong. Measuring accuracy must be done for two reasons: people will improve when they are measured, and top performance can be rewarded accordingly.

Christopher Zinner is a senior manager with Accenture’s Public Service Operating Group.