Rarely does a week go by that doesn't include a military-sponsored exercise, experiment or demonstration. As participants ready for the event, excitement mounts. During the event, there don’t seem to be enough hours in a day to accomplish tasks as troops immerse themselves in the job at hand and the days fly by all too quickly. Usually, the purpose of these activities is to find out if a technology or a concept has merit. But as the event winds down and draws to a close, participants pack their bags, file their reports, shake hands with each other and return to their “real jobs.”
Given the operational tempo today, these processes, products and outcomes for evaluating concepts and technologies are understandable. Unfortunately, detailed reports about these activities often end up spending more time on the shelves than in commanders' hands, where the information would do the most good.
Enter the Joint Systems Integration Center (JSIC). Rather than sponsor its own exercises, experiments or demonstrations, JSIC members become flies on the walls of event sites around the world and create Joint Systems Baseline Assessments (JSBAs). Although a useful resource, the program has been criticized in the past because assessment data was reported annually—generally in late summer or early fall. Many warfighters—including military leaders—said the document took too long to produce and was too unwieldy to be useful when making technology acquisition decisions, and they were right, Frank Hunt, project manager, JSBA, admits.
So like the rest of the military, JSIC decided it was time to become more agile. Rather than studying individual exercises and reserving their findings for publication in a single document, JSIC decided to restructure the JSBA by carving it up from one large project into several assessment projects. As a result, agility increases while continuity remains, and assessments can be delivered at nearly the speed of battle, Hunt offers.
The events JSIC focuses on primarily revolve around command, control and intelligence interoperability. Beginning in the first quarter of fiscal year 2010, JSIC examined the comings and goings of PRISM to MAJIIC. The goal of the event was to establish interoperability between U.S. and NATO collection management capabilities. “Part of our [new] model is to take advantage of something that someone else has set up as an operational venue, and that way we don’t have to set up our own operational venue,” Hunt explains.
The Austere Challenge interoperability assessment was JSIC’s second JSBA project, and it road on the coattails of the U.S. European Command’s exercise by the same name. Austere Challenge involved creating a French-led coalition for an air component command as part of a combined joint task force. The exercise gave JSIC personnel the opportunity to evaluate how NATO-integrated command and control systems would work with the U.S. Global Command and Control System on a NATO-releasable network. Before the exercise began, three technology assessments took place: integration, targeting system interoperability and event integration testing. “That’s an operational assessment opportunity for us because all of these individual systems must work together from one end to another to see if they are going to support the mission,” Hunt states.
“Those were the first two of our assessments, and we feel that a lot of the lessons learned there are destined for use in Afghanistan because it’s a NATO ISAF joint command, so they have to have the international systems working together. What we learned the hard way in our interoperability assessment, we push out to them so they can use the assessments to make the systems work better together,” he relates.
In addition to these two projects, JSIC will be involved in three other assessments this fiscal year. Trident Warrior is the first and is in progress. The goal is to determine if it will be possible for full-motion video to be sent from an airborne asset to a ship that would then relay it to a ground station that would distribute it to the decision makers who need it.
The objective of all these assessments is multifaceted, Hunt explains. In some cases, they help the services or commanders make acquisition decisions. During current operations, they may alert troops about challenges they will face in the field and offer solutions to commanders.
Hunt also emphasizes that the JSBAs are now being written in terms that are easily understood. Assessments of a technical capability include fact sheets that feature “red,” “yellow” and “green” indicators so commanders can quickly understand the risks and rewards of certain capabilities.