Communications Capabilities Connect

April 2005
By Maryann Lawlor
E-mail About the Author

Spc. Kenon Burns, USA, unmanned vehicle operator, B Company, 104th Military Intelligence Battalion, controls a Shadow tactical unmanned aerial vehicle (UAV) from a ground control station in Baqubah, Iraq. By employing extensible markup language (XML), UAV controllers will be able to view information from different types of UAVs simultaneously.

Common language allows machine-to-machine information exchange.

U.S. Joint Forces Command is harnessing the power of extensible markup language to lash together three capabilities and to facilitate collaboration between intelligence and operations activities. If successful, the integrated capability would increase an individual warfighter’s ability to control intelligence, surveillance and reconnaissance assets and share targeting information, reducing the time between target identification and strike from minutes to seconds. Integration of those capabilities is in the experimental stage, but the project’s director is encouraged by the initial results and believes it could eventually lead to humans on instead of in the targeting-strike loop.

Enhancing the effectiveness of intelligence, reconnaissance and surveillance (ISR) information is the primary purpose of the project. It links the capabilities developed in three separate projects: the Multisensor Aerospace-Ground Joint ISR Interoperability Coalition (MAJIIC) advanced concept technology demonstration (SIGNAL, October 2004, page 27); the Adaptive Joint Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance Node (AJCN) advanced concept technology demonstration (SIGNAL, November 2004, page 49); and the Joint Operational Test Bed System (JOTBS) initiative. U.S. Joint Forces Command (JFCOM), Norfolk, Virginia, has named the trilogy ISR Troika, a term that refers to a team of three horses harnessed side by side.

Cmdr. James Joyner, USN, director, ISR Troika, JFCOM, explains that the groundwork for the current effort was laid last year during an experiment called Forward Look. Considered the first step toward the network-centric operation of unmanned aerial vehicles (UAVs), the experiment integrated the location information and sensor points of interest of Predator, ScanEagle, Shadow and Silver Fox UAVs in a common operational picture (COP).

As designed, information about the four UAVs is displayed only on their own individual ground control stations. Cmdr. Joyner allows that the first question was whether enabling these systems to work together would offer warfighters some benefit. Proceeding on the hypothesis that the answer to this question was yes, the initial thought was to develop a common groundstation for all UAVs. “The idea was that this piece of hardware would talk to all other pieces of hardware. This failed miserably because some UAVs need different types of interfaces, and we’re just not there yet,” the commander says.

The breakthrough occurred with the second approach. The JFCOM team decided to interface with the ground control stations, to gather the data and then to share the relevant information among ground control stations. “That’s where the Cursor on Target XML [extensible markup language] schema comes in. It gave us a standard to use so one system could communicate ‘where, what and when’ information with the team. It expedited our process to integrate the systems because once integration occurred for one system, it was done for all systems,” Cmdr. Joyner states.

Cursor on Target (CoT), developed by The Mitre Corporation, Bedford, Massachusetts, fuses “what, where, when” target information from a laser rangefinder, a compass and a global positioning system receiver. Then it  sends it to an intelligence system where the data is refined for high-precision resolution. XML, which Cmdr. Joyner believes is a convenient and economical solution to providing integrated information, is actually a metalanguage that allows users to design their own customized markup applications for exchanging information about specific topics. XML schemas provide the mechanisms to define and describe the structure, content and, to some extent, the semantics of XML documents.

In Forward Look, information about the four types of UAVs, such as position and sensor points of interest, was integrated in a COP, so one UAV operator could see information from all the other UAVs simultaneously. In addition, the information could be exported to others, such as the commander or the warfighter on the front tactical edge, the commander says.

The next step was to enable the warfighter to control the position of the UAV and its sensors. To do this, Cmdr. Joyner explains, requires that the ground control station be able to accept the way points generated by the forward tactical station, “but it’s a matter of a couple lines of code,” he says.

This and other capabilities were demonstrated in the most recent experiment, Extended Awareness 1 (EA1), held last December in New Orleans, and will be further refined in two upcoming experiments scheduled for June and September, in Fort Huachuca, Arizona.

Cmdr. Joyner describes EA1 as a combination exercise, experiment and training effort. The two-week event took place in conjunction with the 26th U.S. Marine Expeditionary Unit’s training, URBAN ENVIRONMENT EXERCISE.

During EA1, the JFCOM team enabled forward air controllers to direct and task dissimilar UAVs using a single console, which provided immediate battlefield surveillance. XML served as a computer language interpreter for the systems allowing the services and combatant commands to receive and understand data collected from a variety of sensors onboard the UAVs. This data was indexed, which facilitated faster referencing. In addition, reconnaissance personnel on the ground could mark a target and electronically relay the coordinates directly to aircraft for automatic tasking of the aircraft’s weapons to the target.

The experiment took the capability one step further. Not only were the UAV systems integrated, but also a gunshot detection system was incorporated into the system using CoT XML. As a result, the information about the location of a gunshot was fed directly into the UAV, and the UAV’s imaging sensor automatically sought out the source of the sound. “Being able to tip and tune a sensor to relevant information on the ground in an automated fashion is pretty effective. This would not have been possible without the Cursor on Target XML schema,” Cmdr. Joyner states.

The ScanEagle UAV was one of four unmanned aircraft involved in both the Forward Look and Extended Awareness 1 experiments.
The commander explains that the XML schema that JFCOM is using features 13 required fields such as time, latitude, longitude and height above ellipsoid. “That gets you in the door with CoT. You have to be able to speak where, what and when. The word ‘extensible’ means the ability to add, so you can add sub-schemas after those 13 fields, and the systems use just the one they can recognize. That gives us the flexibility to grow this capability as needed and parse it as necessary, and we can geographically filter the information. So, if you’re working on a large, global network, you have guards up that wouldn’t let information pass if it is not relevant to the area,” he says.

The capabilities demonstrated during EA1 improved joint warfighters’ awareness of the ISR data they were viewing in near-real time. The information was relayed in less than 10 seconds, and XML linked dissimilar radios without a third-party relay.

Cmdr. Joyner and his team already are planning for the goals they intend to accomplish during the next two experiments. The objective for EA3 is to merge the capabilities MAJIIC, AJCN and JOTBS offer. “We have the MAJIIC server bringing in all the feeds into a common information picture—that’s the where, what and when information. All these ISR feeds are now brought together, and we can chop it up, parse it up via Cursor on Target and disseminate that awareness information in near real time to the warfighters and commanders. Combine that with the ability to take control of a UAV and a gain of information of a target on a map, and the information shows up in front of them in a machine-to-machine fashion. That is a big integration hurdle, and that’s what we’ve bit off for ourselves for the final experiment,” he relates.

Although the commander admits that what his team is attempting is challenging, he also is going to endeavor to demonstrate some capabilities that warfighters could accomplish immediately using existing equipment and just a few software modifications. For example, a battlefield air operations kit, which is composed using CoT XML, currently displays the position of a target on a handheld device; however, that is the only information that is shown. “With Cursor on Target XML, that information is brought into a laptop, and CoT XML sends it to a PRC-117 [radio]. The PRC-117 then sends it to a Link 16 inject area called an Air Defense Systems Integrator that brings the information to an aircraft. At the same time, that PRC-117 brings the information back to the UAV joint support module that I’m putting together,” Cmdr. Joyner explains.

As a result, when the warfighter receives the information, instead of just getting one piece of information on the liquid crystal display, the UAV as well as a multifunctional-information-distribution-systems-equipped fighter can be looking at the same area, and the warfighter can be receiving the video on a terminal all with one press of the button. “They would all be on the same sheet of music. And taking that even further, the people who would be there to help them with what they need at the moment of execution—all the imagery analysts—would also be on the same sheet of music, and they’d be able to communicate via chat protocols of Cursor on Target XML,” Cmdr. Joyner explains.

With this integrated capability, a warfighter and pilot could view a target simultaneously. After the soldier on the ground refines and confirms the coordinates, the pilot would accept the information and the bombs would be released. “What would normally take minutes now takes just seconds. The systems are great systems, but they are much stronger when they’re brought together like this,” the commander states.

At the tactical level, information is presented on laptop displays for the capabilities being examined today. Although the equipment is larger than the more portable handheld devices, Cmdr. Joyner points out that currently a soldier would need several laptops to view all the different information. He allows, however, that ideally the JFCOM team would like to be able to present the data on a smaller, more robust platform.

The commander notes that technology is not the challenge the team faces in ISR Troika. “The challenge is the momentum behind certain ways of thinking, and that needs to be managed and guided. Transformation of doctrine, the development of operational tactics and the integration of technology need to happen in unison. We recognize that this is a constraint, and we’re working within this constraint. We’re trying to let warfighters use the equipment they have but use it in a better way. We’re not telling them what kind of hardware to buy. This is a relatively low-cost solution set for getting these devices to work together. I think that is the genius of it,” the commander says.

Once the ISR systems are brought together and the information is in a common operational picture, Cmdr. Joyner says the next step will be finding a way to handle larger quantities of ISR tasking. “In my vision, I believe this is going to happen through automation. We would need some kind of top-level control above it. Once we have the information and can direct and move it around, it needs to be handled with a man on the loop, not a man in the loop,” he notes. Although strikes can take place automatically when ISR information indicates an obvious target, the commander admits that under some circumstances, such as when an assessment is required, there is definitely room for a human in the loop.

Even though EA1 as well as the experiments later this year are geared toward supporting the warfighter, a real-world, nonmilitary emergency arose during the December event that demonstrated the usefulness of the systems and of sharing information quickly. Near the end of the event, a New Orleans air traffic controller contacted the pilot of a Pelican involved in the experiment and asked for assistance to locate an aircraft crash site. The Pelican, an aircraft that can be flown either by a pilot or remotely, was equipped with Predator imaging systems. The pilot used the systems to send imagery to 26th Marine Expeditionary Unit headquarters where a Marine analyst processed the information. “It would have taken them hours if not days to locate this plane, and, in under a minute, we gave them 1-meter accuracy coordinates as well as the best ways to access the crash site,” the commander relates.

“When it comes down to the time of execution, service doesn’t matter. Agency doesn’t matter. The only thing that matters is getting the right people with the right qualifications together to get the mission done. This is the guiding factor in the development. That in itself is a transformational find,” Cmdr. Joyner states.


Web Resources
U.S. Joint Forces Command:
Extended Awareness 1:
The Mitre Corporation:


Enjoyed this article? SUBSCRIBE NOW to keep the content flowing.