Roadblocks to Interoperability Frustrate Coalition Communicators

November 2000
By Robert K. Ackerman
E-mail About the Author

Soft commitments go nowhere; shared costs, international standards and testbeds may open the way to future operational successes.

Nations seeking to enable information exchange among international coalition partners face several daunting tasks for laying the groundwork for vital interoperability. Many of these efforts involve individual national commitments to build interoperability into their systems and practices, while others require consultation and consensus before proceeding along equipment deployment paths.

Many issues remain to be resolved even before a road map to interoperability can be drawn. Items such as standards, security and cost-bearing all weigh heavily on efforts to take the necessary international steps. Prospective coalition members are tasked with beginning to lay the groundwork now for operations that may be several years in the future.

Ultimately, it may take the establishment of an international interoperability testbed to resolve, and even prevent, communications conflicts among North Atlantic Treaty Organization (NATO) and Partnership for Peace (PfP) nations’ systems.

Sir Robert Walmsley, chief of defense procurement and chief executive of the U.K. Defence Procurement Agency (DPA), is not reticent about discussing the need for interoperability—and the challenge in achieving it—among allies. He calls for “coupling the genuine desire for interoperability with an absolute commitment to delivering it.

“Interoperability is a key determinant of any coalition’s military capability,” Walmsley declares. “The increasingly important role of information in military operations—both direct operational information as well as logistic information—is hugely more important today in the thinner battlefield that we now experience than it was when soldiers could touch hands and pass messages by word of mouth.”

Walmsley continues that international interoperability presents several extremely difficult technical and management challenges. First and foremost is technical interoperability. Despite the fact that it should be the easiest to deliver, the solution to this challenge remains elusive. Walmsley notes as an example that opting to use super high frequency satellite communications does not automatically confer interoperability. Waveforms must be compatible, and no adequate technical descriptor of a waveform exists to guarantee compatibility.

Other technical issues include cryptography and system security policy compatibility. This entails determining which person is entitled to view specific information. Compatibility of doctrine and concepts of operation are other challenges. An effective doctrine should, for example, resolve whether target descriptions are defined alphanumerically or with imagery. These variations would impose different demands on communications systems.

One of the keys to delivering interoperable communications is the adoption of standards. Walmsley notes that support for open standards is widespread, but merely conforming to an open architecture does not necessarily deliver interoperability, he points out. A greater commitment to keeping NATO standards up to date is important because commercial standards tend to erupt spontaneously and may offer more attractive solutions.

It is no longer realistic to expect defense needs to shape commercial standards, Walmsley warrants. Given the volume of commercial communications, civil standards should prove attractive to the military except for special applications such as resistance to electronic countermeasures. The defense community must expend more effort seeking ways to exploit commercial standard technologies, he adds. The NATO standards group can help in this effort, but individual nations must address the issue when designing their own systems.

Emphasizing NATO interoperability standards also is important for new and prospective member nations. “If new members of NATO are going to have faith in the organization, it’s no good if we tell them that NATO standards don’t work,” Walmsley warns. “They would find that astonishing.”

While standards help, they are not an answer in and of themselves, Walmsley continues. Any given standard includes complex components. A system may comply with standards, but ultimately it must be demonstrated in a testbed.

This testbed activity is essential to interoperability, he declares. Most nations have testbeds, but they have been slow to incorporate interservice interoperability. As hard as it is to deliver interoperability within a service, the difficulty increases among different services in a single nation. In turn, this problem pales in comparison with interoperability among different nations, he adds.

These difficulties highlight the need for international testbeds. Unfortunately, Walmsley points out, they generally are not available. He offers that the key to achieving international testbeds is having a central authority in each country owning and understanding the configuration of its national testbed. This authority would be able to enter into discussions with its counterparts in other countries. While this step requires “real will,” Walmsley believes it will come to pass because leaders will see it as the central way of demonstrating interoperability. As these national testbeds develop and their definition becomes more precise, experts will be able to lay the groundwork for a coordinated international effort. “Testbeds are an absolute sine qua non of interoperability—we’ve seen that,” he emphasizes.

Bridging existing systems can be difficult enough, but ensuring interoperability between two developing systems can vex both their customers and the contractors producing the systems. “Quite often, we seek interoperability between two systems that are under contemporaneous development,” Walmsley states. “Defining the end-state of the two systems—which may be several years away—is something that is not as simple as it sounds. Therefore, defining the conditions that each system must meet to interoperate produces a complexity that is enormously difficult.”

This problem also applies to existing systems and to those under evolutionary acquisition. Telecommunications systems particularly are prone to upgrades as new capabilities are introduced in the competitive commercial marketplace. Major incompatibilities, such as between old and new switches, can erupt. Testbeds can help solve this dilemma by evaluating upgraded systems for interoperability before they are fielded.

Another roadblock to interoperability, Walmsley allows, is the tendency to be too greedy on security levels. Insisting on interoperability at the top-secret level across a large network is “like trying to scale Mount Everest with both hands tied behind your back,” he analogizes. The proper course is to implement compatible security policies that will secure interoperability at lower security levels. At the same time, planners must recognize that this action will place limitations on interoperability at the highest security levels. The only solutions can come from traditional pro tem methods until technical solutions emerge.

“After all, we got by with liaison officers for a long time,” Walmsley points out. “The problem with liaison officers is that they can’t work with huge volumes of information and data, and huge volumes of information usually only arise at the lowest security levels. So, we should take that load off them.

“Achieving interoperability between the haves must not start with some fantastic globalized picture of everything that must be done to completely secure interoperability at every security level between every system,” Walmsley declares. “That will just result in mounds of paper, and meanwhile the world will be zipping on at great speed.

“If ever there is a case where the perfect becomes the enemy of the good, it is in the security constraints surrounding interoperability,” he declares.

Another vital issue involves funding. Enabling interoperability between two existing systems costs money, and Walmsley raises the question of who pays for this interoperability. “Everybody espouses interoperability, and the normal way of delivering it is to say ‘I have a system, and I will tell you what you need to do to be interoperable with me,’” he explains. Describing that approach as a very soft commitment to interoperability, he calls for both parties to agree on interoperability steps and shared costs.

This sharing would be allocated on a case-by-case basis. Nations must first examine which systems they want to interoperate, and then they must open a dialogue to determine the best way to secure that interoperability. Two countries working on two systems may need only build an “interoperability bridge” between the two systems, but enabling interoperability among several systems tends to rule out that solution.

“Deciding who pays to build the interoperability bridge is a really tough one. … The [ideal] route is to say ‘Not only do I want you to be interoperable with me, but I will pay you to do that,’” he continues. While that may be “pretty unthinkable” in most circumstances, it expresses the difficulty in determining who shoulders the financial burden, Walmsley concludes.

“We must firm up our international commitment to interoperability if we are to sort that problem out,” he declares.

The PfP is contributing to the search for a solution by drawing NATO nations and prospective members into discussions on the topic. Walmsley observes that these nations are able to look at “the nuts and bolts” of international interoperability and consider possible solutions. Many of these former Warsaw Pact nations achieved interoperability by fiat from the Soviet Union. Their choice of equipment largely was forced on them. Not only does NATO not operate that way, the alliance must also view its security arrangements as a partnership of equal sovereign nations.

This means “getting down to brass tacks and discussing the details,” Walmsley says. Once these details begin to unfold, then costs become clearer and nations can determine to whom these costs are allocated.

Walmsley states that the United Kingdom’s work toward internationalizing its interoperability efforts includes sending senior experts to NATO standardization working parties. Understanding commercial standards is another time-intensive, but productive, interoperability activity. The DPA has eliminated about 30 percent of its defense-specific standards, and more cuts lie ahead. These simple measures yield tangible results, he maintains.

The agency is employing a new managerial structure to address the potential for stovepipe systems to emerge in an atmosphere of more autonomous development. Giving responsibility to individuals for developing new information systems requires ensuring that integrated project teams deliver products that are compatible, Walmsley relates. Because they are given narrowly defined responsibilities, these teams could easily become stovepiped in both managerial and technical procedures.

To avoid that pitfall, the United Kingdom has implemented an integration authority. It is tasked with influencing each information system program, understanding the extent of its interoperability requirement, and confirming that the program will deliver those requirements. While not given police powers to stop a project, it can alert senior officials among the user community—as well as the DPA chief and his executive directors—if it fears that interoperability is threatened by project team decisions.

Walmsley notes that this integration authority ultimately may become the owner of the United Kingdom’s testbed design definition. It also would be responsible for the design definition of the country’s contribution to an international testbed should one be developed.

Enjoyed this article? SUBSCRIBE NOW to keep the content flowing.