Defense Knowledge Management Hinges On Compatibility

May 2005
By Robert K. Ackerman
E-mail About the Author

A U.S. Army officer uses a Force XXI Battle Command Brigade and Below, or FBCB2, system. U.S. Defense Department planners are striving to ensure that data inherent in this and other position location displays can be shared with similar systems across service lines.
The key is to allow users to view different data their own way.

A broad-based initiative underway in the U.S. Defense Department aims to ensure that all of the data amassed and processed in the future battlespace truly can become useful knowledge to all U.S. forces. This effort is trying to make all data sources, correlators and user interfaces resident on the defense network so that a user could select those best suited for his or her requirements.

The advent of new sensors and platforms means that new types of information will be entering the battlespace cybersphere, and combining them will be the linchpin for providing the right knowledge to the right user at the right time. Platforms will be collecting information not just for their immediate operators but also for the entire networked force. And, when a platform can contribute its data to the network, it will have access to other data that it can assimilate into a single knowledge product greater than the sum of its parts.

Currently, data flow to and from a platform largely is vertical and is engineered for that platform. The platform has its own particular correlators and user interface. This approach has worked well for mission-specific data that suits the platform’s needs, but it limits the user both to exploiting only this platform-geared data and to not being able to share what could be valuable information with other blue forces in the battlespace.

Overcoming these limitations will require significant advances in knowledge management. Step one in those advances is to characterize the data in ways that permit this byte interoperability. Data collected by individual sensors and platforms in the battlespace must be tagged for discovery so that other users in the network-centric environment can find it in real time.

Defense Department planners are striving to create the data environment that will permit this new kind of data fusion and exploitation. “We are hoping to enable a very agile collaborative environment,” says Michael Krieger, director of information management in the office of the Defense Department deputy chief information officer. “If you really manage to achieve visibility of your data assets, then that means the guy at the edge who needs to know something really quick can use an enterprise discovery service to find the data he is looking for really quick. If you’ve really made the data accessible, then when he finds it he can actually get to it. If you’ve made it understandable, then not only can he get to it, but also he can bring it in and do something with it.”

With current defense knowledge management, two data-centric systems may be designed to exchange information. However, the warfighter may want to move the information to a third location, which may require yet another interface. A better solution is that systems are built from conception ensuring that information is visible and accessible. This approach—in which information is the key—would solve the problem with legacy systems as well as with programs in the acquisition pipeline.

“The excitement of what we are doing is that we can go back and, without many additional resources, take legacy systems and have them share data network-centrically,” Krieger offers.

The department is counting on its Network-Centric Data Strategy to sort out many of the issues regarding knowledge management. After the strategy was implemented, a data sharing directive—8320.2—was signed by then-deputy secretary of defense Paul Wolfowitz this past December. That directive established the policy aspect of this knowledge management effort.

The main problem with Defense Department knowledge is that users may know that a certain type of data or information exists, but they may not know where or how to obtain it. Past solutions have involved system engineering information exchange requests so that data could be shared between two rear nodes. However, this did not always ensure that a user could get the data when it was needed.

The key to effective knowledge management, Krieger offers, is to make the data visible so that someone who needs it can discover it, to make it accessible to an authorized user and to make it understandable for a user to be able to exploit it.

Making data available across the Defense Department requires leading all of the department’s organizations in a concerted effort, states Anthony Simon of the Information Management directorate. Simon, who led the DOD Network-Centric Data Strategy, explains that this drive is complicated by factors such as the variety of types of data and issues of authorized users. An unanticipated user must be able to discover hitherto unfamiliar information, determine whether it will help in decision making and pull that information for use in the decision process.

Krieger relates that earlier Defense Department attempts at data management aimed to establish a single data model for the entire department. The commercial sector tried that same approach only to discover that it did not work. Instead, the private sector focused on establishing data formats and exchanges within business units.

The Defense Department has chosen to create communities of interest, or COIs, to decide these issues. These COIs will determine the data elements and their titles as well as which data to exchange. This data schema will be imposed on the particular COI. Ultimately, the different COIs will have to determine how to share data across their boundaries as well.

The difficult task is to stand up the COIs and to establish the data models, Krieger continues. A couple of pilot COIs are working toward those goals, but that clearly is the challenge for the department, he declares. “We now have a ‘buy’ on the strategy; we have a directive out; and the three services have said that they agree with the strategy and the directive. Now we have to start standing up the communities of interest,” he states.

One reason this is proving difficult is that a platform-centric program of record may be a member of several COIs as that platform may provide information to those COIs. The platform-centric organizations must stand up communities that exchange that data, and this effort must go through other department processes. A COI inherently is joint and faces similar funding and logistics hurdles.

From an information technology perspective, the department is divided into four mission areas. The chairman of the Joint Chiefs of Staff heads the warfighting area; the undersecretary of defense for acquisition, technology and logistics leads the business area; the chief information officer leads the enterprise information environment; and the undersecretary of defense for intelligence leads the Defense Department’s portion of the intelligence mission area. These mission area leads, along with the services, are the focal point of efforts to choose the COIs in their mission areas.

Achieving a proper balance may be challenging. Krieger allows that having too many COIs will hurt the department, especially by requiring too many translators across horizontal lines. However, having not enough COIs also will be counterproductive.

Simon offers that the COIs should come together naturally, as opposed to having the secretary designating them. Some already are standing up: intelligence, surveillance and reconnaissance; network operations; space situational awareness; and Blue Force Tracking are some self-forming COIs that are beginning to discuss how to exchange data.

A solution may be at hand. Defense officials have been heartened by a recent proof-of-concept effort conducted by The MITRE Corporation and involving the U.S. Army, Navy and Air Force. Researchers employed the Army’s Blue Force Tracking system Enhanced Position Location Reporting System (EPLRS) and the Force XXI Battle Command Brigade and Below (FBCB2) L-band system, which cannot communicate with each other, along with the Movement Tracking System and the Air Force’s Cursor on Target system for putting tracks on Link 16. Using Web services technology and a laptop computer, these researchers separated the FBCB2 application from Blue Force Tracking data according to an established schema. An extensible markup language (XML) wrapper exposed the discovery metadata to a portal for updating every 30 seconds.

This permitted a user to go to that portal and view all of the data producers on a single page. This user could sign up for whichever ones are desired—or for whichever geographic area is of interest.

But the key result was that this user could view the data on his or her own legacy system without changing any of the display representation. In effect, a user who never had employed Blue Force Tracking could nonetheless view its data using the familiar symbology of that user’s legacy system. Users of different systems would see different icons representing the same data.

“You’re offering a service that people can subscribe to,” Krieger continues. “Using existing communications, I can share Blue Force Tracking data with people who could not get it previously. The Air Force now can subscribe to the Blue Force Tracking system and push up to Link 16 the friendlies within the radius of the pilot’s area of interest so that now he can see not only the red—which we already could get up there through Cursor on Target—but also where the friendlies are.

“So for us, the Blue Force Tracking demonstration was a proof of concept that the data strategy worked,” he concludes. MITRE used open-source software and legacy investments over a span of six months. “To date, it is our best reference implementation [that] the technology actually exists to enable doing what we said.”

If this concept is to advance, the government must perform an operational assessment on it, Krieger offers. Issues would include information security requirements, communications bandwidth and whether the system would work in a combat environment. “It appears from this effort that it does not take a huge level of effort—people or resources—to modify legacy systems,” he says. “If that all works, then there is your metric for the level of effort required to apply this to legacy systems.”

Simon extends this achievement further. “Once the department can position itself well to do portfolio management, then the portfolio manager can look under his portfolio and see how to change that portfolio to be more network-centric,” he says. “Some systems may fall out—they’re not worth it either financially or requirement-wise. [But] other systems or capabilities will be needed.”

However, one element of that success already achieved also may portend the difficulties that lie ahead for Defense Department knowledge management efforts. Krieger relates that the experts setting up that proof of concept spent more time debating over the schema and the portal’s 12 fields than actually implementing the demonstration. “We expect the same thing from the communities of interest,” he predicts. Nonetheless, splitting the effort among the different COIs presents a far less daunting task than trying to develop a single departmentwide schema, Simon observes.

Resources remain an issue for setting up the COIs. The information management directorate is taking the approach that the money to achieve the goals is in the programs that are the producers and consumers. A well-constructed COI with a strong lead will be able to influence programs of record effectively, Krieger offers.

When a COI agrees on a data schema, that will enable the building of correlators or fusing engines that allow incorporating other data. Engineers then can write separate user interfaces that present the data according to user needs and systems.

As always, security is a high-priority issue. The department developed the Defense Department Discovery Metadata Specification, or DDMS, which tells the department how to tag its information. In that tagging-for-discovery schema resides a security layer that determines level of classification, releasability and other attributes. This ensures that the data is tagged correctly for discovery by users. The information assurance architecture developed last year by the National Security Agency and the COI office for the Global Information Grid serves as the road map for integrating security, Krieger adds, but this will require a lot of work.

Among the technical challenges is how to bind discovery metadata to the object securely. This metadata is in a database and is referenced to the object, but now planners must be able to bind it securely so that only authorized individuals can change it. “We have a vision; we are trying to harness the technology; we must continually identify the risk and how we are going to manage it,” Krieger states. The ongoing paradigm shift from need-to-know to need-to-share will require a clear understanding of the risks that will be introduced and how they can be mitigated, he adds.

Industry should turn away from “selling us another interface” and focus on providing solutions that eliminate the need for interface-type solutions, Simon offers. One vital item that could be provided by even a small business is a way of automating data tagging according to the guidelines of the DDMS. That would be a “hot seller in the Department of Defense,” he adds.

Krieger suggests that another key private sector contribution would be the development of a solution for binding metadata securely to an object. Simon notes that industry already has contributed to the development of a strategy.


Web Resources
Defense Department Network-Centric Data Strategy:
Defense Department Data Sharing Directive 8320.2:
Defense Department Discovery Metadata Specification (DDMS):
DDMS Schema Information:
COI Directory: