Testing and Evaluation Nurture Valuable Simulation Expertise
Information technology, modeling, security experience coalesce to produce networked interactive systems.
Engineers are applying information technology practices to create complex simulations that go beyond the reach of existing single systems. Extensive simulations that model large-scale military operations can be generated by networking established systems to produce data in near real time.
Much of the expertise to develop these networked simulations can be found in other related fields. Testing and evaluation, for example, establishes performance parameters for individual components as well as entire platforms. Computerized wargaming systems contribute information on human actions. Hardware trainers such as flight simulators can provide the basis for hardware performance.
All this data can be plugged into a model to improve its fidelity. Adding command, control and communications to the mix enables experts to cook up networked simulations that are far-reaching in their coverage.
A German company has applied expertise in this range of disciplines to produce extensive networked models. IABG mbH, based in Ottobrunn, is able to link proprietary and foreign models into an extensive multifaceted simulation network.
Thomas Kreitmair, IABG vice president for ground-based air defense, explains that this networking is not designed to add participants to a simulation, but instead to add functions and complexity. It also aids in making full use of existing capabilities without extensive software rewrites that reinvent the wheel.
Much of this builds on company experience and ongoing projects in other fields. More than 20 German brigade, division and corps staff exercises have used company models for computer-aided exercises. Its command and control expertise includes requirements analysis, system specification and risk analysis, and testing. And, in information security, company history includes identifying emerging information technology risks and helping to establish German information security standards.
Kreitmair explains that the classical approach to developing simulation tools focuses on specific force models that reflect particular battle levels for individual services. This approach basically involves a hierarchy of models. At the lowest level, designated level 0, the system process features data aggregation for the system level, including simulation networking within this level. This includes sensor data, which is displayed in real time.
Level I, the system process level, features individual weapons, communications or intelligence systems, Kreitmair relates. Models are implemented and networking takes place at the system level, and data from this level is partly real time. System processes are aggregated at the battle system level. Level II features simulation networking with standardized distributed interactive simulation and high-level architecture. Various area systems are combined at the systems level and the combined systems level. For example, an infrared air defense system might be teamed with its radar-based counterpart.
At the highest level, designated level III, all defense forces work together to conduct a military campaign. This constitutes an aggregate-level simulation protocol with high-level architecture connectivity with the systems level. Simulation systems of diverse forces are networked.
In this configuration, the length of time for data presentation becomes an issue, with simulation experts seeking as close to real-time display as possible. “You want to know within 20 minutes what the outcome of an air combat could be,” Kreitmair explains. At this level, where sensor simulations produce results in real time, simulating weapon systems may require several iterations to obtain necessary information. Combined systems are more of a challenge, especially if simulators are trying to analyze the outcome of a multiday war.
To work in this type of system, the company developed in-house models and modified others from outside partners. One example of this approach is its use of the extended air defense testbed, known as EADTB. Begun in the late 1980s, EADTB originally operated under the U.S. Strategic Defense Initiative Organization. It features universal weapon system simulation; air, land and sea warfare; and command and control with an emphasis on extended air defense. Its software development cost about $200 million, and it can run in only a few sites because of its complexity. A smaller version, the extended air defense simulation, is far more widespread, with about 180 users in the United States. European EADTB users include the North Atlantic Treaty Organization (NATO) Consultation, Command and Control (C3) Agency in The Hague, the U.K. Defense Experimental Research Association, and IABG.
Kreitmair states that the company’s specialty is the model of command and control air defense, or MOCCA. This enables simulating various elements of a coordinated air defense comprising many different sensors, weapons and forces. These can include ground-based air defense, ground-based air defense systems, command and control, air surveillance and ground-controlled interception. The command and control process can incorporate satellite early warning data, for example, to be processed and routed to the appropriate recipients for generation of fire control orders. This information would be disseminated to the weapon systems to execute engagements, to be followed by evaluations and further re-engagements.
A challenge facing networked simulation implementers is to move information from the system level (I) to the combined systems level (II). Previously, Kreitmair relates, many operators ran hundreds of runs on the system process, aggregated data and presented an average as the normal result. Now, using a distributed interactive simulation standard and a high-level architecture standard, the company can combine hundreds of in-house simu lations for a networked result.
Kreitmair describes early efforts using this approach as a tedious process, especially as many simulations did not meet the two simulation standards. As more simulations met those standards, it became easier to combine the necessary sets.
This approach also helps maintain a level of confidentiality in a networked simulation. If a user does not want to reveal technical details such as particular performance data or techniques, communication with the outside world during the simulation can be performed using one of the standards. This way, the system performance can be part of the simulation without an outsider knowing precisely what is in the system. For example, sensitive U.S. weapon systems can be part of a NATO simulation without revealing the weapons’ characteristics to others.
The company has set up a model network of four major simulations. The first is the man-in-the-loop simulation, or MILSIM. This aviation simulation features two pilot stations with screens, controls and instrument panels; a high-fidelity aircraft model; a high-fidelity missile model for short- or medium-range missiles; and detailed onboard sensor simulation. The second is simulation air combat, or SILKA. It comprises a closed-loop air combat simulation; threat evaluation; pilots’ logic; sensor models; and an aircraft model. The last two are MOCCA and EADTB.
MILSIM originally was used to pit two fighter aircraft pilots against one another to evaluate tactics and missile performances, for example. Combining these capabilities with the other networked systems allows the two pilots to fly in formation against any number of simulated targets. A large air campaign could feature two leading escort pilots controlled by live humans in real time. They could be placed in a four-ship formation, as all other aircraft in the simulation would react to their actions. The air campaign would be modeled by MOCCA or EADTB.
Kreitmair notes that this generates a highly detailed model, but this limits its performance in real time. Sharing the load does increase its effectiveness, and this is exemplified by having MOCCA serve as the majority of “pilots” in a simulation featuring only two live participants.
In addition to air campaign analysis, the company is engaging in similar work for ground battle simulations. The firm also is using the hierarchy inherent in this networked simulation for the whole spectrum of evaluation. At lower levels, this is being applied to signal-to-noise ratio calculations, seeker head evaluations and guidance system assessment, for example. At the next level, it is evaluating weapons systems, including aggregations of systems. Higher levels include mixed ground and air campaigns, while the highest levels test theater war scenarios.
The company is working on joint programs with the NATO C3 Agency. This involves integrating model families from different nations into an interoperable simulation.
The tools employed in this approach already have been used in command and control training exercises. The military is more likely to adapt to these types of tools as they become more familiar, Kreitmair says. In the future, they may become decision support tools implemented inside the simulation.
Some elements have problems generating results in real time, Kreitmair allows. Many simulations involve disciplines, such as air defense, that cover a wide arena and carry a lot of overhead. Some aspects of this discipline can be omitted from the full simulation exercise without affecting the mission. The company can tap its in-house stock of simulations for one that suits an exercise without taxing the system, Kreitmair states.
Simulation quality is driven by the available data, he observes. The company has access to a wide range of data from outside sources as long as it observes sensitivity issues and protects proprietary interests. An in-house department based in the former Eastern bloc analyzes data that was unavailable a decade ago. When verified, this data is inserted into place holders in existing simulations.
Data also can be collected from experimental programs. The X-31 experimental test aircraft, a joint venture between the United States and Germany that evaluated advances such as directional thrust nozzles, was tested in MILSIM before its actual test flights. This data could be applied to future simulations that feature this type of aircraft or component performance.