Enable breadcrumbs token at /includes/pageheader.html.twig

Advanced Surveillance Spawns New Challenges

The boom in battlespace surveillance and reconnaissance applications has triggered a search for new technologies that could both help and hinder network-centric warfighters. Many revolutionary sensor systems in the laboratory pipeline offer the potential of widening the supremacy gap that the U.S. military owns over potential adversaries. However, using them effectively will require new data fusion techniques, advanced security measures, enhanced training and education, and greater bandwidth capacities.

 

The introduction of new surveillance and reconnaissance systems will require new ways of collecting, processing and disseminating various forms of data. The distributed common ground system, of DCGS, is designed to fuse imagery, tactical data, signals intelligence and open source information.

Sensors are increasing in number and variety, so are system complexities.

The boom in battlespace surveillance and reconnaissance applications has triggered a search for new technologies that could both help and hinder network-centric warfighters. Many revolutionary sensor systems in the laboratory pipeline offer the potential of widening the supremacy gap that the U.S. military owns over potential adversaries. However, using them effectively will require new data fusion techniques, advanced security measures, enhanced training and education, and greater bandwidth capacities.

According to many government and industry experts, the top priority for the U.S. intelligence, surveillance and reconnaissance community is to achieve persistent surveillance. And, it remains one of the biggest challenges. Had the enabling technologies been pursued when the concept came to the fore in the early 1990s, the goal might have been achieved by now, these experts say. However, that commitment was not made then, and currently attempts to fulfill that concept are lagging.

Budgetary constraints are partly to blame, but part of the problem can be laid at the feet of past policy decisions. Some intelligence officials were reluctant to assign vital assets to battlefield surveillance that might focus largely on battle damage assessment. Since then, the community has embraced the concept for its more complex missions.

The proliferation of intelligence platforms over the past few years has improved surveillance and reconnaissance, but achievement of the principle of persistent surveillance remains elusive. Unmanned aerial vehicles (UAVs), for example, can stare at a particular point, but they cannot scan an area for changes in the manner prescribed by persistent surveillance (SIGNAL, May 2002, page 17).

Funding must be increased to meet this goal, intelligence experts say. Space-based radar is high on everyone’s list, and that will be fairly expensive. Funding must be secured for other elements as well. “It’s a question of priorities,” one expert offers.

Another key thrust is how to attack the hard over-the-horizon target. This will require a yet-undetermined breakthrough sensor or collection approach to take out underground targets. Related needs include obtaining more insight into leadership intentions and having more focus on stateless targets that are not fixed. This will require sensors that help provide an understanding of enemy concepts of operations.

Several factors complicate the tasks facing surveillance and reconnaissance planners. Primary among these factors is that the past few years have seen a proliferation of platforms from which data is produced. This process began in the mid-1990s after some intelligence community planners sought to cut platforms and collection systems. An alternative track offered by others was to collect information and store it for future use. During the 1991 Gulf War, many of the target folders assembled for that conflict featured data from sources as old as KH-7 satellites. This required maintaining, and actually expanding, the number of data sources.

Now, collection platforms comprise constellations of classified and commercial remote sensing satellites; land, sea and air collectors specializing in image intelligence (IMINT), signals intelligence (SIGINT) and measurement and signature intelligence (MASINT); and open source information. Collection by air-breathing aircraft, which used to be confined to U-2 and SR-71 missions, now also is performed by UAVs such as Global Hawk and Predator.

All of these platforms complicate the effort by the intelligence community to process data into information that can be disseminated to the customer in a useful form. Members of the intelligence community and their private sector contractors are working to deal with this challenge in the intelligence chain, government and industry officials declare.

Two main issues that must be addressed are to cull the information properly and to close the time gap between collection and dissemination. However, two problems stand in their way. The first is how to cull through the volume of data that is growing exponentially. The second is a long-standing intelligence problem: customers not knowing what is available beyond traditional requirements and information.

Both of these problems are affected by work underway to develop the next generation of sensors and collection systems. Guy DuBois, vice president for information management and dissemination systems, Raytheon, Reston, Virginia, offers that any architecture that can accommodate these new sensors should be open and scalable. This architecture also must be platform independent, he emphasizes. This would ensure that surveillance and reconnaissance sensors would be effective regardless of their chosen land, sea or air platform.

 

An Atlas IIAC rocket prepares to launch a National Reconnaissance Office (NRO) payload. The NRO is examining a host of new sensing technologies that would be placed in space.

These new platforms are proving to be much more flexible in accommodating different types of sensors. Legacy platforms such as the U-2 now can be equipped with a host of plug-and-play sensor systems. Newer platforms such as the Global Hawk are being considered for an array of various sensors. Topping these are the advanced sensor systems that the National Reconnaissance Office is looking at for deployment in space. These efforts all contribute to the proliferation of different types of sensors.

Maintaining data integrity throughout processing and dissemination is another challenge. In addition to knowing what the data says, analysts often want a metric on its accuracy. This metric would describe the accuracy of a point, its date and its source. In effect, it would establish the bona fides of a piece of information. The result, if achieved, would be sufficient user confidence in the information to act on it.

Raytheon’s DuBois notes that his company is working on a program for the National Geospatial-Intelligence Agency (NGA) known as the geospatial intelligence database integration program, or GIDI. It entails consolidating the agency’s geospatial data into a single database that permits experts to trace the data’s integrity back among tools.

This approach has potential drawbacks, however. Establishing data integrity by attaching source information runs the risk of revealing sources to an enemy. This risk increases as the information is passed further down the command line. A foreign intelligence service could determine the sources and methods of producing that information. This problem could be especially acute with SIGINT, for example.

One potential solution would be to apply a tearsheet principle to digitized information. In this approach, information and its risk-laden integrity data would be highly detailed only above a certain command level. Below that level, those at-risk details would be omitted. However, this might not be applicable for all types of information, particularly non-orbit clandestine SIGINT.

Developing new sensors and systems for surveillance and reconnaissance must overcome other hurdles. One challenge is the different characteristics between high-end and low-end users. The high-end-user level—the president and the national command authority—has different requirements than the warfighter running a joint task force.

But, a related challenge involves presenting this new intelligence information composed of data from multiple, diverse sources. Analysts or other experts may be required to educate both the high- and the low-end user on just what they are seeing in their intelligence product. This goes beyond the procedures from the days when a photoanalyst would point out and identify specific targets in aerial imagery. The new intelligence picture will provide a variety of facts that may be as diverse as the sources that generate it. Intelligence experts note that humans tend to be literal both in expression and comprehension, and the sensors currently under development are less literal than their predecessors, even though they contain much greater quantities of content.

The commercial imagery market already is offering an example with its hyperspectral products. These products have a large amount of information, but they often must be interpreted using an expensive workstation operated by an experienced analyst. This type of information could become less useful as it moves up the chain of command.

On the military side, radar technology has improved far beyond traditional scan displays. The radar sensor equipping a U-2, for example, can produce imagery revealing a broad range of information—to a trained analyst. An inexperienced eye might not be able to discern even a fraction of the information visible to the expert.

Having all this diverse sensor data increases the need for effective data fusion. One priority item is to develop the ability to fuse a wealth of information into one box. This removes the need for analysts to have a number of different boxes at their workstations or vehicles. After data fusion, dissemination of large amounts of information becomes paramount.

DuBois notes that the distributed common ground system (DCGS) program fuses imagery, tactical data, SIGINT and open source information to provide a comprehensive picture of the battlefield to the tactical user. For dissemination, DuBois cites the global broadcast system as an example of an approach to providing necessary bandwidth to move the common operating picture, intelligence information and weather data to locations lacking a fiber infrastructure.

Achieving communications on the move presents its own challenge to disseminating surveillance and reconnaissance information. Currently, networks have little difficulty moving vital information down to corps and division. However, pushing that information below division becomes difficult as those elements tend to be actively mobile. One approach currently under development seeks to install a global broadcast system receive suite in a high-mobility multipurpose wheeled vehicle.

Improved communications on the move will affect training, however. Brigade levels will receive one picture, a battalion would receive another and a company would receive a third. This problem largely may be doctrine- rather than technology-oriented, as military planners must establish how they want the information to be selected for each level.

The private sector is far ahead of the military in this type of technology application, DuBois says. A military version of wireless fidelity (Wi-Fi) could lead the way down this path. In turn, this would lead to further developments in crypto technologies to support Wi-Fi use on the battlefield.

Text messaging could be applied directly to situational awareness capabilities. A soldier or air traffic controller on the battlefield could punch “target destroyed” into a handheld unit, which in turn would lead to changes in an air tasking order.

Even if these technological challenges are met, they may exacerbate an already worsening situation: coalition interoperability. The U.S. military is far ahead of even its best allies in network-centric warfare, and that technology and capability gap is widening. During the Iraq War, U.S. forces encountered bandwidth difficulties when sending data-rich imagery intelligence to their British counterparts (SIGNAL, September 2003, page 33). The bandwidth gap is likely to grow as new sensor types and systems are introduced into the battlespace.

Some industry officials have adopted a newly coined adage that “information is the only thing outrunning Moore’s Law.” In practical terms, this means that bandwidth availability never will catch up to bandwidth requirements, so experts must plan accordingly.