U.S. Recoups Nighttime Primacy

June 2006
By Clarence A. Robinson Jr.

 
Short wave infrared (SWIR) sensor technology is a key advantage in U.S. forces regaining combat supremacy at night. The Multispectral Adaptive Networked Tactical Imaging System (MANTIS) night vision helmet operates with cameras in three spectral bands, a digital camera, an image fusion capability, inertial navigation and a global positioning system receiver.
Fused imagery enables every squad’s soldiers to share digital sensor input simultaneously.

Fundamental advances in U.S. night vision technologies are unfolding rapidly. These sweeping military developments already are being demonstrated successfully. Called visual collaboration, the sharing of real-time image, graphics and information from soldier to soldier is enabled by exploiting new sensor and digital network technologies.

America’s once commanding night vision advantage quickly is evaporating with the Internet and night vision mall Web sites that ship image intensifier night vision goggles for $489 to anyone with sufficient cash, including Islamic terrorists.

The U.S. military no longer owns the night but shares it with the enemy. However, all of that is about to change, according to Jeffrey L. Paul. With the help of his work, the night vision technology pendulum is swinging back to again favor dominance by the U.S. military in combat operations after dark. He is the Defense Advanced Research Projects Agency (DARPA) manager of two key programs. One is the Multispectral Adaptive Networked Tactical Imaging System, or MANTIS, and the other is Networked Embedded Systems Technology, or NEST.

The MANTIS program harnesses a new spectral region of short wave infrared (SWIR) technology, along with pioneering adaptive image fusion in three spectral bands. In addition to SWIR, bandwidths are visible/near infrared (V/NIR) and long wave infrared (LWIR), Paul reveals. New processor-enabled innovations also offer tactical capabilities that help turn night into day, stimulating precision target handoff in the darkest conditions with a point-click-kill probability, he asserts. MANTIS-related advances also make every soldier a real-time node in a network.

“We have made a breakthrough in SWIR technology, and this is the first step in developing MANTIS to go after a new spectral region. We also have moved into digital technology for imaging within the whole spectral region,” Paul states. A new all-digital helmet-mounted sensor suite operates in three bandwidths for a dramatic night vision achievement.

Paul explains that V/NIR, which operates in the 0.4- to 1-micron range, provides the most literal imagery in highlight, covers the night vision goggle spectral range and provides color cues. “This is where the human eye works in the visible spectrum. V/NIR sees ambient starlight that falls on a scene.

“The SWIR sensor operates in the 1- to 2-micron range, providing low light performance, a primary image and scene context with the ability to see through fog,” Paul continues. “The LWIR camera operates in the 8- to 12-micron range, and as a thermal imager needs no light; it penetrates smoke and dust and can find partially hidden targets. All of these bandwidths can be digitally imaged. Once that occurs, we can do whatever we want with the imagery in real time, including fusing it to use that one best image to present to the soldier. That is a function of the processor and software.”

More than just sensors, MANTIS is an advanced high-speed, low-power integrated processor—a $37 million development program with prime contractor Raytheon Missile Systems, Tucson, Arizona. The development team also includes Raytheon Vision Systems, Santa Barbara, California, for the sensor suite; Sarnoff Corporation, Princeton, New Jersey, for the processor, software, algorithms and fusion; and Rockwell Collins Optronics, Carlsbad, California, which is integrating and building the system, Paul relates.

“This program leverages the Army Night Vision Laboratory’s expertise and technology investments to regain the night and exploit the network,” Paul emphasizes (SIGNAL Magazine, April 2006). “This is what MANTIS is all about. Once we are in the digital realm and have a soldier-to-soldier communications capability, we fully exploit the technology. Not only will that individual soldier see a fused image, but he simultaneously shares that image with his buddies in real time—visual collaboration between soldiers.”

As a program manager, Paul functions within DARPA’s Information Exploitation Office. He previously served as the acting director for sensor systems in the Office of the Deputy Under Secretary of Defense for Science and Technology. He also participated in the Land Warrior and Thermal Weapon Sight programs and managed the U.S. Defense Department-sponsored neural network technology for target recognition applications. A research physicist, Paul earlier spent 18 years at the Night Vision Laboratory (NVL), Fort, Belvoir, Virginia.

MANTIS is designed to display instantly on the visor of each soldier’s helmet imagery from that squad, so that each person sees what every other person sees. “We also have a TiVo-like record and playback capability so that the last 10 seconds can be called up and played again. Digital information and high-speed processors handle these functions and connect them over the network to enable image sharing,” Paul maintains. “MANTIS also uses inertial navigation and global positioning system receivers so that each soldier will precisely know his location and the processor will know where he is looking at all times, his fields of vision and of fire.”

Automatically, MANTIS hands off range information and a target’s exact coordinates from the processor to loitering weapons for the point, click and kill capability, Paul allows. “This feature not only provides eyes on the target but also recognizes the network by plugging each soldier into it for situational awareness and true lethality.

“The first thing we had to do in developing MANTIS was to advance SWIR state-of-the-art technology. Prior to this program, the available cameras were big, heavy and required a lot of power. There was no way to mount them on a soldier’s head. To achieve SWIR imagery required performance at lower power with higher resolution and in a smaller size,” Paul recounts. The SWIR development’s motivation was to overcome natural drawbacks in night lighting conditions. The 0.4- to 1-micron V/NIR regime is fine when there is sufficient ambient light. However, when it is really dark with no moonlight and overcast starlight, the goggles are very noisy with snow and scintillation, he adds.

“Mother nature provides an advantage in the SWIR bandwidth that we have known about for several years, but until now we were unable to exploit it,” Paul notes. “Both the NVL and Office of Naval Research exploited SWIR technology so that the light level from the moon and stars no longer matters. Regardless, you will have good SWIR energy flowing on the target. With a camera you can see this, and we built that camera in a potentially head-mountable small package. We are very excited with the obvious advantages of SWIR, and NVL concurs this is a true breakthrough in night vision technology.”

Before moving on to multispectral development, it was necessary first to perfect SWIR technology in a head-mounted system, Paul claims. This technology was demonstrated successfully about a year ago in the 1- to 2-micron band. In the next multispectral developmental phase, other sensors were added with two channels of SWIR to provide scene context. LWIR, a well-developed technology, was incorporated as the thermal imager for day or night applications to penetrate fog, smoke and dust.

A commercially available charge coupled device V/NIR camera was included to provide color in twilight conditions and for additional context. The V/NIR device also allows soldiers to see aiming lights or to detect any type of light used by an enemy, Paul comments.

The SWIR is the primary imager for MANTIS scene context; however, the other sensors also provide color and thermal images. The processor takes care of this simultaneously and automatically, selecting the best image for display. The soldier does not have to be concerned; he sees only the fused image output as the system measures and determines range and monitors contrast, Paul continues. “The best possible output is displayed in real time at a 30-hertz video rate as very-high-speed processing takes place. The result is better than any one of the three spectral images.”

In the second phase of the program, about a year ago, helmet-mounted hardware and a multisensor testbed with displays emerged along with a bench-top processor. The multivision processor is still larger than required for the helmet but is being scaled down. “Each soldier will transmit and receive imagery from every other soldier in a squad during phase three of the program. But first, we had to prove we could do this using digital technology in real time at the 30-hertz video rate,” Paul offers. “The processor and fusion technologies are really the payoff for MANTIS.”

Paul declares that the real-time fusion capability provides adaptive imagery for the system. That capability was demonstrated to SIGNAL via video at DARPA. In this technique, the processor determines how much SWIR or LWIR to fuse into the image at any point. The image is always rapidly changing—at every frame the processor determines which is the best image to display. This determination is based on contrast and edges, as the processor optimizes the image continuously. The algorithm breaks down the image and builds it back up very quickly. “During this process, it determines whether the SWIR looks better than the thermal, as an example. The key to the fusion is to present the best possible image,” he says.

The record and playback capability of the MANTIS processor allows a soldier to drop down and immediately play back an image to verify a possible target he may have seen, Paul clarifies. The image also can be played in slow motion, fast forward or in a zoom mode. In parallel, that same image goes to other nearby soldiers in real time so that they can collaborate and share MANTIS images. These scenes also can be sent up the chain of command and included in a database for later use if necessary. “All the while, the processor in the helmet keeps track of where that soldier is located and where he is looking within a 10-mile radius, sending the information to his buddies or to a weapons system.”

The processor and sensors make the soldier a network node with intelligence, surveillance and reconnaissance capability—“eyes on the target, up close and personal, connected to the network for scouting, patrolling or engaging in combat,” Paul points out. “There are tradeoffs with MANTIS—whether to use a display for one eye or for two eyes in stereo viewing. One eye now sees very high resolution in the center of a scene, and the other sees a wide 70-degree field of view. This system offers dichoptic or binocular fusion that appears seamless to a soldier.”

Now in the two-year third phase of the program, MANTIS already is making plans to transition through the NVL to the Soldier Program Executive Office and the Future Force Warrior effort. Two prototype helmets are being built for this phase and for soldier evaluation in the field. “This is a very well-funded and focused effort by the Army. The plan is to transition the technology we believe revolutionary in the most expeditious manner possible. MANTIS will make every soldier a moving sensor node to view things and communicate in real time around the battlefield,” Paul comments.

Dr. Jon Leonard explains that development of SWIR sensor involves a large format with 1,280 x 1,024 pixels. The detector’s material is indium gallium arsenide from Sensors Unlimited, Princeton, New Jersey. Leonard is the deputy director, Raytheon Advanced Technology directorate, and has nearly 40 years of experience in industry with defense electronics and commercial products. He holds a doctorate in mathematics and a bachelor of science degree in physics from the University of Arizona. He earned a master of science degree in aerospace engineering from the University of California, Los Angeles.

“What makes the detector material work is tiny indium bumps at the back of every pixel,” he explains. “That pixel is bumped onto the back of a readout integrated circuit. With 1,280 x 1,024 contacts, there are more than 1 million contact points with the readout circuit. This technical approach occurs right at the sensor. Incoming photons hit the detector material, which shoots out electrons that are captured through the indium bump into the readout circuit. This circuit stores up electrons and reads them out. While involving complicated circuitry, the detector provides an ultra low noise readout of the detected photons.”

There are many ambient photons that previous systems could not detect, but nature provides an incredible opportunity to see things in the dark, Leonard remarks. As the program progressed, fusion technology was integrated with the sensors in a helmet-mounted testbed, including five night vision cameras in various spectral bands shared between the user’s eyes.

“Using advanced software from Sarnoff, we warped those camera images so that they appear to be coming into the human eye from the same point—we fuse them to take the best information from any camera and present it to the eye. Or, we can average several to get the best images that way,” Leonard adds. “You are fusing for each eye three different spectral bands with V/NIR, SWIR and LWIR so that both eyes see virtual images. Seeing the world through the cameras provides an enormous advantage that has not previously existed. MANTIS can operate with displays covering both eyes in a completely virtual function, or [the soldier can] lift the display off one eye. All the while, the imagery is transmitted to every man in the squad.”

The key to MANTIS-to-MANTIS communications is in digitizing the information. There also is an extremely powerful computer in the helmet that enables the information to be processed and made available to the eye. “It is one thing to have a camera but another thing entirely to see it in such a way that the human brain knows that it looks right. What you want is really good images that don’t look peculiar or confuse the brain. MANTIS images come to the brain in a natural way, and we are very close to reality with this system.”

The Army will soon have soldiers wear and evaluate prototypes to determine the helmets’ ability to exchange data. This evaluation also will provide user feedback to the program. Meanwhile, development continues for an extremely fast applications-specific integrated circuit with MANTIS algorithms embedded. Other military applications for the technology are spinning off for handheld devices with different lenses that also can precisely locate military targets.

 

Web Resources
Defense Advanced Research Projects Agency, Defense Sciences Office: www.darpa.mil/dso/solicitations/solicit.htm
Raytheon Missile Systems: www.raytheon.com/businesses/rms/index.html
Raytheon Vision Systems: www.rayjobs.com/index.cfm?NavID=49
Sarnoff Corporation: www.sarnoff.com
Rockwell Collins Optronics: www.rockwellcollins.com/optronics/Contact/page510.html