Multispectral Camera System to Provide Soldiers With Enhanced Night Vision
Small cameras use smartphone technology to fuse data into high-resolution color images.
A prototype sensor technology under development will enable soldiers to identify threats more rapidly in low-light environments and to share target images with other squad members. Consisting of several types of small multispectral cameras, the system will use smartphone technology in the form of a warfighter’s handheld mobile device to process and fuse the camera data into high-resolution color images for the soldier’s helmet display. That display imagery can then be transmitted wirelessly to other soldiers.
Night vision and vision enhancement systems have been used by U.S. and allied forces for decades. But while they provide troops with a critical tactical advantage in the dark, there are tradeoffs. Such systems can be heavy, awkward to use and generate monochrome imagery that makes it difficult for warfighters to detect enemy combatants under certain conditions.
The goal of the Defense Advanced Research Projects Agency’s (DARPA) Pixel Network for Dynamic Visualization (PIXNET) program is to provide individual soldiers and squads with increased situational awareness through an improved night and low-light vision system. To achieve this, the program is setting out to develop very small multi-spectrum cameras that will clip onto helmets and weapons. Sensitive to a range of light spectrum bands, from visible, near infrared and infrared to thermal, the cameras will transmit data wirelessly to the soldier’s smartphone, which uses its processors to automatically fuse the data into a high-resolution color image. This capability, when combined with specifically designed software applications loaded onto soldiers’ smartphones, will allow warfighters to pick out targets in combat and other operational situations better, says PIXNET’s program manager, Dr. Nibir Dhar, who works in DARPA’s Microsystems Technology Office.
An important part of the program is developing the small cameras, Dhar says. PIXNET is working on three types of cameras—two helmet-mounted types and a third model that is clipped directly onto a soldier’s weapon. The weapon-mounted camera will pick up thermal wavelengths, he adds.
Besides the cameras, Android-based smartphones are at the heart of the effort. One of the key drives behind the program is to harness the processing power of the handheld mobile devices the Army and Marine Corps want to issue to all of their front-line warfighters, Dhar says. For PIXNET, the smartphones will process and combine the cameras’ multispectral images and then wirelessly network those images between the mobile device and the soldier’s helmet display. This same wireless networking process also may allow soldiers viewing a target to share the image with their squad mates.
While PIXNET will work on inter-squad image sharing, Dhar notes that the program will not develop its own peer-to-peer data sharing processes to push images out to others. Other DARPA programs already are focusing on using soldiers’ smartphones as servers in squad-level networks, he says.
Although PIXNET began issuing solicitations in the fall of 2012, it did not officially launch until the summer of 2013, Dhar says. The program will be 45 months long. The first four months of the effort will focus on testing some of the core technological concepts behind the new multispectrum cameras. Following this phase, the first brass-board prototypes should be completed within 24 to 30 months after launch, he explains. At the end of the 45 months, the prototype cameras should be ready for field tests, but he adds that at least one of the cameras may be developed enough to undergo testing as early as 28 to 30 months into the program.
Three companies are working with DARPA to develop cameras: Raytheon, DRS Technologies and UTC Aerospace Systems. Raytheon and UTC are developing the helmet-mounted versions, while DRS is working on the weapon-mounted camera. Of the three firms, Dhar notes that Raytheon is currently the furthest along in developing its camera.
Both of the helmet-mounted cameras cover different wavelength combinations. Like the weapon-mounted camera, they will be able to share data with each other via the personal Wi-Fi network supported by the wearer’s smartphone.
Current night vision and vision-enhancement technology uses individual spectral bands and overlays, Dhar says. This means that warfighters need different pieces of individual equipment to see targets under differing circumstances, which can be cumbersome or impractical. PIXNET’s cameras will collect images in a range of spectrum bands: visible light, near infrared, mid-wave infrared and thermal, and combine them into a single color image. Viewing such high-resolution, fused multispectral color imagery is something troops cannot do now, he explains.
Mixing and overlaying images in different wavelengths offers a number of advantages over single-wavelength devices. For example, Dhar notes that while thermal imagers can detect the temperature of the surface of a building’s windows, they can’t detect anything behind the glass. But shortwave infrared imagers can see through glass, he adds.
Another problem is size and power use. Most thermal imaging systems are large, either intended for use on vehicles or fixed sensor or weapons systems. One of the major goals of the program is to develop new types of low-cost, low-power elecro-optic camera technologies. According to DARPA, PIXNET is pushing for breakthroughs in aperture design, focal plane arrays, electronics, packaging and materials science. By these criteria, Dhar says program success will be measured by significant reduction in camera size, weight, power and cost while maximizing overall functionality.
Military infrared systems, particularly those worn by individual troops, use long-wave infrared with an image resolution of 10 microns, Dhar says. Mid-wave infrared has similar advantages to long-wave, but it has better resolution in the 3- to 5-micron range. Microns are the basic unit of measurement for wavelength in the electromagnetic spectrum. Many vehicles use mid-wave infrared imagers, but the goal of PIXNET is to provide soldiers with this high-resolution capability. “Mid-wave technology in the soldiers’ hands would be a significant advantage,” he says.
By using an Android-based smartphone to process and fuse camera images, the program also seeks to incorporate the open operating system as a platform to permit the development of applications to support the system. For example, Dhar speculates that it will be possible for a soldier or service agency to write an application that would allow the cameras to highlight thermal and infrared hot spots to pick out people from a background image better. He notes that the firms participating in the program are only writing applications directly relevant to supporting their prototype cameras.
The ability to create new applications offers a variety of possibilities for future capabilities, Dhar observes. But while there is considerable potential to develop new applications to support a range of functions, they have not been developed yet and probably will not be until the technology is more mature. “It is best to wait. Then the apps will come,” he offers.
Any applications written for smartphones supporting PIXNET will have to be able to migrate to new devices as the military upgrades or replaces them. This is why the program chose an Android-based system, as well as to comply with the Army’s move to issue mobile devices that run on the open operating system, Dhar says. While the program has not chosen a specific platform, the key part of it will be its ability to run Android, he adds.
Supporting an Android-based system also potentially allows PIXNET to cooperate or ultimately interoperate with other DARPA programs or Army efforts such as Nett Warrior (see page 40). At this early stage of the program, Dhar notes, exactly how it may interact with other programs has not yet been defined, but he is confident that as the program develops and smartphones become more powerful, new and more effective capabilities will be created to support PIXNET.
In addition to sharing and fusing sensor data, the cameras also will provide specific data for soldiers. For example, the weapon-mounted camera will be able to display imagery directly onto a soldier’s monocle or goggle-viewers. This would allow warfighters to shoot from the hip or around corners without exposing themselves to enemy fire, sighting their weapons remotely, Dhar says.
When the system is mature, soldiers will be able to control and modify the PIXNET cameras to suit their needs. These capabilities may include selecting from a variety of applications and tools written for the system or setting the cameras to operate in specific ways to support a mission, Dhar speculates. The final phase of the program will focus on developing affordable and efficient processes to produce the cameras in bulk.