Enable breadcrumbs token at /includes/pageheader.html.twig

Navy Researchers Target, Virtually

Scientists in the U.S. Navy's Office of Naval Research are developing new virtual reality simulations that address the needs of land-based warriors such as U.S. Marines. These simulations seek to reproduce various motions and scenarios as faithfully as battlefield conditions.

 
A scientist at the Naval Research Laboratory engages a virtual reality simulation in the laboratory’s VirtuSphere system. The user can walk, run, dart to one side or reverse direction physically as if he were moving in the real world.
Warfighter simulations employ varying technologies to emphasize different skills.

Scientists in the U.S. Navy’s Office of Naval Research are developing new virtual reality simulations that address the needs of land-based warriors such as U.S. Marines. These simulations seek to reproduce various motions and scenarios as faithfully as battlefield conditions.

The researchers are combining conventional game systems with specialty technologies to generate virtual reality simulations. Many of the commercial technologies had to be adapted for greater military realism and more natural human motion. Several systems have been developed, each emphasizing a different skill set or scenario. Ultimately, these varying types of simulations may be networked into a single virtual reality for a full range of warfighter ground scenarios.

Many of these efforts are underway in the Virtual Technologies and Environments program, or VIRTE, run by the Office of Naval Research, Washington, D.C. This program has been operating for more than five years, and it has more than a year remaining before evaluation of its systems is concluded.

Roy Stripling of the Naval Research Laboratory’s (NRL’s) Information Technology Division explains that the program is experimenting with a number of systems for particular tasks. No single system may be right for every kind of training task, so the solution may be to incorporate a range of solutions operated separately or in tandem.

“We range from very-low-cost systems to more advanced tracking systems and immersive systems,” adds Stripling, who is a neuroscientist at the NRL’s NavalCenter for Applied Research in Artificial Intelligence. “We can look at all these systems and try to get a sense of what kinds of tasks you need to spend extra money on versus using computers that are already available in most places.”

Jim Templeman, a computer engineer who heads the Immersive Simulation Laboratory in the NavalCenter for Applied Research in Artificial Intelligence, explains that many different systems are interfaced to the same simulation environment. This permits planners to mix and match simulations. Some users could be sitting at desktop simulations while others are immersed in virtual reality systems that physically replicate field activities.

One system that seeks to replicate the user’s physical activities with high fidelity is the VirtuSphere. Resembling a giant hamster ball, the plastic sphere stands about 10 feet tall and has a full 360-degree range of motion. A user enters the sphere wearing a backpack, a simulation rifle and a helmet-mounted display. The backpack carries batteries and bluetooth wireless connections to the simulation computer.

Both the head-mounted display and the rifle are tracked using inertial cues that transmit head or hand motion to the computer controlling the virtual environment. As the user turns his or her head, the computer shifts the virtual display in the headset. The rifle enters the scene when the user points it in the field of view.

“It’s a low-technology solution,” says Stripling. “It’s just basically a big plastic sphere sitting atop a bunch of inline skate wheels.”

Not all of the VirtuSphere is so low-technology. The tracking system uses a Doppler ultrasound system featuring two locators to track X and Y directions. In effect, the sphere is a giant trackball with the operator inside.

What sets this system apart from many other soldier simulations is that it accurately represents real motion on the part of a person active on the ground. A user can crawl, walk or run in the sphere in the same way that he or she would move on real terrain. The actual running effort translates into movement by the participant in the virtual world. The user is not limited to moving a joystick or churning out steps on a treadmill to simulate running motion.

Running in a giant sphere does take some getting used to, Stripling admits. A user probably cannot take full advantage of its capabilities on the first try in the VirtuSphere. Practice will improve the balance and coordination necessary for moving effectively while wearing a virtual reality head-mounted display.

This giant sphere represents the use of commercial hardware. The 485-pound prototype system is built by VirtuSphere Corporation. The company also developed the tracking system, using proprietary software, for the virtual simulation.

The next step in this effort is to link the weapon’s trigger to the simulation. Stripling says that the office is developing the necessary interface for that capability.

The tracking fidelity of this and the office’s other simulations is not so precise that users can bring marksmanship-level fire to bear, Stripling notes. Precise gunfire accuracy has been traded for other realistic approaches.

Moving out of the sphere and into a room environment, another VIRTE effort places the user in a sensor-equipped room while wearing a head-mounted display. This room is rigged with eight optical cameras that track red light-emitting diodes (LEDs) on the user’s head-mounted display, backpack and weapon. Each LED flashes at a different rate so that the computer system can identify it individually. The system, from PhaseSpace, tracks the user’s torso, head and weapon.

Mounted on the side of the weapon is a tiny joystick. Lacking a trackball in which to run, the user moves that joystick to locomote in the virtual reality scene. If that user wants to turn around or perform some other rotational motion, he or she simply does that in the room. The prop rifle is from AirSoft with a GamePad controller embedded in it.

Although the user is wearing a head-mounted display, the rifle directs the user’s motion, not the headset. The system features audio as well as visual imagery. This system’s virtual reality environment is the same as the one used for the VirtuSphere, so the two systems could be networked into a single simulation relatively seamlessly.

The office is collaborating with scientists at ClemsonUniversity and the University of Central Florida on furthering this system, Stripling relates. Part of this work compares the simulations to a real-world environment.

While these two systems sacrifice marksmanship for other capabilities, the VIRTE program has a marksman training function that accommodates accurate shooting. Its goal is to adapt these marksmanship systems for more complex training. The NRL version is a beam-hit laser detection system that features an actual de-militarized M-16 rifle. The rifle employs a track pad on its side for user motion, and it includes a carbon dioxide charged recoil that simulates the actual recoil experienced while firing an M-16 with live ammunition.

A laser embedded in the rifle barrel provides marksmanship measurement, and the user views the simulation on a standard liquid crystal display projector. It is a fairly low-cost system, Stripling offers.

A more advanced system can be found in the Immersive Simulation Laboratory, which is run by Templeman. His laboratory has conducted extensive research on simulating human locomotion.

He notes that optical tracking systems can track six degrees of freedom, so the position and orientation of both the head and the rifle are tracked. Some lower end systems track only orientation without knowing spatial location, and this affects whether a weapon is used realistically or not. Until about three years ago, optical systems could not provide good simulation in real time. Since then, he notes, they largely have come into their own from software and processing improvements.

“A lot of the computational resources that cost a lot five years ago are the relatively least expensive part of the system these days, especially the graphics,” Templeman observes. The laboratory largely is running regular Windows and Linux operating systems.

One system that Templeman’s team has put together is known as Gaiter. This consists of a sensor and head-mounted display system that is suspended from above by a large framework that turns with the user. This harness permits the user to walk in place as if he or she were in the real world. Passive sensor markers are worn on major body segments such as lower legs, lower arms, pelvis, back and head and on the rifle. A network of cameras picks up reflections off these sensors from light generated by LEDs.

 
Robert Page, a computer scientist with the Immersive Simulation Laboratory, demonstrates the Gaiter simulation system. Cameras track the reflective passive sensor markers to provide realistic full-body motion.
One element that sets this system apart from others is that the user’s entire body is part of the simulation. Although other systems record motion, they often display it as general head and/or torso movement. This system takes into account limb motion, which has considerable effect on simulated movement. Templeman relates how a person walking will go through a transition period when accelerating to a run in which both feet are off the ground.

Because this simulation takes place in a room without a giant sphere or even a treadmill, the user alters his or her gait to simulate motion in place. Walking or running in place translates as forward motion, but to move to the side, the user performs a side step in place by swinging a leg out. The user backs up by pushing a foot back and kicks open a door by kicking forward. “It’s the direction of the motion of your legs that allows you to move while you’re addressing targets with your upper body,” Templeman explains.

The simulation also permits some physical simulation. In addition to kicking open doors, users can push aside many small obstacles such as tables and room partitions. Or, the user can just kneel and the simulation will represent it accurately.

Large display screens allow monitors to view both the view experienced by the user and a third-person perspective that shows the user as a computer-generated figure. A team can monitor all of its body motions, which are recorded on the virtual display.

Templeman relates that the laboratory has been developing Gaiter for about seven years. It began as an experiment in leg tracking, then it was applied to a virtual simulator. Researchers have learned much about how users relate to the step-driven system versus a simulation employing conventional game controls.

Another new system that emerged from Gaiter developments is known as Pointman. The goal is to give users the same kind of control for executing tactical movements that they have in the real world. Instead of the user physically acting out the actual body movements while wearing a virtual reality head-mounted display, this version features simple foot pedals under a desktop to generate movements. Rather than moving in place, the user works the pedals to simulate a variety of forward movements.

One key challenge has been to enable manipulation of objects in the virtual world, Templeman offers. This may involve moving or picking up objects, climbing over obstacles or helping another person climb through a window. This traditionally has been unknown territory, and the laboratory has teamed with others to explore solutions.

Researchers also have changed the mapping of game joysticks. Conventional thumb joysticks have been replaced by others that realistically support tactical movement. This is key to giving a user high fidelity for tactical movement control.

Templeman relates that the laboratory would like to transition its work into more gamelike venues, but many game interfaces are very simplistic. The characters move too much like vehicles, he observes.

Recreating Warfighter Movement More Than Just Game Playing

Experts at the Naval Research Laboratory are working to generate true motion in warfighter simulations—not the flawed movement that characterizes most computer games. Jim Templeman, head of the Immersive Simulation Laboratory in the NavalCenter for Applied Research in Artificial Intelligence, explains that studying human locomotion has helped researchers develop a different scheme for simulating U.S. Marines moving on foot across terrain and in urban environments.

“Figuring out what the difference is between the virtual and the real is the biggest challenge we have overcome—just coming to terms with what that is and then addressing it,” he says. “Our primary emphasis today has been locomotion.”

The motion strategies in most first-person-shooter games do not match actual infantry tactics. In effect, typical gamepads and mouse-keyboard configurations emphasize strafing motions in which the user moves in the direction in which the gun is pointed.

In real warfare, infantrymen constantly scan for danger as they walk through terrain or urban environments. They hold their rifles in a ready position and always move it with their head so they can bring it to bear quickly to engage a target. This “guns and eyeballs” approach keeps the two aligned as the individual moves the upper body as if it were a turret.

A Marine may be moving in a straight line in one direction but scanning side-to-side with his weapon pointing in the direction that he is looking. The problem with replicating that movement and attitude through conventional commercial game controls is that these systems do not follow scanning patterns. They tend to favor an oblique motion in which the head remains in a fixed direction and steering the weapon redirects the course of the participant. So, the user cannot scan without affecting the course of the warfighter. The offset knob on the game control can counter that effect somewhat, but it is difficult to coordinate with the desired motion.

The laboratory’s Pointman system addresses this problem by changing the function of the conventional game controller. Templeman relates how the laboratory’s version empowers the right controller knob to set motion more directly by controlling steering. The left knob provides directional control of the user’s scanning motion. The two knobs are more easily coordinated because they are directional, he adds.

So a Marine on virtual patrol would set his direction with one knob and swivel using the other. This human turreting motion would be relatively natural, Templeman says. This approach encourages participants to look around while they are moving instead of when they stop at a destination, which is common behavior in a conventional game.

“Our goal is to make it easier to make the kinds of moves that you should make for tactical [operations] and actually express your motion in a more natural way,” Templeman states.

 

Web Resources
Naval Research Laboratory Information Technology Division: www.itd.nrl.navy.mil
VirtuSphere Corporation: www.virtusphere.com
PhaseSpace: www.phasespace.com