Post-Terrorist-Attack Triage Teams Rehearse on Virtual-World Platform
Urgent caregivers learn valuable lessons about dangerous, complex situations.
Emergency response personnel are exploring virtual reality to practice dealing with chemical or biological attacks. This combination of medical expertise and technology gives medical teams the opportunity to learn and to make mistakes on patients that simply can be rebooted.
In the past, military personnel in combat were the most likely to be exposed to these dangers; however, in recent years as terrorist attacks have increased, new alarms have sounded. Shopping malls, subway stations and airports have been added to the list of possible targets. Although the military must continue to be prepared to respond to these threats, civilian medical professionals, including emergency medical technicians, also must be ready to address casualties that result from exposure to airborne toxins.
Paralleling the evolution of the threat, work on high-fidelity virtual reality (VR) trainers began in the military but is expanding to the civilian sector. In 1998, the Defense Advanced Research Projects Agency sponsored work conducted at Sandia National Laboratories, Albuquerque, New Mexico. A team of scientists from Sandia and other research institutions developed MediSim, a virtual reality system that immersed medics in a world in which they confronted battlefield casualty situations.
Building on this work for the military, the Sandia team designed a new scenario that can be used to help train civilian medical personnel to triage average citizens caught up in a terrorist attack. The biological simulated medical emergency response (BioSimMER) system allows up to 10 trainees to assess and attend to virtual patients that react to a hypothetical environment and respond to treatment.
Wearing virtual reality goggles and donning sensors on their arms, legs and waists, simulation participants begin their training. The simulated scenario is a small airport that has been taken over by terrorists who are holding hostages. The terrorists claim they have released a biological agent but have not indicated what type. After a standoff, an assault takes place on the airport, and the terrorists set off an explosive device that causes both conventional injuries and may have dispersed more of the unknown agent.
At this point, the rescuers are sent in to begin triage and treatment. They find a variety of injuries. As a result of assaults, the victims have incurred conventional injuries, including head trauma and tension pneumothorax. In addition, hostages have been exposed to a biotoxin; however, emergency personnel do not know what type of toxin. For the purposes of this virtual scenario, Sandia researchers chose staphylococcal enterotoxin B (SEB) as the biological agent because it severely incapacitates victims and symptoms develop quickly, according to Sharon A. Stansfield, project leader for technical development of BioSimMER, Sandia.
Finally, to enhance the realism of the scenario, the system’s developers added a symptomatic psychological case. One victim is catatonic. This last casualty was included because emergency personnel could encounter this type of injury, but the observed symptoms may cause them to think that the patient is physically injured from the assault or the toxin, Stansfield explains.
Upon entering the virtual world, participants can see both the casualties and the other team members and must triage, diagnose and treat the medical needs of each casualty. Visual indicators, such as movements, labored breathing or change in skin color, and vital signs, such as blood pressure, temperature and heart rate, give clues to each victim’s condition. If the rescuer is wrong or the victim is not treated quickly enough, the virtual casualty dies.
The simulation features a medical kit that includes the most likely tools a medical team would carry into a situation. As treatment is administered, the virtual victim reacts in an appropriate manner. For example, if oxygen is cut off or the patient is inappropriately treated, the patient turns blue, Stansfield relates.
Because a biotoxin is present, the first order of business is to conduct decontamination procedures. The participant may be required to place a mask over a patient’s nose and mouth. He or she also may be responsible for placing sensors or other equipment near the victim so that biotoxin dispersal can be monitored.
One critical lesson trainees learn is to protect themselves. “A player who dies a quick cyberdeath will not likely forget the importance of personal protective equipment in the future,” Stansfield says.
BioSimMER consists of several components. Virtual reality interfaces include the VR station, which is the display driver for the users and allows them to control their viewpoint and motion independently within the virtual world. Real-time updates of the view of objects are driven remotely by position trackers worn by the participants. Tracker input modules acquire the position of each user and provide this data to other modules that require it. Each VR station uses the position of the users’ head trackers to update their view of the world.
The VR multicast allows all simulation and interface modules to share information about the state of objects and users within the simulation environment. It is implemented using Ethernet multicasting of user datagram protocol datagrams on a local area network.
Voice recognition software permits the trainee to acquire information such as vital signs and speak to the virtual patient or command certain actions.
The Medic Avatar, another component of BioSimMER, allows participants to see each other as full human figures. This is a critical element because position, posture, gesture and body language are all vital components of team coordination and communication, Stansfield explains.
The avatar driver utilizes all four trackers worn by the user to update the position and posture of that user’s graphical body. It combines several techniques, including general-purpose kinematic solutions that are first generated using input from four user-worn sensors. Special heuristics are then applied to selectively reduce the number of actions to those that are reasonable to simulate. Heuristics are based on knowledge of the human body and probable motions of limbs.
The avatar also acts semiautonomously. For example, when a user reaches for and grasps an object, the motion made by the trainee determines the placement of the arm; however, the hand posture is selected automatically based on the object that the participant is trying to grab.
Virtual objects also contain knowledge to aid the user. For example, protective gloves place themselves on the trainee’s hands when they are grasped and the fingers are touched.
The system incorporates dynamic casualty models that manifest the symptoms of injuries. If the trainee does not properly assess and treat these injuries, the victim may die. These models were created using ExGen software and were developed by the Tekamah Corporation, Rockville, Maryland. The program uses a finite state automata to model the dynamic state of the patient, the degradation of that state over time, and the changes brought about by the participant’s actions.
The agent transport model is the final element of BioSimMER. This Sandia-developed software originally was created to model the interior movement of nuclear radiation through a facility in case of an accident. It has been extended to model movement of chemical and biological agents within a building. In the BioSimMER system, the software accurately simulates the dosage of SEB for casualties within the fictitious airport and the readings for simulated sensors placed within the facility.
More than 20 emergency medical technicians had the opportunity to train in the prototype BioSimMER last summer at the National Emergency Response and Rescue Training Center, Texas A&M University, College Station, Texas. According to Daniel M. Shawver, senior member of Sandia’s technical staff, participants first viewed an instructional videotape to familiarize themselves with the equipment and the scenario. Once suited up, users spent 10 minutes becoming comfortable with the environment and learning how to manipulate the virtual medical equipment. Next, they each chose a patient, and the training began. After the virtual reality experience, participants filled out a questionnaire to help the system’s developers learn what improvements could be incorporated from the user’s viewpoint.
Trainees contributed some valuable suggestions, Shawver says. For example, they said they were hindered in some instances because the virtual medical toolkit did not include items they would traditionally carry with them in an emergency situation. In addition, the emergency medical technicians said they would like to be able to hear verbal cues or breathing sounds because they often rely on these to make accurate diagnoses.
Despite these shortcomings, Shawver says participants fully immersed themselves in the scenario and were disappointed when they were unable to save the virtual victim.
According to Dr. Annette Sobel, principal member of the technical staff at Sandia and a physician and researcher, the early military prototypes laid much of the technological groundwork for BioSimMER. As one of the key researchers in the military’s program, Sobel helped develop the system to support the training of medics who would be the first on the scene of a battlefield casualty.
As with BioSimMER, after the initial capabilities were developed, users were brought in to evaluate and make suggestions for improvements. “The technology is wonderful and can be very useful, but it can’t just be developed by technicians. It has to be user friendly,” Sobel relates.
After research and prototyping on the technology was complete, the project did not receive funding to continue the work so Sobel and Stansfield, with the assistance of Sandia, began designing a system that could be used by the civilian sector.
Sobel, who also is a colonel in the U.S. Air National Guard and works in the civil support group that provides weapons-of-mass-destruction support, says the transition into the nonmilitary setting is facilitated by the underlying design of the technology. “It was developed as an open architecture so you can pull in other types of hardware and software. In the urban environment, if a sensor is developed for determining the type of biological agent, the system could just pull in another data stream,” she explains.
According to Stansfield, another benefit of the design is its portability. “The goal is to have a system that would have enough of a scenario for people to train and learn something. We also want to make it affordable so small organizations can use it. We want to import it into a platform that makes it affordable,” she says.
“This would augment training. Right now, training consists of classroom and maybe CDs. This would provide experiential learning. These are not done often now because it’s expensive to do live training. Also, small organizations only have a few people, so they can’t afford to let all of them go to an exercise for training. Chemical and biological training is hard to do because of the dangers. These are the places where virtual reality can augment training.
“Virtual reality can also be used for sustainment training. People are trained in these things, but if they don’t use it, they lose it,” she adds.
Although much of the BioSimMER technology is a direct result of the work done for the military, the system’s growth is also benefiting from an untraditional source—the video-game industry. “What has helped the development in technology so much is that the gaming people are pushing things to the personal computer level. So, the cost of PC material is down, and the quality of the PC materials is up,” Stansfield says.