Scientists Use Brain Waves to Navigate, Interact With Cybernetic Environments
System translates specific signals, opens doors to future applications for the disabled.
Researchers are testing a prototype computer interface that allows users to interact with a virtual reality world through brain impulses. If successful, this proof-of-concept device could greatly increase the mobility and independence of people who are paralyzed or have similar conditions.
The ability to manipulate the physical world with only a thought has long been a staple of science fiction. Current developments in medical and computer sciences open the potential for a future where bedridden patients can enter virtual realms or navigate in the real world with wheelchairs or other conveyances controlled by systems that translate brain waves into electronic commands.
Jessica Bayliss, a graduate student in computer science at the University of Rochester, New York, developed the concept of linking the brain to a virtual reality (VR) environment. She notes that the idea of a brain-computer interface (BCI) is not new—the first one was created in 1973. But these initial attempts did not progress because they lacked the real-time processing capabilities available to today’s researchers.
The three-and-a-half-year research project is part of her doctoral dissertation, which was inspired by an article that appeared in Scientific American in 1996. In selecting a research subject, Bayliss chose BCIs because she wants to help people who are both paralyzed and have difficulty communicating with others. The worst cases are those in a locked-in state, which means they are completely incapable of any voluntary muscle movement, but their brains otherwise function normally.
Bayliss’s BCI works by identifying and translating an easily identifiable brain signal called a P300. The term stands for positive, because it appears as a positive bump on an electroencephalogram (EEG), and 300 for the time in milliseconds the signal appears after the stimulus. She chose this particular signal because it is well understood—it has been studied since 1965—and can be detected from the surface of the head with electrodes.
As a signal, the P300 does not represent the direct thoughts of a person. The impulse is a reaction to a stimulus such as seeing a stop sign or a desire. For example, if a person in a virtual environment wishes to turn on the lights in a room and there is a flashing image on the light switch, they concentrate on it and will the lights on. The resulting spike in the P300 signal would be interpreted by the system as a message to turn on the lights. This approach has advantages over methods where the patient has to constantly concentrate because continuous systems are fatiguing. Bayliss notes that, with some exceptions, it should not be necessary for the operator to concentrate on a mouse pointer every step of the way, for example.
The system features a VR helmet and electrodes that are attached to the user’s head. Bayliss is only working with student volunteers. P300 signals received by the electrodes are transmitted to a set of analog amplifiers that in turn feed into an analog-to-digital conversion board. The converted digital signals are then fed into a standard Pentium series PC.
Bayliss believes that one of her major accomplishments has been the flexible BCI system. Most other devices of this type are large monolithic creations that only have one specific function. While they may perform that one task well, they are hard to reconfigure for other tasks, she says. Instead, she chose Matlab, a commercially available scientific software tool, to process the signals. The advantage of this approach is that new signal processing routines can be written and easily plugged into the system.
P300 signals are then passed through a robust Kalman filter, which Bayliss describes as a complex template-matching device. Because a P300 impulse is read as a spike, the filter takes a one-second slice around the bump and determines if it is an accurate signal. The impulse is then processed to determine if a P300 is present. If it is a true signal, a recognition code is activated, instructing the user application to react to the impulse by shutting off a light or conducting some other action in the virtual environment.
This application operates on top of the other programs and currently resides in a Silicon Graphics Onyx II reality engine, which runs the VR programs. The recognition codes can work with anything added into the system. For example, instead of a VR application, it could react to a communications interface of some kind, she says.
The system has a one-second response time based on the signal processor’s one-second time slice. The BCI is currently 85 percent accurate at recognizing a signal; however, several important qualifications exist, Bayliss says. While the hardware may process the information in a second, it must receive the signal from the user—and this will depend on how fast that person can make a decision. As the experiments currently focus on environmental control, it is difficult to compare them with communication because a person does not necessarily want to turn on a light once every second, she says.
Although the primary signal processing work is done by the Matlab software, other programs were needed to hold the system together. Commercial programs to read EEG signals proved to be very expensive and inefficient, which prompted Bayliss to rewrite the code in visual C++.
According to Dana Ballard, a professor of computer science and Bayliss’s dissertation adviser, using a virtual environment is a unique approach to BCI research. He believes the application could be tremendously freeing to bedridden patients but cautions that the research is still at the proof-of-concept stage.
Bayliss notes that other groups also are involved in BCI research. A team of scientists led by Neils Birbaumer of the Institute of Medical Psychology and Behavioral Neurobiology at the University of Tbingen, Germany, recently conducted BCI work with patients in a clinical setting. Birbaumer’s group made the first interfaces for use by individuals in a locked-in state. Two subjects were able to communicate with researchers through an electronic spelling device. However, the process was extremely time-consuming, taking up to 16 hours to form a few sentences, Bayliss observes. One of the goals of her work is to increase the speed at which such interfaces operate.
The current extent of Bayliss’s research is environmental control; getting users to react to what they see. Because this is visually driven and time critical, the interface is extremely important. The use of immersive VR equipment proved attractive because it helps people concentrate and acts as a motivator. By comparison, the European group had difficulty convincing patients to use their system because of its slow speed, she observes.
The University of Rochester has its own VR laboratory with equipment that is extremely important to the project, she says. The VR equipment is part of a grant to the university by the National Institutes of Health, a small portion of which funds her dissertation research. A fellowship from the National Aeronautics and Space Administration’s Goddard Space Flight Center provides the remainder of her project’s financing.
The university already possessed a virtual car simulation, which made it easy to create a variety of dynamic environments to test the technology. A virtual mock-up of Bayliss’s apartment is also used in testing. It is precisely the ease with which these settings can be created that offers many options for both researchers and patients, she says.
Because quick reaction time was important to the project, VR has an added safety benefit. For example, trying to use the system to pilot a wheelchair could potentially be dangerous, but in a virtual setting, such activities can be tested extensively. A potential psychological benefit of a VR environment also exists for a patient. “If you are in bed all day, every day, something like VR might actually be a good thing because it gives you a lot of different possibilities for things to see, rather than just your hospital room,” she offers.
Although the work shows great promise, technical issues still must be worked out before the system can be made widely available, David Loiselle, director of clinical neurophysiology at the University of Rochester Medical Center, cautions. Loiselle, who coordinates Bayliss’s research and is an adviser on her dissertation committee, notes that the human body generates a lot of biological noise capable of corrupting data.
While P300 signals are among the easiest of the brain’s impulses to detect, they are still quite faint, registering in the microvolts. Movements, such as those of the eye muscles, are measured in millivolts, produce larger electrical signals, and cause noise, he says. The other issue is that the work is being conducted in a room that is not electrically quiet. Because the brain produces such low-powered signals, even fluorescent lighting can interfere with data collection.
The work is multidisciplinary. Bayliss is approaching the project from the computer science field’s point of view, but she spent a year learning about clinical neurology and EEG signals. Other research groups, if they primarily have engineering backgrounds, often do not really understand the signals they get from the neurologists when they conduct pattern recognition work, she notes.
Another issue the scientists must address is keeping up with computer advances. No single technology can do this kind of work, Bayliss admits. This is especially true if any flexibility is designed into the system, which she believes is necessary to adapt it to constant hardware and software changes. If it can adapt easily to new protocols and equipment, it will stand the test of time, she concludes.
Virtual reality technology also is not trouble free. One of the immediate drawbacks of VR is cybersickness, which is a type of motion sickness that affects some people who wear VR equipment for prolonged periods of time. Other research groups are working on correcting these problems, she says.
Because the BCI system runs from a personal computer, Bayliss hopes to make it available to other researchers in the near future. It is currently undergoing extensive testing to identify and eliminate any problems in the equipment. She notes that one of the major barriers to conducting this type of research is the speed of current BCI systems. “You need something that is going to react quickly enough and that can easily do the processing in order to be able to study these things,” she says.
The primary goal of the technology is to help people with disabilities, Bayliss maintains. If the users have some movement, options are available to them in the way the system can be configured. The BCI can read eye movements as well as P300 signals. Because recognition codes are used, it is easy to add new codes. Eye blinks and left and right eye movements have already been added to the existing code library, she says. This allows for some flexibility in approach, she adds, because in some circumstances it can be desirable to use a combination of P300 signals and eye movements, or one type of input entirely.
An important distinction also is made between systems that operate via brain waves and those that work through eye and muscle movements. Some claiming to read brain waves really read eye and muscle movement, Bayliss observes. This is because these movements are larger and easier to detect than brain waves.
The U.S. Air Force has been experimenting with brain-body actuated control to improve a pilot’s response times. However, the goals of these systems are entirely different from medical research, and they track the most easily detectable and usable reactions such as eye or muscle movements. For example, every time pilots see something coming at them, they move to the right. This reaction is picked up and used in the system.
Bayliss adds that she chose to work with brain waves because they are harder to detect and because some patients do not have any voluntary muscle control and cannot benefit from methods based on movement detection.