Mind Over Mouse Click

April 15, 2011
By Rachel Eisenhower, SIGNAL Connections

A new eye-controlled laptop could change how humans interact with computers inside offices, hospitals and homes. The capability could be part of the mass market in the next two years.

The developer of the laptop, Tobii Technology, started its journey into eye-tracking technology with a long-term vision of perfecting it for a broad audience. However, the company originally focused on assisting people with disabilities, Barbara Barclay, general manager of Tobii TechnologyNorth America, says. The company used the technology to assist people with quadriplegia, amyotrophic lateral sclerosis and cerebral palsy. In addition, early research concentrated on eye tracking for detection of autism and attention deficit disorder.

But beginning two years ago, Tobii partnered with Lenovo to adapt the capability to a size and price range that would appeal to a larger audience. The team developed 20 prototype laptops with eye tracking for demonstration purposes and debuted them last month. The laptops show that eye-tracking technology can succeed with a standard computer interface, Barclay relates.

The capability complements the traditional mouse and keyboard and utilizes natural eye movement to control the computer in a more automatic and intuitive way by using invisible infrared lights to illuminate the eyes. Using the reflection from this light and the glint in the eye, two high-quality cameras locate the pupil and take rapid pictures that are used to build a 3-D image. By constantly photographing the eye in real time, the computer can determine where the user is looking on the screen with extreme precision.

The computer can automatically scroll through and switch pages based on the user’s eye movement during reading. In addition, it can use eye control to pull up a menu, activate a tool bar, open files, read through emails and browse songs without the touch of a hand. Beyond these basic functions, the eye tracking can provide gamers with a more integrated experience by allowing the user to control the character with their hands while aiming with their eyes.

Eye movements are the fastest movements the human body can produce, explains Barclay, and eye tracking could prove helpful for military interactions where speed is critical. One of the research team’s earliest projects involved the simple computer game Asteroid; players had to look directly at falling asteroids to stop them from hitting Earth. Similar programs that address speed and reaction time could be implemented as educational tools. “The kind of training you could do would be unimaginable,” Barclay states. 

In addition, the technology gathers data on pupil dilation, which can help track human performance in office and military environments. It records and monitors real-time attention and vigilance and notes if the user drifts off during a work shift or is no longer at the computer. It also can alter screen contents based on where a person is looking to prevent other people from viewing the information.

The capability’s use ultimately depends on software developers in industry, academia and government. “The software that we’ve built is really just the beginning,” asserts Barclay, and Tobii offers a free software development kit for engineers interested in finding new applications for the product. While many types of computer interactions still rely on a computer mouse as the optimal tool, Barclay hopes developers will find unique ways to utilize the capability when eye control is the most efficient option.

For Barclay, completing the prototype laptops is one step toward helping people see the potential for eye-controlled technology outside of research laboratories. “From the manufacturers themselves to all of the players who are a part of making a computer, all of them are considering eye-tracking at some level, but sometimes you need to see it to understand how it can really be helpful and how it might work in day-to-day life.”
Enjoyed this article? SUBSCRIBE NOW to keep the content flowing.