Eye-tracking poised to become a standard computer-graphics feature

SolidWorks is only one of many companies looking to incorporate eye tracking into future updates to the user interface.  

By L. Stephen Wolfe, P.E.

[Editor’s Note: This is not the first time Steve Wolfe has written about Tobii; take a look at his 2005 article, “Tobii pioneers eye-tracking software.”]

When I first saw Tobii Technology’s eye-tracking hardware and software at Siggraph 2005, they were costly research tools aimed at people designing Web sites, computer applications, and children’s toys. The custom monitor with eye-tracking hardware carried an asking price of $24,500, and the “Clearview Usability” software cost $6,900. All the software did was to record the path of a person’s eye movements and the areas on the screen where users spent the most time looking. It had no closed-loop control capability.

Fast-forward six-and-a-half years to SolidWorks World 2012. In a booth at the back of the exhibit hall, SolidWorks developers showed how eye-tracking might be used in CAD software. The demonstration was not with production SolidWorks, but a prototype program that enables users to perform tasks such as selecting objects and centering them on the display just by looking at them.

A month later, at CeBit in Hanover, Tobii announced a smaller eye-tracking sensor that could be built into a laptop computer. Eye tracking appears poised to become a common feature in many types of computers.

Tobii’s eye tracking hardware for PCs is much smaller than when we first looked at the technology in 2005. (Source: Tobii Technology)

In August, I had a chance to try Tobii’s Eye Tracker at the Siggraph convention in Los Angeles. The so-called Gaze interface knows where a user is looking and allows the computer to act on that information. For example, if you look at an object, the application could highlight it and move it to the center of the display. It could also let the user select it with a stroke of a space bar. As Carl Korobkin, Tobii’s vice president of business development, put it, “We’re just using information we already know to make the selection faster and more natural.” Why make someone move a mouse cursor or point with a finger when the computer already knows where he or she is looking?

The possibilities for eye-control of one’s computer seem almost limitless. Gamers would need only to look at their adversaries before shooting or clubbing them. Designers could select a part of an assembly or building and center it much faster than a mouse allows. Map readers could look at the intersection of two streets and have it scroll automatically to the center of the display. And when browsing the internet, the ability to look at a link and select it could make everyday tasks seem more natural, especially on mobile devices.

The art of designing great computer controls will depend in part on combining eye-tracking with older input technologies such as keyboards, mice, touch screens, and 3D navigation devices. It also may involve mixing vision with modern input methods, such as hand gestures, Korobkin suggests. Tobii has developed an eye-tracking driver for Microsoft’s Windows 8 operating system. (See the video “Tobii Gaze Interface for Windows 8.”) This software offers a good starting point, but application developers need to pick up on the power of eye-tracking too.

For more information: www.tobii.com

L. Stephen Wolfe, P.E. is a contributing analyst for Jon Peddie Research.