Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Home ›› Business Value and ROI ›› 6 Key Questions to Guide International UX Research ›› Eyes are the Prize: Evaluating the benefits of eye-tracking equipment

Eyes are the Prize: Evaluating the benefits of eye-tracking equipment

by Jon West
9 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

Smart navigation of the variable-rich world of eye-tracking systems will find the proper product to suit every need.

Eye-tracking technology is on the rise and creating a lot of buzz in the UX community. But unless you happen to be a robotic or optical engineer who understands the physics and optics of eye-tracking instruments, it can be hard to know which products offer the best value. Manufacturer specifications don’t always capture the whole performance picture, so it’s best to test drive alternative systems to see which one is really best for your application.

This article explores the fundamental characteristics of today’s trackers and provides some tips to help you evaluate which eye tracker is best for you. The issue of pricing isn’t addressed here due to the variability of the individual products and solutions presented, but will of course be a factor as you make any decisions surrounding eye-tracking systems.

An eye tracker’s capabilities have a lot to do with what’s under the hood. First, there has to be good video camera hardware to capture high-quality images of the user’s eyes. There is also image-processing software that analyzes the video images. These image-processing algorithms play an especially critical role in interpreting the wide range of eye images the eyetracker will encounter. The algorithms find the eyes in the images, do their best not to get confused by all the other non-eye objects in the images, measure critical eye features such as the pupil center and corneal reflection, calculate the position and orientation of the eyes in space, and finally project the eyes’ gaze lines to their “gaze points” within the scene being viewed, typically on a computer screen.

There are several key performance factors for evaluating eye trackers:

  • Gaze-point tracking accuracy
  • Tolerance to head motion
  • Accommodation of typical variations in the human eye
  • Ease of calibration
  • Tolerance to ambient infrared light
  • Sampling speed
  • Robustness of gaze path analysis tools
Gaze-Point Tracking Accuracy

In the UX world, the most basic output from an eye tracker is its measurement of the user’s gaze point, i.e., the point in the user’s viewing scene where he is looking. On a computer monitor, the gaze point is a 2-dimensional coordinate on the screen surface. In 3D environments, the gaze point is a 3-dimensional coordinate within the scene. 3D gazepoint measurement requires tracking both eyes (binocular eyetracking) to compute where the gaze lines of the two eyes converge.

Because the fundamental output from an eye tracker is the location of the user’s gaze point, the most important performance metric for a device is the accuracy of its gaze-point measurement. Since the eyes move their gaze by rotating, eye-tracking accuracy is often measured in terms of angular error. Good eye trackers typically have average errors of 0.5 degrees or less. At a viewing distance of about two feet, a 0.5-degree measurement error translates to about 0.21 inches at the scene. And given that tracking errors can be positive or negative, an eye tracker with an 0.5-degree average error can reasonably resolve what a person is looking at if the objects or icons in the scene are a half inch in diameter or larger. In this day and age where crowded screens are common and objects are small, high eye-tracking accuracy is often critical for good results.

There are a couple of good ways to evaluate an eye tracker’s accuracy. First, ask the manufacturer for a program that displays the predicted gaze point in real time, and see how well the predicted point matches your real gaze point as you look around the screen. Secondly, run a program that collects the sequence of gaze points of a user looking around an image, and then play the gaze trace back after the run. Again, see how well the measured gaze points match what the user actually looked at.

Unconstrained Head Movement

It is vital that an eye tracker continues to track the user’s gaze accurately as she moves naturally. Unfortunately, in eye-tracker design there is often a direct tradeoff between accuracy and freedom of head movement. If you want high accuracy, head motion is limited. If you want more freedom of head motion, you give up accuracy.

To achieve high gaze-point measurement accuracy, the eye-tracker camera has to provide a relatively high-resolution image of the eye—at least ten camera pixels per millimeter at the eye. If the camera sensor is 1000 pixels wide, the total camera field of view is only 100 millimeters, or about four inches. Since people’s eyes are about 2 ¼ inches apart, the user cannot move his head over a range of more than an inch and a half before one of the eyes moves out of the camera’s field of view.

In modern eye-tracking technology, there are several approaches to providing larger “head boxes” while maintaining good eye-tracking accuracy. One method is to use higher-resolution cameras with increased numbers of pixels in the image sensor. However, with today’s camera technology, increased sensor resolution means smaller pixels and less light per pixel, resulting in grainier images that reduce gaze-point tracking accuracy.

Another method for increasing head freedom is to use multiple cameras and to orient the various cameras so their respective fields of view are stacked side by side. Unfortunately, each doubling of the desired head-box volume requires a doubling of the number of cameras, so the number of cameras can increase rapidly if large head volumes are required. Large head box requirements may result in complicated camera configurations.

Finally, it is possible to increase head volume by up to 100 times while maintaining high accuracy by placing the eye tracking cameras on motorized gimbals that keep the camera pointed at and focused on the user’s eyes as he moves his head around. The camera uses a telephoto lens to obtain a high-resolution image of the eyes, and just like eyeballs, it rotates and focuses the camera to keep the user’s eyes in the small, high-resolution field of view. When the telephoto eye-tracking cameras are not pointed at the eyes, separate, low-resolution, wide-field cameras may be used to detect the user’s face and point the eye tracking cameras at the user’s eyes. Thus, motorized cameras permit very large freedom of head movement while simultaneously achieving high gaze-point tracking accuracy.

When evaluating alternative eye trackers for your needs, carefully consider the tradeoff between gaze-point tracking accuracy and freedom of head movement. See if the accuracy holds as the users move around freely.

Accommodating Typical Variations in the Human Eye

To produce consistent, accurate, and reliable gaze-point measurements, an eye tracker’s image-processing algorithms must accommodate wide variation in the characteristics that naturally occur in different people’s eye images. The algorithms must accommodate variables like eye color, pupil dilation, pupil drift, eyelid occlusion of the pupil, contacts/glasses, varying lighting conditions, and varying head position and orientation.

No eye tracker today can track all people’s eyes in all circumstances, but some systems handle a wider range than others. When comparing eye trackers for your research, test the systems on a range of different people, and using displays and environments typical for your application. Use the same people, test scenes, and environments when you run tests on the different eye-tracking systems.

Ease of Calibration

To achieve accurate gaze-point tracking, eye trackers must accommodate several physiological features of individual eyes. For eye-tracking purposes, the shape of the cornea and the location of the foveola (central vision point) on the retina vary considerably from one eye to the next. To measure these individual physiological properties, eye trackers generally require that each user perform a calibration procedure where he visually fixates on a series of points that move around the screen.

Eye Tracking Graphic

When comparing eye trackers, it is instructive to see how easy it is for people to perform the calibration procedure quicky, and see what percentage of your test population gets good calibration results. It is also important to see how well the calibration holds—i.e., whether tracking accuracy tends to drift five or ten minutes after the calibration process, or changes if the user leaves his station and comes back later.

Infrared Light

Most eye trackers use infrared light from light emitting diodes (LEDs) to illuminate the eye, and measure the relative locations of the pupil and the LED’s reflection off the cornea to calculate the eye’s position and orientation. This approach is called the pupil center corneal reflection (PCCR) method. When evaluating an eye tracker, it is wise to consider how much infrared light the system exposes the subject’s eyes to. While the majority of trackers stay within safe exposure limits, many eye trackers use high infrared illumination levels to compensate for low-sensitivity cameras. According to the U.S. National Institute of Occupational Safety and Health, the maximum permissible exposure for the wavelength of infrared light used in eye trackers is 0.7 mW/cm2 as measured at the eye (LaMarre, D. A. (1977).Development of Criteria and Test Methods for Eye and Face Devices., NIOSH).

A new class of eye trackers is emerging that use simple webcams and no infrared light. While these systems are inexpensive and involve no infrared exposure, the low-resolution cameras and lack of a controlled light source severely reduce the system accuracy and will adversely affect results.

Bright vs. Dark Pupil

A critical job of the eye tracker’s image processing algorithm is to measure the pupil-center location. Accurate pupil measurements accurately require good contrast between the pupil and the surrounding iris. Given the fascinating structure of the mammal eye, there are two ways to achieve pupil–iris contrast: the bright pupil method and the dark pupil method.

With the dark pupil method, the LEDs are mounted some distance away from the eye-tracking camera lens, causing pupils to appear as dark objects, the same way they appear to our own eyes. With the bright pupil method, the light source is mounted at the center of the camera lens, causing pupils to appear as bright objects.

The manufacturer’s choice between bright and dark pupil effect is strongly influenced by the amount of ambient infrared light in the environment relative to the power of the eye tracker’s illuminator. If the ambient infrared light is low, the bright pupil effect provides better pupil–iris contrast; irises are dark and pupils are bright. As ambient infrared increases, however, iris images get brighter while bright pupil images stay the same brightness, which reduces contrast. When there are large amounts of ambient infrared (in sunlight, for example), better pupil–iris contrast is achieved by using the dark pupil effect because the dark pupil contrasts well with a relatively bright iris.

Generally, the bright-pupil effect provides better pupil–iris contrast in indoor environments, and the dark pupil effect provides better contrast in sunny outdoor environments.

Sampling Speed

Some modern eye trackers feature increased sampling rates. While traditional sampling rates vary between 30 and 60 Hz (samples per second), some new systems provide up to 1000 Hz. These high sampling rates are particularly valuable for studying the high-speed dynamics of eye rotations (saccades) or experimenting with visual stimuli that respond instantaneously to the eye’s activity. If, on the other hand, you want to study what people choose to look at in typical user experiences, a 60 Hz sampling rate is generally quite adequate. Eye-fixations typically vary between 100 and 500 milliseconds. A 60 Hz sampling rate provides between 6 and 30 samples per fixation—ample numbers to measure both fixation points and fixation durations.

Remote vs. Head-Mounted Eye-Trackers

Eye trackers can be divided into two types: remote and head-mounted. Remote eye-trackers are typically mounted under a computer screen or a display scene and view the user’s eyes from a remote distance. Head-mounted eye-trackers are typically mounted on eyeglass frames or caps on the user’s head and view the eyes from a very close range. Remote eye-trackers are typically more accurate and less intrusive in that nothing is attached to the user’s head. The accuracy of head-mounted eye-trackers is highly susceptible to error as the glasses or cap shift around on the user’s head. On the other hand, only a head mounted system can track a person’s eyes while he walks or runs around freely.

Gaze Path Analysis Tools

The majority of eye trackers are paired with data-collection and analysis software to support UX testing and evaluation. These programs make it possible to configure UX experiments, run experiments while collecting eye-tracking data, and analyze and present the experimental results. Different levels of eye tracking analysis tools allow you to collect data on still images, videos, websites, simulated environments, and real world environments. The analysis and presentation tools typically include real-time gaze overlays, fixation analyses, heat maps, and dynamic areas of interest over websites, videos, and dynamic environments. Analysis tools should be chosen to meet your specific experimental needs.

Conclusion

There is a wide mix of strengths and weaknesses in the eye-tracking devices currently available. If you choose the right system for your application, it will add tremendous insight and value to the results of your research. When you investigate and test different eye-tracking systems carefully, it will be clear which one is best for you.

Eyeball image provided by Shutterstock

post authorJon West

Jon West,  

Jon West is the Director of Marketing and Opportunities for LC Technologies located in Fairfax, Virginia. Over the last five years, he has directed company development efforts for several companies in the Washington DC metro-area, ranging from financial investment firms to high tech engineering companies.

LC Technologies is passionate about building the world's best eye tracking hardware and software solutions. Founded in 1986 with the goal of creating an unobtrusive human-computer interface that will revolutionize the way humans interact with computers and devices, LC now operates in 40 countriesTheir eye-tracking systems, are hands-off, unobtrusive, remote human-computer interfaces that provide highly accurate gazepoint prediction.

 

Tweet
Share
Post
Share
Email
Print

Related Articles

In this article, I’ll share seven incredibly useful design resources that can elevate your skills as a UX designer in 2024 and make you a well-rounded designer overall. These resources not only help you work more efficiently but also automate repetitive tasks and enable you to create outstanding user experiences. So, let’s dive into this without further delay.

Article by Abhi Chatterjee
7 Must-Have Resources for UX Designers in 2024
  • The article explores seven indispensable resources for UX designers in 2024, offering insights into design methodologies, case studies, UX laws, and practical challenges to elevate designers’ skills and efficiency.
Share:7 Must-Have Resources for UX Designers in 2024
3 min read
Article by Eleanor Hecks
8 Key Metrics to Measure and Analyze in UX Research
  • The article outlines eight essential metrics for effective UX research, ranging from time on page to social media saturation
  • The author emphasizes the significance of these metrics in enhancing user experience and boosting brand growth.

Share:8 Key Metrics to Measure and Analyze in UX Research
6 min read

Did you know UX Magazine hosts the most popular podcast about conversational AI?

Listen to Invisible Machines

This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and