Unlocking the potential of eye tracking technology

The concept of measuring and responding to human eye motion, or eye tracking, isn’t new, but the past year saw a rising interest in the technology. There have been a slew of acquisitions of eye tracking startups by large firms and the rollout of several devices and software that support eye tracking.

“Eye tracking sensors provide two main benefits,” says Oscar Werner, vice president of the eye tracking company Tobii Tech. “First, it makes a device aware of what the user is interested in at any given point in time. And second, it provides an additional way to interact with content, without taking anything else away. That means it increases the communication bandwidth between the user and the device.”

There’s a chance that soon eye tracking will be a standard feature of a new generation of smartphones, laptops and desktop monitors setting the stage for a huge reëvaluation of the way we communicate with devices—or how they communicate with us.

“In the past year eye tracking technology moved from being a promising technology to being adopted in commercial products in a wide array of consumer segments simultaneously,” Werner says.

Dominic Porco, chief executive officer at Impax Media, a digital advertising company, says less expensive and more potent hardware; new open source software platforms; and new easier and faster ways of obtaining data to train algorithm models have driven the progress in eye tracking technology.

“Companies like NVIDIA have launched products with more powerful GPUs at competitive prices, accelerating the image recognition speeds,” Porco says.

Porco adds that popular crowd-sourcing marketplaces such as Amazon Mechanical Turk have enabled the collection of larger and broader datasets to train recognition algorithms. “These developments have accelerated progress in eye tracking technology significantly, allowing researchers and developers to go faster through their cycles of experimentation and implementation.”

But any technology won’t grow unless it can fulfill specific demands and use cases. And in the case of eye tracking, there seems to be no shortage.

Virtual Reality

Businessman in virtual computer room.

In a push to create a more immersive experience, VR headset companies are making large investments in eye tracking technology. In fact, in many ways, eye tracking is seen as the technology complementing VR.

“VR is about immersion,” Tobii’s Werner says. “But a VR headset without eye tracking will assume that I am speaking to the person in front of my forehead. It is approximating my area of interest to the direction of my forehead. We all know this is not true. Our real interest is where I am looking, and there is often a difference between where I look and the direction of my head. VR headsets need to take your gaze into account be become truly immersive.”

Eye tracking technology is key to foveated rendering, a technique where only the portion of the image that lands on the fovea (the part of the retina that can see significant detail) is rendered in full quality.

With foveated rendering, Werner says, there will be a 30 to 70 percent decrease in the amount of pixels drawn, a processing power saving that can translate to higher framerates and the ability to achieve high quality output with 4k headsets as opposed to the 24k level needed to meet natural human vision level.

Also, eye tracking technology will make it possible to reduce graphics distortion caused from not taking eye position account when rendering VR graphics, Werner adds.

Fove, a kickstarter-funded project, is the first VR headset to have embedded eye-tracking. Others are not far behind. In the past months, Google and Facebook acquired eye tracking startups Eyefluence and Eye Tribe respectively, and are expected to embed the technology in their future products.

And SMI, a leader in eye tracking technology, has initiated several partnerships and projects to bring eye tracking to both standalone VR head-mounted displays and smartphone slot-ins.

Eye tracking will also be part of the upcoming Khronos VR API, an open standard under development which has garnered the support of Oculus, Google, NVIDIA and others.

“The view that eye tracking will be a key part of second generation headsets is shared by a large number of VR HMD vendors,” Werner says. “This drives technology development and innovation.”

PC Gaming

Another successful VR demo, at HTC, that showed room scale gaming that actually worked.

Another successful VR demo, at HTC, that showed room scale gaming that actually worked.

For decades, we’ve used gamepads, joysticks, keyboards, mice and other peripherals to make PCs and video game consoles understand where we’re looking at. With eye tracking, your computer already knows what you’re looking at and can react accordingly.

“When you want to interact with an object you just look at it and press a button,” says Werner. “The computer understands which object you want to interact with. You don’t need to drag the mouse or controller to the place you are already looking.”

Whether it’s about hacking at an object, aiming at a target, designating a location for the game character to run, or simply changing the direction of the point-of-view camera, eye tracking might make it a whole lot easier for gamers to interact with the gaming environment. This can mean a big deal for games that require a high level of mouse and controller handling.

While it might render previously challenging games too easy, it can also pave the way for games that are much more fast paced.

Eye tracking can also create a cleaner and less intrusive user interface.

“In games, graphical artists spend a lot of time creating beautiful environments,” Werner says, “and are in a constant battle with UI interaction designers that need to place UI elements on top, since it clutters the immersive feeling.”

With eye tracking, Werner explains, you can hide the UI or make it transparent and only make it visible when the gamer’s gaze is directed toward it. “This creates a more immersive feeling and solves a constant battle between graphical artists and UI designers,” he says.

For simulations and virtual worlds, eye tracking can enable gaze aware objects, where game objects or characters will react to the gaze of the player to make the simulation more realistic, Werner says. This means that you’ll have to be careful not to stare too long at a mercenary’s purse when entering a tavern in your favorite RPG.

Tobii itself offers a line of tack-on eye-tracking devices and eye-tracking embedded laptops and has worked with gaming companies on eye-tracking-enhanced versions of popular games such as Rise of the Tomb Raider, Deus Ex and Watch Dogs 2.

It is unlikely that eye tracking will replace controllers any time soon, but according to Werner, thanks to the technology, “PC games will take into account one of the most powerful interaction methods we as humans have, our eyes. PC gamers will be able to utilize an additional control mechanism complementing the mouse and the controller. This will drive more natural and interaction without taking anything away.”

Medicine and accessibility

Medicine doctor hand working with modern computer interface as medical concept

Beyond the consumer level, the benefits of eye tracking expand to other realms where measurements of human gaze are key to obtaining results and insights.

“There is an increasing interest in using eye tracking to help diagnose — and potentially treat –neurological disorders,” says Bryn Farnsworth, science editor at biometric research company iMotions. “For example, infants usually like to look at images with people’s faces—scenes that have a social element.”

Farnsworth explains that infants that go on to develop autism are much more likely to have a preference for images that feature geometric shapes, while for children with the Williams syndrome, Farnsworth says, the situation is reversed, meaning that they show a marked preference for social scenes in comparison to neurotypical children.

This all suggests, Farnsworth says, “that the analysis of eye movements may help guide early diagnosis.”

A research paper by students at UCSD states that eye tracking technology holds promise as an objective methodology for characterizing the early features of autism, because it can be implemented with virtually any age or functioning level.

Labs such as iMotions are helping researchers obtain those metrics through eye tracking devices in order to better understand and assess the conditions of patients.

Eye tracking company RightEye uses the technology to help physicians in administering test and finding symptoms for illnesses ranging from simple concussions to Alzheimer’s disease and dyslexia, and to help treat children with autism.

Eye tracking can also be a breakthrough for patients with physical disabilities, especially as affordable, consumer-level devices become available. “This is a large area for eye tracking and it is evolving,” says Tobii’s Werner. Gaze keyboards and control panels powered by eye tracking, Werner explains, give people with diseases such as cerebral palsy and spinal cord injuries a means to communicate, control their environment and develop skills through therapy.

Advertising

facebook-live-ads-gif

In the current state, the best metric advertisers get from ads are impressions and click numbers. But those numbers do not precisely reflect the effectiveness of ad campaigns because a lot of what gets counted as impressions goes to waste on non-human sources. That is something that changes with eye tracking technology.

“The advertising industry is currently in the midst of some major upheaval when it comes to universal standards for measuring ad impact,” says Porco, the Impax chief executive. “The whole concept of ‘viewability’ is now being redefined to make more sense in the age of ad blockers and bot traffic.”

With eye tracking technology, online advertisers will be able to measure exactly how many actual human eyes actually view their ads when they appear on the page. While gaining precise metrics would be nearly impossible until such time as every computer and mobile device is embedded with eye tracking technology, using eye tracking does give insights into how users interact with ads.

In the physical world, however, eye tracking is already showing promise.

“Market research firms are experimenting with directly measured biometric data to precisely determine the composition of people in out-of-home media environments such as retail stores, for audience measurement purposes,” Porco says.

Porco’s company, Impax Media, is investing heavily in eye tracking technology along with other computer vision techniques to collect attention metrics from its proprietary in-store advertising screens. “We’re big believers that the future of the ad industry is going to be grounded in attention metrics, as opposed to impressions, and eye tracking is, hands down, the best way to track attention,” Porco says.

The data, Porco says, helps advertisers and location partners to assess audience interest in various messaging angles, and to correlate this information with parameters like location, timing and demographics. “It’s great for media buyers seeking to get the most for their budgets, and for store managers dealing with questions about everything from inventory to staff shift schedules.”

While retailers always benefit from collecting information about customers, the area is somewhat a gray and is often subject to controversy and falls across privacy regulations. However, Porco underlines that there’s no need to collect identity parameter in order to glean useful insights, and anonymized data about gaze point, age and gender along with duration of view suffice.

Market research

research-kit

It’s important for market researchers to “evaluate people’s interactions and expectations across the whole omnichannel customer journey and its key touchpoints,” says Simone Benedetto, UX researcher at TSW, an Italy-based market research lab.

Recent advances in eye tracking, Benedetto explains, open up new possibilities in both lab and real world neuromarketing tests.

“It is crucial for us to involve users in product and service design and evaluation,” Benedetto says. “This doesn’t mean just to ask them their opinion, but to collect objective data coming from their eyes and brain while interacting with the product or service.”

TSW uses mobile eye tracking units along with other wearables in order to get precise user and customer metrics on a wide variety of products and services, both digital (such as online ads, mobile apps, websites, software and device control panels) and physical (such as print material, product packages, cars, home furniture and retail stores).

Being able to weigh the user’s natural interaction with products and services enables researchers to identify real usability problems and frustration points and collect actionable information that gives insights into customer satisfaction and engagement and drive design decisions.

“One of the great advances last year has been the introduction of object tracking in relation to the analysis of data from mobile eye trackers,” says iMotions’ Farnsworth, referring to the process by which specific visual features can be delineated from a scene, and information about how that particular feature is attended to can then be recorded.

“This means that an individual can, for example, wear portable eye tracking glasses, interact with their environment normally, and how they attend to certain features can be automatically analyzed —how long they look at a street map when they’re out walking, if they notice an advert that they passed,” Farnsworth says. “Being able to automatically understand how specific features are attended to clearly has great ramifications for understanding humans, and using that knowledge further.”

“From my perspective there’s a huge market behind the exploitation of eye tracking into UX-neuromarketing investigations,” Benedetto says. “Eye tracking allows the implicit measurement of user behavior, and turns that measurement into quantitative objective data. We have only relied on subjective data for years, and it’s definitely time for a change.”

The future of eye tracking

Close Up of blue eye with computer circuit board lines, digital composite

Close Up of blue eye with computer circuit board lines, digital composite

Tobii’s Werner told me he believes a new paradigm of PC usage will emerge, where eye tracking is a fifth modality that, in combination with touch screens, mouse/touchpad, voice and keyboard, will make computers much more productive and intuitive. “Gaze always precedes any kind of action that you do with mouse, keyboard and voice, so much smarter user interactions will be designed using these technologies,” he says.

As vision is the most used sense among human beings, being able to track and measure it digitally will have a great impact on how we make our intentions known to computers, wittingly or unwittingly.