Eyefluence’s Eye-Tracking Interface Lets You Navigate Virtual Reality Hands-Free

When it comes to virtual reality tech, seeing is always believing.

Earlier this month, I joined the team at Eyefluence, an eye-tracking startup, for a demo in San Francisco. Over the course of the meeting I was walked through the process of navigating menu screens, selecting icons and whacking moles in a mini game, all with my eyes.

I first demoed the company’s iUi eye interaction tech on a pair of ODG R6 Smartglasses. The thing that surprised me most was how little mental energy it seemed to take to navigate menus with the eye-tracking tech – I found I was subconsciously making the movements that would take me to the next screen simply because it was exactly what I was expecting to happen. At first, I was feeling skeptical and wondered if the demo had actually been engineered to make it easier for me to go through, but after trying to trick the sensors for a bit and getting stuck in the demo, I realized that I perhaps have gotten a bit too jaded/paranoid for a 22-year-old tech reporter.

I managed to learn the controls for navigation in only a minute or two, but it was clear after watching members of the Eyefluence team navigate the iUi that you can soar through the interface quite a bit faster with practice.

Eyefluence CEO Jim Marggraff said his team wanted to eschew the main visual navigation methods being pushed by others, which largely focus on eye movements teamed with winking. In addition to this making it look like users have pretty severe twitching issues, the whole process can imaginably leave your eyes feeling fatigued or disoriented after just a few interactions. Eyefluence’s iUi tech deals entirely with a coordinated system of eye gestures (the specifics of which they’re asked me to keep under wraps).

“Inevitable”

Right now, Eyefluence’s Bay Area-based team is composed of about 25 people, a third of which are sporting PhDs. Marggraff is a serial entrepreneur/inventor who created the LeapFrog LeapPad and Livescribe Smartpen, both pieces of tech that were both undoubtedly on your tech-saavy kids’ Christmas lists at some point over the past decade. He believes this particular tech of his is going to rapidly invade most VR/AR headsets on the market.

“This is a pivotal moment for our company, and for the HMD [Head-Mounted Display] industry as a whole, that will accelerate the adoption of AR and VR hardware and experiences, with our technology deployed in forthcoming headsets,” said Marggraff. “All HMDs are fundamentally incomplete without eye-interaction and all will be enabled with eye-interaction technology in the future.”

Eyefluence is most likely correct in the fact that eye-tracking tech, or something that closely resembles it, will undoubtedly be in the next generations of HMDs that come to market. What is uncertain is how developers will decide to integrate the flow of eye gestures with their own in-game controls and how they should be used to maximize mutual intuitiveness.

In terms of adoption risks for manufacturer headset integration, the tech actually boasts quite a small physical footprint. The flexible circuit boards that host the entire system, also cost “single-digit dollars” and could comfortably fit on top of a dime.

Huge for Mobile

Speaking of compactness, the implications for mobile VR with this interface technology are huge.

Mobile VR has a huge input problem right now. This iUi technology could easily replace the touch input controls on the side of the Gear VR headset and I believe would offer a much, much smoother experience that leaves users less aware that they’ve got a hulking virtual reality headset strapped to their face.

“Foveated rendering” is another one of the larger reasons for game developers to incorporate eye-tracking tech as soon as possible. The concept mimics the focus of the eye and allows the outer reaches of your vision to drop out of focus and therefore drastically lighten the load on system resources and allow the devices to, in turn, focus more on kicking up frame rates. It practical terms it means that only a small percentage of the screen will actually need to be in full resolution because that’s how your eyes work anyway. Here’s an example of the concept (not Eyefluence).

I had the chance to demo foveated rendering on an Oculus DK2 sporting a pair of Eyefluence sensors positioned on either eye, and the key thing is I didn’t really notice anything, and that’s awesome. Until the resolution was drastically reduced and things got totally blurry I was just looking around while only the area at the center of my focus being fully processed.

This will especially be important to mobile VR and the next generation of headsets that companies like Samsung and Google are looking to build. Mobile VR, which offers a much more accessible cost-of-entry to the virtual reality platform seems to always be bumping up against system limitations. Evolving mobile processor chipsets will certainly play a major role in powering the next-gen mobile VR experiences, but it will undoubtedly be doing so alongside foveated rendering as well.

Eyefluence is not alone in this space, other companies like Fove and Tobii have shown off their own eye-tracking interfaces, but this was my first personal demo with the technology in virtual reality and I have to say, I loved it.

Whether Eyefluence’s iUi is sustainable for long bouts of consumer use or will be able to grow to adequately determine when I’m looking at an AR display vs looking through the device like normal glasses are a couple big questions I have left on my mind after trying the tech out, but I am convinced that when most consumers get around to buying their first virtual reality headset, eye-tracking will be a key part of the experience.