The Eye Tribe, which took the stage today at TechCrunch’s CES Hardware Battlefield, is developing hardware that allows users to control technology with the motion of their eyes.
In fact, co-founder and CEO Sune Alstrop Johansen told me that the company has started shipping its first units and software development kits (they’re available for $99), and that the initial users should be receiving them now.
Johansen said The Eye Tribe has also raised another $1 million in seed funding, bringing its total seed/angel funding to $1.8 million. (It also received a $1.3 million grant from the Danish government.) The money comes from “primarily existing investors, board members and key individuals from the US,” he said — new backers include former semiconductor executive Richard Sanquini.
CES marks the first time that the finished product, not just a prototype, has been demonstrated publicly, he added. And although the initial version was built for Windows, he said the company is unveiling a Mac version too. As for the iOS and Android versions that the company has mentioned in the past, Johansen said they’re still on the product roadmap but declined to get specific.
I didn’t get a chance to try the product out for myself, but if you’ve ever wanted to see someone play Fruit Ninja with their eyes, well, watch this video.
As you can probably guess from the fact that an SDK is included, the company is currently focused on recruiting the developers that it hopes will actually build applications that take advantage of these capabilities. In fact, when a prototype of The Eye Tribe Tracker was demonstrated in our Hardware Alley at last fall’s Disrupt Europe conference, the company said it was also going to provide free trackers to developers with the best ideas.
Those ideas also help answer the question, “Why the heck would I want to control software with my eyes?” — they give a sense of what people could potentially do with the technology. The winners include an idea for a device combining eye tracking and EEG technology to help those with ALS (Lou Gehrig’s Disease) communicate, as well as ideas for driver assist applications, breast cancer detection, drone control, and improved reading on tablets.
Last fall, a company representative told us that users don’t have to train themselves to act differently. Instead, they claimed that after the initial calibration, users could just let their eyes interact normally with applications and the software should respond accordingly.
The company has also said the eventual goal is to partner with hardware makers who want to integrate these capabilities — so in the future, you could get a tablet with eye-tracking capabilities built in, rather than having to buy a separate to device. In fact, Johansen told me this week that the company is setting up an office in Palo Alto “as we believe this will be the best place for us to engage” with the manufacturers.