Google Launches Project Tango Smartphone To Experiment With Computer Vision And 3D Sensors

Google today announced Project Tango, an Android-based prototype 5″ phone and developer kit with advanced 3D sensors out of its Advanced Technology and Projects (ATAP) hardware skunkworks group.

Using its sensors, the phone doesn’t just track motion, but it can actually build a visual map of rooms using 3D scanning. The company believes the combination of these sensors with advanced computer vision techniques will open up new avenues for indoor navigation and immersive gaming, among many other things.

Starting today, Google will allow developers to sign up for access to these phones, but the first run will be limited to a hand-vetted group of 200 developers. Developers will have to provide Google with a clear idea of what they want to build with the device and the company expects to allocate all devices by March 14th, 2014. It will allocate the devices to developers who want to build apps for “indoor navigation/mapping, single/multiplayer games that use physical space, and new algorithms for processing sensor data.”

Developers will be able to write apps in Java, C/C++ and with the help of the Unity Game Engines. The company notes that the APIs for the phone remain a work in progress.

“Project Tango strives to give mobile devices a human-like understanding of space and motion through advanced sensor fusion and computer vision, enabling new and enhanced types of user experiences – including 3D scanning, indoor navigation and immersive gaming,” said Johnny Lee, ATAP’s technical program lead.

bg4

The idea behind Project Tango is to see what kind of applications developers will dream up for this technology. Google hopes that it can unlock new kinds of smart, vision-based applications based on the 3D sensing and vision technology that it has built into the phone. By giving applications an almost human-like understanding of space, developers will be able to create applications that simply weren’t possible before.

The phones are outfitted with a compass and gyros, just like any other phone, but in addition, they feature Kinect-like visual sensors that can scan the room around the phone.

It’s worth noting that the idea here isn’t to create Leap Motion-like, gesture-based interfaces. It’s about how the apps developers can create when they know exactly where a phone is in space.

In its announcement, Google asks: “What if you could capture the dimensions of your home simply by walking around with your phone before you went furniture shopping? What if directions to a new location didn’t stop at the street address? What if you never again found yourself lost in a new building?”

Tango’s Sensors

Google is using Movidius’ Myriad 1 vision processor platform for Project Tango. For the longest time, embedding these kinds of sensors into phones was not just prohibitively expensive, but because it tends to be computationally demanding, they would also drain a phone’s batteries rapidly. The latest generation of vision processors, however, uses significantly less power, which was likely a reason why Google was able to go ahead with this project. You can read more about the sensors in our post here.

cameras_tango

The project was headed up by Lee, who previously worked on Microsoft’s Kinect technology before he left for Google in early 2011. Today’s announcement also marks the first public hardware release from Google’s ATAP group, which was one of the few units of Motorola the company decided to keep, even as it is selling off the rest of the company.

Besides Tango, the group is also involved in Project Ara, the modular phone concept that has received quite a bit of attention, as well. Google considers ATAP to be its “moonshot tech group” outside of Google[x] and its mission, as far as we can see, is to test advanced mobile technologies. The group is headed by Regina Dugan, a former DARPA director who joined Google in 2012.