Some of the most promising set of mobile apps being built today use a cell phone’s camera and GPS to overlay data onto the real world. In other words, instead of looking at a browser, you look through the camera lens at the real world around you and information is layered on top of the view projected on the small screen. (It’s not just a viewfinder, you know). Last year at TechCrunch 50, the Sekai Camera demo from Japan that does this blew away the audience. More recently, Layar showed us similar augmented reality apps for the Android phone. Now IBM has its own augmented reality mobile app for Wimbledon called Seer Android (see demo in the video above).
In order for these apps to be worthwhile, people first have to do the hard work of tagging the world, otherwise the apps have no data to pull down and display. Since it is the technology provider of the tennis tournament, IBM decide to tag Wimbledon. Using the Android G1′s compass, camera, and GPS, IBM’s app shows pop-up windows whenever it recognizes whatever you are pointing at: tennis courts (along with who is playing), bathrooms, buses, and so on. It can tell the user how far away a court or food concession stand is, and can stream in live data such as scores.
As you look through the camera, Seer Android essentially reads the environment and pumps that data into the app. It is very cyborg. But the app only works at Wimbledon. It is location-specific, which brings us back to the challenge of tagging the world. For IBM, Seer Android is a cool way to demonstrate how much data they are collecting at Wimbledon. But as more of the world gets tagged with geo-coded data, this sort of mobile app will become more practical to use everywhere.