I attended the inaugural event of the AR Commons in Tokyo today, a new initiative that’s supposed to help set “standards of augmented reality as a public environment”. In other words, the symposium is an attempt to understand what augmented reality, the mixing of the real world with computer data, really is and what consequences the concept will have on the society.
And one part of the symposium was dedicated to one of TechCrunch’s favorite companies out there: Japan-based Tonchidot. CEO Takahito Iguchi, who delivered an unbelievable performance during his pitch at TechCrunch 50 (where his company was launched), presented an almost final version of Sekai Camera.
To recap, Sekai Camera is an application in development for iPhone and Android that “tags the real world”, meaning it presents tagged information in the form of a graphical layer over images in the mobile phone camera. And the Sekai Camera version I saw today was prettier than previous ones and worked (almost) flawlessly during Iguchi’s live demonstration (which lasted about 20 minutes).
Iguchi showed how Sekai Camera works on an iPhone 3GS, which is especially suitable to run the app because of its built-in compass (3G users will probably have to flick fingers to the left and right in order to overcome the directional alignment issue). Technically, the working prototypes I saw in the last few months (back in February, I tried an early iPhone version myself) leave me no choice but to say the app really works.
The question is if Sekai Camera ends up being just a gimmick (as opposed to being useful and long-term fun) and if it will really achieve mass adoption (without which there is no real point). There is no doubt this is a very intriguing app, but it might end up being just way ahead of its time.
When does it come out? We’ll see a finished iPhone version in the “near future”, Iguchi told me today.
Have a look at the two videos I made below. This looks very, very promising thus far.