Would you like to be able to turn your pizza box into a computer interface or use a banana as an alternative to your phone? Invoked Computing, a concept developed at the Ishikawa-Oku lab at Tokyo University, makes that possible via “ubiquitous” augmented reality (video and sound).
The idea here is to project screens, keyboards and other elements on everyday objects so users ideally wouldn’t need specific hardware anymore.
These “augmented” objects can be anything (like the mentioned pizza box or phone) and can be manipulated, for example by touching the projected volume bar on the pizza box and moving the finger up and down to adjust volume in the real world.
Lead researcher Alexis Zerroug explains:
In this project we explore the reverse scenario:a ubiquitous intelligence capable of discovering and instantiating affordances suggested by human beings (as mimicked actions and scenarios involving objects and drawings). Miming will prompt the ubiquitous computing environment to “condense” on the real object, by supplementing it with artificial affordances through common AR techniques. An example: taking a banana and bringing it closer to the ear. The gesture is clear enough: directional microphones and parametric speakers hidden in the room would make the banana function as a real handset on the spot. (…)
To “invoke” an application, the user just needs to mimic a specific scenario. The system will try to recognize the suggested affordance and instantiate the represented function through AR techniques (another example: to invoke a laptop computer, the user could take a pizza box, open it and “tape” on its surface).
Diginfo TV recently shot a video with Zerroug in which he explains the latest status of the Invoked Computing project: