Here’s an interesting little project that, while it’s unlikely to grow into a major product, nevertheless demonstrates the potential of alternative interfaces. Bruno Zamborlin’s Mogees (an abbreviation of “mosaicing gestural surface”) takes input from a contact microphone and analyzes it to determine the placement and direction of gestures on any surface through which vibrations can be detected.
I wrote a while back about how the “finger on a glass touchscreen” wasn’t the be-all and end-all of user interaction. The stylus, for example, has much life left in it. And interfaces we haven’t even thought of will emerge as well. Why not a puck that turns your table into a touchable surface?
It really has to be seen to be understood:
Naturally this demonstration doesn’t speak to the practicality of using it to, say, scroll down a webpage or control a cursor. But don’t you kind of get tired of resting your hand on your laptop, inching your fingers along a patch of plastic or glass to move the next paragraph into view, or some such action? I like the idea of taking gestures off of the device itself and moving them into its vicinity.
And objections to this particular system are ready enough: how would ambient vibrations and music affect it? What about typing? And so on.
But the point isn’t to take this device and apply it in your mind to something for which it wasn’t designed (the Mogees is a sound creation and control device). It’s to take the idea of taking what you have and doing something new with it. What if you could put your iPhone on the table and, if it rang, tap the table once to answer, tap twice for speakerphone, put your whole hand down to silence it, etc?
This particular item may be more suited to Theramin-style musical noodling, but it descends from a larger concept of disengaging the controls for a device from the device itself — of improvising the medium of interaction but retaining the content.