New Screen Technology, TapSense, Can Distinguish Between Different Parts Of Your Hand

And you thought multitouch gestures were annoying – how about mashing your whole hand on your screen to close an app or rapping on it with your knuckle to summon Siri (or Iris?). A new technology from Carnegie Mellon’s Human Computer Interaction Institute allows your device to distinguish between different types of taps using a microphone and touchscreen.

Created by Chris Harrison, the same guy who brought us Omnitouch, the technology “doubles the bandwidth” when it comes to touch interaction.

By attaching a microphone to a touchscreen, the CMU scientists showed they can tell the difference between the tap of a fingertip, the pad of the finger, a fingernail and a knuckle. This technology, called TapSense, enables richer touchscreen interactions. While typing on a virtual keyboard, for instance, users might capitalize letters simply by tapping with a fingernail instead of a finger tip, or might switch to numerals by using the pad of a finger, rather toggling to a different set of keys.

The system can also sense different tools including foam, multiple pens types, and brushes. The system could sense who was using which pen, allowing for collaborative drawing.

You can check out the project page here. The project is obviously in its research stage but I wouldn’t be surprised if it showed up in real world applications in the next year or so.