The Kinetic Space project uses almost any type of 3D spatial scanner – including the Xbox Kinect – to register and read gestures. How does it work? Well, first you register your body and then record a set of gestures. The system can read those gestures and trigger events based on the speed and repetition of the gestures. The best part is the granularity: you can even scan hand motions for an interesting form of man-machine sign language.
The project code is available here and it supports the “PrimeSense PS1080, the Kinect or the Xtion sensors” so it runs the gamut from high-end to low.
The software observes and comprehends the user interaction by processing the skeleton of the user. The unique analysis routines allow to not only detect simple gestures such as pushing, clicking, forming a circle or waving, but also to recognize more complicated gestures as, for instance, used in dance performances or sign language.
What is it good for? Well, it can read a gesture from one person and register it on another and you can train it to register tiny movements and, potentially, allow for full motion control of your PC. Minority Report it isn’t, but that future is getting closer and closer.