A new system created at Keio University will help add lifelike motion capture and emotion-sensing to apps, games, and design programs by scanning your face and body for cues. The system works on any PC and can recreate all of your facial expressions on a life-like human avatar. The test model lets the team turn a sullen grad student into a cute girl with long pigtails.
The system lets the average user create lifelike motion captures of faces and upper bodies, adding a bit of realism and ease to a process that usually required special motion capture suits and lots of post-processing.
The project is led by Associate Professor Yasue Mitsukura and is still in experimental stages.