There is no dearth of impressive student projects here at the finals of Microsoft’s Imagine Cup in Sydney, but one of the six finalists that caught my attention was a project called EnableTalk by the Ukrainian team QuadSquad. There are currently about 40 million deaf, mute and deaf-mute people and many of them use sign language to communicate, but there are very few people who actually understand sign language. Using gloves fitted with flex sensors, touch sensors, gyroscopes and accelerometers (as well as some solar cells to increase battery life) the EnableTalk team has built a system that can translate sign language into text and then into spoken words using a text-to-speech engine. The whole system then connects to a smartphone over Bluetooth.
The team has built a number of prototypes and tested them with sign language-users in the Ukraine. The idea for the project, said team member Osika Maxim, came from interacting with hearing-impaired athletes at the groups’ school.
The few existing projects that come close to what EnableTalk is proposing generally cost around $1,200 and usually have fewer sensors, use wired connections and don’t come with an integrated software solution. EnableTalk, on the other hand, says that the hardware for its prototypes costs somewhere around $75 per device.
Besides the cost, though, another feature that makes this project so interesting is that users can teach the system new gestures and modify those that the team plans to ship in a library of standard gestures. Given the high degree of variation among sign languages, which also has regional dialects just like spoken language, this will be a welcome feature for users.
This being a Microsoft competition, the system obviously mostly uses Microsoft technology, but as the EnableTalk team pointed out, Windows Phone 7 doesn’t allow developers access to the Bluetooth stack, the current version actually runs on Windows Mobile, the predecessor to Windows Phone that even most people at Microsoft would rather not think about anymore.