MIT’s remote control robot system puts VR to work

MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has come up with a use for virtual reality headsets that goes beyond firing them up, checking out a new game, muttering “cool” briefly after 5 minutes of use and then putting them back in the closet: Controlling robots remotely for manufacturing jobs.

The CSAIL research project combines two things with questionable utility into one with real potential, marrying telepresence robotics and VR with manufacturing positions. The system gives the operator a number of ‘sensor displays’ to make it feel like they’re right inside the robot’s head on site, and even employed hand controllers to provide direct control over the robot’s grippers.

This system actually uses a simplified approach compared to a lot of 3D virtual simulated remote working environments, since it just takes the 2D images captured by the robot’s sensors, and displays them to each of the operator’s eyes. The operator’s brain does all the heady lifting of inferring 3D space – which makes the experience graphically light, and actually decreases queasiness and other negative effects.

CSAIL’s team called their robot Baxter, and operating Baxter also makes you feel as if you’re right inside its heads. It’s designed to create a “homunculus model of mind,” or the feeling that you’re a small human sitting in the brain of a large humanoid robot essentially piloting it – like a mech pilot might in, say, Guillermo Del Toro’s Pacific Rim.

Despite CSAIL’s unconventional approach, participants in the study had a higher success rate than with state-of-the-art, more complex alternatives, and gamers in particular were adept at this kind of remote control. MIT CSAIL even proposes it could potentially help put some of the growing population of young jobless gamers down a new career path in commercial use.