cornell

Tell Me Dave Lets You Train A Robot To Respond To Complex Commands

Next Story

KakaoTalk Now Lets Users Test New Features Before They Launch

Sudo make me a sandwich, anyone? A new research project by a computer science team at Cornell University is using human volunteers to train robots to perform tasks. How is it unique? They’re showing robots how to infer actions based on very complex, human comments. Instead of having to say “move arm left 5 inches” they are hoping that, one day, robots will respond to “Make me some ramen” or “Clean up my mess.”

The commands are quite rudimentary right now and focus mostly around loose requests like “boil the ramen for a few minutes” which, with enough processing, can be turned into a step-by-step set of commands. For example, in the video above a subject asks for an affogato, basically coffee with ice cream. The robot has learned the basic recipe and so uses what is at hand — a barrel of ice cream, a bowl, and a coffee dispenser — to produce a tasty treat for its human customer.

We’re not quite at 100% accuracy yet. But, interestingly, the robot performed the right steps 64 percent of the time even when the commands changed and the entire scene was reorganized. “There is still room for improvement,” said the researchers.

“In order for robots to perform tasks in real world, they need to be able to understand our natural language commands. While there is a lot of past research that went into the task of language parsing, they often require the instructions to be spelled out in full detail which makes it difficult to use them in real world situations,” they wrote. “Our goal is to enable robots to even take an ill-specified instruction as generic as ‘Make a cup of coffee’ and be able to figure out how to fill a cup with milk or use one if it already has milk etc. depending on how the environment looks.”

These sorts of skills are of utmost importance in robots. For robots to be truly useful in the field, they have to understand our most banal commands. For example, a mechanic’s helper robot could understand “Bring me a wrench for this bolt” and the robot, through the benefit of video cameras and remote measuring tools, would be able to supply the tool.

Created by Doctoral Students Dipendra K Misra, Jaeyong Sung, Kevin Lee, and Professor Ashutosh Saxena, this is also one of the most nicely designed project websites I’ve seen in a while. Because the team is hoping you’ll play along with Tell Me Dave and teach the robot simulator a thing or two about cleaning up the living room, they’ve created an interactive environment that allows you to train a virtual robot to make food or change the channels on a TV.

The whole project is so whimsical it’s hard to believe that robots like this will soon be taking our Starbucks orders, caring for the infirm, and building our iPhones. I, for one, welcome our ramen-making robot overlords.