A Unique Fingertip Sensor Helps Robots Touch The World Around Them

Next Story

CrunchWeek: Alibaba’s IPO, Minecraft, And Sarah Gets A ManServant

A Unique Fingertip Sensor Helps Robots Touch The World Around Them

Robots need love, too. That’s why MIT researchers have added a touch-force sensor to the robotic Baxter, allowing him to register gentle caresses, tender hand-holding, and the sense the he is loved and in love. Okay, not really. But now Baxter, a robot used in manufacturing and to perform repetitive tasks, can carry out those tasks with a certain gentleness and grace thanks to the addition of force pads on his pinchy fingers.

Called the GelSight, the system adds “unprecedented” sensitivity to a robot’s pinchers. By offering feedback on how hard to squeeze, it allows the robot to plug in a USB charger or even handle eggs and the like. It works using a chamber covered in thin rubber lit from the inside with color LEDs. The rubber pad, which is painted with reflective paint, deforms and reflects the shape of the object back to the sensor, which then interpolates the force used.

“I got interested in touch because I had children,” said Edward Adelson, professor of vision science at MIT. “I expected to be fascinated by watching how they used their visual systems, but I was actually more fascinated by how they used their fingers. But since I’m a vision guy, the most sensible thing, if you wanted to look at the signals coming into the finger, was to figure out a way to transform the mechanical, tactile signal into a visual signal — because if it’s an image, I know what to do with it.”

In other words, instead of depending on thousands of tiny force sensors, the system “sees” the shape of the object and how close it is to the light sensors. The rubber also acts as a sticky finger pad.

The coolest thing is that the sensors have millimeter accuracy. To wit:

In Platt’s experiments, a Baxter robot from MIT spinout Rethink Robotics was equipped with a two-pincer gripper, one of whose pincers had a GelSight sensor on its tip. Using conventional computer-vision algorithms, the robot identified the dangling USB plug and attempted to grasp it. It then determined the position of the USB plug relative to its gripper from an embossed USB symbol. Although there was a 3-millimeter variation, in each of two dimensions, in where the robot grasped the plug, it was still able to insert it into a USB port that tolerated only about a millimeter’s error.