‘A robot doesn’t have to shoot back,’ Rodney Brooks says of machines in the military

Rethink Robotics co-founder and CTO, former CSAIL director and all-around robot luminary Rodney Brooks joined the Disrupt New York stage this afternoon to tackle some complex questions, ranging from robots place in the living room to the battlefield.

Brooks has a fair bit of experience in both categories, as a cofounder of iRobot, whose product offerings have ranging from vacuuming to bomb diffusion. And while his current company deals more in the realm of factory automation, a number of these ethical issues still clearly weigh heavily on the Australian roboticist.

It was a question about whether robots should be considered unfit for any human tasks that really caused Brooks to ponder their place in the world.

“This might be controversial,” he said, pausing momentarily. “A lot of people say we shouldn’t have robots making firing decisions in the military – whether to pull the trigger. We thought about that a lot at iRobot. And while iRobot has not sold any robots that do this, we came to the conclusion that a robot can afford to fire second.”

Sending robots into the battlefield has, of course, been a long-time subject for debate among those who spent time thinking about future technologies. And certainly iRobot’s own origins as a DARPA funded-startup have made Brooks keenly aware of all of the baggage such questions contain. But Brooks argued that, as with practically every other bold, forward thinking question, there are several sides to consider.

After all, he explained, while there’s good reason for concern around essentially constructing killing machines, robots are capable of, among other things, calmly assessing context. He continued, ”If you’re sending an 18 or 19 year old kid – which is what the US does – and putting them in a high stress situation at night in a place where they don’t speak the language and they think they’re getting shot at, they shoot back. A robot doesn’t have to shoot back. There’s an example where they can be much more cautious. So, anything you can come up with, you can probably find arguments both ways. They’re never easy.”

Rodney Brooks (Rethink Robotics) at TechCrunch Disrupt NY 2017

The question of how important human characteristics are in developing robots is similarly up in the air. While Rethink imbues its own robots like Baxter with certain human traits like arms and a face, he once again returned to the Roomba as an example of when exact mimicry isn’t necessary.

“The Roomba is definitely not humanoid, but people project a lot on it,” Brooks explained. “We had 6,500 robots in Afghanistan and Iraq doing roadside bombs, but people named them and said they were their friends. Roombas you can buy clothes for. People project a lot onto these robots, even though they’re not of human form.”

Even so, Brooks explained, future home robots could certainly benefit from better adapting to their environments.

[gallery ids="1492686,1492685,1492684,1492683,1492682,1492681,1492680,1492679,1492678,1492677"]

“When we’re putting robots in environments where people normally are, some level of biomimicry is important,” he added. “The big problem for home robots is, right now, Roombas are low to the ground, and our homes are designed for something that’s between five and six feet tall and skinny. So that’s where we’ll ultimately see home robots, is tall and skinny, because the home is designed for tall and skinny.”