RT @BorowitzReport Obama Sends Predator Drone to Pick Up Nobel Peace Prize—
Pele. Jack. Tipy. (@Jackandpele) December 10, 2009
“Giving machines the power to decide who lives and dies on the battlefield would take technology too far,” said Steve Goose, arms division director of Human Rights Watch, which just released on ominous new proposal to ban fully-automatic robots from war. The proposal stands in direct opposition to the Pentagon’s attempt to design war-crime-resistant robots. “Robot soldiers would not commit rape, burn down a village in anger or become erratic decision-makers amid the stress of combat,” surmises The Economist. No longer just the philosophizing of science fiction, policymakers will soon have to answer whether robots can be more moral soldiers than humans.
Human Rights Watch’s damning new pre-emptive 50-page report concludes that, “to comply with international humanitarian law, fully autonomous weapons would need human qualities that they inherently lack. In particular, such robots would not have the ability to relate to other humans and understand their intentions.” Specifically, the report argues that robots cannot uphold the basic tenets of just war theory, namely, that non-combatants are off-limits and responses must be proportional (i.e. no nuclear strikes in response to a kidnapping). Such self-policing requires a sense of the “principles of humanity” and the “dictates of public conscience,” neither of which can be programmed.
The report notes that robotic weapons will soon be able to make life or death decisions without the command of a human. Automatic missile defense systems, such as Israel’s Iron Dome, have been battle tested successfully.
The U.S. Navy’s cutting-edge drone tech, the X-47B, will be able to land and refuel completely on its own, and has reportedly been designed “for eventual ‘combat purposes,'” according to the report. The Los Angeles Times warned that could begin “a paradigm shift in warfare, one that is likely to have far-reaching consequences. With the drone’s ability to be flown autonomously by onboard computers, it could usher in an era when death and destruction can be dealt by machines operating semi-independently.”
On the other hand, The Telegraph reports that the U.S. Army has hired experts for the purpose of building amoral soldiers, safe from the psychoses of war. Indeed, proponents of robotic solidiers could point to tragic incidences, such as the alleged drunken rampage of U.S. Army Staff Sergeant Robert Bales, who purportedly murdered 16 Afghan villagers.
War is, ultimately, a human creation. The twisted question is, does greater human involvement make war more or less humane?