If a robot in combat accidentally kills a civilian, who is to blame?
This isn't as straightforward of a question as it sounds. A team of scientists presented a study at the International Conference on Human-Robot Interaction and found that although robots don't have free will, people sometimes treat them as if they do.
The researchers had 40 undergraduate students play a scavenger hunt game with a human-like robot named Robovie. The robot was controlled remotely, but it appeared autonomous to the students.
In the game, the participants had two minutes to find objects in the room from a list. They all found the minimum number of objects (seven) to win the $20 prize. But Robovie would claim they only found five.
That's when people would start arguing with the robot. They even accused it of lying and cheating when it miscounted the objects. In the study, 65 percent of the people said the robot was to blame, at least in part.
A scavenger hunt is a far cry from combat, but the study showed that as robots get better at mimicking human interactions, people will hold them morally accountable, at least to a point. Robots have been built with the capacity to make ethical decisions, and as long ago as 2009 ethical governors for military robots were developed at Georgia Tech. But Peter Kahn, a University of Washington associate professor of psychology and lead author, noted that the laws of armed conflict haven't kept up with innovations.
Even if one decided that the human operators were always responsible, the study showed that the political fallout from a robotic massacre might complicate matters, because even though intellectually people understand robots follow their programming, it doesn't mean they react to them that way.
Already the use of drones in Pakistan has been roundly criticized. Drones have been used to attack bases said to be used by terrorists; in several cases there have been civilian deaths. Thus far, military robots in use aren't making tactical decisions, but the situation is changing rapidly and necessitates a close look at how we program them.
Credit: Human Interaction with Nature and Technological Systems Lab / University of Washington