clock menu more-arrow no yes

Filed under:

Human Rights Watch would prefer we not have to go back in time to stop killer robots

New, 23 comments

If you buy something from a Polygon link, Vox Media may earn a commission. See our ethics statement.

It’s a trope as old as video games themselves — the sentry gun, an unfeeling mechanical contrivance that fires on unsuspecting players as soon as they come within range. These enemies have been a staple in thousands of games, causing gamers of every stripe to throw down their controllers in frustration. Science fiction as well is full of robotic killers like the Terminator who wantonly disregards the first of law of robotics, as enshrined by author Isaac Asimov’s I, Robot - that a robot "may not injure a human being."

But with today’s technology, with our electro-optical devices, advanced computers and robotic systems it is possible to bring these kinds of autonomous weapons systems into reality. And for that reason a team at Human Rights Watch has launched the Campaign to Stop Killer Robots.

Laugh all you like. It's a subject that even Stephen Hawking is thinking about.

Mary Wareham, advocacy director of HRW’s arms division, is deadly serious about stopping killer robots before it’s too late.

There’s a sense in Washington, especially in the defense community, that these things are inevitable.

"There’s a sense in Washington, DC — especially in the defense community, that these things are inevitable," Wareham told Polygon. "Some say that we should just get used to it. They’re irritated that we’ve come along now, that we’re starting to ask questions about this."

One of the weapons currently deployed that Wareham says blurs the line between science fiction and reality is called the SGR-1, an automated sentry gun built by a subsidiary of Samsung and fielded on the Demilitarized Zone between North and South Korea. The weapon uses infrared imaging to detect heat, as well as optics and video analytics to resolve human forms. It has the potential to fire on and kill people up to two miles away, even in the pitch dark. And Wareham says that making the device completely autonomous is a philosophical hurdle, not a technical one.

samsung_turret

"It’s a stationary device that just sits there," Wareham said. "It could see some refugee trying to flee, or it could see a soldier. But right now we understand that it’s set to this configuration whereby if it sees something it alerts that base, and then a human looks through the camera at what the machine is seeing and says yes or no and fires it."

The SGR-1, Wareham said, is only the first of what promises to be a future filled with weapons systems that have the capability to operate autonomously. Just like the metal monsters that Skynet unleashed upon the world on the fictional Judgement Day portrayed in the Terminator movies, these kinds of weapons could become pervasive.

Human Rights Watch is concerned that autonomous weapons could become commonplace.

Even discussing them demands the creation of new definitions for modern weapons. Wareham describes three types of automated weapons systems. The first includes a human which is "in the loop" of the decision to fire. An example would be a remotely operated aerial drone, like those deployed by the U.S. in the Middle East and Africa. Here a human operator controls the weapon directly and ultimately makes the decision to fire.

A second type of weapons is one where humans are simply "on the loop," like the Samsung SGR-1. Here humans merely give the final approval for a machine to fire on a target that it has found on its own.

Finally, there is third type of weapon system where a human is "out of the loop." Weapons that fire autonomously are the subject of the campaign. Currently these kinds of weapons are reserved for defensive measures, like reactive weapons that intercept anti-tank missiles faster than a human being could react. But even turning an "in the loop" or an "on the loop" weapons system into one where humans are "out of the loop" is simply a matter of design, and can be achieved with existing technology. That, Wareham said, is what makes the campaign so important.

portal_turret

Technological trends, Wareham said, demonstrate ever-increasing autonomy in warfare and raise practical questions.

"What a lot of this technology has been driven by," Wareham said. "is a simple fact that a lot of these vehicles are flying over ever greater distances, ever faster. So fast that you cannot put a human in the cockpit. They just wouldn’t survive. And then they’re falling out of communications range when they’re going too far. And what’s the instructions to the device at that point? Does it set down? Does it complete the mission? Does it return to base? How do you program it? What do you program it to do? And if it completes the mission, who’s responsible for it if they get it wrong?"

This isn’t the first time that Human Rights Watch has tried to change the way war is waged. In the 1990s they lead the campaign to ban landmines. Wareham herself worked with Jody Williams, who was awarded the Nobel Peace Prize in 1997 for her work to end the use of "victim-activated anti-personnel" mines. The result is an international treaty banning their use signed by 162 countries.

Human Rights Watch helped negotiate an international treaty to ban landmines, so why not Terminators?

"That was the beginning of [HRW’s] arms division," Wareham said. "That’s followed with work on cluster munitions, or cluster bombs."

"We’ve kind of worked out a methodology with our advocacy whereby we like to work with other NGOs, and we like to work together in a broad-based coalition."

The Campaign to Stop Killer Robots currently has the support of 275 scientists, 70 faith leaders and 20 Nobel Peace laureates for a "pre-emptive ban." But so far, there's only one robotics manufacturer on their side.

In August the Canadian company Clearpath Robotics, a manufacturer of various wheeled vehicles with military applications, endorsed the campaign.

"Clearpath Robotics believes that the development of killer robots is unwise, unethical, and should be banned on an international scale," said Ryan Gariepy, the co-founder and chief technical officer of Clearpath. "We ask everyone to consider the many ways in which this technology would change the face of war for the worse."

So far, Wareham says, HRW has been unable to get traction with larger robotics manufacturers like Google (who recently purchased Boston Dynamics) or iRobot. For now HRW’s arms division is focusing on changing governments’ perceptions of autonomous weapons systems. By changing the kinds of devices militaries want to field, or believe it is legal to use in the field, Wareham believes HRW can change the kinds of devices that are manufactured.

The Campaign has the support of 20 Nobel Peace laureates.

"Ultimately, the governments are going to be the ones who will issue the regulations," Wareham said. "They are the ones who will issue the law and say this is what you can and cannot do. We’ll focus on the companies in that respect, and we welcome any statements or endorsements from anywhere at the moment."

"We’re under no illusions that all this running around, talking to all these diplomats is going to result in anything if we don’t do the hard yard at the national level and here in the capital," Warham said. "That’s where the action’s gotta happen."

Correction: The article was edited for clarity in the number of countries that initially signed the mine treaty as well as the focus of the campaign generally.