(Image source: Wikimedia Commons)

BY STEVEN SPARKMAN
ANCHOR CHRISTINA HARTMAN

Is it time to ban killer robots? A new national campaign headed by Human Rights Watch and the Nobel Women’s Initiative says it is.

In a new report, the rights group says high tech militaries are developing robots with more and more autonomy — and may soon reach the point where robots decide who lives and who dies.

It might sound like science fiction, but the group says automated defense systems in use today already border on autonomous behavior, like a sentry in use in South Korea that can detect when someone enters a restricted area.

“It then asks permission of a soldier back at base whether it should fire or not. If the soldier grants that permission, it shoots the individual. Our concern is that that permission may not always be required.”

The group raises two main arguments against fully autonomous weapons: decision making and accountability.

Robots wouldn’t be able to distinguish between a civilian and a combatant, or understand the context of a battle. That makes it hard to trust them with the power to make their own decisions. (Video via Northrup Grumman Corp. / AIRBOYD)

And robots throw off the whole notion of accountability. Who is to blame if a robot commits a war crime? The manufacturer, the programer, the government? (Video via Samsung Techwin)

Human Rights Watch says this report isn’t about drones, because drones still have a  human pilot. But they’re looking ahead to the day that isn’t the case. When will that be? One of the authors tells Democracy Now!:

“Most roboticists think it will take at least ten, maybe 20-30 years before these things might come online, though others think that more crude versions could be available in just a number of years.”

With the technology decades away, the campaign may seem a little over-the-top. But a writer for VR-Zone says we’ve gotten out ahead of destructive technologies in the past.

“It is important to note that the banning of weapons in space was a serious issue in the early 1960s but many at the time thought the idea was a bit paranoid.  Nevertheless, a treaty was made ... and now has over 100 nations agreeing to it.”

But not everyone agrees so-called “killer robots” are a bad thing. The Economist’s digital editor Tom Standage argued there’s actually an ethical case for taking some moral decision making out of human hands.

“If you could build robot soldiers, you’d be able to program them not to commit war crimes, not to commit rape, not to burn down villages. They wouldn’t be susceptible to human emotions like anger or excitement in the heat of combat.”

While there may be good points on both sides, Human Rights Watch is definitely right about one thing: these technologies are on the horizon. A writer for Slate says it’s time to have the discussion.

“Banning killer robots and other technologies may not be the solution, but as citizens of democratic states, it is both our right and our responsibility to consider whether the military advantages these technologies bring are worth the cost they may impose on our democratic order.”

Rights Group Wants an International Ban on Killer Robots

by Steven Sparkman
1
Transcript
Nov 20, 2012

Rights Group Wants an International Ban on Killer Robots

 

(Image source: Wikimedia Commons)

BY STEVEN SPARKMAN
ANCHOR CHRISTINA HARTMAN

Is it time to ban killer robots? A new national campaign headed by Human Rights Watch and the Nobel Women’s Initiative says it is.

In a new report, the rights group says high tech militaries are developing robots with more and more autonomy — and may soon reach the point where robots decide who lives and who dies.

It might sound like science fiction, but the group says automated defense systems in use today already border on autonomous behavior, like a sentry in use in South Korea that can detect when someone enters a restricted area.

“It then asks permission of a soldier back at base whether it should fire or not. If the soldier grants that permission, it shoots the individual. Our concern is that that permission may not always be required.”

The group raises two main arguments against fully autonomous weapons: decision making and accountability.

Robots wouldn’t be able to distinguish between a civilian and a combatant, or understand the context of a battle. That makes it hard to trust them with the power to make their own decisions. (Video via Northrup Grumman Corp. / AIRBOYD)

And robots throw off the whole notion of accountability. Who is to blame if a robot commits a war crime? The manufacturer, the programer, the government? (Video via Samsung Techwin)

Human Rights Watch says this report isn’t about drones, because drones still have a  human pilot. But they’re looking ahead to the day that isn’t the case. When will that be? One of the authors tells Democracy Now!:

“Most roboticists think it will take at least ten, maybe 20-30 years before these things might come online, though others think that more crude versions could be available in just a number of years.”

With the technology decades away, the campaign may seem a little over-the-top. But a writer for VR-Zone says we’ve gotten out ahead of destructive technologies in the past.

“It is important to note that the banning of weapons in space was a serious issue in the early 1960s but many at the time thought the idea was a bit paranoid.  Nevertheless, a treaty was made ... and now has over 100 nations agreeing to it.”

But not everyone agrees so-called “killer robots” are a bad thing. The Economist’s digital editor Tom Standage argued there’s actually an ethical case for taking some moral decision making out of human hands.

“If you could build robot soldiers, you’d be able to program them not to commit war crimes, not to commit rape, not to burn down villages. They wouldn’t be susceptible to human emotions like anger or excitement in the heat of combat.”

While there may be good points on both sides, Human Rights Watch is definitely right about one thing: these technologies are on the horizon. A writer for Slate says it’s time to have the discussion.

“Banning killer robots and other technologies may not be the solution, but as citizens of democratic states, it is both our right and our responsibility to consider whether the military advantages these technologies bring are worth the cost they may impose on our democratic order.”

View More
Comments
Newsy
www1