Print 30 comment(s) - last by ShaolinSoccer.. on Nov 24 at 11:27 AM

  (Source: TriStar Pictures)
Humanitarian group predicts war crimes and worse if robot AIs are trained to target and kill humans

Thus far, no nation has produced a fully autonomous robotic soldier.

I. Human Rights Watch Warns of Robotic War Crimes

However, many observers fear we are creeping towards an era in which automated killing machines are a staple of the battlefield.  The U.S. and other nations have been actively been developing landair, and sea unmanned vehicles.  Most of these machines are imbued with some degree of artificial intelligence and operate in a semi-autonomous fashion.  However, they currently have a human operator in the loop, (mostly) in control.

But experts fear that within 20 to 30 years artificial intelligence and military automation will have advanced to the point where nations consider deploying fully automated war robots to kill their enemies.

International humanitarian group and war-crimes watchdog Human Rights Watch has published a 50-page report entitled "Losing Humanity: The Case Against Killer Robots", which calls on world governments to install a global ban on autonomous killing robots, similar to current prohibitions on the use of chemical warfare agents.

Current generation war robots, like the MAARS robot, have a human operator in the loop.
[Image Source: Wired]

Comments Steve Goose, Arms Division director at Human Rights Watch, "Giving machines the power to decide who lives and dies on the battlefield would take technology too far.  Human control of robotic warfare is essential to minimizing civilian deaths and injuries.  It is essential to stop the development of killer robots before they show up in national arsenal.  As countries become more invested in this technology, it will become harder to persuade them to give it up."

II. Ban the 'Bots

The proposal, co-endorsed by the Harvard Law School International Human Rights Clinic, also calls on a prohibition on development, production, and testing of fully autonomous war robots.

The groups address the counter-argument -- that robotic warfare saves the lives of soldiers -- arguing that it makes war too convenient.  They argue that an "autocrat" could turn cold, compassionless robots on killing their own civilian population.  It would be much harder to convince humans to do that.

Countries could also claim their cyber-soldiers "malfunctioned" to try to get themselves off the hook for war crimes against other nations' civilians.

And of course science fiction fans will recognize the final concern -- that their could be legitimate bugs in the AI which cause the robots to either not properly calculate a proportional response to violence, to not distinguish between civilian or soldier, or -- worst of all "go Terminator" and turn on their fleshy masters.

Comments Mr. Goose, "Action is needed now, before killer robots cross the line from science fiction to feasibility."

Sources: Human Rights Watch [1], [2]

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: fear this
By Misty Dingos on 11/21/2012 8:29:07 AM , Rating: 2
OK because I am nice and I like to help folks out I am going to clear somethings up for you.

1) Isaac Asimov was writing a story when he came up with the "3 Laws". It is a work of fiction.
2) He did that because he wanted to make money and feed himself, put clothes on his body and have a place to sleep.
3) His fictional construct is not going to be used EVER in robots that defend nations.
4) Why is 3 true? Because it makes no sense.
5) Someday, soon, a government will make the decision that a weapons platform (robot) with the right software will be able to save the lives of the people they are sworn to protect by taking the lives of the people they are not. The decision will boil down to not should we have a man in the loop but can we. And when they can't put a human in the loop they will put the system on kill with out consult.
6) Will this lead to a devaluation of human life? The cynic in me says how can we devalue it anymore? Children are killed by their fathers because they had the misfortune of being raped. People are tortured for months to obtain information that could be gained in hours with no pain and suffering with simple safe chemical interrogation. So no I don't think robots killing people will devalue human life. We may not be at rock bottom but we haven't risen nearly as far as some of the self-important, sanctimonious, hypocrites would like you to think.

Well anyway I am glad I could clear that up for you. Oh and have a happy Thanksgiving if you live in the USA.

RE: fear this
By jimbojimbo on 11/21/2012 10:58:45 AM , Rating: 2
In response to your #4:
It makes plenty of sense! You don't want your $2billion dollar robot walking off a cliff. It's in the books, read them.

In fact rule #3 became an issue in one story because one of the robots was programmed to stress that rule more because it cost so much and got in a dilemma with rule #2. Fascinating the amount of complex and intriguing stories Aasimov was able to write from those three rules.

RE: fear this
By tamalero on 11/22/2012 12:08:37 PM , Rating: 3
your point #5 is so wrong in so many levels.. its not even funny.

why? because it just requires to gov to declare someone as "combatant" even if they are defending their own country from abuse. (which the US as done a lot of invasions for "interests")

RE: fear this
By ShaolinSoccer on 11/24/2012 11:27:28 AM , Rating: 2
Just the US?

"If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel." -- AMD CEO Hector Ruiz in 2007

Copyright 2015 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki