backtop


Print 30 comment(s) - last by ShaolinSoccer.. on Nov 24 at 11:27 AM


  (Source: TriStar Pictures)
Humanitarian group predicts war crimes and worse if robot AIs are trained to target and kill humans

Thus far, no nation has produced a fully autonomous robotic soldier.

I. Human Rights Watch Warns of Robotic War Crimes

However, many observers fear we are creeping towards an era in which automated killing machines are a staple of the battlefield.  The U.S. and other nations have been actively been developing landair, and sea unmanned vehicles.  Most of these machines are imbued with some degree of artificial intelligence and operate in a semi-autonomous fashion.  However, they currently have a human operator in the loop, (mostly) in control.

But experts fear that within 20 to 30 years artificial intelligence and military automation will have advanced to the point where nations consider deploying fully automated war robots to kill their enemies.

International humanitarian group and war-crimes watchdog Human Rights Watch has published a 50-page report entitled "Losing Humanity: The Case Against Killer Robots", which calls on world governments to install a global ban on autonomous killing robots, similar to current prohibitions on the use of chemical warfare agents.

MAARS Gun Bot
Current generation war robots, like the MAARS robot, have a human operator in the loop.
[Image Source: Wired]

Comments Steve Goose, Arms Division director at Human Rights Watch, "Giving machines the power to decide who lives and dies on the battlefield would take technology too far.  Human control of robotic warfare is essential to minimizing civilian deaths and injuries.  It is essential to stop the development of killer robots before they show up in national arsenal.  As countries become more invested in this technology, it will become harder to persuade them to give it up."

II. Ban the 'Bots

The proposal, co-endorsed by the Harvard Law School International Human Rights Clinic, also calls on a prohibition on development, production, and testing of fully autonomous war robots.

The groups address the counter-argument -- that robotic warfare saves the lives of soldiers -- arguing that it makes war too convenient.  They argue that an "autocrat" could turn cold, compassionless robots on killing their own civilian population.  It would be much harder to convince humans to do that.


Countries could also claim their cyber-soldiers "malfunctioned" to try to get themselves off the hook for war crimes against other nations' civilians.

And of course science fiction fans will recognize the final concern -- that their could be legitimate bugs in the AI which cause the robots to either not properly calculate a proportional response to violence, to not distinguish between civilian or soldier, or -- worst of all "go Terminator" and turn on their fleshy masters.

Comments Mr. Goose, "Action is needed now, before killer robots cross the line from science fiction to feasibility."

Sources: Human Rights Watch [1], [2]



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

fear this
By superstition on 11/20/2012 6:06:05 PM , Rating: 2
The scariest outcome for humanity is that the robots might force us to live logically.

That would end war entirely.




RE: fear this
By ClownPuncher on 11/20/2012 6:09:19 PM , Rating: 3
Maybe in derper stonerville.


RE: fear this
By Mitch101 on 11/20/2012 10:07:23 PM , Rating: 5
"I'm gonna drink 'til I reboot!"
-Bender


RE: fear this
By Loki726 on 11/20/2012 6:26:53 PM , Rating: 2
I've worked a little in AI and I don't worry about this.

The most promising techniques for AI (hierarchical neural networks,
PSO, etc) mimic human intelligence.

They are not constrained to follow logical rules like most computer programs.


RE: fear this
By Samus on 11/20/2012 8:08:01 PM , Rating: 2
the cylon occupation attemted to have humanity follow a 'logical' lifestyle but we still fought back. we don't like being told what to do, by each other, or robots.


RE: fear this
By StevoLincolnite on 11/20/2012 9:18:49 PM , Rating: 4
My favorite part about the Military is the Men in Uniform. Robots just don't do the same thing for me... :P

In all seriousness, you do have the 3 laws of robotics that all A.I should be hard-coded for when they become "self aware" which are:

1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2) A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

It would then put all issues to rest, unless some evil mad scientist in their mothers basement makes a Droid army.


RE: fear this
By delphinus100 on 11/20/2012 10:42:07 PM , Rating: 3
Of course, Asimov himself showed that you can get some strange and dramatic results, even within those self-imposed limits...

Hell, even the definition of 'harm' is sometimes up for grabs.


RE: fear this
By Visual on 11/21/2012 4:09:48 AM , Rating: 2
RE: fear this
By Misty Dingos on 11/21/2012 8:29:07 AM , Rating: 2
OK because I am nice and I like to help folks out I am going to clear somethings up for you.

1) Isaac Asimov was writing a story when he came up with the "3 Laws". It is a work of fiction.
2) He did that because he wanted to make money and feed himself, put clothes on his body and have a place to sleep.
3) His fictional construct is not going to be used EVER in robots that defend nations.
4) Why is 3 true? Because it makes no sense.
5) Someday, soon, a government will make the decision that a weapons platform (robot) with the right software will be able to save the lives of the people they are sworn to protect by taking the lives of the people they are not. The decision will boil down to not should we have a man in the loop but can we. And when they can't put a human in the loop they will put the system on kill with out consult.
6) Will this lead to a devaluation of human life? The cynic in me says how can we devalue it anymore? Children are killed by their fathers because they had the misfortune of being raped. People are tortured for months to obtain information that could be gained in hours with no pain and suffering with simple safe chemical interrogation. So no I don't think robots killing people will devalue human life. We may not be at rock bottom but we haven't risen nearly as far as some of the self-important, sanctimonious, hypocrites would like you to think.

Well anyway I am glad I could clear that up for you. Oh and have a happy Thanksgiving if you live in the USA.


RE: fear this
By jimbojimbo on 11/21/2012 10:58:45 AM , Rating: 2
In response to your #4:
It makes plenty of sense! You don't want your $2billion dollar robot walking off a cliff. It's in the books, read them.

In fact rule #3 became an issue in one story because one of the robots was programmed to stress that rule more because it cost so much and got in a dilemma with rule #2. Fascinating the amount of complex and intriguing stories Aasimov was able to write from those three rules.


RE: fear this
By tamalero on 11/22/2012 12:08:37 PM , Rating: 3
your point #5 is so wrong in so many levels.. its not even funny.

why? because it just requires to gov to declare someone as "combatant" even if they are defending their own country from abuse. (which the US as done a lot of invasions for "interests")


RE: fear this
By ShaolinSoccer on 11/24/2012 11:27:28 AM , Rating: 2
Just the US?


RE: fear this
By jr9k on 11/21/2012 4:17:20 PM , Rating: 2
How long would it take to root such a droid?

Then you can install a Search & Destroy app ;)


RE: fear this
By inperfectdarkness on 11/21/2012 2:11:39 AM , Rating: 2
Good luck with the ban. We haven't been able to ban nukes either. And let's face it, just because country "A" takes the moral high-ground and opts not to use certain types of weapons, country "B" may be significantly less powerful than country "A" and may resort to every underhanded trick in the book. Not that this has happened recently or anything (IED's, suicide bombers, etc).


RE: fear this
By guffwd13 on 11/21/2012 1:46:17 PM , Rating: 2
The flaw in this statement is that you assume war is illogical. That concept is incredibly fallacious.

War is necessary for economic, social and biological reasons. The survival of the latter two is more obvious, while the first is an abstraction from typical human behavior and desire to always have more. If war is profitable, it will always exist. As will assassins, boxing, and concussions in the NFL. For social, just read 1984 and ask yourself why One State was always at "war" (or why we so much about our favorite teams). The third is ironic. Survival of the fittest must continue in order for our species to continue. The only ways for that to happen is through epidemics and wars. Since we're destroying nature's natural cleansing ability (medicine), that leaves one thing left: war. It is not fair and may not actually be the strongest but limiting the human population through war allows for resources to be distributed and managed better.

Also, don't forget people we are programmed by years of evolution that has influenced our instincts and learned social behaviors. We are "AI" because "intelligence" is just a word applied to reasoned thinking.


RE: fear this
By Argon18 on 11/21/2012 3:08:36 PM , Rating: 3
Robots will end war by forcing us to live logically? Where did you get that nonsense from?

Will they also end obesity by forcing us to eat healthy? Or how about ending laziness by forcing us to work productively? And don't forget ending genetic disease, by forcing us to reproduce with computer selected breeding partners? Maybe even ending crimes, by terminating individuals that are predicted to commit them? Minority report anyone?

Anyone who gives his free will up voluntarily is a fool.


RE: fear this
By superstition on 11/22/2012 12:41:33 AM , Rating: 2
I knew the humorless warmongers would downrate.


Just wait
By Argon18 on 11/20/2012 5:57:02 PM , Rating: 2
It's only a matter of time before Skynet is fully operational. Where is Sarah Connor?




RE: Just wait
By DiscoWade on 11/20/2012 7:55:24 PM , Rating: 3
I, for one, welcome our new robotic overlords. I will welcome them with the words of the prophet Jeremiatic: "10010010001110011101010100111 2".


RE: Just wait
By havoti97 on 11/21/2012 1:06:33 AM , Rating: 2
I love the older Futurama episodes. The show was ahead of its time and people didn't like it.


RE: Just wait
By Bad-Karma on 11/21/2012 11:03:39 AM , Rating: 2
If you don't think lots of people liked it..."You can bite my shiny metal A**! "


No fully congintive systems at all
By Shadowself on 11/20/2012 9:17:45 PM , Rating: 2
quote:
Most of these machines are imbued with some degree of artificial intelligence and operate in a semi-autonomous fashion. However, they currently have a human operator in the loop, (mostly) in control.
As someone who has recently worked on "cognitive agents" for several of the DoD's unmanned systems, I can say unequivocally that these statements are hogwash. Pure and simple.

The level of autonomy of the most sophisticated DoD systems? They route themselves from point A to point B and can autonomously choose their communications channels based upon the operating environment. Or other actions as equally mundane. None of them have any -- and I do mean *any* -- level of control over any operational equipment including, and especially, weaponry.

There is no human "mostly" in control. There is a human in the loop of all the more sophisticated systems. Most of them have a person at the controls of every function above the most rudimentary (thermal controls so the system does not overheat, power controls so batteries don't get drained by non essential equipment, etc.) Even for things like the autonomous routing systems there is a way to take absolute control away from the unmanned vehicle almost instantly (if you're doing this over a satellite link it can take a quarter of a second or more to take control of these simple operations).




RE: No fully congintive systems at all
By tayb on 11/20/2012 10:36:09 PM , Rating: 2
There are sentry guns already in existence that can detect an object and shoot it down with no human interaction. Right now a human has to give the command but that isn't a required barrier, it's there for safety. There was an article on this site a few days ago talking about Iron Dome. No human intervention there. The machine sees a rocket, determines the trajectory, and either let's it fall or attempts to shoot it down. It's not iRobot or Terminator advanced but it's definitely a precursor.

When people talk about autonomous weapons everyone always thinks of AI or super advanced robots. They don't have to be super advanced. The technology already exists and is deployed. You could even make the case that land mines lack human intervention and therefore would fall under this umbrella.

The problem here is that true AI machines are inevitable. An AAV would be able to easily outmaneuver even the most skilled pilot simply due physical limitations of the human body. Any country that isn't deploying AAVs with advanced intelligence will essentially cede the skies to foreign powers. I'd say we are a few decades away from this reality but it's probably not as far out as people would like to believe.


By Shadowself on 11/21/2012 1:03:20 PM , Rating: 2
By your description all of the defensive systems in the opening sequences of the movie Raiders of the Lost Ark fit within the automated weapon category. If you can make the case for land mines why not those systems?

Simple purely reactive systems (even the Iron Dome system or Phalanx system) are not addressing the primary issue. The issue is cognition and what level of cognition "crosses the line"?

When does a single purpose system that has zero level of "choice" versus one that can actively sense its environment and choose which inputs to ignore, which inputs to evaluate, which responses to make from those evaluations, and finally decide what are the allowable limits of response cross the line? It's the last two of those cognition steps that cause heartburn in most people in the field.


really?
By lagomorpha on 11/20/2012 7:12:31 PM , Rating: 3
quote:
They argue that an "autocrat" could turn cold, compassionless robots on killing their own civilian population. It would be much harder to convince humans to do that.


I see someone hasn't paid much attention in history class.




RE: really?
By sl68 on 11/20/2012 8:03:51 PM , Rating: 2
quote:
I see someone hasn't paid much attention in history class.


He was busy playing hacky sack on the quad.


Judgment Day is Inevitable
By vladio on 11/21/2012 1:38:38 PM , Rating: 3
"You can only postponed it.
Judgment Day is Inevitable."
--VladimirJosephStephanOrlovsky




By Morvannec on 11/21/2012 2:52:41 AM , Rating: 2
What I find most interesting is that a human rights group is actively saying... "Send people into battle instead.".

Yah but it's for the "greater good"! :/




By jimbojimbo on 11/21/2012 10:19:45 AM , Rating: 2
quote:
It would be much harder to convince humans to do that.
From my experience in the Marine Corps for five years and two tours in Iraq I would say there are a lot of people out there just aching to kill someone looking for a reason to do it that won't get them in trouble. Robots aren't like that. It's all logic based on their programming.




Arr'nold
By btc909 on 11/21/2012 2:01:04 PM , Rating: 1
Well at least someday a bunch of Arr'nold's will be running around killing people. "Hey that's the guy from those movies a 100+ years ago, wha, wait, is that a laser rifle, oh crap, pew, pew, pew."




“We do believe we have a moral responsibility to keep porn off the iPhone.” -- Steve Jobs














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki