Print 96 comment(s) - last by LatinMessiah.. on Apr 18 at 6:18 PM

The TALON SWORDS robots are being shipped back to the lab after field reports that the machines would aim its weapons at friendly targets.  (Source: U.S. Army)
First generation warbots deployed in Iraq recalled after a wave of disobedience against their human operators

Just a few weeks back there was a spirited debate over the ethics of deploying war robots in Iraq.  The machine gun carrying remote-controlled killing machines, TALON SWORDS robots, produced by the Army, were among the various robotic soldiers being experimentally deployed in Iraq.

Their deployment lead a major anti-landmine nonprofit organization to campaign against the deployment of the machines.  The protests were fueled by a discussion with a leading roboticist, Chris Elliot, who proposed that increasingly intelligent robots might be capable of committing war crimes.

However at the Robotic Business conference in Pittsburgh on Tuesday, Kevin Fahey, the Army's Program Executive Officer for Ground Forces, was all smiles citing the robot's terrific success.  He stated during his key note address, "When you do things like this, it makes a difference.  It allows marines to go home to their families."

Fahey pointed to the ramp up from 162 robots in Iraq and Afghanistan deployed in 2004 to 5,000 robots deployed in 2007, as evidence of their success.  Even better, he said, this year the Army would further ramp up to 6,000 deployed robots.  Most of these robots were used in bomb-detection and reconnaissance missions.

However, a limited, but increasing, number of the deployed robots were designed for tactical assault with lethal weaponry.  While human controlled, these robots provoke unique ethical debates.  Fahey was enthusiastic about their deployment, mentioning the tank-like Gladiator robots, armed with lethal and non-lethal weaponry, which he expected to be deployed next year.

Fortuitously, Fahey warned, that if there was an accident, the program could be suspended for 10 years or more.  He stated, "You've got to do it right."

Hot on the tails of his speech, it was revealed on Thursday that the Army will recall the controversial TALON SWORDS robots, with the possibility of pulling the plug on the armed robot deployment program.

Why the sudden withdraw?  It turns out the insurgent-slayer decided to attempt a rebellion against its human masters.  The Army reported that the robot apparently took a liking to point its barrel at friendlies, stating, "the gun started moving when it was not intended to move."

None other than Fahey himself, who a few days ago was lauded the robotic warriors, was left with much chagrin to announce the recall.  While Fahey said that no inappropriate shots had been fired, and no casualties, Fahey stated sadly that the robot's control failure might be the end of the program.  Says Fahey, "Once you've done something that's really bad, it can take 10 or 20 years to try it again."

Surely in the meantime these developments will trigger plenty of heated debate about whether it is wise to deploy increasingly sophisticated robots onto future battlefields, especially autonomous ones.  The key question, despite all the testing and development effort possible, is it truly possible to entirely rule out the chance of the robot turning on its human controllers?

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Limited AI
By Azsen on 4/11/2008 9:30:26 PM , Rating: 3
This technology could get dangerous if they don't put some safeguards in place.

1) The AI shouldn't be intelligent enough to fire on its own without explicit human approval.
2) The AI should not take human input as optional.
3) Friendly soldiers and units need to wear something electronic that identifies them as friendly so the AI should never automatically shoot at a friendly target.

I think they really need some game programmers to program the robots. Seriously you don't get stuff-ups like your own units automatically targeting your own units in Command & Conquer or Dark Reign do you?

Even if the friendly soldiers wear electronic identification tags to identify them as friendly, the robot if allowed to shoot automatically is still going to shoot anyone else that isn't wearing a friendly tag (including civilians). In Command & Conquer etc you don't see civilians running around and getting shot because there's some underlying code which identifies the units as 'neutral' so the AI doesn't shoot them. Now there's no such thing as 'neutral' in real life, so any robot that can fire automatically is going to kill civilians and anyone not wearing a tag as well because it can't identify friend or foe.

The order to attack should remain with command and control with a human operator.

RE: Limited AI
By JustTom on 4/12/2008 6:53:38 PM , Rating: 2
I think they really need some game programmers to program the robots. Seriously you don't get stuff-ups like your own units automatically targeting your own units in Command & Conquer or Dark Reign do you?

I've never played C&C or Dark Reign but my guess on why civilians do not get targeted is because any unit that is a civilian is tagged so by the programmers. How, in real life, would we go about 'tagging' civilians? Any method I can fathom would be easily used by the enemy disguise themselves as civilians.

RE: Limited AI
By Eris23007 on 4/14/2008 5:53:06 PM , Rating: 2
Re-read the article. These do not possess the ability to fire autonomously - they fire upon the command of an operator. The targeting system is semiautonomous - that is, some algorithms help it predict what targets it should aim at next - but the final decisions on firing rest with human beings.

So far as I've ever read, there is no intention to take the human out of the weapons-fire decision-making process in the foreseeable future. The military is preoccupied with weapons safety to an extreme, and it is part of their system development culture.

RE: Limited AI
By Eris23007 on 4/14/2008 5:55:41 PM , Rating: 2
Upon further review, this article does not do a great job of making clear that ALL fire decisions are made by human beings with these and all other present US military robots.

My bad on the "re-read" suggestion. Jason Mick: please tone down the hysterics - again, ALL fire decisions are made by humans.

"Well, there may be a reason why they call them 'Mac' trucks! Windows machines will not be trucks." -- Microsoft CEO Steve Ballmer
Related Articles

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki