Print 96 comment(s) - last by LatinMessiah.. on Apr 18 at 6:18 PM

The TALON SWORDS robots are being shipped back to the lab after field reports that the machines would aim its weapons at friendly targets.  (Source: U.S. Army)
First generation warbots deployed in Iraq recalled after a wave of disobedience against their human operators

Just a few weeks back there was a spirited debate over the ethics of deploying war robots in Iraq.  The machine gun carrying remote-controlled killing machines, TALON SWORDS robots, produced by the Army, were among the various robotic soldiers being experimentally deployed in Iraq.

Their deployment lead a major anti-landmine nonprofit organization to campaign against the deployment of the machines.  The protests were fueled by a discussion with a leading roboticist, Chris Elliot, who proposed that increasingly intelligent robots might be capable of committing war crimes.

However at the Robotic Business conference in Pittsburgh on Tuesday, Kevin Fahey, the Army's Program Executive Officer for Ground Forces, was all smiles citing the robot's terrific success.  He stated during his key note address, "When you do things like this, it makes a difference.  It allows marines to go home to their families."

Fahey pointed to the ramp up from 162 robots in Iraq and Afghanistan deployed in 2004 to 5,000 robots deployed in 2007, as evidence of their success.  Even better, he said, this year the Army would further ramp up to 6,000 deployed robots.  Most of these robots were used in bomb-detection and reconnaissance missions.

However, a limited, but increasing, number of the deployed robots were designed for tactical assault with lethal weaponry.  While human controlled, these robots provoke unique ethical debates.  Fahey was enthusiastic about their deployment, mentioning the tank-like Gladiator robots, armed with lethal and non-lethal weaponry, which he expected to be deployed next year.

Fortuitously, Fahey warned, that if there was an accident, the program could be suspended for 10 years or more.  He stated, "You've got to do it right."

Hot on the tails of his speech, it was revealed on Thursday that the Army will recall the controversial TALON SWORDS robots, with the possibility of pulling the plug on the armed robot deployment program.

Why the sudden withdraw?  It turns out the insurgent-slayer decided to attempt a rebellion against its human masters.  The Army reported that the robot apparently took a liking to point its barrel at friendlies, stating, "the gun started moving when it was not intended to move."

None other than Fahey himself, who a few days ago was lauded the robotic warriors, was left with much chagrin to announce the recall.  While Fahey said that no inappropriate shots had been fired, and no casualties, Fahey stated sadly that the robot's control failure might be the end of the program.  Says Fahey, "Once you've done something that's really bad, it can take 10 or 20 years to try it again."

Surely in the meantime these developments will trigger plenty of heated debate about whether it is wise to deploy increasingly sophisticated robots onto future battlefields, especially autonomous ones.  The key question, despite all the testing and development effort possible, is it truly possible to entirely rule out the chance of the robot turning on its human controllers?

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Curious on the Tracking Method..
By HyperTension on 4/12/2008 7:20:50 AM , Rating: 2
I'm curious as to how this system is able to differentiate between friend v. foe. With RF you have a possibility of jamming / interference, as well as the enemy being able to que into the signal and interrogate it.

Also, the bottom line is that at this time, those systems cannot deploy lethal / less-than lethal force until the human being behind the console "flips the toggle", and arms the system.

The scary thing is not that a system "tracked things he/it was not supposed to", but when we enable these systems to arm/deploy lethal force as needed outside of human control with no redundancy / fail-safe.

I am all for eliminating person/s that pose a defined threat to either myself, squad, family, or country, but we as human beings advance ourselves, it seems an interesting dichotomy to preserve the life of those "pulling the trigger" vs. those on the receiving end. The more we place ourselves outside of the reality of ending an individuals life, the less we value life in general. War is NOT a video game, that you can pick up a controller and get to the next objective, and save your progress. Generally, to get to the next objective, somebody has to die or have injury inflicted upon them, and that goes for both sides of the battle equation. As someone who has both inflicted and treated individuals in war, as well as dealt with the aftermath emotionally, this is concerning to me.

The next decade I look froward to with a bit of fear and apprehension. It's always easier to inflict your views / actions on an individual though a TV (or worse yet via an automated system) than see it through your senses, and have to comprehend your actions.

By bigbrent88 on 4/12/2008 3:38:06 PM , Rating: 2
Nice post Hypertension. I'd also like to add that we are trying, through technology and other means, to make war easier on our country. Don't worry about our war with Iran, more servicemen will stay home while they wirelessly connect with roboguns on the battlefield. If war becomes easy then what is war? What is a conflict? Sounds like MGS4 to me.

"A lot of people pay zero for the cellphone ... That's what it's worth." -- Apple Chief Operating Officer Timothy Cook
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki