backtop


Print 111 comment(s) - last by bfellow.. on Mar 5 at 10:46 AM


Autonomous and semi-autonomous robots like this stealthy Boeing X-45C may face new legal repercussions  (Source: Boeing)

IRobot's war-robot is one of the autonomous robotic soldiers that may eventually appear on the battlefield, legality pending. Could such a machine commit atrocities?  (Source: iRobot)

Another deadly autonomous bot is the Foster-Miller MAARS (Modular Advanced Armed Robotic System). The MAARS implements advanced technology to eliminate friendly fire problems.  (Source: The Wired Danger Room)
Researchers debate where the fault divides between the operator and the machine

The U.S. military is working very hard to develop autonomous robotic warriors.  The U.S. is not alone -- the technology is the wave of the future in warfare, and is being pursued by dozens of countries worldwide.  While expensive, weapons-toting robots can provide deadly accuracy and protect a nation's human soldiers.

Already, the U.S. has extensively utilized unmanned aerial vehicles in the war in Iraq, for both surveillance and offensive strikes, with drones such as the Hunter UAV shooting deadly missile strikes into enemy hideouts.  The SWORD robots, designed by the army and armed with machine guns, patrol the streets in Iraq.  Meanwhile a semi-autonomous Bradley Fighting Vehicle, named the Black Knight, is being developed in the U.S.  Not to be left out, the Air Force states that it wants to have unmanned heavy bombers by 2020.

The key feature among all these robotic killers that have been deployed or are under development is that they need a human to pull the trigger.  However, there is growing sentiment among military circles that eventually robots should be developed to be fully autonomous -- fighting and killing enemies, all without human intervention.  Exactly when and how such robots should be legal and troubling moral issues raised were topics of discussion at Royal United Service Institute's conference, "The Ethics of Autonomous Military Systems", a summit of international scientists and military officials held on February 27 in London.

The question "Can a robot commit a war crime?" was raised during the conference.  Such a concern -- that robots might malfunction and target civilians or friendly soldiers -- remains a frightening thought to many military men.

English Barrister and Engineer Chris Elliot explained carefully his thoughts on the legality of autonomous robotic weapons systems in terms of international criminal and civil laws.  He points out that at a certain point the robot's engineers can no longer be held reasonably culpable, and the blame for errors resulting in catastrophic loss of life may come to rest on the shoulders of the robot who committed the assault.  He states, "We're getting very close to the where the law may have to recognize that we can't always identify an individual - perhaps an artificial system can be to blame."

The idea of a robot being charged with murder raises provocative questions about punishment and the fairness of such measures.  Elliot did not back down from taking other hard stances on issues.  He made it clear that currently there was a clear legal burden for humans choosing to deploy systems lacking in sufficient judgment.  He stated, "Weapons intrinsically incapable of distinguishing between civilian and military targets are illegal."

Elliot stated that robots should only be allowed to autonomously wage war when they pass a "Military Turing Test."  He explains, "That means an autonomous system should be no worse than a human at taking decisions [about valid targets]."

The original Turing test, developed by computer pioneer Alan Turing, states that if a human is unable to tell in a conversation with a robot and a real human, which is the man and which is the machine, then the robot conversing with the human has achieved intelligence.

Elliot could not say when or how such a test could or would be administered, stating simply, "Unless we reach that point, we are unable to [legally] deploy autonomous systems. Legality is a major barrier."

Bill Boothby, of the UK Ministry of Defense, argued a slightly differing perspective that if the situation was carefully controlled, an autonomous fighting machine would not need to be quite as intelligent as Elliot's test would require.  Boothby hopes that by lowering the requirements, robots could assist on the battlefield sooner as opposed to later.  He describes this stance, stating, "There is no technology today that can make that kind of qualitative decision about legitimate targets ... [but] it may be possible to take precautions in the sortie-planning phase that enable it to be used within the law."

Boothby argued that in a way, human operators might be no better as they might simply "rubber stamp" the robot's targeting decisions.   Boothby's comments were found more appealing to many military commanders who wished for sooner deployment of such robots as the new SWORD autonomous fighters or Foster-Miller's MAARS combat robot, which implements advanced technology to reduce friendly-fire incidents.

The difference in opinions expressed between Elliot and Boothby are reflective of the mixed feelings society holds about deploying independently operating killing machines to warzones.  There remain many fears in the mind of  the public, both realistic ones, based on practical assessment of the current limitation, and fantastic ones, fueled by popular culture such as the Terminator movies, which depict a future in which cold-blood lethal robots have turned upon mankind.  The issue is sure to only become more contentious with time and technological advances.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: saving lives?
By Duwelon on 2/29/2008 10:08:33 PM , Rating: 2
What happens when someone comes after the "good" son who is holding a loaded colt 45 in his hand. This person is hellbent on robbing him and raping his girlfriend. He is in a drug induced haze and barely knows what he's doing and can't be reasoned with.

What then? Should the good son shoot? Should he let his girlfriend get raped and his money stolen?


RE: saving lives?
By sfi786 on 2/29/08, Rating: 0
RE: saving lives?
By Duwelon on 2/29/2008 11:18:34 PM , Rating: 2
You may have read some history books, but it definately wasn't about World History(as in the real life planet Earth). Death and suffering didnt' begin when the USA was born. The USA has held back the floodgates for over 100 years and helped close them when they opened.


RE: saving lives?
By sfi786 on 2/29/2008 11:48:24 PM , Rating: 1
As much i would like to believe as an American what you believe what you describe is not history it is called white washed history. Most of the time we slap ourselves to make us the victim so we either jump into other conflicts or just to start a fight. My directions is to approach history with a intention to not to repeat it. We do not need another cold war. We do not need hatred of long lost red china we need a new direction where we may be start looking at ourselves before judging somebody else. Sometime getting slapped is not a bad thing if it could wake you up from a bad dream.


RE: saving lives?
By roadhog74 on 3/3/2008 3:43:44 AM , Rating: 2
Ahh the mythical drug induced rape scenario.

Honestly you could make up any scenario you want to
excuse any situation you want.

If you live in south africa (at least a few years ago)
a weapon was highly recommended. better to not live
there though.

But as for me I learned akido I don't need a weapon.
Being in a situation that requires a weapons is always
bad odds.

And while we are on it why can't the girl have a weapon?
Why should she need big manly man to protect her?


"So if you want to save the planet, feel free to drive your Hummer. Just avoid the drive thru line at McDonalds." -- Michael Asher














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki