backtop


Print 111 comment(s) - last by bfellow.. on Mar 5 at 10:46 AM


Autonomous and semi-autonomous robots like this stealthy Boeing X-45C may face new legal repercussions  (Source: Boeing)

IRobot's war-robot is one of the autonomous robotic soldiers that may eventually appear on the battlefield, legality pending. Could such a machine commit atrocities?  (Source: iRobot)

Another deadly autonomous bot is the Foster-Miller MAARS (Modular Advanced Armed Robotic System). The MAARS implements advanced technology to eliminate friendly fire problems.  (Source: The Wired Danger Room)
Researchers debate where the fault divides between the operator and the machine

The U.S. military is working very hard to develop autonomous robotic warriors.  The U.S. is not alone -- the technology is the wave of the future in warfare, and is being pursued by dozens of countries worldwide.  While expensive, weapons-toting robots can provide deadly accuracy and protect a nation's human soldiers.

Already, the U.S. has extensively utilized unmanned aerial vehicles in the war in Iraq, for both surveillance and offensive strikes, with drones such as the Hunter UAV shooting deadly missile strikes into enemy hideouts.  The SWORD robots, designed by the army and armed with machine guns, patrol the streets in Iraq.  Meanwhile a semi-autonomous Bradley Fighting Vehicle, named the Black Knight, is being developed in the U.S.  Not to be left out, the Air Force states that it wants to have unmanned heavy bombers by 2020.

The key feature among all these robotic killers that have been deployed or are under development is that they need a human to pull the trigger.  However, there is growing sentiment among military circles that eventually robots should be developed to be fully autonomous -- fighting and killing enemies, all without human intervention.  Exactly when and how such robots should be legal and troubling moral issues raised were topics of discussion at Royal United Service Institute's conference, "The Ethics of Autonomous Military Systems", a summit of international scientists and military officials held on February 27 in London.

The question "Can a robot commit a war crime?" was raised during the conference.  Such a concern -- that robots might malfunction and target civilians or friendly soldiers -- remains a frightening thought to many military men.

English Barrister and Engineer Chris Elliot explained carefully his thoughts on the legality of autonomous robotic weapons systems in terms of international criminal and civil laws.  He points out that at a certain point the robot's engineers can no longer be held reasonably culpable, and the blame for errors resulting in catastrophic loss of life may come to rest on the shoulders of the robot who committed the assault.  He states, "We're getting very close to the where the law may have to recognize that we can't always identify an individual - perhaps an artificial system can be to blame."

The idea of a robot being charged with murder raises provocative questions about punishment and the fairness of such measures.  Elliot did not back down from taking other hard stances on issues.  He made it clear that currently there was a clear legal burden for humans choosing to deploy systems lacking in sufficient judgment.  He stated, "Weapons intrinsically incapable of distinguishing between civilian and military targets are illegal."

Elliot stated that robots should only be allowed to autonomously wage war when they pass a "Military Turing Test."  He explains, "That means an autonomous system should be no worse than a human at taking decisions [about valid targets]."

The original Turing test, developed by computer pioneer Alan Turing, states that if a human is unable to tell in a conversation with a robot and a real human, which is the man and which is the machine, then the robot conversing with the human has achieved intelligence.

Elliot could not say when or how such a test could or would be administered, stating simply, "Unless we reach that point, we are unable to [legally] deploy autonomous systems. Legality is a major barrier."

Bill Boothby, of the UK Ministry of Defense, argued a slightly differing perspective that if the situation was carefully controlled, an autonomous fighting machine would not need to be quite as intelligent as Elliot's test would require.  Boothby hopes that by lowering the requirements, robots could assist on the battlefield sooner as opposed to later.  He describes this stance, stating, "There is no technology today that can make that kind of qualitative decision about legitimate targets ... [but] it may be possible to take precautions in the sortie-planning phase that enable it to be used within the law."

Boothby argued that in a way, human operators might be no better as they might simply "rubber stamp" the robot's targeting decisions.   Boothby's comments were found more appealing to many military commanders who wished for sooner deployment of such robots as the new SWORD autonomous fighters or Foster-Miller's MAARS combat robot, which implements advanced technology to reduce friendly-fire incidents.

The difference in opinions expressed between Elliot and Boothby are reflective of the mixed feelings society holds about deploying independently operating killing machines to warzones.  There remain many fears in the mind of  the public, both realistic ones, based on practical assessment of the current limitation, and fantastic ones, fueled by popular culture such as the Terminator movies, which depict a future in which cold-blood lethal robots have turned upon mankind.  The issue is sure to only become more contentious with time and technological advances.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Sounds familiar
By sonoran on 2/29/2008 3:09:08 PM , Rating: 3
quote:
Now we have to enforce the Three Laws of Robotics
We're talking about robots specifically designed to VIOLATE Asimov's laws of robotics here - these are robots designed to kill people. In any case, I suspect we're decades away from any AI being smart enough to tell friend from foe. In many cases humans have a hard time doing that now, until the shooting starts.


RE: Sounds familiar
By BVT on 2/29/08, Rating: -1
RE: Sounds familiar
By Hakuryu on 2/29/2008 3:42:58 PM , Rating: 2
Because at some point in our future we can envision robots of many types working in our world. Should we just throw away any thoughts on how they should work just because a science fiction writer thought it up first?

Star Trek helped to bring about the cell phone (communicator), why is it so hard to believe laws of robotics could come from another piece of fiction?


RE: Sounds familiar
By mikefarinha on 2/29/2008 5:56:50 PM , Rating: 5
quote:
Should we just throw away any thoughts on how they should work just because a science fiction writer thought it up first?


Tom Cruise would agree with you.


RE: Sounds familiar
By SoCalBoomer on 2/29/2008 8:06:48 PM , Rating: 2
I gotta say that Asimov was far far more than just a Sci-Fi writer. . . and FAR more brilliant than Hubbard. "Asimov was one of the most prolific writers of all time, having written or edited more than 500 books and an estimated 90,000 letters and postcards. His works have been published in nine of the ten major categories of the Dewey Decimal System (all except the 100s, Philosophy)" Ph.D in BioChem, and so much more. http://en.wikipedia.org/wiki/Isaac_Asimov

Compare to Hubbard - a dropout. bah.


RE: Sounds familiar
By SoCalBoomer on 2/29/2008 8:07:29 PM , Rating: 2
I gotta say that Asimov was far far more than just a Sci-Fi writer. . . and FAR more brilliant than Hubbard. "Asimov was one of the most prolific writers of all time, having written or edited more than 500 books and an estimated 90,000 letters and postcards. His works have been published in nine of the ten major categories of the Dewey Decimal System (all except the 100s, Philosophy)" Ph.D in BioChem, and so much more. http://en.wikipedia.org/wiki/Isaac_Asimov

Compare to Hubbard - a dropout. bah.


RE: Sounds familiar
By pauluskc on 2/29/2008 4:56:29 PM , Rating: 2
quote:
In any case, I suspect we're decades away from any AI being smart enough to tell friend from foe.


Implanted RFID anyone? Dogs & Cats have it available today. Why not infantry? Problem solved.

A quick battlefield scan and anyone not with a tag - kablooie!


RE: Sounds familiar
By MozeeToby on 2/29/2008 5:14:46 PM , Rating: 4
1) Where do civilians fit into this plan of yours?

2) RFID has a very, very limited range, not really aplicable here so lets just ignore this and assume some other similar but more effective technology.

3) How are you going to prevent jamming and/or spoofing?

4) What stops an enemy from taking a killed/captured soldiers tag and doing whatever he wants to? (ok, you say implanted but in a war situation do you think that would stop the someone?)


RE: Sounds familiar
By Christopher1 on 3/1/2008 4:12:24 AM , Rating: 1
Four extremely good points. The real thing that we should be moving toward is a world where war is abhorred and that anyone who talks about starting a 'war' is immediately thrown into the closest mental health ward or prison that we can find, military person or not.

War will not have to be a 'reality of the future' if we stop allowing people to convince us that war is necessary, which in almost all cases it is not necessary. The only 'war' in the past 100 years that I personally believe was necessary and just was World War II. Then, we were fighting a country whose leaders had fooled their people into mass murdering another group of people, and we actually needed to get rid of those leaders.


RE: Sounds familiar
By morton on 3/1/2008 6:28:31 AM , Rating: 1
quote:
anyone who talks about starting a 'war' is immediately thrown into the closest mental health ward


isn't that place called the White House?


RE: Sounds familiar
By brenatevi on 3/1/2008 8:08:25 PM , Rating: 2
quote:
War will not have to be a 'reality of the future' if we stop allowing people to convince us that war is necessary, which in almost all cases it is not necessary.
You are neglecting the social, economic, and environmental pressures that are the driving forces behind war. Consider the "freshwater problem," where certain areas are drought ridden, and just over the border is a nice river that could provide all of the water that they need. Said political entity asks its neighbor nicely for access to the river, but they decline. What is a political entity to do? They have to think about the health of their citizens, and one way to get access to the water is to invade their neighbor and take the land that would give them water by force.

"But the neighbor could have just given them access." The thing about that is the river might not be able to sustain the citizens of both political entities. So who has the right to decide? How do you decide which citizens live and die? Looking at the situation dispassionately, you could say the war decides, and war might possibly thin out the populations enough that the river can sustain both political entities.

Worldwide, we talk about population pressure, how there aren't enough resources to sustain all of the people. So here's a question for you: how do you get rid of all of the "excess" people? Are you going to pick and choose?

Basically, when you get down to it, Life is ROUGH. Life is a struggle at all levels, not just for humans. Just watch a nature show for evidence. To think that the struggle, that war is going to go away just because it's "Insane" is to ignore that fact.


RE: Sounds familiar
By pauluskc on 3/3/2008 9:40:37 AM , Rating: 2
1) its autonomous. it kills, therefore it is. Unavoidable, unfortunate casualties, I'm afraid. Oh well. Hopefully the "live battlefield tours" company never kicks off. And we aren't sending autonomous killing robots to do guerilla warfare in the neighborhood shopping mall. If so, then yes, this tech is a long way off.

2) Oh yeah. I forgot. Walmart Inventory Systems is the only possible use of RFID and the only known RFID chips in existence. Maybe IRID? H2OID? Perhaps they'll inject all our soldiers with radium or whatever and then this killing machine will be able to see their whole siloette. Not to mention how scared the others will be when their gigers go off with just the scouts approaching. BTW: Were these robots supposed to be long-range howitzers? Look at the design. Sure there's pretty impressive-looking guns on them, but these are going to be designed for more close-in incursions. "Send in the Robots!" That's the whole point of armor.

3) How do they do it at walmart now? Sort it out later. That's not my plan, but probably theirs. Like the last question, it doesn't have to be RFID. But that concept, is my point. An external (or internally implanted) device. Spoofing, etc. gets to be more and more expensive. Of course, if they carried a live database of known nearby tags, and get captured, that could cause issues. But the codes could be reissued daily or hourly or whatever. But if Charlie's Angels' tapes can self-destruct in 5 seconds, why not these robots. Go ahead capture it. We just trigger the handy self-destruct mechanism.

4) Very good point. That's why they would follow standard procedure and change the coding frequently, as in daily/hourly. Just like the SAC, nuke subs, etc. Or in a SecurID way where's its mathematically calculated. 1024-bit codes allow for a lot of options. 2048-bit, even better. Old codes could be remembered and given a second chance. Maybe a 3rd, 4th, 5th, 6th, but definitely not a 700th, 701st chance. Especially over a geographically diverse area. I barely know anything about authentication. There's a few hundred (thousand?) experts more knowledable than me. OH YEAH - these robots replace human troops. So what are you even questioning the "captured soldier" problem?

My whole point is that it's not decades away. My own amateur self can ponder ideas for a few months and come up with something, let alone a few minutes.

My additional point is that I don't like this, not one bit. Who's to say the companies designing these robots aren't all on the take with the enemy?


"Mac OS X is like living in a farmhouse in the country with no locks, and Windows is living in a house with bars on the windows in the bad part of town." -- Charlie Miller














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki