backtop


Print 111 comment(s) - last by bfellow.. on Mar 5 at 10:46 AM


Autonomous and semi-autonomous robots like this stealthy Boeing X-45C may face new legal repercussions  (Source: Boeing)

IRobot's war-robot is one of the autonomous robotic soldiers that may eventually appear on the battlefield, legality pending. Could such a machine commit atrocities?  (Source: iRobot)

Another deadly autonomous bot is the Foster-Miller MAARS (Modular Advanced Armed Robotic System). The MAARS implements advanced technology to eliminate friendly fire problems.  (Source: The Wired Danger Room)
Researchers debate where the fault divides between the operator and the machine

The U.S. military is working very hard to develop autonomous robotic warriors.  The U.S. is not alone -- the technology is the wave of the future in warfare, and is being pursued by dozens of countries worldwide.  While expensive, weapons-toting robots can provide deadly accuracy and protect a nation's human soldiers.

Already, the U.S. has extensively utilized unmanned aerial vehicles in the war in Iraq, for both surveillance and offensive strikes, with drones such as the Hunter UAV shooting deadly missile strikes into enemy hideouts.  The SWORD robots, designed by the army and armed with machine guns, patrol the streets in Iraq.  Meanwhile a semi-autonomous Bradley Fighting Vehicle, named the Black Knight, is being developed in the U.S.  Not to be left out, the Air Force states that it wants to have unmanned heavy bombers by 2020.

The key feature among all these robotic killers that have been deployed or are under development is that they need a human to pull the trigger.  However, there is growing sentiment among military circles that eventually robots should be developed to be fully autonomous -- fighting and killing enemies, all without human intervention.  Exactly when and how such robots should be legal and troubling moral issues raised were topics of discussion at Royal United Service Institute's conference, "The Ethics of Autonomous Military Systems", a summit of international scientists and military officials held on February 27 in London.

The question "Can a robot commit a war crime?" was raised during the conference.  Such a concern -- that robots might malfunction and target civilians or friendly soldiers -- remains a frightening thought to many military men.

English Barrister and Engineer Chris Elliot explained carefully his thoughts on the legality of autonomous robotic weapons systems in terms of international criminal and civil laws.  He points out that at a certain point the robot's engineers can no longer be held reasonably culpable, and the blame for errors resulting in catastrophic loss of life may come to rest on the shoulders of the robot who committed the assault.  He states, "We're getting very close to the where the law may have to recognize that we can't always identify an individual - perhaps an artificial system can be to blame."

The idea of a robot being charged with murder raises provocative questions about punishment and the fairness of such measures.  Elliot did not back down from taking other hard stances on issues.  He made it clear that currently there was a clear legal burden for humans choosing to deploy systems lacking in sufficient judgment.  He stated, "Weapons intrinsically incapable of distinguishing between civilian and military targets are illegal."

Elliot stated that robots should only be allowed to autonomously wage war when they pass a "Military Turing Test."  He explains, "That means an autonomous system should be no worse than a human at taking decisions [about valid targets]."

The original Turing test, developed by computer pioneer Alan Turing, states that if a human is unable to tell in a conversation with a robot and a real human, which is the man and which is the machine, then the robot conversing with the human has achieved intelligence.

Elliot could not say when or how such a test could or would be administered, stating simply, "Unless we reach that point, we are unable to [legally] deploy autonomous systems. Legality is a major barrier."

Bill Boothby, of the UK Ministry of Defense, argued a slightly differing perspective that if the situation was carefully controlled, an autonomous fighting machine would not need to be quite as intelligent as Elliot's test would require.  Boothby hopes that by lowering the requirements, robots could assist on the battlefield sooner as opposed to later.  He describes this stance, stating, "There is no technology today that can make that kind of qualitative decision about legitimate targets ... [but] it may be possible to take precautions in the sortie-planning phase that enable it to be used within the law."

Boothby argued that in a way, human operators might be no better as they might simply "rubber stamp" the robot's targeting decisions.   Boothby's comments were found more appealing to many military commanders who wished for sooner deployment of such robots as the new SWORD autonomous fighters or Foster-Miller's MAARS combat robot, which implements advanced technology to reduce friendly-fire incidents.

The difference in opinions expressed between Elliot and Boothby are reflective of the mixed feelings society holds about deploying independently operating killing machines to warzones.  There remain many fears in the mind of  the public, both realistic ones, based on practical assessment of the current limitation, and fantastic ones, fueled by popular culture such as the Terminator movies, which depict a future in which cold-blood lethal robots have turned upon mankind.  The issue is sure to only become more contentious with time and technological advances.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Scary times ahead
By Mojo the Monkey on 2/29/2008 3:41:29 PM , Rating: 2
I wonder if the general public has really stopped and thought about the ramifications of robotic warfare. Essentially, advanced nations being able to wage war with "rogue nations/factions" by just sending in robo-extermination forces.

Doesnt the risk of loss of a friendly life give some validation and weight to the decision to go to war? Its one thing to convince a nation that a cause is worth going to war, and potentially losing life over (ignore the present political climate/debates, just speaking generally) - but quite another to just push a button with no accountability.

And dont even get me started on the "no better than a human" decision-making. Shouldnt we strive for "far superior to a human"? Lets move forward, people.

I just foresee a turbulent future ahead if we really allow robots to do everything. We're already half-way there.




RE: Scary times ahead
By CU on 2/29/2008 3:58:42 PM , Rating: 1
There will still be accountability with going to war. You just will not lose as much friendly life going to war with a robot fighting force.

We should move forward, but that does not mean we should not use them when they are as good as us. I would rather see a robot blown up than a friend or family member.

We use attack dogs now in the police and armed forces and I think I can determine friendly in some case better than them. Do you think we should not use them either?

What if we bio-engineer something to fight for us, call it an upgraded to the attack dog?


RE: Scary times ahead
By MozeeToby on 2/29/2008 5:20:11 PM , Rating: 5
I've quoted it before, and I will quote it again...

"It is good that war is so terrible, lest we grow too fond of it."

When we remove human suffering from war, we increase the chances that we will enter into it in the future. Of course, putting robots into the fight on our side only eliminates our side of the suffering, but history has shown that in times of war the enemy is often not seen as human.


RE: Scary times ahead
By Spuke on 2/29/2008 5:41:01 PM , Rating: 5
Begun, the clone wars have...


RE: Scary times ahead
By logaldinho on 3/1/2008 7:53:37 AM , Rating: 5
Skynet became self-aware at 2:14am EDT August 29, 1997


RE: Scary times ahead
By Spartan Niner on 3/1/2008 2:11:21 PM , Rating: 2
The singularity is approaching. Run for the hills!


RE: Scary times ahead
By roastmules on 3/3/2008 4:26:05 PM , Rating: 2
We must assure that no weapon ever becomes self-aware.


RE: Scary times ahead
By Mojo the Monkey on 3/1/2008 3:50:20 PM , Rating: 2
That was kind of my point. I mean, in a perfect world the robots would fight a "righteous war" of defense, from an aggressive, invading enemy. But in the real world, we deal in shades of grey. Nations possessing this technology are likely to be the aggressors or invaders.

And I'm not suggesting that we not move forward, merely calling attention to the fact that there is quite a stimulating debate to be had, but no with authority one seems to be having it.


RE: Scary times ahead
By borowki on 3/1/2008 9:04:39 PM , Rating: 3
In a perfect world, we would not have morons who think the existence of shades of grey implies the absence of black and white.


RE: Scary times ahead
By Christopher1 on 3/1/08, Rating: -1
RE: Scary times ahead
By kyp275 on 3/1/2008 5:02:30 AM , Rating: 3
too bad in order to do that you'd need all humans to agree with one another as to a universal standard of right and wrong, which will never ever happen.

Hell, I can think of one thing right now that I'll never agree with you on, I'm sure many others know what I'm talking about.


RE: Scary times ahead
By borowki on 3/1/2008 7:26:07 AM , Rating: 3
quote:
not to overthrow a 'dictatorship', not to 'put in democracy' (which will come if the people actually want it enough)


Fuzzy wuzzy is a dictatorship.
Fuzzy wuzzy allows its people to choose the system of government they want.
Fuzzy wuzzy is really a democracy.


RE: Scary times ahead
By sonoran on 2/29/2008 4:37:32 PM , Rating: 2
quote:
Doesnt the risk of loss of a friendly life give some validation and weight to the decision to go to war?
Indeed, that's THE major factor that prevents there from being more wars - the fact that your own people will be dying, which always carries serious political and social consequences. Take that away and anyone with these capabilities is free to be as aggressive as they can financially (and politically) afford to be.

In the end all our discussion doesn't matter. Militaries WILL build these things once it's possible. How they get used will be up to men to decide.


RE: Scary times ahead
By CU on 2/29/2008 5:22:54 PM , Rating: 2
I don't believe that is the THE major factor. Simple because, we can already wage war and take out governments with little if any friendly life lost and the world is not in total conflict everywhere. You can do this with tactical nukes and other long range bombardments. It is difficult to claim and hold ground without human soldiers but not impossible with current technology.

I also believe humanity has more good people than bad, and that most people care about others and do not want to just kill for money.


RE: Scary times ahead
By kontorotsui on 2/29/2008 10:19:35 PM , Rating: 1
quote:
TextAnd dont even get me started on the "no better than a human" decision-making. Shouldnt we strive for "far superior to a human"? Lets move forward, people.


If robots become "far superior to a human", they won't fight a war anymore.


RE: Scary times ahead
By Ringold on 2/29/2008 11:13:28 PM , Rating: 2
quote:
Essentially, advanced nations being able to wage war with "rogue nations/factions" by just sending in robo-extermination forces.


Am I the only one that doesn't see this as a new issue? How are conventional forms of destruction fundamentally different from the nuclear forms of automated mass destruction we've had since Eisenhower in the form of ICBMs? Someone somewhere in America gives a command, a series of actions take place that result in the launch of automated ballistic missiles, which then travel across the Earth to deploy their payload, all without risking a single American life.

The risk comes in retaliation from another major nuclear power, though a non-nuclear power couldn't respond at all. Likewise with this new-age technology; a major power could respond at, and the very least, make an attack extremely expensive. A minor power could only roll over and pray for it to stop.


RE: Scary times ahead
By crystal clear on 3/1/2008 9:05:05 AM , Rating: 2


"Death is the penalty we all pay for the privilege of life."


RE: Scary times ahead
By Cr0nJ0b on 3/1/2008 11:03:59 AM , Rating: 4
I would agree with this point. In my view, the person, or group that deploys such a weapon would be responsible for it's actions, not the engineer and certainly not the weapon itself.

The problem is that in very rare cases are the generals ever held to account for "mistakes" that their weapons make. Collateral damage is an accepted fact of war, so if a general were to send a dumb robot into a building...and that robot kills every man woman and child inside...I think that the general would see it as a successful mission, albeit with some level of collateral damage. That's what I'm most afraid of. And why not have a human behind the trigger at some secure location? At least then there would be a chance to reduce the damage. The only issue you have to contend with then is the added cost of such a system.

I personally don't believe that war is necessary, but that not the topic for today. It's robots on the battlefield. and my stance is...

1) They are a legitimate form of weapon
2) They should always have a human at the trigger
3) If they are ever released autonimously, the general in charge should be considered the person at the trigger.
4) There should be some code of conduct that codifies these things, so generals will think twice before jumping to the easier solution.


RE: Scary times ahead
By crystal clear on 3/1/2008 3:02:22 AM , Rating: 2
quote:
I just foresee a turbulent future ahead if we really allow robots to do everything. We're already half-way there


With or without robots we had/have & will have a turbulent future.

Going back into history we had wars from generations to come & will continue to do so,just in any part of the world.

"if we really allow robots to do everything"

Robots are programmed by human beings for a specific use or
purposes.

Robots are a creation of man whilst human beings are a creation of nature.

Man is an animal by nature & animals by nature go by the "programme" of survival of the fittest.

The desire to control territory & females by animals is known characteristic.

quote:
Shouldnt we strive for "far superior to a human"? Lets move forward, people.


There nothing far superior to a human being - We are the most superior form created & robots are clones of human beings.

Robots were invented & manufactured by man with an explicit prupose of replacing man to do work ONLY or kill/destroy or what ever purposes they were created for.

They can be programmed to save life from doing surgery to manufacturing to bomb disposal to space exploration etc to
kill/destroy life,property etc to SAVE LIFE.

Man decides what the robots should do ! simple as that.


Lets be realistic & practical !

I just returned from Russia on a business trip there,
I picked up some Russian....

ROBOTA means WORK !thats what ROBOTS do !


RE: Scary times ahead
By crystal clear on 3/1/08, Rating: 0
"If a man really wants to make a million dollars, the best way would be to start his own religion." -- Scientology founder L. Ron. Hubbard














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki