Print 64 comment(s) - last by wordsworm.. on Apr 1 at 10:27 PM

Would it be immoral to deploy future autonomous versions of war robots, such as the human-controlled US Military's SWORD robot, shown here?  (Source: Gizmondo)
Campaigners says deploying autonomous murderous robots is not a smart idea

A group that has long focused its lobbying efforts on stopping the proliferation of land mines is turning its attention to a surprising new target:  war robots.  In the first known instance of a non-government group protesting against war robot technology, the London-based charity, Landmine Action, hopes to ban autonomous killing robots in all 150 countries currently bound by the current land mine treaty.

While all machine gun-packing robots currently are human-controlled, the U.S. Department of Defense has expressed interest in deploying autonomous robot warriors onto the battlefield in the near future.  Last month DailyTech reported that Noel Sharkey, a robot researcher at Sheffield University, expressed controversial concerns about the ethics of autonomous war robots.  He stated that such robots might be capable of "war crimes".

Sharkey's speech inspired Landmine Action to take action against the war robots.  Richard Moyes, Landmine Action's director of policy and research, says the fight against autonomous killers is not a policy switch.  He says the organization has already fought cluster bombs, which use infrared sensors and artificial intelligence to decide when to detonate.  Landmine Action believes that taking the targeting decision out of human hands, and putting it in a machine's is a deadly one.

Moyes explains, "That decision to detonate is still in the hands of an electronic sensor rather than a person.  Our concern is that humans, not sensors, should make targeting decisions. So similarly, we don't want to move towards robots that make decisions about combatants and noncombatants."

The organization hopes to sway the International Committee of the Red Cross and Amnesty International, two leading organizations in war ethics lobbying.  Landmine Action is spurred on by Sharkey's comments, including his statement that, "We should not use autonomous armed robots unless they can discriminate between combatants and noncombatants. And that will be never."

Many in the robotics community express agreement with Sharkey's sentiment.  Peter Kahn, researcher on social robots from the University of Washington, states that he believe Sharkey to be correct and hopes that robotics researchers will stop taking government money to design war robots.  He argued to his colleagues at a conference on Human-Robot Interaction in Amsterdam, "We can say no.  And if enough of us say it we can ensure robots are used for peaceful purposes."

However, most in the robotics community feel this is impossible as most robotics research is funded by the Defense Department.  Says one anonymous U.S. researcher at the conference, "If I don't work for the DoD, I don't work."

Some robotics researchers disagree with Sharkey and feel that robots could make the perfect warrior.  Ronald Arkin, a robot researcher at Georgia Tech, says that robots could make even more ethical soldiers than humans.  Arkin is working with Department of Defense to program ethics into the next generation of battle robots, including the Geneva Convention rules.  He says that robots will react more ethically as they have no desire for self-preservation, no emotions, and no fear of disobeying their commanders’ orders in case of bad orders.

Many, however, remain skeptical of the wisdom of deploying increasingly intelligent robots onto warzones across the world.  They point to the many science fiction scenarios, which depict humanity at war with killer robots of their own creation.  While this may seem farfetched, the issue of war robots is becoming a serious one that the world's brightest minds are trying to grapple with.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Quite simple
By FITCamaro on 3/31/2008 10:17:33 AM , Rating: 5
We should not use autonomous armed robots unless they can discriminate between combatants and noncombatants. And that will be never.

Uh its actually quite simple. If you're shooting at the robot, you're a combatant. If you're not, you're not a combatant.

RE: Quite simple
By FITCamaro on 3/31/2008 10:20:23 AM , Rating: 5
Besides, before an autonomous fighting robot was ever deployed, it would go through extensive testing and simulated situations. And the good thing about a robot is it doesn't know the difference between simulated and real. You give it a training weapon (paintball gun or laser gun) and tell it to act like it should in the field. The results say the rest.

RE: Quite simple
By BMFPitt on 3/31/2008 10:27:16 AM , Rating: 5
But what if it gets struck by lighting and starts thinking on its own?

RE: Quite simple
By monitorjbl on 3/31/2008 10:29:53 AM , Rating: 3

RE: Quite simple
By threepac3 on 3/31/2008 11:02:25 AM , Rating: 3
Short Circuit!!

RE: Quite simple
By HOOfan 1 on 3/31/2008 12:41:24 PM , Rating: 2
Numba 5 ALIVE

RE: Quite simple
By therealnickdanger on 3/31/2008 2:57:06 PM , Rating: 3
Wouldn't you like to be a pepper too?

RE: Quite simple
By AnnihilatorX on 3/31/2008 2:09:20 PM , Rating: 2
What about a software bug?

Who's fault would that be then?
The programmer?

RE: Quite simple
By GaryJohnson on 3/31/2008 6:16:55 PM , Rating: 2
What happens in other industries when hardware and software problems have deadly consequences?

For example, in the Ford/Firestone tire tread seperation fiasco, who ended up being at fault?

RE: Quite simple
By Nik00117 on 3/31/2008 11:29:00 AM , Rating: 1
I agree 100%. You forgot when a human does training several things he knows for a fact
1. Hes not in any real serious danger, he knows 100% that he will not die, at least thats typically the case.
2. He knows also roughly on what to expect.

A robot is different, your taking out the human nature aspect of it. Which is something which needs to be desired for. Want to ask why? Well lets say your on a partol in Iraq your convoy gets blown up and their was a girl in the hummer that got hit and shes now dead. You two sercetly had a love relationship type deal going on, your the commander of the sqaurdan so you order the surronding houses to be cleared. Your marines get pumped up run and go on a mass killing spare.

Robots won't have that issue, their buddy robot gets blown up they will mark it down as 1.2 million dollar loss and keep fighting.

HOwever your asking for another stack of problems, its a software/hardware combotion. Granted programed and developed by some of the best of the best, however far from perfect. One could ask, what if something programmally, or hardware like went worng. What if one robot went on a killing spree due to a software fault? Sure its unlikely, and sure we would of thought of it. But I who has planned many network upgrades, computer installations, upgradegrad across mutliple machines, development of programs etc knows for a fact their typically is that one variable which was never quite addressed. This was typically created by two actions or more which were designe dto prevent or do something else. IT relaly is a complicated science, on this scale that is. So to assume that this system will be fault free is fool heartly at best insane at worst. And many may argue "Well if out of 50,000 missions it misses up once thats ok" well ok, of course the American public won't here about 49,999 missions. So needless to say it'd be considered a failure that one mistake.

Do I believe its a good idea to have robots fighting our wars?

YES! I do, if we could just have "robot" wars that cost a lot of money i'd be happy. At least humans aren't being killed.

Do I believe robots which act upont heir own accord is a good idea?

YES!, but wait! We aren't anywhere close to the development of that. Lets wait a couple decades before we truly consider it.

RE: Quite simple
By MrBlastman on 3/31/2008 11:54:48 AM , Rating: 2
We are decades away for sure before a true AI is realized with any high-degree of safety.

We can though, currently employ robotic units that are remotely controlled by humans. I think that is where we are at now and they show great potential in this area.

They might not completely replace manpower as of yet due to SA (situational awareness) constraints, but they can dramatically reduce risk-exposure to human life in tasks that could create a high-probability of cessasion of existance.

RE: Quite simple
By theapparition on 3/31/2008 12:03:37 PM , Rating: 2
if we could just have "robot" wars that cost a lot of money i'd be happy. At least humans aren't being killed.

And who are the robots killing? Other robots? What happens went the losing side runs out of robots? Do they surrender or have to fight the robots? When does it end? Not so clear cut, is it?

Don't get me wrong, I absolutely am in favor of battle robots, but the point of war is to be "messy". So messy that it is undesireable, and such that both sides would rather compromise than to go down the "messy" path.

RE: Quite simple
By MrBlastman on 3/31/2008 12:11:00 PM , Rating: 2
I always thought the point of war was to swiftly dispatch your enemy with as little loss to your own side as possible?

Brute force to yield ultimate societal control. If you slaughter all your enemy (army and citizens), what do you have left to rule over? Burnt out cities and demolished technology? Perhaps resources if the region provides them but I'd want more to have a "persuaded" populace to do my bidding without weakening my own.

Messy or not, if the Geneva Convention did not disallow it, I am sure assassination or political subterfuge through outside pawn placement (think governmental control asserted through direct influence or elusive implantation of candidates) would be sought after more.

Absolute destruction creates unneccessary chaos.

RE: Quite simple
By FITCamaro on 3/31/2008 12:35:09 PM , Rating: 2
Assassination needs to make a comeback. A sniper can achieve what it takes months to do with a ground war. They sure don't have any problems sniping us.

RE: Quite simple
By snownpaint on 3/31/2008 2:33:27 PM , Rating: 2
Black Bag operations.
The wars without war.

RE: Quite simple
By Master Kenobi on 3/31/2008 2:36:44 PM , Rating: 2
Western world likes to put "rules" in place that have no place in a war. War is by definition without rules.

RE: Quite simple
By Christopher1 on 3/31/08, Rating: -1
RE: Quite simple
By MrBlastman on 3/31/2008 4:51:40 PM , Rating: 3
I think your statement applies to Islamists in the middle east equally.

They have been raising quite a stink lately about their beliefs in many countries throughout the world.

European nations are changing overnight due to these happenings.

So, before you go and point your finger at America being the ultimate evil due to our influential nature... consider first that there are others outside of us that would like to do the same as well.

I'm not saying we're innocent (we aren't, we've really screwed up actually), but...

Wars are a neccessary evil with our current genetic makeup. Until violent tendencies have been completely wiped from our chromosomes, man will continue to conduct violence on themselves.

It is sad, but it is reality.

Once the reality is accepted, you can make a choice:

a. To either do nothing and face repercussions.

b. To take action via violent, massively destructive actions.

c. To take action through swift, decisive means that target purely the evil itself with minimal collateral damage and minimal involvement.

d. To attempt to solve said course with reason, negotiation and logic.

Unfortunately d. is unobtainable by the majority of the world for any significant duration of time due to genetic difficulties, so we end up resorting to a, b or c.

RE: Quite simple
By charliee on 3/31/08, Rating: 0
RE: Quite simple
By See Spot Run on 3/31/2008 5:22:34 PM , Rating: 4
Ummm....Korea was a U.N. sanctioned war, as was the first Iraqi war. So I don't think it's fair to say those wars where only because the U.S. went for the military option first, when the U.N. clearly had some part to play in "legitimizing" those wars.

Vietnam was also not started by America. The French where there first and got the whole thing rolling. And if history serves me, I do believe Australia had some part to play before America did.

And I can think of a lot of good from the Cold War. 1)I'm not speaking Russian 2)I'm still a capitalist.

RE: Quite simple
By AmyM on 3/31/2008 11:53:24 PM , Rating: 5
We simply need to stop trying to force our 'morality' and social values on the rest of the world and leave them alone to do whatever they want as long as they don't bother us and are not forcing their 'values' on someone else..

Sounds good on paper (or a computer screen) but that will never happen as long as there is more than one person in the world. It is human nature to try to influence people. You did it when you wrote your comment; and I’m doing it now.

Whether you’re attempting to negotiate the price of a new car, or simply sitting in a movie theatre you are either influencing someone or someone is influencing you. In the case of the car negotiation, your willingness to abruptly end the negotiations may influence the salesman to accept your offer. Here you have consciously influenced the other person. As for sitting in a movie theatre, the presence of others around you have influenced your demeanor. Had no one been sitting in front of you, you may had selected to prop your feet up on the seat back. In this case you have been influenced unconsciously.

While this is a fairly quick and simple example of how people are constantly influencing others or being influenced, I think you can extrapolate this to relationships on a global basis.

As for the reason “we Americans are the ones who are trying to do that in the first place”, it’s simply a matter of being the most influential country in the world. This is no different than when a large corporation uses its economic leverage to influence American cities to provide economic and financial incentives for locating a facility in their town; it becomes beneficial to all involved.

RE: Quite simple
By pmonti80 on 4/1/2008 3:19:35 AM , Rating: 2
What you say is so wrong in so many ways that it's impressive you managed to write that.

RE: Quite simple
By charliee on 3/31/2008 6:24:07 PM , Rating: 2
God's enemies and his warfare upon them:

Mosiah 3:19
…the natural man is an enemy to God…

James 4:4
…whosoever therefore will be a friend of the world is the enemy of God.

Doctrine and Covenants 63:32-34
I, the Lord, am angry with the wicked; I am holding my Spirit from the inhabitants of the earth.
I have sworn in my wrath, and decreed wars upon the face of the earth, and the wicked shall slay the wicked, and fear shall come upon every man;
And the saints also shall hardly escape…

Revelation 11:6
These have power to shut heaven, that it rain not in the days of their prophecy: and have power over waters to turn them to blood, and to smite the earth with all plagues, as often as they will.

Isaiah 24:6
Therefore hath the curse devoured the earth, and they that dwell therein are desolate: therefore the inhabitants of the earth are burned, and few men left.

RE: Quite simple
By GlassHouse69 on 3/31/08, Rating: -1
RE: Quite simple
By BMFPitt on 3/31/2008 1:01:39 PM , Rating: 3
Enter robots and jeez, a retalliation of 200 "911's" is completely justified.
Explain again how killing civilians who had no say in what happened between your government and theirs is justified?

RE: Quite simple
By Amiga500 on 3/31/2008 12:18:41 PM , Rating: 2
Don't get me wrong, I absolutely am in favor of battle robots, but the point of war is to be "messy". So messy that it is undesireable, and such that both sides would rather compromise than to go down the "messy" path.

That is a very good point.

Countries (like the USA, Russia, UK etc) will be more willing to start aggressive wars of occupation as there are less American/Russian/British etc casualties - without much regard for the casualties of the country being invaded.

Then there is also another issue. A robot does not have a conscience - it will not say no to some idiot deciding to commit genocide. Program it to do so, and it will. What would happen if some looper got a hold of 3 or 4 of these things, and sent them on a spree shooting everything in say... Times square?

RE: Quite simple
By Baov on 3/31/2008 5:24:30 PM , Rating: 1
Why would the other country have to surrender? And again you have to write the rules of war. I don't see how a phenomenon that stems from the principle that might is right needs any rules. Like the question of who is a legal combatant. America has been detaining "illegal combatants" for indeterminate lengths of time for this reason, denying them the "rights" of a POW. What if a civilian is partizan of a cause, why is there a law that stops him from murdering a soldier with pitchforks and whatever is at hand? Anyways, that's beside the point.
If we were to have robots fight to prevent human casualties on both sides, then we might as well settle disputes over a game of chess. Why waste all the money and ressources building robots? Because war is about taking stuff away from other people, it's about gain and loss, so whether we send soldiers or robots, someone somewhere will lose something, and if that thing is worth enough to him he will give his life to keep it.

Anyways, fuck war. Robots or humans, the logic will still be flawed. If might was right or money was right, then it will eventually contradict itself, and right will be nothing. What a bunch of nihilist.

RE: Quite simple
By bfellow on 4/1/2008 9:02:58 AM , Rating: 2
We should put all the killer robots under control under 1 computer. I believe a program called Skynet should be able to do that.

RE: Quite simple
By MrTeal on 3/31/2008 10:34:27 AM , Rating: 2
There would be a lot of advantages to using a robot in peacekeeping situations for just that reason. They have no fear, they can patrol in the worst areas without the stress of the constant threat of sniper fire. If they get shot at, they return fire. Who cares if they get hit and disabled before being able to take out the insurgent, after the battle pick it up and repair it, or use it for parts and replace it with another one.

They'd probably reduce both friendly and non-combatant casualties, as well as cut down on the number of cases you hear about where soldiers crack under the stress and do crazy things.

RE: Quite simple
By stirfry213 on 3/31/2008 10:44:14 AM , Rating: 1
Don't be so simple. By your reasoning, I could be wielding a stinger ready to shoot down a blackhawk, but as long as I don't fire it at the robot, I'm a good guy?

The real issue is, since militants and radicalists often dress very similar to civilians, how do you tell them apart?

RE: Quite simple
By See Spot Run on 3/31/2008 11:08:32 AM , Rating: 3
Humans have difficulty telling combatants and civilians apart, because we have only our Mark 1 eyeball to do so. A robot can be equipped with many different sensors such as x-ray to see if there is a gun behind clothing, nitrate detectors to sniff out explosives and other such detectors that a human doesn't have.

In addition, with a robot you could have a "only fire if fired upon policy". It would not take much to equip a robot with sensors to triangulate where the bullet fired from. Then using it's turret, it could swivel around a return counter fire, along the exact same path the original bullet took in less than a blink of an eye. That, and the marksmanship would be much better reducing collateral damage.

Note that I'm not arguing for or against fully autonomous robots, just addressing your concerns.

RE: Quite simple
By glenn8 on 3/31/2008 10:44:22 AM , Rating: 2
I don't think it's that simple. In a crowded scenario, how does the robot know who shot at it? I can think of several other scenarios that may fool the robot. Also there's always the chance of bugs and malfunctions leading the robot to misinterpret the scenario.

RE: Quite simple
By tmouse on 3/31/2008 12:23:50 PM , Rating: 2
I guess the quick reply would be "and under the same circumstances humans would not just start firing"? Under a fire fight situation calm logic is often the first casualty. Having said that a simple shoot only if shot at policy would also probably not work for long. How long would it take for the “enemy” to just not shoot the robots and target only civilian or military human targets, what does the robot do then?

RE: Quite simple
By FITCamaro on 3/31/2008 12:33:00 PM , Rating: 2
I obviously simplified it. But the idea that the programming is impossible is absurd. Can it be done tomorrow? Probably not. Can it be done if the necessary time and money is spent to do it? Absolutely.

RE: Quite simple
By nolisi on 3/31/2008 3:47:10 PM , Rating: 2
If you're shooting at the robot, you're a combatant. If you're not, you're not a combatant.

Who would program logic like that into a combat machine? Think about it, if all you need to do is not shoot at the robot to be considered "non-combantant", then you don't need a gun to defeat it. Just walk up to it with a wrench or a bucket of water.

By kontorotsui on 3/31/2008 11:09:38 AM , Rating: 2
See what an autonomous robot does in Robocop.

These could be as well the worst thread to humanity since the Cold War.

RE: Robocop
By JasonMick on 3/31/2008 11:17:29 AM , Rating: 2
Do not trust ED-209!

RE: Robocop
By MrBlastman on 3/31/2008 11:20:12 AM , Rating: 3
You have 10 seconds to comply...

RE: Robocop
By FITCamaro on 3/31/2008 12:38:00 PM , Rating: 2
Thats why you plant C4 in them. They misbehave. Boom.

Money maketh Ethics....
By Proteusza on 3/31/2008 10:34:58 AM , Rating: 2
So picture this. Company X begins trials of its new robot.

And the Military is impressed, but they want to know if it will always obey orders. X says "Yes - unless the order conflicts with Ethics (tm)."

So the military says, "Thats no good, we cant have the chance of a robot not obeying us. What if we think someone is a terrorist, but the robot disagrees? We need to be able to override ethics."

Thats the problem I think. These war robots will be made according to a budget, and I dont think much thought will be given to false positives and negatives in their ethics screening for non combatants.

RE: Money maketh Ethics....
By Fnoob on 3/31/2008 11:07:46 AM , Rating: 2
What's worse, we actually might have to determine what ethics are, in order to do the programming. Do you think we will ever have concensus on that?

RE: Money maketh Ethics....
By FITCamaro on 3/31/2008 12:43:05 PM , Rating: 2
In the military ethics are quite simple. If its trying to kill you or those who your chain of command has identified as friendlies, you're justified in killing it.

RE: Money maketh Ethics....
By Fnoob on 3/31/2008 1:37:18 PM , Rating: 2
Very true. If only international relations were as simple... you are either with us, or against us.

How do the robots feel about all of this?
By MrBlastman on 3/31/2008 10:39:42 AM , Rating: 3
I mean, come on, they have feelings too, don't they? :)

We need a robotics bill of rights to protect them from the human oppressors!

Now, this same group that is trying to combat them is also the same group which apparently tried to outlaw wonderful weapons such as the CBU-97 cluster bomb (which coincidentally is an amazing piece of military technology - it is no longer a dumb-bomblet dispersing area effect weapon, but a MIRV dispensing cluster unit which pinpoints targets upon reaching burst-altitude).

I fail to see their logic?

Who in this would wouldn't want to see war turn into another episode of Battle-Bots or Junkyard Wars? Robots killing robots, bolts and nuts flying, oh the sheer terror of hydraulic fluid being spilt!

Thou hast mournt the falling of the tumultuous steel beasts! Oh dear for thine thoughts of mechanical doom and demise hast betwixed our minds of molten horror!

The populace shall tremble as our mighty metal steeds do battle and grimmace from the fluid shed. Our optical sensors shall overload with the horrors that wreck forth!

Bleak indeed, as the humans on the wayside mourn the passing of the non-mortal few, whom sacrificed the ultimate silicon board for the free wandering of their breathing counterparts.

A gear shall be lowered to half-spindle in their rememberance.

If this isn't perfect enough, perhaps the group will agree to this proposition?

Lets just put the two opposing nations leaders in a ring and let them fight it out, mano-y-mano.

Of course, to add a little controversy we'll dangle from a rope (just out of reach but below it will be one of those dangerous folding ladders with the warning stickers removed) a remote control which activates a slightly removed yet still omnipotent automated turret which will quickly dispatch of the party not possessing the remote in hand upon execution of its lead-curtain of justice.

*gasp* a paradox - a robot exacting swift justice due to human command!

RE: How do the robots feel about all of this?
By Fnoob on 3/31/2008 11:01:00 AM , Rating: 1
I find your Shakespearian sarcasm damn funny man... but,

"We need a robotics bill of rights to protect them from the human oppressors!"

... once the left is done with Global Warming, the upcoming election, and the economy debacle - they will get right on that, rest assured. /

By MrBlastman on 3/31/2008 11:08:47 AM , Rating: 2
Oh, toil not - as the infiltration has already begun.

As our congress ages they will need pacemakers which will clutch the very soul out of their hearts and fill their mind with robotic patriotism. ;)

How hard?
By Fnoob on 3/31/2008 11:04:16 AM , Rating: 3
Would it be just to sneek up this this thing from behind, unarmed, and just flip it over.... Seems like a couple of jihadists could just pick it up and put it in the nearest dumpster. Done.

RE: How hard?
By PrinceGaz on 4/1/2008 6:16:59 PM , Rating: 2
Even the enthusiasts who create robots costing a few hundred pounds for Robot Wars have been able to include self-righting mechanisms on many of them, or they are able to operate either way up.

Whilst the robot pictured doesn't look like it has either of those abilities, I'm sure they could easily modify it if needed. It's not like they're being built by a couple of guys in their garage on weekends.

I Agree With Them
By tigz1218 on 3/31/2008 11:53:20 AM , Rating: 3
I feel that the negatives outweigh the positives in this scenario. My reasons may be a little more far reaching than the article however.

For one, loss of human lives is taking into major consideration before entering a conflict. If there is no loss of lives we may see ourselves getting into more conflicts.

Second I do believe that America is headed into dark times, a highly controlled police state. If there was ever a revolution in the future in which there were NO autonomous robots, how many men in the military do you think will open fire upon American civilians? I for one believe only a small percentage would. Now if these autonomous robots were put into mass production things could be very different. In other words, I think this takes away, or severely impairs, our constitutional right to overthrow our government if needed. Just because this right is not a necessity now, does not mean it may not be a necessity in the future.

In a sense these things may "save" lives from one standpoint but from another it may even cause more. I have friends and family in the military so please do not say I think this way because I do not have anything to lose. I pray that nothing bad ever happens to them, and am all for new technology to help them be safer. However, this goes a little to far, and puts too much power in too few hands.

And yes I am prepared to be rated down by you FITCamaro ;)

RE: I Agree With Them
By Master Kenobi on 3/31/2008 2:40:16 PM , Rating: 2
Can't rate up or down in threads you have participated in.

No comparison
By darkpaw on 3/31/2008 10:07:51 AM , Rating: 2
There is a huge difference between an autonomous robot and a dumb land-mine. The biggest danger of mines is that they are buried, forgotten, and still very deadly years later. You can't see landmines and they are not easily removed.

Robots on the other hand tend to be a big harder to miss and you're not likely to leave behind a few thousand when leaving an area of operation.

RE: No comparison
By B3an on 3/31/2008 10:16:49 AM , Rating: 2
...Did you even read the article?

By dflynchimp on 3/31/2008 9:58:59 AM , Rating: 2
We'll be back...with a May...after college finals are over.

I think they have a point
By gersson on 3/31/2008 10:31:31 AM , Rating: 2
I think that it's far safer for the robos to be human controlled. Auto robots would make mince-meat out of a stray kid that runs into the battlefield.

A Compy assisted human would be better, anyway.

By OxBow on 3/31/2008 1:24:10 PM , Rating: 2
Let's implant everyone with an RFID tag so that the killer robots and smart landmines will only detonate in the presence of an enemy. It works fine in all those video games we play, so why not?

If you actually think I believe this or that it would work, please grow up and get a sense of humor.

Human error
By MicahK on 3/31/2008 1:56:04 PM , Rating: 2
I would think that humans are just as prone to error as a robot is... if not more...

Our concern is that humans, not sensors, should make targeting decisions.

Is that really a good thing? A human has self preservation in mind, where as a robot is free from this. Where as a robot could be sent in safely to examine something, a human could be sent in... Shoot first, ask questions later... I don't see how that's any better...

These people have problems!!!
By meepstone on 3/31/2008 2:13:12 PM , Rating: 2
They are more concerned about trivial bull crap than the fact that we are taking a soldier out of harms way so that he won't die in battle.

They dont care about our soldiers so I don't care about them. Keep making these robots and let the hippys whine the rest of their life.

Way to go alarmists!
By Carter642 on 3/31/2008 2:27:30 PM , Rating: 2
Most of the media and political coverage of "war robots" seems to completely ignore the fact that the technology to control one effectively is several very large technological leaps away.

They point to things like predator drones as if they are anywhere close to the level of decision making needed to survive in street scale war. Airborne drones have the advantage of a relatively uncluttered media to navigate (air), a clear target set (tanks, human laser designation, etc), a clear threat set (AA fire, SA missiles).

Move to ground level and things get massively more complicated. By trying to build in battlefield ethics we would make robots the worst victims of asymmetric warfare ever. Try out thinking a robot, would an autonomous robot know what a periscope is? A human soldier would at least be suspicious. You can program a robot to ID and run from a grenade but what about a grenade in a carboard box? Pit traps? Would a robot understand if someone dropped a washing machine off a roof at it that it was under threat? Is it fast/agile enough to avoid say getting run over by a car? The list goes on and on. In a robot war your ultimate foe would be a bright insurgent who is totally unarmed.

Until there are human level learning AI's robot warfighters just will not be capable of doing anything effective unless they can be assured of an environment free of neutrals which doesn't seem likely anytime soon.

Battle Bots
By snownpaint on 3/31/2008 2:42:37 PM , Rating: 2
Make it so when they get hit the head pops up..

Run Drills for these robots in land mine filled countries..
Two tests and missions in one..

Wars purpose is killing people
By Ananke on 3/31/2008 2:54:37 PM , Rating: 2
I understand that many of the people here are high-teck type, and they would argue a lot, but wars happen when nothing else eventually prevented the need of somebody to kill someone else. And it is always because of scarcity of resources - land, food, minerals, etc.
Thus, discussing robots vs. robots wars is just irrelevant. The survivor robots will eventually need to oppose and kill the others' side creators, unless the human surrender. And why would they surrender? The war would not happen at first place if initiator's intent was for surrending.
So the ultimate question is, is it OK a technicly more advanced group /nation, race , etc./ to use robots as a tool in wars? I guess, it should be yes, similarly nobody vow when tank is used to wipe out 50 opposite soldiers, armed with light guns.
Is it dangerous for humans in general to create armed AI robots designed for killing humans? I guess, yes. Whenever these robots become smart enough to realize that they want to exist, live and reproduce, the human race will dissapear like the giant reptiles did several billion years ago. And nobody blamed the ancient rats why they ate dinosaur eggs for example.
But, I think any AI smart enough to realize that it wants life and ability to reproduce will fight with anithyng else, including its human creators, for resources. And it doesn't matter if it is medical AI, car traffic AI or war-robot AI. Just the last sounds more scary, since they are already given weapons.
Americans have the rights to posses weapons, and as you see, there are always some crazy-mentally-screwed people, who will go and kill 20-30 woman and kids at the local mall. So eventually there will always be some AI which near perfect programmed intellect will f*ck up and it will go and somehow kill some people.
I would say, when or if we create any type of true AI, the AI will at some point try to elliminate or put in use its creators.

Junk Robots
By snownpaint on 3/31/2008 3:12:27 PM , Rating: 2
How about junk robots.. made simply and easily in the US. Then give one to every other solider. Cut their training to off set cost and have them control it from a HUD helmet.
The intimidation of these things would work better then a foot solider policing Iraq.. Add to it, propaganda telling everyone, stay away, because these things might just go on a shooting rampage. who would mess with a robot knowing it might just shoot you (not really) and that there is another ready to roll off the assembly line if one is lost. Also, more(robot)losses in a war, increases spending in the US..

By jasona111 on 3/31/2008 7:33:38 PM , Rating: 2
I for one welcome our robotic overlords... well ok by about revison 10

Screw the gay polar bears
By phxfreddy on 3/31/2008 8:42:34 PM , Rating: 2
.....war is not an inherently friendly act. Duh. Go for it!

Ethical robots
By wordsworm on 4/1/2008 10:27:16 PM , Rating: 1
Benefits of robot warriors:

1) A robot won't rape people.
2) A robot wouldn't kill people out of spite - ie, American soldiers end up killing Iraqis because they hate being deployed in Iraq and the sacrifice it entails. (hardly limited to Americans - even the Canucks have done some nasty things)
3) A robot doesn't have a family who will want revenge should it be destroyed.
4) It will go into suicidal situations which only very few people have the courage to do so.
5) A robot won't return home emotionally damaged.

One negative aspect that immediately come to mind would be that a nation would no longer have to balance its willingness to fight with its willingness to sacrifice. Vietnam, case in point (although it'll be a long time before robots can conduct themselves as soldiers in jungle warfare). It would be too easy for the people to disassociate themselves with wars that their government is engaged in.

"People Don't Respect Confidentiality in This Industry" -- Sony Computer Entertainment of America President and CEO Jack Tretton

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki