A group that has long focused its lobbying efforts on
stopping the proliferation of land mines is turning its attention to a
surprising new target: war robots.
In the first known instance of a non-government group protesting against war
robot technology, the London-based charity, Landmine Action, hopes to ban autonomous
killing robots in all 150 countries currently bound by the current land
While all machine
gun-packing robots currently are human-controlled, the U.S. Department of
Defense has expressed interest in deploying autonomous robot warriors onto the
battlefield in the near future. Last month DailyTech reported that Noel Sharkey, a robot researcher at
Sheffield University, expressed controversial concerns about
the ethics of autonomous war robots. He stated that such robots might
be capable of "war crimes".
Sharkey's speech inspired Landmine Action to take action against the war
robots. Richard Moyes, Landmine Action's director of policy and research,
says the fight against autonomous killers is not a policy switch. He says
the organization has already fought cluster bombs, which use infrared sensors
and artificial intelligence to decide when to detonate. Landmine Action
believes that taking the targeting decision out of human hands, and putting it
in a machine's is a deadly one.
Moyes explains, "That decision to detonate is still in the hands of an
electronic sensor rather than a person. Our concern is that humans, not
sensors, should make targeting decisions. So similarly, we don't want to move
towards robots that make decisions about combatants and noncombatants."
The organization hopes to sway the International Committee of the Red Cross and
Amnesty International, two leading organizations in war ethics lobbying. Landmine
Action is spurred on by Sharkey's comments, including his statement that,
"We should not use autonomous armed robots unless they can discriminate
between combatants and noncombatants. And that will be never."
Many in the robotics community express agreement with Sharkey's
sentiment. Peter Kahn, researcher on social robots from the University of
Washington, states that he believe Sharkey to be correct and hopes that
robotics researchers will stop taking government money to design war
robots. He argued to his colleagues at a conference on Human-Robot
Interaction in Amsterdam, "We can say no. And if enough of us say it
we can ensure robots are used for peaceful purposes."
However, most in the robotics community feel this is impossible as most
robotics research is funded by the Defense Department. Says one anonymous
U.S. researcher at the conference, "If I don't work for the DoD, I don't
Some robotics researchers disagree with Sharkey and feel that robots could make
the perfect warrior. Ronald Arkin, a robot researcher at Georgia Tech,
says that robots could make even
more ethical soldiers than humans. Arkin is working with Department
of Defense to program ethics into the next generation of battle robots, including
the Geneva Convention rules. He says that robots will react more
ethically as they have no desire for self-preservation, no emotions, and no
fear of disobeying their commanders’ orders in case of bad orders.
Many, however, remain skeptical of the wisdom of deploying increasingly
intelligent robots onto warzones across the world. They point to the
many science fiction scenarios, which depict humanity at war with killer robots
of their own creation. While this may seem farfetched, the issue of war
robots is becoming a serious one that the world's brightest minds are trying to