UN seeks ban on killer robots


02:44 AM May 7th, 2013


In this undated artist’s rendering provided by BAE Systems, Taranis aircraft is shown. A new United Nations draft report posted online this week objects to the use of weapons systems like the Taranis that can attack targets without any human input. AP PHOTO/BAE SYSTEMS

UNITED NATIONS—Killer robots that can attack targets without any human input “should not have the power of life and death over human beings,” a new draft UN report says.

The report for the UN Human Rights Commission posted online this week deals with legal and philosophical issues involved in giving robots lethal powers over humans, echoing countless science-fiction novels and films.

The debate dates to author Isaac Asimov’s first rule for robots in the 1942 story “Runaround”: “A robot may not injure a human being or, through inaction, allow a human being to come to harm.”

Report author Christof Heyns, a South African professor of human rights law, calls for a worldwide moratorium on the “testing, production, assembly, transfer, acquisition, deployment and use” of killer robots until an international conference can develop rules for their use.

His findings are due to be debated at the Human Rights Council in Geneva on May 29.

According to the report, the United States, Britain, Israel, South Korea and Japan have developed various types of fully or semi-autonomous weapons.

Lethal autonomous robotics

In the report, Heyns focuses on a new generation of weapons that choose their targets and execute them. He calls them “lethal autonomous robotics,” or LARs for short, and says: “Decisions over life and death in armed conflict may require compassion and intuition. Humans—while they are fallible— at least might possess these qualities, whereas robots definitely do not.”

He notes the arguments of robot proponents that death-dealing autonomous weapons “will not be susceptible to some of the human shortcomings that may undermine the protection of life. Typically they would not act out of revenge, panic, anger, spite, prejudice or fear. Moreover, unless specifically programmed to do so, robots would not cause intentional suffering on civilian populations, for example through torture. Robots also do not rape.”


The report goes beyond the recent debate over drone killings of al-Qaida suspects and nearby civilians who are maimed or killed in the air strikes. Drones do have human oversight. The killer robots are programmed to make autonomous decisions on the spot without orders from humans.

Heyns’ report notes the increasing use of drones, which “enable those who control lethal force not to be physically present when it is deployed, but rather to activate it while sitting behind computers in faraway places and stay out of the line of fire.

“Lethal autonomous robotics (LARs), if added to the arsenals of states, would add a new dimension to this distancing, in that targeting decisions could be taken by the robots themselves. In addition to being physically removed from the kinetic action, humans would also become more detached from decisions to kill—and their execution,” he wrote.

His report cites these examples, among others, of fully or semi-autonomous weapons that have been developed:

The US Phalanx system for Aegis-class cruisers, which automatically detects, tracks and engages anti-air warfare threats such as antiship missiles and aircraft.

Israel’s Harpy, a “fire-and-forget” autonomous weapon system designed to detect, attack and destroy radar emitters.

Britain’s Taranis jet-propelled combat drone prototype that can autonomously search, identify and locate enemies but can only engage with a target when authorized by mission command. It also can defend itself against enemy aircraft.

The Samsung Techwin surveillance and security guard robots, deployed in the demilitarized zone between North and South Korea, to detect targets through infrared sensors. They are currently operated by humans but have an “automatic mode.”

Decision in nanoseconds

Current weapons systems are supposed to have some degree of human oversight. But Heyns notes that “the power to override may in reality be limited because the decision-making processes of robots are often measured in nanoseconds and the informational basis of those decisions may not be practically accessible to the supervisor.

In such circumstances humans are de facto out of the loop and the machines thus effectively constitute LARs,” or killer robots.

Probe of drone killings

Separately, another UN expert, British lawyer Ben Emmerson, is preparing a special investigation for the UN General Assembly this year on drone warfare and targeted killings.

His probe was requested by Pakistan, which officially opposes the use of US drones on its territory as an infringement on its sovereignty but is believed to have tacitly approved some strikes in the past.

Pakistani officials say the drone strikes kill many innocent civilians, which the United States has rejected. The other two countries requesting the investigation were two permanent members of the UN Security Council—Russia and China.

In April, an alliance of activist and humanitarian groups led by Human Rights Watch launched the “Campaign to Stop Killer Robots” to push for a ban on fully autonomous weapons. The group applauded Heyns’ draft report in a statement on its website.

Disclaimer: Comments do not represent the views of INQUIRER.net. We reserve the right to exclude comments which are inconsistent with our editorial standards. FULL DISCLAIMER
  • http://www.facebook.com/Boycott.Made.N.China Val Sor

    UN only asks that the decision to kill a human must be made by another human, meaning the robot will get a clearance from a human remote operator before a missile is fired. Mind you, it is still deadly this way.

  • jess ravalo

    What about the extremists who are indiscriminate with whom they target with their bombs? Can UN also ban them? Look who is crying foul now?

  • joboni96

    ayaw ng imperyalistang u.s. ito

    takot na sila sa harapang laban

    siguro babawasan ang pondong imperyalista
    sa u.n.

  • w33k3nd3r

    I’d rather have this than people with families running on the fields, with a risk of losing their lives.

  • phthlateous

    UN bans it. America follows the rules and shoots herself in the foot again.

    Then the other nations that have copied the drone technology will do what America did and not follow the rules set by the U.N.

    And there is nothing that the U.N. can do to enforce the rules.

    The more things change, the more they remain the same.

    • koolkid_inthehouse

      Repent, End of the world is near.

  • farmerpo

    “lethal autonomous robotics,” is still a wish, probably a century away if at all possible. Drones are human controlled and no autonomous robot is yet in existence except in the imagination of scientists, physicists and most of all science fiction writers. Common sense and image recognition sense is the stumbling block for the development of ‘autonomous robots’. Brains cannot be manufactured. In 60 years of AI development, no computer can even be at par with the brain of cockroach. Fuzzy logic is just that, fuzzzzzzzy.

    • pepito gwaps

      Correct. Its like the Babel Tower which never been completed.

    • wunskroolooz

      Just because it’s taking too long doesn’t mean its not gonna happen. Maybe not in our lifetime, but it will…

    • http://www.facebook.com/rogelio.silva.129 Rogelio Silva

      you are lagging behind about the revolution of the new technology today,,united states is already testing that,,followed by israel,,”LAR” is just a manipulation in code embedd in the drone warfare,,,youll be surprise for the nxt 20 years,,robotics will transform this world,,it is already happening,,be updated about the latest technological warfare today,,you are living in the past,,

  • clanwolf

    So, no Skynet?

  • pepito gwaps

    we should ask the terrorist also not to use bomb but a firecracker instead.

For feedback, complaints, or inquiries, contact us.