Human Rights Watch calls for a ban on ‘killer robots’ before they are fully deployed

| |

Top Tier Gear USA

In September of this year, I wrote an article about the Defense Science Board calling on the U.S. Department of Defense to “more aggressively use autonomy in military missions.”

Now Human Rights Watch (HRW) has produced a 50 page report entitled “Losing Humanity: The Case Against Killer Robots” (PDF) calling on a complete ban on “killer robots,” or, more precisely, “fully autonomous weapons.”

Before delving into this report, I am obligated to point out that HRW is far from a faultless organization, as seen in this article on HRW on SourceWatch. That being said, HRW is the apparently the first non-governmental organization (NGO) to address the issue thus far.

The use of drones is questionable enough, as highlighted by the work of applied ethicist Dr. Robert Sparrow, but the concerns surrounding autonomous systems go far beyond those of drones.

Note: to learn more about drones, their usetheir future and the dangers of this type of technology, I highly suggest you read some of the many articles published on End the Lie. This is a topic that is not going away and requires a deep level of research to understand.

HRW is now calling on Governments to “pre-emptively ban fully autonomous weapons because of the danger they pose to civilians in armed conflict,” which is indeed something which must be done in order to preserve human rights.

Fully autonomous weapons systems are even more dangerous than typical drones because “humans could start to fade out of the decision-making loop, retaining a limited oversight role—or perhaps no role at all,” according to the HRW report.

As hard as it may be to believe, the prospect of fully autonomous weapons is not pure science fiction, evidenced by the aforementioned Defense Science Board document.

These intentions are also outlined in “Unmanned Systems Integrated Roadmap FY2011-2036” (PDF) an October 2011 document produced by the Department of Defense.

In the document, the Department of Defense reveals that it “envisions unmanned systems seamlessly operating with manned systems while gradually reducing the degree of human control and decision making required for the unmanned portion of the force structure.”

There are quite strong driving forces behind this, not only the powerful drone lobby (and their friends in the aviation industry as a whole), but also the fact that the military is simply overwhelmed with the sheer amount of data gathered by drones around the world.

This constant data stream is only increasing with drones capable of capturing 36 square miles of imagery in a single blink along with the U.S. military operating drones domestically and sharing the captured data with law enforcement.

Indeed, according to the “Unmanned Ground Systems Roadmap,” released by the Robotic Systems Joint Project Office in July of last year, the U.S. military is working towards totally autonomous weapons systems.

“There is an ongoing push to increase UGV [unmanned ground vehicle] autonomy, with a current goal of ‘supervised autonomy,’ but with an ultimate goal of full autonomy,” the report (PDF) states.

“Unmanned Aircraft Systems Flight Plan 2009-2047,” a 2009 report (PDF) from the U.S. Air Force, revealed that “[i]ncreasingly humans will no longer be ‘in the loop’ but rather ‘on the loop’—monitoring the execution of certain decisions. Simultaneously, advances in AI will enable systems to make combat decisions and act within legal and policy constraints without necessarily requiring human input.”

These plans are, surprisingly, not as new as one might think, evidenced by a 2004 planning document (PDF) produced by the U.S. Navy.

The document, “The Navy Unmanned Undersea Vehicle (UUV) Master Plan,” states, “While admittedly futuristic in vision, one can conceive of scenarios where UUVs sense, track, identify, target, and destroy an enemy—all autonomously.”

This would entirely eliminate humans from the so-called “kill chain,” making the already common slaughter of civilians even more routine.

“Giving machines the power to decide who lives and dies on the battlefield would take technology too far,” said Steve Goose, the Director of the Arms Division at HRW, in a press release. “Human control of robotic warfare is essential to minimizing civilian deaths and injuries.”

The 50-page report, boasting over 182 footnotes, is an absolute must read as it covers a great deal of ground while also making some sensible recommendations and pointing out some of the many unseen dangers lurking behind this type of technology.

Did I forget anything or miss any errors? Would you like to make me aware of a story or subject to cover? Or perhaps you want to bring your writing to a wider audience? Feel free to contact me at with your concerns, tips, questions, original writings, insults or just about anything that may strike your fancy.


Delivered by The Daily Sheeple

We encourage you to share and republish our reports, analyses, breaking news and videos (Click for details).

Contributed by End The Lie of End the Lie.

End the Lie was founded in 2011 with the goal of publishing the latest in alternative news from a wide variety of perspectives on events in the United States and around the world. For more information, find End the Lie on Twitter and Facebook or check out our homepage.

Wake The Flock Up! Please Share With Sheeple Far & Wide:
  • BubbleUp

    Yeah, I demand the RIGHT to be killed up close and personal–by a thinking, feeling HUMAN BEING.

    • Former Soldier

      I agree, I’d rather have an actual person come for me, gives me the opportunity to fight back instead of facing a nameless attacker that removes my right to defend myself. We the people have the right to Life, Liberty and the pursuit of happiness, drones are machines that will deny us those rights…

  • Mia

    I was surprised when I found an article yesterday that had a list of several hundred names (@500) of people who have died as a result of a taser being deployed on them by a police officer. The tech is referred to as “non-lethal”,but should be changed to “not usually lethal” or “sometimes lethal” or even potentially lethal.
    If a taser was able to learn when and how it is to be deployed based on how it had been used previously, yet could be utilized more because of the sheer number of increased potential encounters (the taser doesnt need to eat, sleep, or stop to use the bathroom….) I would rarely leave my house. The internet has numerous numbers of videos concerning a taser being deployed available to watch. It would appear that tasers are being “taught” to deploy on suspects in handcuffs, old ladies not moving quick enough, and even on children.
    The taking of human life should not be something done lightly. Nor should it be part of a formula. The taking of life should be done very rarely, and only when there is NO OTHER CHOICE.
    The military uses weapons that function more like a video game, thus making the target seem less human. There were some occurances in Iraq (the video’s are still VERY easy to find)that clearly demonstrated that its not only safer due to the remote locale, it is clear that this weapon system made it very easy for some soldiers to forget that those were real people they were killing, or not care.
    A machine can only do what it is programmed to do, or learn to complete a task more efficiently as it completes more and more tasks.
    THIS IS A HORRIBLE IDEA!!! War will be made far more often, for much less reason if human beings don’t have to perform the task themselves. We are only in the 12th year of a century that has so far been filled with war.
    There is a saying that “old men make wars young men fight them.” I would hate to see it turn into “The robot sensed the need for war so the robot killed them.”

  • Lutz

    Stay tune for the upcoming part II of the American Revolution.