When Robots Commit War Crimes

The use of robots in combat will be accompanied by a lengthy learning curve — and, likely, many civilian casualties. (Pictured: Modular Advanced Armed Robotic System (MAARS) / popsci.com)

The use of robots in combat will be accompanied by a lengthy learning curve — and, likely, many civilian casualties. (Pictured: Modular Advanced Armed Robotic System (MAARS) / popsci.com)

The use of robots in war present a variety of problems, such as who has authority over them and who is accountable for them. As they’re programmed to act of their own volition, those issues are only accentuated. In April 2015, at the Independent, Chris Green writes about a report issued by Human Rights Watch and Harvard Law School’s International Human Rights Clinic.

Under current laws, computer programmers, manufacturers and military personnel would all escape liability for unlawful deaths and injuries caused by fully autonomous weapons, or “killer robots.”

… Professor Noel Sharkey, a leading roboticist at Sheffield University and co-founder of the International Committee on Robot Arms Control, said that if a machine committed a war crime its commander would have “lots of places to hide” to evade justice, such as blaming the software or the manufacturing process.

… The researchers added that although victims or their families could pursue civil lawsuits against the deadly machine’s manufacturers or operators, this would only entitle them to compensation and would be “no substitute for criminal accountability … ‘punishing’ the robot after the fact would not make sense.”

Which means, said Bonnie Docherty, the main author of the report, “No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party.”

The report added that even if a robot’s commander knew it was about to commit a potentially unlawful act, they may be unable to stop it if communications had broken down, if the robot acted too fast, or if reprogramming was only possible by specialists.

Thus, said Ms. Docherty: “The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons.”

In fact:

Campaigners would like the use of such robots to be pre-emptively banned through a new international law. This would have to be written into the UN’s Convention on Conventional Weapons, which in 1995 outlawed the use of laser weapons with the ability to blind people before they could be developed.

  • hammerstamp

    Accountability IS the major pitfall when you’re talking about artificially intelligent, autonomous killer robots… isn’t it?

    • Stan Erickson

      Sentenced to be re-formated.

  • squawneye

    A robot could just blame George Bush. That works well in about any situation.

    • dnarex

      And it would be an accurate statement.

      • Maybe?

        Hate to tell you simpletons, but Obama has been in office 7 years, and nothing has changed but the names on the doors of the White house.

        It simply amazes me of the complete ignorance and bias of the single cell, non functional brains of liberals.

      • Yhadda

        LIke when Mao murdered 45 milion naked Chinese in the snow…l

      • Hawaii Doc

        Cool, liberals have invented dnarex, the robot troll.

      • squawneye

        Ah, an Obot.

  • SR-71

    Robots by their very definition cannot commit war crimes. They are not thinking, feeling entities. It simply does not kill out of hate. It is like VW and their software that turned off the pollution control. What complete idiots. If you are going to cheat with software, make it an “engineered bug.” They are impossible to distinguish from actual bugs. Nobody can prove anything. Same goes for your war robot’s over zealous acquisition of civilian targets.

    • Stan Erickson

      What if the robot is smarter than humans by every test the humans invent? Or to invert the question, will ultrasmart AI hold humans responsible for stupid acts their wet noodles have them do? After all, they were misprogrammed.

      • SR-71

        Artificial Intelligence is a media and Hollywood created myth. I never use those words to describe what it is I am building and I am squarely in that market. My systems “learn” from successes and failures (back propagation) and can determine future scenarios (history maps) from past experiences. You can fool the system once but you can’t a second time doing the same thing. That system will never become intelligent and start thinking on its own though. It is domain specific. If I were to build robots to identify humans and kill them (rather simple task) it would eliminate all humans it encounter much more viciously than any movie could portray. These machines would dominate the planet in a matter of weeks. I do not fear the machine getting to a point were it can think for itself like a human. That is impossible. It physically does not have the memory or the raw processing power to do anything with that memory even if it had enough. What I fear is that MAN who controls the robots because his bully force will be unstoppable.

        • Stan Erickson

          True, AI has been seized upon by the media, but that in itself doesn’t prove anything about the eventual information processing capability of silicon. In order to make the claims you do, you will have to define intelligence, and provide some tests that humans can pass and AI cannot. And yes, you are right, current systems need more memory and processing power to reach AI, probably more layers than your systems, probably multiple conjoined systems to handle the sensor processing, the strategy, and so on. My contention is that robot rulers could not arise except by human choice, as the dynamics of development provide enough checks.

          • SR-71

            “In order to make the claims you do, you will have to define intelligence” No, I make those predictions based on my own knowledge of expert systems and the domain knowledge required to create such systems. What you are missing is the fact that silicone systems cannot rewire themselves to optimize performance like any biological brain can. In order to do that you have to find something more adaptive that silicon. Haven’t you noticed our computing technology has slowed? We are now trying to stuff more cores into a single silicon construct because we’ve hit a wall with silicone. We lost Moore’s law nearly 7 years ago. Silicone is a dead end and so are expert systems that run on it.

          • Stan Erickson

            What on Earth would make you think silicon systems cannot rewire themselves. Not physical wires, obviously and trivially, but via software. Expert systems are a failure, to be sure, which is why they have been abandoned in favor of genetic networks. The rate of shrinking microchips has slowed as there are physical limits, but how do you make the leap from that obvious physical phenomena to concluding AI will not be smarter than humans at some date in the future.

          • SR-71

            You are talking about simulating an analog process in a digital space. This software simulation is already in production in many, many systems. The problem is that analog systems (that thing in your head) grows the connections to shorten the distance between the two (or more) processors whereas in the silicon system these links have to be interpreted. That interpretation adds to the computational load, it doesn’t reduce the computational load. In the analog systems these connections reduce the computational load. That is huge difference. As a digital system learns, it requires more processing power. As an analog system learns, it requires less processing power. I didn’t write the rules but I marvel at the design of the average brain. What a wonderful work of art it is in all of its squishy glory!

          • Stan Erickson

            You are implying that the architecture of current digital systems would have to be changed in order for AI to equal human intelligence according to some standardized test, as complex as desired. OK.

          • SR-71

            I am not implying anything of the sort. I just stated why digital simulations will never (as in ever) equal an analog system for computing. As to your testing thing; I bet you I can find a human that will fail whatever test you have that supposedly proves you are human. So to me, a test doesn’t carry the same weight as it does with you. By the same token I can build a machine that will prove positive in your test. Just look at the 30 so odd examples that compete favorably in the Turing test. None of them are alive BUT they can fool the human to question if the system is in fact a human. Digital and especially silicon are dead ends. Don’t mistake what I am saying though. It’s all we got and advancements will certainly be welcome but, long term, Skynet is not coming.

          • Stan Erickson

            Your assertions are neither justified nor justifiable. End of discussion.

          • SR-71

            Cool, stick you head in the sand. Others in this industry wouldn’t have been so kind and tried to educate you. When you want to really know where we are headed on this topic, I will still be right here.

  • bxdanny

    >> The researchers added that although
    victims or their families could pursue civil lawsuits against the deadly
    machine’s manufacturers or operators, this would only entitle them to
    compensation and would be “no substitute for criminal accountability …”

    It would still be a deterrent if the “manufacturers or operators” could be subject to paying millions of dollars in damages.

    • SR-71

      that all depends if you are on the winning side. We are talking about war here. Loser pays.

  • Cyunvwyatt

    I am sure that’s what the elite want a nonstrings attached way to depopulate the planet. 98% of humanity needs to be destroyed according to Bill Gates “to live in perpetual balance of nature” earth can only accommodate 500 million people.

    • Jack Whistler

      Meanwhile we pay farmers to not grow food.

      • Maybe?

        Most miss that small detail.

        With modern technology, we could have world peace, plenty of food, water and a real nice lifestyle for everyone, but that is not what the elite want.

        They want control and think they are god and get to decide the fate of humanity.

        You could take every family on earth and give them 3/4 acres, set it up sustainable where nobody would have to work other than on their property to maintain it and we would only cover land space of about 3/4 the size of Australia.

        There is no over population unless you isolate cities. These are what cause the problems, but the elite protect them because they churn the economic machine that keeps the elite in power.

        Without cities, the elite are absolutely nothing.

  • Peleus

    Keep all this in mind since we seem to be heading for fully autonomous cars too. So what happens if folks get run over by those devices?

    • Dan-in-IN

      Well see that’s all international and war related. you get run over by an automated car back here, and i’m guessing you’ll be able to sue the car maker, car owner, innocent bystanders who saw it happen, the person who sold them the gas for the car, the refinery who refined the gas for the car, and George Bush.

  • Verbotene Gedanken

    “Kill All Humans!”
    – Bender

  • Nanny mo

    You execute the programer, duh!

    • Maybe?

      How about the ones who make the purchase order?

  • biggoil

    If the enemy would mark themselves more clearly in battle, like with a pink scarf, we would not have these problems.

  • SgtMAD

    these robot army dream is a disaster for all of us in the making
    we ,the people need to force these politicians to incorporate asimov’s three laws in all these robots.
    1)no robot will ever harm a person
    2)no robot will allow harm to come to a person by inaction
    3)robots will protect themselves when not in conflict the the first two rules.

    these rules need to be inbedded in every robot operating system by law or we will see robots used to kill anyone these elites want to kill without any repercussions or cost and war should never be cheap

  • Snake Plissken

    Question to ponder: Are ‘war crimes’ against islamists truly crimes? Or just Karma?

    • Maybe?

      But what happens when they are turned on you because you disagree with Islamic terrorist? This is what’s coming.

  • LMJ313

    I’m sure the authoritarian elites would not hesitate to turn this against our own population when the opportunity presented itself.

    • franklingray

      That to me is the only real thing to worry about. Soldiers can and do disobey orders, I doubt robots would be programmed to disobey orders. History has shown due to hackers that stop somebody from re-programming them to take orders from X and only X is impossible.

  • David Hedricks

    Unless we move to the autonomous robot, SOMEbody is driving that thing and taking orders from somebody else. Whoever pulls the trigger, or clicks the mouse to pull the trigger, should be held responsible for a war crime.

    • franklingray

      This article is about robots, not drones.

      • Maybe?

        Robots still need code written by humans, purchased by humans, activated by humans etc…. There is no escape. If there is, then you and I can build them, cutting them lose on society without repercussion.

  • LiberTEA

    Will killer robots get PTSD?

  • Juanita Broaddrick’s Lip

    Joe Biden is a drone.

    Does that count?

    • Major Remington

      He could certainly bore someone to death.

  • C. Adkins

    – we’ve been doing this in the middle east for a long time.

    • franklingray

      Those are drones idiot

  • HAL 9000

    …something about the day Skynet becomes self-aware.

  • seatex

    “Existing laws hold neither the military nor computer companies and
    programmers accountable for the sins of robots.”

    But if a civilian builds one….

  • http://www.eliteword.com/ Choir Loft

    A lit bottle of gasoline should do the trick on this baby. Toss and stand back to watch the fun.

    If nobody claims responsibility for its actions, then no one can be held responsible for destroying it.

    The law swings both ways……assuming of course fair and impartial administration of the law…. which isn’t going to happen in America. Seig Heil

    I can dream, can’t I?

    and that’s me, hollering from the choir loft…

  • Major Remington

    What could possibly go wrong? …..go wrong? …..go wrong?…………

  • Ruckweiler

    Would love to see a robot trial at the court in The Hague.

  • MikkiDean

    I have a big red flag and a question.
    All computer systems are inheritly flawed since they were created by humans.
    What happens when self learning systems watch movies, TV, and the news?
    And one after thought. Who watches the chip makers?

  • zombietimeshare

    “… robots in war present a variety of problems, such as who has authority over them and who is accountable for them.”

    Megatron 2016: Why vote for the lesser evil?

  • ciscobiscuit

    “Campaigners would like the use of such robots to be pre-emptively banned through a new international law. (…), which in 1995 outlawed the use of laser weapons with the ability to blind people before they could be developed.”

    We can probably accurately assume that every major world power,and some smaller nations ,have been developing or re-purposing lasers to blind humans. They are probably on the shelf, in every real nation ,ready to equip soldiers or drones.

    Every major weapons manufacturer on the planet probably has a prototype in it’s lab.

  • Mr Stones

    “This would have to be written into the UN’s Convention on Conventional Weapons, which in 1995 outlawed the use of laser weapons with the ability to blind people before they could be developed.”

    Yep, so go ahead and ignore the giant megawatt laser the US Navy is going to deploy:
    http://www.popsci.com/technology/article/2011-01/navys-free-electron-laser-weapon-takes-big-leap-forward-powerful-new-electron-injector

    I’m sure a ‘ban’ on Terminators would be as effective as a ‘ban’ on guns, or a ‘ban’ on drugs.

  • ciscobiscuit

    Perhaps employees who machine the parts used in a rifle should be held civilly AND criminally liable for the deaths “caused” by the rifle itself.

    After all, firearms do levitate and aim themselves, and pull their own triggers.
    Firearms are racist too.

  • Red_State_Eddie

    One day, we’ll read that a grieving husband sent his weaponized drone over to machine gun the family of the fully-automated car that ran over his family.

  • PhysicsWon

    International laws banning increasingly effective weapons of war will work about as well as “gun free zone” signs deter shooters bent on mass murder.

  • jack nichols

    Danger Will Robinson

  • Joycedkilkenny1

    the fact that creator Frank Miller has almost completely transformed in the decades since its

  • Joycedkilkenny1

    the fact that creator Frank Miller has almtely transformed in the decades since its

  • Joycedkilkenny1

    the fact that creator Frank Miller has almost completemed in the decades since its

  • Joycedkilkenny1

    the fact that creator Frank Miller has almost sformed in the decades since its

  • Joycedkilkenny1

    the fact that creator Frank Miller has almost coformed in the decades since its

  • Joycedkilkenny1

    the fact that creator Frank Miller has almost completely transformed in the decades since its

  • Joycedkilkenny1

    the fact that creator Frank Miller has almost comformed in the decades since its

  • Joycedkilkenny1

    the fact that creator Frank Miller has almost completely in the decades since its

  • Joycedkilkenny1

    the fact that creator Frank Miller has almost co transformed in the decades since its

  • Joycedkilkenny1

    the fact that creator Frank Miller has almost completelyed in the decades since its