The use of robots in war present a variety of problems, such as who has authority over them and who is accountable for them. As they’re programmed to act of their own volition, those issues are only accentuated. In April 2015, at the Independent, Chris Green writes about a report issued by Human Rights Watch and Harvard Law School’s International Human Rights Clinic.
Under current laws, computer programmers, manufacturers and military personnel would all escape liability for unlawful deaths and injuries caused by fully autonomous weapons, or “killer robots.”
… Professor Noel Sharkey, a leading roboticist at Sheffield University and co-founder of the International Committee on Robot Arms Control, said that if a machine committed a war crime its commander would have “lots of places to hide” to evade justice, such as blaming the software or the manufacturing process.
… The researchers added that although victims or their families could pursue civil lawsuits against the deadly machine’s manufacturers or operators, this would only entitle them to compensation and would be “no substitute for criminal accountability … ‘punishing’ the robot after the fact would not make sense.”
Which means, said Bonnie Docherty, the main author of the report, “No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party.”
The report added that even if a robot’s commander knew it was about to commit a potentially unlawful act, they may be unable to stop it if communications had broken down, if the robot acted too fast, or if reprogramming was only possible by specialists.
Thus, said Ms. Docherty: “The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons.”
Campaigners would like the use of such robots to be pre-emptively banned through a new international law. This would have to be written into the UN’s Convention on Conventional Weapons, which in 1995 outlawed the use of laser weapons with the ability to blind people before they could be developed.