logo_reaching-critical-will
   

Share

14 April 2015, Vol. 2, No. 2

Editorial: The ethics of action
Ray Acheson | Reaching Critical Will of WILPF


Download full edition in PDF

Their development would cross a fundamental moral boundary. They would be the next revolution in military affairs, like gunpowder and nuclear weapons before them. They will lower the threshold for the use of force. They will make war even more inhumane and undermine human dignity. They will not be able to comply with international humanitarian or human rights law. These are some of the concerns states and civil society organisations raised about autonomous weapons during the general debate of the CCW experts meeting. But, as pointed out by civil society, the development of autonomous weapons is not inevitable.

We have the opportunity to prevent the development of autonomous technologies of violence. No state intervening during the general debate indicated they are pursuing autonomous weapons. Only Israel suggested they might be beneficial, stating that autonomous weapons might somehow promote compliance with international humanitarian law—a view at odds with the majority of other delegations taking the floor. Most delegations argued that the use of any weapon requires meaningful human control, rejecting the idea that matters of life and death should be delegated to machines. The majority appear to agree with Germany that the autonomous selection and engagement of targets “is a line that should not be crossed.”

This concept of meaningful human control has rapidly become the central focus of deliberations on autonomous weapons. It is broadly understood as meaning that humans need to be engaged in analysing a target area, selecting targets, and using force. As WILPF noted in its statement to the general debate, the laws of war and protection of human rights require human engagement. Under international humanitarian law and human rights law, the legality of an attack is context-dependent. It is generally assessed on a case-by-case basis. Questions of distinction and proportionality cannot be resolved through automated mechanisms, and in any case, the law requires that human commanders make such judgments.

“The law doesn't exist on its own; it's derived from what we believe is right or wrong,” noted Thomas Nash of Article 36 in his remarks at a side event hosted by the Campaign to Stop Killer Robots. This link between law and morality is critical to understanding the full implications of autonomous weapons and the threat they pose to humanity. The principles of humanity require deliberative moral reasoning, by humans, over each individual attack decision. Human beings can be violent and we can break laws. But we have something that machines do not have, and likely cannot be programmed to have: moral reasoning. We can value human life—even if we sometimes don’t. As the Chilean delegation remarked, the only restraint we have on the use of any weapon is the ability of people to identify with the human being at the other end.

Several delegations, including Cuba, Ecuador, Pakistan, and Sri Lanka, and all of the civil society groups addressing the meeting, argued that a ban on autonomous weapons is necessary to ensure there is meaningful human control over targeting and attack decisions. The Chinese delegation argued that it would be better to take precautionary measures than deal with the aftermath of autonomous weapon systems. Mexico argued that weapons that conflict with IHL should be prohibited.

As the Irish delegation conveyed forcefully, the mandate of the CCW and its protocols “is to regulate or ban the use of specific categories of conventional weapons that have effects which trouble the conscience of humanity.” It remains to be seen how many delegations will be ready to move towards negotiations, based on their concerns that this technology could fall foul of humanitarian and human rights law. As the Dutch civil society group PAX noted in its statement, failing to take action on the development, production, and use of autonomous weapons also has ethical implications. It is time for action. Delegations should use this week to set out their proposals for how they intend to take forward concrete work on autonomous weapons in the CCW, the Human Rights Council, and in their national debates at home.

[PDF] ()