logo_reaching-critical-will

Final Edition, Vol. 2, No. 6

Editorial: Towards a prohibition on autonomous weapons
Ray Acheson | Reaching Critical Will of WILPF


Download the full edition in PDF

The CCW wrapped up its second informal meeting of experts on autonomous weapons last Friday. Discussions over the past week were wide ranging, covering a variety of critical issues from a range of perspectives. At the end of the week, it was clear that the majority of delegations believe that the use of any weapon requires meaningful human control and rejected the idea that matters of life and death should be delegated to machines. The alternative of developing and using fully autonomous weapons that are programmed to identify, select, and engage targets without human intervention, was overwhelmingly portrayed as a moral line that should not be crossed. On this basis, states should begin formal work towards an international instrument to prohibit fully autonomous weapons.

The only delegations overtly resistant to this approach appear to be from countries determined to maintain their dominance in the hegemony of global violence. The prevalent view, though, seemed to be that the principles of humanity require deliberative moral reasoning, by humans, over each individual use of force. Taking into account technical, legal, and moral considerations, violence administered solely by machines becomes meaningless—it threatens the coherence of legal rules; is subject to errors, malfunctions, misuse, and exploitation; and corrodes our common humanity.

As the NGO Article 36 noted, “Processes of calculation and computation in a machine are not equivalent to deliberative human reasoning within a social framework. Machines do not make ‘legal judgments’ and ‘apply legal rules’.” Without meaningful human control, machines would not be enacting a human will towards a specific act of violence. Rather, this machine-based violence would represent a social acceptance that human beings can be processed or put in harm’s way simply as objects, subjected to an abstract calculus. Moral reasoning is the only thing that makes us accountable for violence. Allowing weapons that identify, select, and apply force to targets without human supervision or intervention means relinquishing human responsibility. Simultaneously it means dehumanising those we expose to harm.

A number of states have claimed that the CCW is the most appropriate forum within which to address this issue. The mandate of the CCW and its protocols “is to regulate or ban the use of specific categories of conventional weapons that have effects which trouble the conscience of humanity.” The treaty affirms the “need to continue the codification and progressive development of the rules of international law applicable in armed conflict.” This recognition that the law is not static and that the general rules of armed conflict are not wholly sufficient to address the problems raised by certain weapon technologies–existing and future–is the cornerstone of the CCW regime.

Therefore work can and should begin to move us towards an international prohibition treaty. To this end, CCW States Parties meeting in November 2015 should establish a dedicated process of work towards a prohibition in 2016. This process should focus on the most critical issues emerging from preceding discussions, including meaningful human control, human rights issues, and moral and ethical issues.

Issues like transparency could be included but should not be the central or exclusive focus. Transparency is important for building collective understandings and approaches to constraining the development and use of tools of violence. Understanding what technologies currently exist and what might be under development would be helpful. But we have an opportunity to prevent the development of certain technologies. Transparency is an element of this preventative effort but is not the solution itself.

Similarly, reviews of weapons are a good idea overall, but are not the best way forward on autonomous weapons. A multilateral response is necessary. We cannot rely on individual states to make hypothetical technical considerations or variously interpret existing legal rules. We need a mechanism that also takes into account moral and ethical considerations, human dignity and human rights law, because these are the fundamental issues at the backbone of our debate. There must be broader scrutiny of development of weapons and the CCW should engage with this robustly, but this work should is not about autonomous weapons. Fully autonomous weapons don’t need to be reviewed, they need to be banned.

Most importantly, the CCW should be oriented towards development of an international prohibition of autonomous weapons. The Review Conference in 2016 is the best opportunity to establishing a negotiating mandate for a new protocol on autonomous weapons.

Failing to take this action has ethical implications. What kind of future are we building if we pass up this chance to prevent new dangerous technologies? The development of autonomous weapons is not inevitable. We have the opportunity to act collectively to prevent new technologies of violence and uphold some fundamental principles of humanity. CCW states parties should seize this opportunity.