logo_reaching-critical-will

CCW Report, Vol. 3, No. 2

Seeking action on autonomous weapons


Ray Acheson
12 April 2016 

Download full edition as a PDF

The third UN meeting on lethal autonomous weapon systems (LAWS) opened on Monday morning with a general discussion by states. Once again, the majority of delegates taking the floor agreed that human beings must always be responsible for the use of force, in particular over decisions about life and death. Some states indicated their support for preventing the development and deployment of LAWS, which would operate without meaningful human control, through a multilateral instrument. A handful of delegations reiterated their well-known arguments against such an instrument. However, a reflection on their policies, consideration of the state of technological development, and recognition of the majority opinion in favour of retaining meaningful human control over individual attacks would seem to highlight problems with these positions rather than critical divisions amongst states within the CCW.

Arguments against action

The following represent some of the well-rehearsed arguments heard on Monday against the development of new law prohibiting the development and deployment of LAWS.

The technology is far away

Some states, such as Israel, Japan, Russia, Spain, and the United Kingdom, argue that LAWS are a possibility of the distant future and may never exist at all. Yet the United States has a list of existing weapon systems it considers beyond the remit of LAWS discussions, such as armed drones, the Patriot or Aegis missile defence systems, or torpedoes. The existence of such weapon systems indicates that the development of fully autonomous weapons is not so distant after all.

The UK is already investing in the development of a weapon system, the Taranis, which has included the testing of autonomous capabilities including target location and engagement. Israel operates the Harpy drone, which automatically detects, attacks, and destroys radar emitters. The US Phalanx system for Aegis cruisers automatically detects, tracts, and engages anti-ship missiles and aircraft. Further, as Sierra Leone noted, increasing the autonomy of existing systems could turn them into fully autonomous systems, warranting their inclusion in on-going talks.

States have “no plans” to develop LAWS

Over the past few years a number of states have been vague in their orientation toward the possible development of LAWS. The US, for example, has a policy that neither encourages nor prohibits the development of LAWS and indicates it will review any applications to develop such technology. Japan says it “has no plans to develop robots out of the loop, which may be capable of committing murder.” Others have been more emphatic, such as the UK, which has declared that it will never deploy weapons without human control.

Yet even where states make such declarations, questions remain about their interpretation of human control. As explained by the UK-based NGO Article 36, “UK policy has not yet provided an explanation of what would constitute human control over weapons systems whilst at the same time suggesting a narrow and futuristic concept of LAWS that appears permissive towards the development of weapons systems that might have the capacity to operate without the necessary levels of human control.”

Existing law is adequate to regulate development and use of LAWS

Some states have suggested they believe all weapons should have meaningful human control yet do not support the development of new law in this direction. The Netherlands indicates it does not support the deployment of weapons without human control, but also does not support a moratorium on the development of specific technologies at this time. Turkey says it supports human control over weapons, but is hesitant about a preemptive prohibition of LAWS because they are “hypothetical”. Canada says it does not support banning LAWS, even while it acknowledges challenges LAWS would pose to national level weapon reviews such as those mandated by article 36 of the 1977 Additional Protocol I of the Geneva Conventions, particularly around testing of these systems. 

A number of states and civil society actors have pointed out other potential problems with relying on article 36 reviews as a response to LAWS. For example, the NGO that takes its name from the legal provision requiring weapon reviews argues that given the global implications of LAWS, decisions about their development must not reside solely with the states considering their acquisition. In addition, “narrow interpretations and inconsistent outcomes across states ... could lead to the introduction of unacceptable technologies.” Furthermore, the development of LAWS would represent “an unprecedented shift in human control over the use of force,” which raises ethical, political, and legal concerns that may go beyond specific weapon systems under review.

Existing law applies

An even less helpful variation of the argument that existing law is adequate to regulate LAWS is that existing international law applies to LAWS. To what weapon system would existing law not apply? Should we really be worried that entire weapons, means, or methods of warfare might somehow be unshackled from the law? If so, we have a bigger problem than how to deal with autonomous weapons.

This argument has perhaps become conflated with the idea that LAWS could potentially be programmed to respect international humanitarian law or human rights law. However, as Article 36 has argued before, “Processes of calculation and computation in a machine are not equivalent to deliberative human reasoning within a social framework. Machines do not make ‘legal judgments’ and ‘apply legal rules’.” Law is written by and for humans, as Ecuador argued on Monday. Without human deliberation, law does not retain its meaning.

A more credible approach to LAWS

In contrast to these arguments, most states addressing the meeting expressed concern with the idea of weapons operating without meaningful human control. Most also gave support for the Convention on Certain Conventional Weapons (CCW) Review Conference in December to establish a Group of Governmental Experts formally to address this issue in 2017. There appear to be a variety of expectations of what such a GGE could do, with a handful of states encouraging the development of transparency and confidence-building measures and clear definitions of technologies and terminologies. However, several states are supportive of initiating negotiations to prohibit autonomous weapons, citing the precedent of the preemptive prohibition on blinding laser weapons.

These states raise critical concerns associated with the potential development and deployment of LAWS. Sri Lanka highlighted the threats that LAWS pose to global peace and security, proliferation, and lowering the threshold for warfare. Pakistan argued that introducing LAWS on the battlefield would be a step backwards from norms and laws of warfare that the international community has built over time. Ecuador outlined a number of concerns about ethics and morality, accountability, transparency, and compliance with IHL and human rights. These states, together with Costa Rica and the Holy See, argued for a prohibition on autonomous weapons. The Holy See noted that most weapon prohibitions come after mass devastation caused by that weapon and argued that prevention is the only viable approach with LAWS.

As we wrote in the CCW Report last year, the CCW affirms the “need to continue the codification and progressive development of the rules of international law applicable in armed conflict.” This recognition that the law is not static and that the general rules of armed conflict are not wholly sufficient to address the problems raised by certain weapon technologies is the cornerstone of the CCW regime. Yet as Austria noted on Monday, technology is outpacing diplomatic deliberations. Urgent action for agreement on concrete measures preventing the development and deployment of LAWS is needed now.

[PDF] ()