logo_reaching-critical-will

CCW Report, Vol. 10, No. 3

Intersessional meeting on autonomous weapons highlights two paths forward, and the urgency of action
28 April 2022


Ray Acheson | Women's International League for Peace and Freedom

On 26–27 April, the Chair of the Group of Governmental Experts (GGE) on autonomous weapon systems (AWS) convened the first of three informal, virtual intersessional discussions. States, international organisations, and civil society discussed the various proposals that have been tabled for consideration by the 2022 session of the GGE, with a focus on the three themes proposed by the Chair: possible challenges to the application of international law; possible prohibitions and regulations; and legal reviews. The discussions highlighted some convergences as well as remaining divergences in approach and perspective on AWS. While there are overlapping elements among the various proposals under consideration, states remain at odds over whether an international, legally binding instrument (LBI) is the best way forward, or whether voluntary national measures are sufficient. Given that the GGE only has five more days of formal discussion scheduled this year, and after Russia’s refusal to allow formal work to proceed at its first session, it is not clear how or what states will manage to accomplish at the CCW on this issue.

Proposal for principles and “good practices”

In March, Australia, Canada, Japan, the Republic of Korea (ROK), the United Kingdom (UK), and the United States (US) tabled a joint proposal that outlines “principles and good practices on emerging technologies in the area of lethal autonomous weapon systems”. Designed as a code of conduct or set of voluntary national measures, the proposal is intended to strengthen implementation of international humanitarian law (IHL) and promote “responsible behaviour” in relation to the designing, developing, deploying, and using AWS. It assumes the creation and use of these weapons; its proponents continue to allege the benefits of AWS for enhancing compliance with IHL and protecting civilians, despite the objections of the majority of GGE participants.

During the informal discussions this week, for example, the ROK argued that AWS could reduce human mistakes and enhance targeting, which will help reduce civilian harm and help parties to armed conflict comply with IHL. Japan argued there may be cases where utilising AWS contributes to mitigation of harm to civilians or civilian objections; therefore, identifying or sharing good practices as outlined in their joint proposal will promote implementation of IHL.

Both delegations, as well as the US, also asserted that consensus is imperative to moving forward within the CCW and argued that this proposal has the best success of achieving consensus. Australia said this proposal is a “realistic deliverable” for this year. The US delegation even argued that no other proposal currently under consideration is as “progressive” as theirs, alleging the other documents are aimed at future work rather than something that can be achieved in 2022.

France, which together with Germany has called for a two-tier approach to prohibiting “fully” AWS and regulating “partially” AWS, supported the US-led proposal. It believes that a code of conduct could be adopted this year and lead to the implementation of national policies and measures to regulate the development and use of AWS.

However, many other delegations opposed an exclusive focus on the US-suggested proposal. The Philippines, one of the authors of the joint proposal for a new CCW protocol on AWS, argued there is no consensus on the vision of the US-led proposal, and that the paper suggesting a roadmap for a new protocol does include work to be undertaken this year. The Philippines suggested that the Chair “marry” the two proposals, drawing elements from each and clearly stating that an LBI is the ultimate goal of this GGE process.

Austria also refuted the US demand to focus exclusively on its proposal, arguing that while it’s true the GGE needs to deliver results after so many years of discussions, states cannot just agree to anything. Whatever the outcome is, it must give meaning to past discussions and have real effects on the ground, especially now that AWS are starting to be developed and deployed. The NGO Article 36 similarly expressed concern that if states rush to agree on an outcome this year, it is not clear that it will be conceptually solid.

Several other delegations argued that the sequencing being proposed by the US et al is backward. Ireland said it’s happy to work on codes of conduct and compilations, but first states must elaborate a framework for the prohibition and regulation of AWS. Palestine agreed that the only sequence that makes sense is the negotiation of an LBI followed by complementary tools to be adopted nationally. Focusing first on “good practices,” argued Palestine, is not a good use of time because good practices cannot be elaborated before a weapon system exists; they are voluntary; they would be based mainly the experience of highly militarised countries and victims would have much less of a say; and the term “good practices” itself suggests there are benefits to AWS, which is not a consensus view. The only good practice, Palestine argued, is a moratorium on the development of AWS until specific safeguards are put in place.

Panama similarly argued that it is not clear what the elaboration of “good practices” would be based on, given that the states developing these weapons assert they don’t yet exist. Panama also emphasised that the entire notion of good practices is linked to the idea that AWS will have “benefits,” which means the CCW will then become a forum to promote weapons, which goes against its very objective.

Proposal for a legally-binding instrument

Given these views, the majority of delegations participating in the informal discussions continued to call for an LBI. Uruguay noted that codes of conduct and exchanges of good practice are useful as confidence-building measures, but are not enough to fill the legal gap in IHL when it comes to these new weapon systems. Algeria said it’s not convinced that voluntary measures are adequate to meet the spirit or architecture of the CCW, while Argentina reiterated the call of the states behind the roadmap proposal for the CCW to provide a mandate for the next GGE to initiate open-ended negotiation for a legally binding instrument for 2023.

Most states supported the two-tiered approach that has been garnering increasing support over recent years of discussion in the GGE. The idea behind this structure is that weapons that by their nature cannot comply with IHL would be prohibited, while the use of other weapon systems would be regulated to ensure compliance with IHL, as well as international human rights law (IHRL), international criminal law (ICL), the UN Charter, and ethics. For Stop Killer Robots, this means prohibiting AWS that cannot be meaningfully controlled and AWS that target human beings; and providing restrictions on the use of other systems in order to ensure the level of human control required to mitigate ethical and legal hazards associated with autonomy.

Switzerland suggested clarifying what AWS are de facto illegal under IHL and would not be acceptable from an ethical standpoint. It said it is convinced of the need for specific rules to ensure that systems that are not de facto prohibited are used at all times in conformity with IHL and that meaningful human control is ensured. Prohibitions should predominately capture systems whose functioning can’t be understood, effects can’t be predicted, or impacts can’t be limited in accordance with IHL. Such rules should be legally binding to have authority they deserve. In addition, AWS that can in principle be used in accordance with IHL should also have rules that ensure they comply with IHL at all times. These rules could be straightforward, suggested Switzerland, and more specific measures would be needed to implement them—including possibly codes of conduct, best practices, and risk mitigation measures.

Chile and Mexico jointly agreed that a distinction must be made between weapon systems that cannot by their nature comply with IHL, and that do not through their use comply with IHL. Austria suggested developing a list of “unacceptable features” for weapon systems, including features that would render an AWS uncontrollable, unpredictable, not capable of being used in accordance with IHL, and with which accountability cannot be assured.

The UK delegation, however, expressed concern with the idea of prohibiting weapon systems that contravene IHL. It asked what are the parameters of a particular weapon system, as opposed to its autonomous function, that would make something automatically in violation of IHL? The UK said it finds the idea challenging that we could constrain functionality of a weapon system.

Aotearoa/New Zealand responded to these concerns, arguing that there is a long lineage in the regulation of weapons in relation to capabilities, but also of behaviours related to use of weapons. It highlighted that under the Mine Ban Treaty, the Claymore command-detonated explosive device, cannot be used unless it is in command-regulated mode. This is an example where the weapon system itself is not banned but specific behaviour or use is. In the case of AWS, argued Aotearoa/New Zealand, code can alter the way weapon systems can be used. Drones already operate in a variety of modes, and we’re beginning to see systems that are selecting and attack targets, which is the kind of behaviour many states and others find problematic.

Aotearoa/New Zealand urged the UK and others not to get hung up on the word “prohibition,” noting that even the US proposal includes language about preventing the development of AWS that cannot be used in compliance with IHL, including “weapons systems are to be developed such that their effects in attacks can be anticipated and controlled, as may be required, in the circumstances of their use, by the principles of distinction and proportionality and such that attacks conducted with reliance upon their autonomous functions will be the responsibility of the human command under which the system was used.

The NGO Article 36 underscored some of these comments, agreeing that every treaty regulating or prohibiting weapons has different rules and approaches, including within the CCW. Some are formulated towards systems that are designed in certain ways or that have certain effects. Rather than making a distinction between fully and partially AWS, Article 36 argued it would be more constructive to see aspects of autonomy are part of a process of how a weapon functions—such as identifying targets based on sensor information and operating within a specific window of time and space, as the International Committee of the Red Cross also articulated.

Weapon reviews

Both the code of conduct and LBI proposals reflect upon the role of weapon reviews (WRs), which are mandated by Article 36 of Additional Protocol I of the Geneva Conventions. While all participants agree upon the necessity of weapon reviews, those who support only voluntary measures in relation to AWS see them as the final arbiter of what weapon systems are acceptable or unacceptable. Those who see value in an LBI argue that WRs are subjective, based on national assessments, and therefore will not ensure standardised safeguards against the development or deployment of problematic weapon systems.

Argentina noted that there is a gap between countries that are producers of weapons, which have significant capacities to evaluate new weapon systems, and importing countries, which have fewer abilities to evaluate the consequences of their acquisitions. In the case of AWS, argued Argentina, WRs will require scientific and technological capabilities that will make even more visible the gaps between states. An LBI, as outlined in the joint proposal, contains precise clauses on review of weapons that include autonomy, which Argentina suggested will help achieve standardisation of review mechanisms.

Austria also raised concerns with WRs, noting that they need clarity how IHL can be better implemented and how IHL applies to AWS. Legal clarity is needed to ensure that the reviews do their job properly; without this, the baseline for assessments of new weapon systems will be quite low or will be gradually lowered. Austria also expressed concern that any exchanges of information will be based on older systems, not cutting-edge technology, which limits the amount of transparency offered through WR processes. Furthermore, the fact that AWS will use algorithms and machine-learning technology also raises questions for WRs, warned Austria. Will new WRs be triggered if a weapon system begins to function differently than when it was first reviewed?

Conclusion

These questions and concerns are vital for understanding the imperative of elaborating legally binding prohibitions and restrictions on AWS. While a code of conduct can provide guidance for implementation of a normative and operational framework, it cannot substitute for it, nor should it come first. Without clear and standardised rules about what weapons are morally, ethically, and legally unacceptable, national measures are likely to be implemented however the state in question sees fit. Given that these weapons are currently being developed by the most heavily militarised countries in the world, it is easily foreseeable that they will not voluntarily constrain their own behaviour without the pressure of international law. Otherwise, they would not be interested in developing and deploying weapons with autonomy in the first place.

The dichotomy between “acceptable” and “unacceptable” weapons is not unproblematic. All weapons, from the perspective of ethics, morality, and human rights, are unacceptable. Weapons are designed to kill and cause harm, to control and suppress, to impose “order” through violence. We are already operating in a framework in which weapons development and use is privileged over human life and well-being. The human and financial resources spent on weapons is unacceptable; investing in new ways to kill instead of provide care is unacceptable.

Just as the delegations of Palestine and Panama objected to the notion of “good practices” in relation to AWS development and use, we should be cautious about declaring certain levels of autonomy in weapon systems to be “acceptable”—or to declare any weapon system “acceptable”. Negotiations of an LBI can help determine what limits to autonomy in weapon systems are necessary to ensure that humanitarian principles, human rights, and ethical perspectives are brought to bear, and to ensure that human life and dignity are privileged above the profits of weapons and war.

[PDF] ()