logo_reaching-critical-will

CCW Report, Vol. 12, No. 2

Report of the Informal Consultation with Observers to the CCW
6 June 2024


Laura Varella | Reaching Critical Will, Women's International League for Peace and Freedom

Download in PDF

On 6 June 2024, the Chair of the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS), Ambassador Robert in den Bosch of Netherlands, held an informal consultation with observers. This meeting did not follow previous practice of holding consultations with both High Contracting Parties (HCPs) and observers of the Convention on Conventional Weapons (CCW) together. This point was made by the International Committee of the Red Cross (ICRC), Stop Killer Robots, and Article 36, which highlighted that several states expressed support for civil society participation during the work of the GGE. The Chair noted their remarks and said it would consider them for the future. He also announced that it is his intention to organise the second formal session of the Group, scheduled for August, in the same format as the first session held last March, when observers participated in the deliberations.

The informal consultation focused on two topics: 1) A common understanding on the working characterization of LAWS; and 2) The application of existing international humanitarian law (IHL) rules and measures needed to ensure compliance with existing IHL and possible new rules.

Regarding the first topic, the Chair informed that in the consultation with HCPs on 7 May, states discussed text proposals building on the discussions held at the first session of the GGE last March. He said that they discussed two working characterisations of LAWS, the first one being “LAWS is a weapons system that can select and engage a target without human intervention,” and the second that “LAWS is a fully autonomous unmanned technical means other than ordnance that is intended for carrying out combat and support missions without any involvement of the operator.” The Chair informed that the first definition received more support than the second. He also said that states discussed adding a disclaimer before the characterisation saying, “Without prejudging any other options for measures, the definition and characteristics of LAWS are as follows,” and another one after it saying, “The above description does not affect High Contracting Parties’ future understanding and improvement or refinement of the definition when formulating an international instrument.”

The European Union (EU) did not comment on the two formulations but reiterated its commitment to work of the Group. Article 36, the ICRC, and Stop Killer Robots expressed support for the first formulation, saying that it is sufficient to move forward with the work of the Group. The Geneva Center for Security Policy (GCSP) stressed that selection is a key component of the characterisation of AWS.

Reacting to the second formulation, the ICRC expressed concern that notions like “fully autonomous” would narrow the scope of regulation to a small class of weapons, making It largely ineffective.  It also reiterated its previous opposition to the word “lethal”.

Regarding the second topic, application of existing IHL rules, the ICRC encouraged the Group to move beyond restating IHL rules and to elaborate how these rules, and international law more broadly, require the prohibition of certain AWS and restrictions on the development and use of other AWS. The ICRC also said that the Group should recognise that its work should be guided not just by IHL, but also by ethical perspectives. Similarly, Stop Killer Robots stressed that while existing IHL applies to AWS, the identification of additional rules is needed, and that it should take the form of a legally binding instrument. It also said that beyond IHL, ethical norms and international human rights law (IHRL) should be considered.

On the issue of control, the ICRC expressed that control is an overarching concept that should guide the formulation of specific prohibitions and restrictions of AWS. Stop Killer Robots underscored that AWS should only be used with meaningful human control and that those systems that do not allow for meaningful human control should be prohibited. Stop Killer Robots also pointed out that the term meaningful human control has been the most commonly used in the GGE to describe the human role in the use of force. Both Stop Killer Robots and the ICRC said that systems that target human beings should also be prohibited.

The ICRC also recommended further elaboration on the “second tier” of systems for regulation, for instance in reference to details of limits on geographic scope, duration, and types of targets. The ICRC recommended restricting targets to objects that have military objectives by nature, and in relation to geographic scope, it recommended not operating AWS in areas where there are civilians or civilian objects present. Stop Killer Robots said that the second tier should be regulated through positive obligations to ensure that they're used with meaningful human control by ensuring that weapon systems are predictable and understandable, that there are temporal and geographic limitations on their use, and limits on the types of targets.

On the issue of weapons reviews, particularly on the need to conduct a new weapon review when the system of an AWS is modified, the ICRC recommended that states focus on modifications that would alter the system's function or effects.

The Chair clarified that the discussion around IHL aims encouraging states to debate which IHL rules and principles are most at risk of being violated by LAWS. In this way they can move forward on how to best protect these rules and make sure they are upheld by states. The Chair informed that in the upcoming formal session of the GGE, which will convene 26–30 August 2024, states will revisit the working characterisation and the issue of how to apply IHL.

[PDF] ()