CCW Report, Vol. 5, No. 4
Do killer robots dream of eating sheep?
16 November 2017
Both sessions provided further insights into state positions on key issues, and the afternoon’s interactive format helpfully allowed delegations to respond to each other’s proposals. Whilst divisions over critical issues are far from resolved, a few positive themes emerged:
- Most states want to see the GGE continue its work next year, with a mandate to achieve concrete outcomes;
- The majority of states see a legally binding instrument as the best option for addressing the challenges posed by AWS, and at least 20 of these want this to be a prohibition on AWS;
- Every state participating in these discussions believes the lawfulness of the use of all weapons is subject to compliance with international humanitarian law (IHL), including any future weapon systems such as AWS, and almost all agree that international human rights law (IHRL) is also applicable;
- Every state believes that some degree of human control is necessary over weapon systems and the use of force—though the nature of that control and over which functions still varies;
- Some states are starting to put on the table concrete suggestions for definitions of AWS and human control, providing an opportunity to start developing convergence around these key concepts; and
- There is a much higher level of thoughtful interaction with each other’s proposals and suggestions from previous expert meeting’s, indicating that the time is ripe for real action.
This is positive news, as is the determination of those who clearly do want to prevent further automation of violence in our world. Whilst states previously seemed comfortable with a continued exploration of technical and philosophical discussions on what once seemed like a distant, unlikely nightmare, many are now shaken by developments in research and development to want to take action now to, as China said, make sure they do not “build the fence after the sheep are eaten”.
That said, a critical difference in approach to this issue that needs to be resolved sooner rather than later centres on what degree of autonomy, or what characteristics of autonomy, are acceptable and which are not. There are some states who seem to view AWS that have some degree of human control as having the capacity to comply with IHL and ethical considerations. Then there are those that believe that an AWS will not have meaningful human control—that a weapon system operating with autonomy, especially over critical functions, crosses a legal, moral, and ethical line that must not be crossed.
The other issue is that a few states are not supportive of any particular way forward. The US delegation does not support either a political declaration or legally binding instrument. Israel does not even support trying to create a definition of AWS, saying it is “too early” to do so. The UK, on the other hand, says it is “too early” for a ban, before the weapons have been defined—setting up a catch-22 for those that demand consensus to make progress. Meanwhile Russia is concerned about the pace of deliberations, emphasising the importance of a step-by-step approach—and one that takes “little steps”.
But even a step-by-step approach requires the next step, not going around in circles or repeating panels and discussions year after year. Action is needed now, or the dystopian view of the future that China described for us on Wednesday afternoon (see below) will be upon us before we know it.
The following is a summary of the discussions that took place in both of Wednesday’s sessions; it is not a comprehensive record of all statements and perspectives.
General exchange of views, continued from Monday
Views on definitions of the technology or systems continue differ. Most states agree that a good next step would be to at least outline a working definition of autonomous weapon systems, with some also urging this definition to focus on the characterisations of such a systems critical functions, as well as concepts such as meaningful or adequate human control.
Argentina called for a precise and unequivocal definition, while Italy stressed that defining these systems is a central step to developing a common understanding upon which all other elements of the GGE’s work can be based. Sweden cautioned that a technical approach to a definition is problematic given the dual use nature of the technology. Israel said it is too difficult to define AWS right now.
Finland urged states to agree on critical functions of AWS, which should guide policy making before technical advances and realization of this weapons. It argued that due to the inherent difficulty with definitions, states could instead categorise the main components of autonomy. Ireland highlighted an urgent need for the elaboration of a working definition on technical aspects and principles of operation for AWS. Argentina and Mexico urged the delineation of a definition that could lead towards a prohibition on these systems.
The Campaign to Stop Killer Robots agreed “the time has come for states to make explicit where they draw the line in increasing autonomy in weapon systems and determine how to ensure the line into full autonomy is not crossed.” The International Committee for Robot Arms Control (ICRAC) suggested defining AWS as “weapon systems that once launched can select targets and apply violent force without meaningful human control,” or something similar.
Some states have already put down markers for a definition on autonomous weapons.
France and Italy argued that AWS are only related to future technology, while Egypt said they could also include existing technology or systems.
France also asserted that LAWS are those weapons that have no human supervision once activated.
The United Kingdom, calling for a collective definition, indicated its support for the understanding that an autonomous system is one that is able to understand a high level of intent, to take appropriate action to bring about desired state, and to bring about a course of action without a high degree of human oversight or control, though these may still be present.
Norway said while it does not yet have a legal definition, it generally understands an AWS as a system that can “search for, identify, and engage targets using lethal force without human intervention”.
Russia expressed concern about the emerging conflicting definitions.
One thing that all states seem to agree on is that international humanitarian law (IHL) provides the most appropriate framework for guiding decisions about the legality of developing or using AWS. On Wednesday morning, Argentina, China, Croatia, Finland, France, Ireland, Israel, Norway, Russia, and Turkey all reiterated that IHL applies to these systems, while many also highlighted the relevance of international human rights law (IHRL).
France said it could not consider deploying such systems unless their ability to comply with IHL is proven. China said if AWS can’t meet the requirements of IHL then they should be prohibited.
Peru and Venezuela do not believe that AWS could not comply with IHL, while Norway also expressed its concern about this. Norway highlighted ethical concerns about targeting decisions and the implication of human judgment necessary for compliance with IHL.
Turkey noted IHL provides a necessary basis regarding the possible development of LAWS, but says it would not disregard the possibility of studying the sufficiency of existing law to deal with these weapons.
Concerns with IHL compliance tend to lead to consideration of the necessary levels of human control over autonomous weapon systems. Most seem to agree that machines cannot be entrusted to make life or death “decisions” without human intervention. Venezuela argued that human life and values can’t be programmed into a machine, while Zambia cautioned that machines can’t replace human judgment.
To this end, Ireland agreed that all weapons should remain under effective and meaningful human control, while Croatia called for states to be guided by the preservation of the principle of full human control over deployment, arguing that “mankind [sic] is ethically obligated to ensure meaningful human control over the use of force.” Argentina specified that states need to agree on the extent of human control for the development and use of these weapons in order for them to comply with IHL, human rights, and ethical and moral considerations.
PAX called on states to “work towards a common understanding of which decisions and actions should be under human control and how to ensure that this control is meaningful, appropriate or adequate.”
Turkey suggested the concept of meaningful human control could provide the necessary baseline from which a common understanding of AWS can be developed. However, the baselines appear to vary. There seems to be a distinction between those that see LAWS as fundamentally incompatible with a concept of “meaningful human control,” such as Peru, and those that see some degree of autonomy in weapon systems as acceptable, but distinguish these from “fully” autonomous weapons, such as France.
As with definitions of the weapon systems, some states have begun to outline or even adopt policy on the nature and extent of human control.
France said humans must retain ability to take final decision when it comes to use of lethal force.
The UK said human authority is always required for a decision to strike, and that it has no intention of ever developing lethal systems that could be used without human control.
Canada said it has “committed to maintaining appropriate human involvement in use of military capabilities that can exert lethal force”.
Finland suggested human control must be possible at early stage of the weapon operating sequence, and asserted that humans totally out of loop would pose serious legal and ethical risks.
For those states that see that meaningful human control is not compatible with an autonomous weapon system, many support a prohibition on the development and use of these weapons. On Wednesday morning, Algeria, Egypt, China, Mexico, Nicaragua, Peru, Sierra Leone, and Venezuela reiterated their support for a legally binding ban on AWS.
This is also the goal of the Campaign to Stop Killer Robots, which urged CCW high contracting parties to continue the GGE for at least four weeks in 2018 in order to lay the groundwork for the negotiation of a prohibition. Human Rights Watch specified that states should conclude a new legally binding protocol banning the development, production, and use of fully autonomous weapons by the end of 2019. Mines Action Canada highlighted the support for a prohibition amongst the scientific community, while ICRAC emphasised that a prohibition would not impact technological developments in the civilian sphere, explaining that a new instrument would only prohibit the development and use of autonomy in the critical functions of target selection and the application of violent force.
Egypt also called for a moratorium on the development of these weapons in the meantime, while Mexico called for the GGE to send a signal to “deter” the creation of these weapons. China called for a “fundamental solution from preventative diplomacy” rather than a non-proliferation initiative.
Zambia said it remains open to further discussions on the best way to deal with AWS, but based on bans on other weapons, it supports a legally binding instrument to regulate or ban AWS.
On the other hand, Israel, Russia, Turkey, United Kingdom, and United States reiterated their opposition to a ban. The US and the UK indicated it is “too early” to support a prohibition, while Israel said the “futuristic nature” and differing opinions on these weapons mean a careful and incremental approach is most useful way forward.
A handful of states have alternatively supported a political declaration on AWS. Italy said the proposal from France and Germany in this regard merits consideration. However, the United States said it does not support the development of a declaration at this time, while Zambia said it has reservations about the substance of this declaration and does not wish to engage in a process that does not produce conclusive discussions. The Nobel Women’s Initiative warned that a code of conduct is inappropriate and insufficient in light of serious concerns, and risks legitimising research and development towards fully autonomous weapons.
Some states continued to argue that national weapon reviews are the best way to deal with AWS, including Finland, Israel, Sweden, UK, and US. China, in contrast, said that while national reviews have a positive role, multilateral work is needed. Mines Action Canada asked how such reviews “would be able to assess bias in the data used in machine learning and how comportment with IHL would be ensured by systems that continue to learn after the review.
Croatia, not commenting specifically on the above options, said the GGE should undertake all possible efforts to regulate AWS.
Regardless of the way forward, a number of delegations expressed concern about the issue of responsibility and accountability in the use of AWS.
Argentina outlined that responsibility for the use of these weapons is implicit in each link of chain of command, but said it is not sure that military commanders can sufficiently understand the complex programming of AWS to make them criminally liable if they use them. If not, Argentina cautioned, responsibility becomes diffuse, guaranteeing impunity. Similarly, Venezuela argued that AWS would hamper accountability for IHL violations.
Norway also expressed concern about establishing individual and state responsibility for the use of AWS. Without accountability, it warned, deterring and preventing international crimes becomes harder. A robot does not have a legal personality and thus is excluded from accountability; if the weapon system has limited human intervention, it is easy to see situations when no one could be held accountable, which would erode substantial progress on criminal liability.
Some states do not seem to believe there is an accountability gap, however—though their opinions vary about where responsibility does lie. The UK asserted that responsibility lies with commanders and operators. Others have suggested different actors that could be held responsible in other discussions.
Accountability and liability is not the only possible casualty of the development of AWS. States continue to express concern about the implications for proliferation, terrorism, arms racing, and peace and security. Argentina worried that AWS could normalise conflict, foster terrorism, facilitate an arms race, and impact human rights. It also said it would be difficult for LAWS to comply with 2030 Agenda for Sustainable Development, especially Goal 16 on promoting peaceful societies. ICRAC is concerned we are already seeing the beginning of an AWS arms race.
Finland warned that any potential deployment of AWS must not lead to lowering threshold for use of force, though Zambia highlighted this as a risk. It also expressed concern that AWS would make future wars more inhumane and increase targeted killings and clandestine operations.
On the other hand, Israel and the US reiterated their perceived humanitarian and military benefits of AWS, such as less collateral damage or threats to friendly forces.
Ireland expressed concern with the potential use of AWS in law enforcement, outside the sphere of armed conflict, and said it sees value in exploring this in other forums such as the Human Rights Council. Sierra Leone also highlighted the HRC as a possible forum for action on AWS.
Canada raised the issue of gender dimensions of these weapons, including the lack of women’s voices in experts meetings such as this one.
Interactive discussion on the Chair’s food-for-thought paper
During the afternoon session, states started to focus on their recommendations for the way forward on the issue of AWS. Several delegations emphasised that we cannot wait to resolve all the complexities of this issue before we can take action, including among others Argentina, Brazil, Chile, Cuba, Egypt, Germany, and Pakistan.
However, delegations clearly hold different visions of that action.
Algeria, Argentina, Chile, Cuba, Egypt, and Pakistan reiterated their support for a prohibition on the development, production, and use of AWS. Chile said there is a critical mass for a legally binding instrument, while Argentina said there does seem to be a majority in favour of drafting a legal instrument to prohibit AWS.
Egypt does not see the lack of a definition as a reason to not start a prohibition process, noting the lack of a pre-definition on blinding laser weapons. Similarly, Algeria noted that the prohibition on blinding laser weapons did not affect the civilian laser industry, just as binding AWS would not limit civilian applications of autonomous or AI technology. Algeria also noted the relevance of public conscience for prohibiting AWS.
Referring to the Non-Aligned Movement (NAM)’s statement on Monday, Cuba said the final report of the GGE needs to reflect that a huge majority of states support the development of a legally binding instrument on AWS. Brazil said it supported the NAM’s position on a legally binding instrument.
France and Germany reiterated their belief that a prohibition in premature and outlined their preference for a political declaration and other possible “soft law” approaches such as a politically binding code of conduct, and the establishment of a body of technical experts in the CCW. Japan said the France-Germany working paper is “persuasive and valuable”.
Switzerland said it is in principle supportive of the general declaration proposed. Such a declaration could be a pragmatic and achievable next step with a number of benefits, said Switzerland; it could provide momentum and serve as platform for concrete results. It could also spell out characteristics of AWS and a commitment to a certain level of human control and relevant legal requirements.
Cuba said it has concerns about a political declaration or code of conduct, and about a group of experts limited in composition. It urged the final report to show that there isn’t consensus when it comes to these proposals.
Brazil said that while it wants a legally binding instrument on AWS, it doesn’t entirely reject the idea of a political declaration if it could help achieve a definition and a commitment to the almost universal idea that IHL applies to AWS. It shared Cuba’s misgivings about an experts group of limited membership.
Algeria said in the absence of a legally binding instrument, it cannot envision voluntary measures like a political declaration. It suggested that after a legal protocol that prohibits the weapon system, other mechanisms could follow.
As in earlier discussions, some states continued to emphasise the importance of national weapon reviews, and increasing the transparency of such reviews. France, Germany, Japan, Netherlands, and Switzerland indicated support for this approach, and the Netherlands suggested the development of an interpretative guide for best practices.
Switzerland noted that some characteristics of AWS that will raise questions in a weapons review are autonomy in targeting and the weapon’s communications within a network. Embedding an AWS into a network of existing systems, Switzerland cautioned, will raise questions about how far these existing systems, which may not themselves be considered weapons and thus may not have been reviewed, may now be affected in terms of their suitability in respect to the law.
Brazil indicated support for further standardisation of the ways in which states conduct national assessments under article 36, but argued that this doesn’t solve the problem of a need for a multilateral instrument.
When it comes to human control, Argentina argued this is indispensible but that states need to have a debate on the limits of this as an approach. Switzerland agreed that most states agree meaningful human control is necessary and that there is no room for weapons without any human control, due to ethical and IHL considerations.
In this vein, Panama articulated that compliance with IHL and the Martens Clause requires a high level of human discernment, requiring control over weapon systems. Germany agreed that human control is necessary over life and death decisions.
The Netherlands stated that AWS without meaningful human control must not be allowed, though it suggested non-fully AWS operating with meaningful human control may have military advantages. In this context, however, the Netherlands specified that deployment always needs meaningful human control over targeting. It does not anticipate that AWS would operate alone or take over the role of humans on a battlefield, but that they could be deployed alongside humans and complement existing machines in “man-machine” [sic] teaming. It also argued that as long as AWS remain under meaningful human control “on the wider loop,” there will be no need to ban these weapons as additional ethical issues will not arise.
Brazil indicated it is interested in the Netherland’s suggested definition for meaningful human control, but more work will be needed next year on this issue. Brazil also pushed back on the concept of the “wider loop,” arguing that it’s an interesting concept if you’re talking about who is responsible, but not useful when you’re trying to establish which weapon systems would be compatible with IHL. For the latter, human control has to refer to the “inner loop,” said Brazil.
Cuba said human control is necessary over the selection of targets.
The United States said humans must retain control for decisions over the use of force, and asserted that the thought of uncontrollable weapons or machines is not a realistic issue for work in the CCW. The US prefers the term “appropriate levels of human judgment over use of force,” which requires that AWS be designed to allow commanders and operators this appropriate level. It argued that this formulation focuses on human beings, to whom international law applies, and that there is no fixed, one size fits all level of human judgment or minimum level of control that should be applied to every weapon system.
In terms of international humanitarian law (IHL), the distinction persists between those that believe IHL is applicable to AWS and will govern its use, and those that believe IHL is applicable to AWS but that AWS cannot comply with IHL.
Brazil, Cuba, and Pakistan said it is impossible for AWS to comply with IHL, while Argentina cautioned that IHL may be weakened by trying to attribute some sort of legal personality to machines.
Germany, Netherlands, and Switzerland said IHL fully applies to AWS development and deployment. Switzerland also noted that while IHL is the most relevant body of law governing such weapons, other branches of international law may equally impose limits on use of force.
China reiterated its view that the legitimacy of AWS must be subject to the test of principles and norms of IHL. In the morning it had noted that those that can’t comply must be prohibited.
Despite these varying positions, the importance of IHL in considering the legality of AWS remains the main point of convergence for this GGE process.
Concerns about responsibility and accountability also persisted, with Panama asking if it would be the programmer, manufacturer, or user that would be held accountable for violations of law or humanitarian harm. The Netherlands insisted that it will remain states’ responsibility to ensure any weapon system complies with international law; it suggested it is sufficient to hold commanders responsible.
Switzerland said that as AWS lack a legal personality, at least for the foreseeable future, they cannot become agents in a human sense. For accountability in their use there are two dimensions, said Switzerland: that of state responsibility and of individual criminal responsibility. States remain legally responsible for unlawful acts and resulting harm caused by AWS they employ. Rules governing attribution of conduct to a state are pertinent to these weapons—this responsibility doesn’t turn on the nature of the weapons but on the person or entity that decides on their employment. Criminal accountability, in the meantime, focuses on responsibility of humans that are commanders, operators, programmers, engineers, and technicians.
The United States said any decision to use force would be made within bounds of law and a commander’s understanding. Through their experience, and testing of the weapon system, commanders are accountable and have responsibility for the weapons they release.
The fight over definitions also continued in the afternoon session. Argentina emphasised that precise terminology is necessary to define military doctrines and that states should seek to characterise elements of AWS, especially the concept of autonomy. Switzerland agreed that a working definition could be a useful next step, but said it is not yet advisable to establish a definition needed for a legal instrument or regulation.
France, Germany, and Russia reiterated their argument that only future systems should be included in a definition of fully AWS, while Brazil said we can’t exclude existing technologies from a working definition. Switzerland asserted that “most” agree AWS do not yet exist but may soon, while Russia argued that AWS will not be deployed “any time soon”. Algeria noted that in 2014 at least one expert asserted that AWS already exist.
The Netherlands suggested a definition could focus on technology that includes AI and urged states to focus on what requirements are necessary “to responsibly develop and use” AWS. Such weapons, it argued, should be tested, reliable, able to explain their actions in understandable terms, and lead to predictable and reliable outcomes, for which human operator can always bear responsibility.
Brazil indicated it is interested in the Netherland’s definition proposal, which could be a point of departure for further work.
Russia concurred that agreeing on primary characteristics of AWS is a primary task, but complained that the various working definitions are all “theoretical, contemplative, and incomplete”. It highlighted the disagreements over whether AWS already exist, will exist soon, or are far in the future as a case in point.
Russia also dismissed the International Committee of the Red Cross (ICRC)’s approach to defining critical functions of AWS, such as targeting, arguing that targeting is already being done better by machines and that if we define AWS on this basis then we are indicating we want to take this function away from machines and give it back to humans. Since Russia isn’t sure it wants to do that, it does not want targeting to be referred to in any definition of AWS.
Against the backdrop of these divergent views, some delegations started making recommendations on ways forward. Chile urged the Group’s report to include a recommendation that CCW high contracting parties commit to a moratorium on the development and use of AWS. Panama expressed interest in this proposal.
Pakistan said it is time to move to policy prescription and option oriented discussions in order to outline elements for a legally binding prohibition on AWS.
Drawing on Sierra Leone’s statement from the morning session, Brazil also highlighted that the CCW and IHL may not be sufficient for dealing with AWS, given the human rights dimensions that could be better drawn out in the Human Rights Council.
Russia suggested that as a way forward, all of the elements around which consensus is forming should be put on paper, but that this should be gradual in nature—this process requires “little steps” and should not be “hasty” or “excessively ambitious”.
China, in contrast, urged states to be ambitious and work hard for “fundamental solutions” rather than “interim placebos”. It called for consensus, arguing that moving work on this issue outside of the CCW without the “major players” would render any resulting treaty “meaningless”. That said, it called on states to prevent the worst possible scenario and not to “build the fence after the sheep have been eaten”.
In this context, China painted a dystopian picture of a future world in which AWS have been developed and deployed. It described “fearless” machines changing the composition of armed forces as human personnel are released from their military posts and equipment is replaced with AWS. “Unmanned war” will be the norm, in which robots will fight each other and humans. Outbreaks of war will be more frequent, warned China, and political pressure at home against war will be diminished because there will be less economic and human costs. The “line between war and no war will be more and more blurred,” it asserted.