logo_reaching-critical-will
   

Share

Fully Autonomous Weapons

Download this fact sheet as a three-page PDF

What are fully autonomous weapons?

Important questions raised

What efforts have been taken?

Campaign to Stop Killer Robots

Additional resources

What are fully autonomous weapons?

Fully autonomous weapons are weapon systems that can select and fire upon targets on their own, without any human intervention. Fully autonomous weapons can be enabled to assess the situational context on a battlefield and to decide on the required attack according to the processed information.

Fully autonomous weapons would act on the basis of an “artificial intelligence”. Artificial intelligence is basically created by arithmetic calculations and programming of the robot. It lacks every feature of human intelligence and human judgment that make humans subject and accountable to rules and norms. The use of artificial intelligence in armed conflict poses a fundamental challenge to the protection of civilians and to compliance with international human rights and humanitarian law.

Fully autonomous weapons are distinct from remote-controlled weapon systems such as drones—the latter are piloted by a human remotely, while fully autonomous weapons would have no human guidance after being programmed. Although weapons with full lethal autonomy have not yet been deployed, precursors with various degrees of autonomy and lethality are currently in use. Several states support and fund activities targeted at the development and research on fully autonomous weapons. Amongst them are China, Germany, India, Israel, Republic of Korea, Russia, and the United Kingdom. Robotic systems with a various degree of autonomy and lethality have already been deployed by the United States, the United Kingdom, Israel, and the Republic of Korea.

Important questions raised by the development of fully autonomous weapons

Ongoing research and development in the field of fully autonomous weapons have reached a critical stage, requiring in-depth reflection on further technical development of such weapon systems. The debate on fully autonomous weapons raises following fundamental ethical and principle questions:

  • Can the decision over death and life be left to a machine?
  • Can fully autonomous weapons function in an ethically “correct” manner?
  • Are machines capable of acting in accordance to international humanitarian law (IHL) or international human rights law (IHRL)?
  • Are these weapon systems able to differentiate between combatants on the one side and defenceless and/or uninvolved persons on the other side?
  • Can such systems evaluate the proportionality of attacks?
  • Who can be held accountable?

These issues put into question whether or not human abilities, such as the assessment of proportionality, military necessity, and the capability to make distinctions between civilians and combatants, can be transferred to a machine.

Protection of civilians. Bearing in mind that most of today’s armed conflicts are inter-state conflicts without clear boundaries between a variety of armed groups and civilians, it is questionable how a robot can be effectively programmed to avoid civilian casualties when humans themselves lack the ability to make distinctions in such conflict settings and face difficulties to overcome these dilemmas.

Proportionality. In certain situations, military attacks are not conducted due to the risk of causing disproportionally high civilian damages. It has been doubted that a robotic system is capable of making such decisions.

Accountability. With an autonomous weapon system, no individual human can be held accountable for his or her actions in an armed conflict. Instead the responsibility is distributed among a larger, possibly unidentifiable group of persons, including perhaps the programmer, or manufacturer of the robot.

Increasing the risk of war. As the UN Special Rapporteur on extrajudicial, summary or arbitrary executions pointed out in his report to the Human Rights Council, the removal of humans from the selection and execution of attacks on targets constitutes a critical moment in the new technology which is considered as “revolution in modern warfare”. He urged states to think carefully about the implications of such weapon systems, noting that such technology increases the risk that states are more likely to engage in armed conflicts due to a reduced possibility of military causalities. Fully autonomous weapons could lower the threshold of war.

Cool calculators or tools of repression? Supporters of fully autonomous weapons argue that these systems would help overcome human emotions such as panic, fear, or anger, which lead to misjudgment and incorrect choices in stressful situations. However, opponents to the development of these weapon systems point out that this so-called advantage can turn into a massive risk to people who live in repressive state systems. Fully autonomous weapons could be used to oppress opponents without fearing protest, conscientious objection, or insurgency within state security forces.

Proliferation. Finally, concerns have been expressed that fully autonomous weapon systems could fall into the hands of non-authorized persons.

What efforts have been taken to address the development of fully autonomous weapons?

During the Human Rights Council session in April 2013, 24 states attended the presentation of the report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns, and discussed the issue of autonomous weapons. Participating states expressed concerns regarding the use of fully autonomous weapons and indicated an interest in continuing discussions.

Pakistan, Morocco, Mexico, Argentina (on behalf of GRULAC), Cuba, Sierra Leone, Switzerland, Algeria, and Egypt raised deep concerns about future implications of such weapons and argued that these weapons should be discussed through the perspectives of both human rights and international humanitarian law. The European Union, several of its member states, the United States, and Brazil seemed more eager to define the issue in arms control terms, too. Brazil and France specifically proposed the Convention on Conventional Weapons (CCW) as an appropriate body to deal with autonomous weapons; others suggested the First Committee. The UK, however, said it considers that existing rules are sufficient on fully autonomous weapons and that it does not support an international ban. However, the UK government later clarified its position by affirming that fully autonomous weapons will not meet the requirements of international humanitarian law.

At the meeting of states parties of the Convention on Certain Conventional Weapons in November 2013, governments decided to convene a four-day meeting of experts on the topic of fully autonomous weapons. The mandate is included in paragraph 32 of Final Report

Campaign to Stop Killer Robots

In April 2013, a group of non-governmental organizations launched the Campaign to Stop Killer Robots in London. The campaign seeks to establish a coordinated civil society call for a ban on the development of fully autonomous weapon systems and to address the challenges to civilians and the international law posed by these weapons. The campaign builds on previous experiences from efforts to ban landmines, cluster munitions, and blinding lasers.

The campaign emphasizes the ethical implications of empowering machines to decide between death and life of human beings. It urges states to negotiate a treaty that pre-emptively bans further development and use of fully autonomous weapons. Such a treaty would include the prohibition of development, production, and deployment of fully autonomous weapons. The campaign emphasizes that this matter must be regarded as an urgent concern, especially from a humanitarian perspective.

Besides the prohibition through an international treaty, the campaign calls also for prohibition on a national level through national laws and other policy measures.  

Additional resources:

Reaching Critical Will (2013). CCW adopts mandate to discuss killer robots, 15 November 2013: http://www.reachingcriticalwill.org/news/latest-news/8583-ccw-adopts-mandate-to-discuss-killer-robots

Article 36 (2013). UK says killer robots will not meet requirements of international law, 18 June 2013: www.article36.org/weapons-review/uk-says-killer-robots-will-not-meet-requirements-of-international-law/.

Campaign to Stop Killer Robots (2013). Urgent Action Needed to Ban Fully Autonomous Weapons: Non-governmental organizations convene to launch Campaign to Stop Killer Robots, 23 April 2013: www.stopkillerrobots.org/wp-content/uploads/2013/04/KRC_LaunchStatement_23Apr2013.pdf.

Docherty, Bonnie (2012). The Trouble with Killer Robots: Why we need to ban fully autonomous weapons systems, before it's too late, in: Foreign Policy, 19 November 2012: www.foreignpolicy.com/articles/2012/11/19/the_trouble_with_killer_robots?page=0,0.

Heyns, Christof (2013). Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, Christof Heyns, 9 April 2013: www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf.

Human Rights Watch (2012). Ban ‘Killer Robots’ Before It’s Too Late: Fully autonomous weapons would increase danger to civilians, 19 November 2012: www.hrw.org/news/2012/11/19/ban-killer-robots-it-s-too-late.

 

Human Rights Watch (2014). Shaking the Foundations. The Human Rights Implications of Killer Robots, 12 May 2014: https://www.hrw.org/node/125251.

Krishnan, Armin (2009). Killer Robots: Legality and Ethicality of Autonomous Weapons. Ashgate Publishing.

Pax (2014). Deadly Decisions - 8 objections to killer robots, February 2014: http://www.paxvoorvrede.nl/media/files/deadlydecisionsweb.pdf

UNIDIR, (2014). Framing Discussions on the Weaponization of Increasingly Autonomous Technologies, April 2014: http://www.unidir.org/files/publications/pdfs/framing-discussions-on-the-weaponization-of-increasingly-autonomous-technologies-en-606.pdf

Reaching Critical Will (2013). Growing momentum to prevent killer robots, 30 May 2013: www.reachingcriticalwill.org/news/latest-news/7930-growing-momentum-to-prevent-killer-robots.