Group of governmental experts meets to discuss autonomous weapons in Geneva
Today in Geneva, the first group of governmental experts on lethal autonomous weapon systems began discussions. Coming off three rounds of informal discussions under the auspices of the Convention on Certain Conventional Weapons (CCW), this more formal mode of discussion is geared toward more concrete outcomes and recommendations. However, civil society is concerned with the lack of focus and goal of the meeting.
The Campaign to Stop Killer Robots, of which WILPF is a steering group member, urges states to swiftly determine how and where to draw the boundaries of future autonomy in weapon systems by committing to negotiate a ban on lethal autonomous weapons systems. The CCW is a framework treaty that prohibits or restricts certain weapons and its 1995 protocol on blinding lasers is an example of a weapon being preemptively banned before it was acquired or used.
As the Campaign to Stop Killer Robots points out in its press release, this is the first-ever meeting of the CCW Group of Governmental Experts, but it marks the fourth time since 2014 that states have met at the CCW to consider lethal autonomous weapons systems. The Group of Governmental Experts was scheduled to meet twice in 2017, but the first week of talks earlier this year was postponed and then cancelled due to the implementation of a complex UN financial accounting system and the failure of certain states to pay their assessed CCW dues.
A “food for thought” paper prepared by the chair for the CCW meeting contains several technology and legal/ethical issues that do not directly relate to the issue of lethal autonomous weapons systems. The campaign believes that other mechanisms should be pursued to consider broader questions about artificial intelligence and its potential impact on society. Human rights considerations are missing from the provisional programme of work and there appears to be insufficient time to consider proliferation and security concerns as well as the human control needed in future weapons systems.
Several autonomous weapons systems with various degrees of human control are currently in use by high-tech militaries including CCW states the US, China, Israel, South Korea, Russia, and the UK. The concern is that low-cost sensors and advances in artificial intelligence are making it increasingly practical to design weapons systems that would target and attack without any meaningful human control. If the trend towards autonomy continues, humans will start to fade out of the decision-making loop, first retaining only a limited oversight role, and then no role at all.
The Campaign to Stop Killer Robots fundamentally objects to permitting machines to take a human life on the battlefield or in policing, border control, and other circumstances. A total of 19 countries, now support the call to ban lethal autonomous weapons systems, including the Holy See, and more than two-dozen Nobel Peace Laureates.
Many members of the artificial intelligence (AI) and robotics community have endorsed the call to ban lethal autonomous weapons systems. More than 137 founders and directors of AI and robotics companies from 28 countries endorsed an open letter in August 2017 demanding stronger UN action to “protect all of us” from the dangers posed by lethal autonomous weapons systems. This month, more than 120 Australian AI and robotics experts urged Prime Minister Malcolm Turnbull to take a strong stand against lethal autonomous weapons systems, while more than 135 Canadian AI and robotics experts appealed to Prime Minister Justin Trudeau to ban weapons systems that remove meaningful human control in the deployment of lethal force.
Reaching Critical Will resources
- CCW Report - subscribe to the "conventional weapons / emerging technologies of violence" mailing list to recive the report daily during the conference