magnify
Home Posts tagged "autonomous weapons"

A “Compliance-Based” Approach to Autonomous Weapon Systems

Published on December 1, 2017        Author:  and
Facebook
GOOGLE
https://www.ejiltalk.org/tag/autonomous-weapons
LINKEDIN

A Group of Governmental Experts (GGE) on the topic of Lethal Autonomous Weapons (LAWS) concluded its first meeting in Geneva on 17 November 2017. The meeting was held under the auspices of the Convention on Certain Conventional Weapons (CCW) and built upon on three informal meetings of experts held between 2014 and 2016 (for reports of those meetings, see here). In December 2016, the Fifth Review Conference of the High Contracting Parties of the CCW had tasked the GGE “to explore and agree on possible recommendations on options related to emerging technologies in the area of LAWS” (see Decision 1 here and the agreed recommendations contained in this report).

At the heart of the debate is the question how States should respond to the emergence of such weapons. While some highlight legal, ethical or moral concerns of delegating life and death decisions to machines and advocate for a preventive prohibition of autonomous weapons systems, others pinpoint potential benefits for the way wars are fought in the future and deem any policy options, including regulation, to be premature.

As often in such multilateral discussions, it is hard to make progress and to get all States to agree on a common approach. The topic of autonomous weapon systems is no different. Indeed, perhaps it is particularly difficult because we do not yet fully understand what robotics and artificial intelligence truly harbor for the future of warfare, and for humanity in general. In an initial step, the GGE in its first session affirmed that international humanitarian law (IHL) applies to all weapons, including the potential development and use of autonomous weapon systems, and that responsibility for their deployment remains with States (see report here). This is a welcome step but obviously cannot be understood to exhaust the topic.

In an effort to generate momentum and identify common denominators, Switzerland presented a working paper at the beginning of the GGE, in which it is argued that ensuring compliance with international law, notably IHL, could and should be common ground among States and that this could form a constructive basis for further work. Accordingly, it should, at least as one element, be central to discussions of the GGE about autonomous weapon systems and should figure prominently in the report of the GGE as well as in the way forward. In the following, we recapitulate requirements for compliance with IHL and on that basis identify elements for a “compliance-based” approach aimed at advancing the debate within the CCW in an inclusive and constructive manner. Read the rest of this entry…

 
Comments Off on A “Compliance-Based” Approach to Autonomous Weapon Systems

“Are you smarter than Professor Hawking?” Higher Forces and Gut-Feelings in the Debate on Lethal Autonomous Weapons Systems

Published on April 27, 2016        Author: 
Facebook
GOOGLE
https://www.ejiltalk.org/tag/autonomous-weapons
LINKEDIN

“Professor Hawking says that artificial intelligence without control may cause the extinction of the human race”, noted a Chinese delegate following a session on ‘mapping autonomy’ at the Convention on Conventional Weapons (CCW) meeting of experts which took place from 11-15 April 2016 at the United Nations in Geneva. The CCW convened its third meeting of experts to continue discussions on questions related to emerging technologies in the area of lethal autonomous weapons systems (LAWS) and I had the privilege of participating.

LAWS are most often described as weapons that are capable of selecting and attacking targets without human intervention; one of the key questions addressed at the meeting was what exactly this means. According to most of the commentators present at the meeting, LAWS do not yet exist however, the possibility of using autonomous weapons in targeting decisions raises multidisciplinary questions that touch upon moral and ethical, legal, policy, security and technical issues. The meeting addressed all of these, starting with the technical session aimed at mapping autonomy.

Without expressing their position on a ban, the six technical experts on the panel presented a nuanced view of the state of current autonomous weapons technology and the road that lies ahead. The Chinese were one of the first delegations to respond to the panel and the delegate seemed startled; some of what was said seemed to contradict the conclusions reached by Professor Hawking et al. China read the Open Letter issued by the Future of Life Institute (FLI) and signed by thousands of artificial intelligence (AI) and robotics researchers, as well as by a number of other endorsers including the well-known Professor Stephen Hawking. The Open Letter calls for a ban on offensive autonomous weapons beyond meaningful human control, claiming that these weapons would be feasible within years, not decades. The Open Letter attracted a good deal of attention, largely because it is signed by a number of well-regarded figures including, Tesla CEO Elon Musk, Apple co-founder Steve Wozniak and as previously mentioned, Professor Stephen Hawking.

The expert panelists offered some divergent views on the claims and predictions made in the Open Letter. In response to these, China asked the panelists “do you think you are smarter than Professor Hawking?” A number of delegates, academics, NGO members and panelists seemed quite amused by the provocative question posed by China. Who dares to disagree with Hawking? Fortunately, some of the experts did. “Isn’t Hawking a physicist, and not an AI expert?”, asked one panelist. Another expert confidently said, “Yes, I am smarter than Stephen Hawking.” Why? “Because, like Plato, I know that I do not know.” The debate is amusing, but also a little bit troublesome. What is the effect of well-regarded figures on the discourse about autonomous weapon systems? Read the rest of this entry…

Filed under: Arms Control, EJIL Analysis