Autonomous weapon systems [AWS] raise profound legal, ethical and moral concerns. Scholars have asked, for example, whether AWS can comply with international humanitarian law [IHL]; whether their use will lower the threshold on the use of force and undermine jus ad bellum rules and whether their deployment will create an accountability gap in violation of victims’ rights to remedy. While there is no agreed definition of AWS, the United Kingdom House of Lords’ recent report carries definitions that generally describe AWS as robots that, once activated, are able to make targeting decisions without further human intervention.
In the recent United Nations Group of Governmental Experts [GGE] meeting [9-13 April] on Lethal Autonomous Weapon Systems, States reiterated the need to maintain human control over AWS. Notwithstanding the general consensus on maintaining human control over AWS, there is no agreement on the nature of that human control or how it should be defined.
Issues surrounding the concept of human control
The 2018 GGE meeting brought to fore a number of questions on how human control should be defined. States submitted a number of ideas and suggestions. Organisations like the International Committee of the Red Cross noted both legal and ethical reasons why human control must be maintained. Likewise, the International Panel on the Regulation of Autonomous Weapons discussed military and philosophical perspectives on the notion of human control.
Now that various disciplines – e.g. military, law, ethics, religion, philosophy etc. – have standards that are relevant to the notion of human control over AWS, the paramount question is which standard(s) should determine an acceptable level of human control and why? While States and scholars may cite innovative ideas and standards upon which to define the concept of human control, it is paramount to distinguish between relevant standards and those that are obligatory or legally binding upon States. The later ought to serve as the yardstick. Read the rest of this entry…