“Professor Hawking says that artificial intelligence without control may cause the extinction of the human race”, noted a Chinese delegate following a session on ‘mapping autonomy’ at the Convention on Conventional Weapons (CCW) meeting of experts which took place from 11-15 April 2016 at the United Nations in Geneva. The CCW convened its third meeting of experts to continue discussions on questions related to emerging technologies in the area of lethal autonomous weapons systems (LAWS) and I had the privilege of participating.
LAWS are most often described as weapons that are capable of selecting and attacking targets without human intervention; one of the key questions addressed at the meeting was what exactly this means. According to most of the commentators present at the meeting, LAWS do not yet exist however, the possibility of using autonomous weapons in targeting decisions raises multidisciplinary questions that touch upon moral and ethical, legal, policy, security and technical issues. The meeting addressed all of these, starting with the technical session aimed at mapping autonomy.
Without expressing their position on a ban, the six technical experts on the panel presented a nuanced view of the state of current autonomous weapons technology and the road that lies ahead. The Chinese were one of the first delegations to respond to the panel and the delegate seemed startled; some of what was said seemed to contradict the conclusions reached by Professor Hawking et al. China read the Open Letter issued by the Future of Life Institute (FLI) and signed by thousands of artificial intelligence (AI) and robotics researchers, as well as by a number of other endorsers including the well-known Professor Stephen Hawking. The Open Letter calls for a ban on offensive autonomous weapons beyond meaningful human control, claiming that these weapons would be feasible within years, not decades. The Open Letter attracted a good deal of attention, largely because it is signed by a number of well-regarded figures including, Tesla CEO Elon Musk, Apple co-founder Steve Wozniak and as previously mentioned, Professor Stephen Hawking.
The expert panelists offered some divergent views on the claims and predictions made in the Open Letter. In response to these, China asked the panelists “do you think you are smarter than Professor Hawking?” A number of delegates, academics, NGO members and panelists seemed quite amused by the provocative question posed by China. Who dares to disagree with Hawking? Fortunately, some of the experts did. “Isn’t Hawking a physicist, and not an AI expert?”, asked one panelist. Another expert confidently said, “Yes, I am smarter than Stephen Hawking.” Why? “Because, like Plato, I know that I do not know.” The debate is amusing, but also a little bit troublesome. What is the effect of well-regarded figures on the discourse about autonomous weapon systems? Read the rest of this entry…