Dr Heather Roff discusses the role of autonomous weapons systems within the international community. She provides a theoretical framework for defining and classifying these systems, examining the diplomatic and moral concerns that they pose.
For the past three years the international community convened a series of informal meetings of experts under the auspices of the United Nation’s Convention on Certain Conventional Weapons (CCW) to consider whether or not to preemptively ban lethal autonomous weapons systems under an additional protocol to the Convention. The debate has circled the same set of concerns: what exactly lethal autonomous weapon systems (AWS) are and whether it is incumbent upon states to ban them before they are developed. Without a definition states argue they cannot know what exactly it is they are supposed to ban. Yet after three years of expert testimony, there is no agreement on any meaningful definition. Diplomatic considerations are pressing, but Dr Heather Roff believes that the source of this confusion is due to an antecedent and more profound concern, one that is inherently tied to the question of defining what constitutes an AWS. In plain terms, it is a concern with the authorization to make war and the subsequent delegation of this authority. Until now, humans have been the sole agents authorized to make and to wage war, and questions of authorization and war have never been technologically dependent. Rather, they have been moral considerations and not empirical ones. She attempts to provide a theoretical framework for defining and classifying autonomous weapons systems. By so doing, she argues the moral quandary over autonomous weapons has its roots in concerns over the delegation of a (moral) faculty: the authority to wage war.