The following is an abstract of a paper contributed by Mr Jeremy England, head of the ICRC regional delegation in New Delhi, towards the ORF Issue Brief for December 2016:

There is deep discomfort with the idea of a weapons system that surrenders life-and-death decisions to machines. The ICRC has been playing an active part in discussions around the subject of Autonomous Weapons Systems (AWS). It argues that such debates should focus on determining the type and degree of human control required to ensure that in the use of AWS, there is ethical acceptability and compliance with International Humanitarian Law (IHL).

autonomous-orf-policy-briefPolicymakers must consider variables like predictability, control of escalation, command responsibility, and legal accountability. Technological evolution will only continue to accelerate, impacting the way war is waged, and the imperatives are clear: The international community should make sure that AWS do not endanger more civilian lives. Stakeholders in Asia need to weigh in on this debate, and raise the questions explored in this paper.

(The key elements of this essay were first shared by Mr. England at the Observer Research Foundation CyFy Conference Panel dedicated to ‘Sentient Technologies, Cyber Weapons and Autonomous Platforms’, held in New Delhi on September 29, 2016.)