Addressing a meeting of experts at the United Nations in Geneva this week, the International Committee of the Red Cross (ICRC) will urge governments to focus on the issue of human control over the use of force in their deliberations on autonomous weapons.
The four-day meeting, convened by States party to the Convention on Certain Conventional Weapons (CCW), will effectively set the stage for States to begin exploring the fundamental legal, ethical and societal issues raised by autonomous weapon systems. At the opening session tomorrow, Kathleen Lawand, head of the ICRC’s Arms Unit, will present highlights from a new ICRC summary report on the subject and lay out some of the key questions that States need to address.
“The development of autonomous weapon systems has profound implications for the future of warfare,” said Ms Lawand. “The central issue is the potential absence of human control over the critical functions of identifying and attacking targets, including human targets. There is a sense of deep discomfort with the idea of allowing machines to make life-and-death decisions on the battlefield with little or no human involvement.”
The ICRC is calling for new weapons with autonomous features to be subject to a thorough legal review to ensure they are capable of being used in accordance with international humanitarian law, something States are required to do for any new weapon.
The new summary report, published tomorrow, reflects discussions at an ICRC-hosted meeting on autonomous weapons in March this year, which brought together military and civilian experts from government and civil society. Interest in such weapons stems from the potential for increased military capability with reduced risk to the user’s soldiers.
However, major concerns persist over whether a fully autonomous weapon could make the complex, context-dependent judgements required by international humanitarian law. For instance, in the heat of battle, would an autonomous weapon be capable of distinguishing between a civilian and a combatant? Would it be capable of cancelling an attack that would have disproportionate incidental effects on civilians? “This represents a monumental programming challenge that may well prove impossible to achieve,” added Ms Lawand.
Already today, weapon systems are fitted with autonomous features. However, the summary report underlines that as weapon systems become more autonomous they may become less predictable, raising doubts as to how they could be guaranteed to operate within the law.
The debate on autonomous weapons reaches far beyond legal and technical complexities, raising fundamental questions about the role of humans in taking lethal decisions in armed conflict. The decisive question may very well be whether such weapons are acceptable under the principles of humanity, and if so under what conditions.