I would like to thank the Austrian government, and His Excellency Minister Schallenberg in particular, for organizing this conference at such a critical juncture. It is both a privilege and a profound responsibility to address you all today, as we come together to confront one of the most pressing humanitarian priorities looming over the future of warfare.

We cannot ignore the allegations of use of autonomous weapons in current conflicts. These reports are difficult to verify definitively, but they do indicate disturbing trends towards increasingly complex technology and expanding operational parameters.

This is why it is so important to act, and to act very fast, to address the risks posed by these weapon systems.

These risks cross borders. The technology to develop these kinds of weapons exists and is accessible to state and non-state actors. The potential for proliferation is enormous. So too is the potential for diversion, including to those who might use these weapons to violate IHL. Addressing the threat of autonomous weapons is therefore relevant for all states.

Turning to our concerns: fundamentally, the unconstrained use of autonomous weapons risks a loss of control over the use of force.

Warfare is messy, complex and always horrible. Autonomous weapons add a layer of unpredictability, and with that, new dangers for civilians and combatants. These weapons fire themselves, based on data received through their sensors.

Legally, this can make it difficult for the user to comply with IHL. How can a user, for example, determine that an attack will not cause excessive incidental civilian harm, if they do not know exactly what, or where or when an autonomous weapon will strike?

And ethically, ceding life-and-death decisions to machine sensors and software is a dehumanizing process that undermines our shared humanity.

Of course, autonomous weapon systems do not exist in a lawless space. We have the Geneva Conventions, universally ratified, and their Additional Protocols. All states are already obliged to ensure that their use of these kinds of weapons complies with IHL.

However, IHL is not static. States have recognised the need to progressively develop IHL, and in just the last three decades we have seen the adoption of seven new IHL treaties.

These treaties have been effective and continue to save lives, even if they are not ratified by all states. They have influenced government policies, and the choices of the defence industry sector worldwide.

As far as the ICRC is concerned, we now urgently need an effective legal instrument on autonomous weapon systems, to preserve control over the use of force in armed conflict.

In line with my joint appeal last year with the UN Secretary-General, I call on governments to negotiate, by 2026, a legally binding instrument with explicit prohibitions and restrictions.

  • In particular, autonomous weapons that target humans directly must be prohibited. The ethical risks are too stark, and the potential for harm to protected persons is too high.
  • It is also imperative to explicitly prohibit unpredictable autonomous weapons. We cannot allow force in armed conflict to be controlled by opaque machine learning algorithms.

And then, for all other autonomous weapon systems, clear restrictions must apply to their design and use.

With this conference, and the UN Secretary-General’s consultation we have a clear moment of opportunity. The ICRC was pleased to lodge our submission to the Secretary-General last month. I encourage states, but also civil society, the scientific community, industry, academia and others, to also submit your views.

Let us seize this moment together, reaffirm our shared humanity and recommit to the protection of the most vulnerable, because that is the spirit of international humanitarian law. Bold and decisive action is needed now, to urgently negotiate and adopt a legally binding instrument with clear prohibitions and restrictions on autonomous weapon systems.