There any many who will agree that the tremendous advancement of technology is intrinsically linked to the evolving means and methods of warfare. Indeed, throughout the history of humankind, new technology has been used to create new munitions. Developing autonomy in weapons has continued to be an incremental process — which today, in fact, is happening at breakneck speed.
Kathleen Lawand, Head of Arms Unit at the ICRC, who was in New Delhi recently, points out that the development of new technology does not occur in a legal vacuum. All new weapons must be used and must be capable of being used in accordance with International Humanitarian Law. She affirmed that this, in turn, makes it important for all States to carry out a legal review of new weapons. It should be noted that cyber warfare and the use of robotics – i.e. unmanned automated systems, as well as autonomous systems – can cause severe irreparable damage to civilians and civilian infrastructure.
Kathleen informs that several questions arise in this scenario and include what degree of human control must be maintained to ensure that ethical standards are maintained. Pertinent questions revolve around what would happen in the case of self-learning machines which go beyond what they are initially programmed to do. Is there a way an autonomous system can be programmed to distinguish between military objectives and civilian objects – especially in situations and contexts when these are intermingled such as urban city spaces?
Kathleen explains further that the issue of accountability in cases of violations becomes a complex one and it is for this reason that the ICRC continues to facilitate dialogue among states in order to provide interpretations that protect civilians.
More from Kathleen Lawand’s India visit:
Carnegie India organized a roundtable in New Delhi, led by Lawand, on New Technologies, Warfare, and International Humanitarian Law. The discussion was attended by leading government functionaries and representatives of the armed forces and civil society.