UN GGE on LAWS: Day 2 (morning)
19 Apr 2019 02:00h
Event report
This session focused on item 5(a) An exploration of the potential challenges posed by emerging technologies in the area of LAWS to international humanitarian law (IHL). The discussion was guided by the following questions:
-
Does autonomy in the critical functions of weapons systems challenge the ability of states or parties to a conflict, commanders, and individual combatants to apply IHL principles on the conduct of hostilities (distinction, proportionality, precautions) in carrying out attacks in armed conflict?
-
Does autonomy in the critical functions of weapons systems challenge the maintenance of combatant and commander responsibility for decisions to use force?
-
What is the responsibility of states or parties to a conflict, commanders, and individual combatants in decisions to use force involving autonomous weapons systems, in light of the principles of IL derived from established custom, from the principles of humanity and the dictates of public conscience (Martens Clause)?
-
How can legal reviews of weapons with autonomous functions contribute to compliance with IHL? What are past or potential challenges in conducting weapons reviews of weapons with autonomy in their critical functions, and how can these challenges be addressed?
The session further discussed the application of existing legal frameworks for the deployment of LAWS, with a focus on the implementation of Art. 36 Additional Protocol (I) to the Geneva Conventions, 1977 (API); on the eventual necessity of negotiating new legal frameworks, meeting the challenges posed by autonomous technologies; and on the recurrent aspect of meaningful human control.
Additional points were raised, arguing that responsibilities for the use of systems exist regardless of the autonomy of the weapon being used. From the floor, the USA explained its working paper CCW/GGE.1/2019/WP.5 – Implementing International Humanitarian Law in the Use of Autonomy in Weapon Systems: existing IHL legal frameworks apply to the use of LAWS; nonetheless, the delegation acknowledged that emerging technologies in the area of LAWS could strengthen the implementation of IHL, by reducing the risk of civilian casualties, and facilitating the investigation or reporting of incidents involving potential violations. Another delegation tried to push the discussion a step further, arguing that the existing legal frameworks of IL and IHL should be complemented by Criminal Law as well. Furthermore, they recalled the importance of trust and the role of Confidence Building Measures (CBMs). Another intervention from the floor complemented this view by explaining that while IHL represents a good basis for discussions on the use of LAWS, additional legal instruments need to be developed to meet the specificities of the new technology involved. Proposals to regulate such systems through legal binding instruments were proposed by the Non-Aligned Movement and Other States Parties to the CCW, in accordance to the working paper, CCW/GGE.1/2018/WP.1 – General Principles on Lethal Autonomous Weapons Systems, proposed in April 2018.
Moreover, some aspects of LAWS were re-stressed with regards to the features of lethality, autonomy, and machine learning. On the last point, the chair underlined the risks involved in the use of different kinds of datasets which are not reviewed, and that therefore, create the most risks.
The delegations reacalled the achievements reached in previous sessions in agreeing that international humanitarian law (IHL) applies to LAWS, with a particular focus on how Art. 36 API provides the necessary legal reviews of new systems and weapons, even by countries that did not ratify the Protocol. Expanding the point, delegations recalled the need for establishing mechanisms of information and best practice sharing to address the challenges posed by LAWS. Furthermore, a delegation proposed the creation of a compendium collecting of best practices on the use of LAWS in compliance with IHL. An interesting proposition was additionally put forth with the aim of strengthening Art. 36 API. The establishment of an annual report mechanism on the development of LAWS and the creation of a checklist and toolkit or guiding principles to refer to was proposed. Nonetheless, a delegation raised some concerns about such legal reviews: Algorithms may produce different results in different environments, therefore, there is a crucial need to test them in realistic contexts and rely on certifications. Indeed, the self-learning capabilities of the systems represent important challenges for system reviewers. Finally, weapons systems always have a margin of error for which only the human being deploying the weapon can be held accountable. Another delegation reiterated, IHL is highly context-dependent and therefore, the critical functionalities of the system should be assessed case by case. Regarding the wording ‘critical functionalities’, an intervention from the floor contested its abstract use, arguing that it might create misunderstandings over its meaning.
In order to ensure the full applicability of IHL, delegations stressed the crucial importance of human control as the only variable able to ensure the respect of the principles of distinction, proportionality and precaution, accountability, and responsibility. Following this line, a delegation said that it is currently impossible for a machine to replicate the human experience and capacity for understanding a conflict situation; while another argued that LAWS do not have the ability to make decisions proportionally and to respect and comply to ethical values. The indispensable necessity to have meaningful human control in the use and development of LAWS was also stressed. It was explained that in a narrow human-in-the-loop situation (in which the human action is related to the deployment of one system) or a wider human-in-the loop situation (in which the human actor is in control of a broader range of systems), there exists the crucial necessity to always be able to control, modify, or abort the deployment of the weapon. To ensure this, new weapons reviews need to satisfy high standards of predictability and reliability. On this last point, delegations stressed the importance of a multi-dimensional approach, as well as the need to develop AI systems with an holistic approach.
A last intervention underlined the qualitative measurement and judgment needed in order to comply to the principles of proportionality, distinction, and precaution at the core of IHL, which can be assured only by human commanders and combatants. In addition to that, the need to have control by design in the development of new weapon systems, and control in the use of those systems, which is most important for the compliance of the conduct of hostilities under IHL was also stressed. The intervention reiterated the necessity of always having human supervision and the ability to intervene; predictability and reliability features embedded in the systems; and a possibility to always have operational constraints.
Related topics
Related event
Group of Governmental Experts on Lethal Autonomous Weapons Systems (GGE LAWS)
25 Mar 2019 14:30h - 29 Mar 2019 14:30h
Geneva, Switzerland