Thank you, Chair.
Human Rights Watch welcomes the Chair's background paper from last week and the latest revision of the rolling text dated May 12. We also welcome the opportunity to discuss these documents in an inclusive forum.
Human Rights Watch, along with Stop Killer Robots, which we co-founded, has consistently called for meaningful human control over the use of force. We therefore appreciate the background paper's examination of the related concept of "context-appropriate human judgement and control," and the questions posed for today's meeting. We note that these questions were first asked to participants attending the GGE meeting on lethal autonomous weapons systems (LAWS) in March 2025.
The first two questions ask if direct control is necessary, or if indirect control, including before the selection and engagement of a target, can suffice in some situations.
We agree that absolute human control is not required, but it is difficult to answer questions about "direct" and "indirect" control without a clearer understanding of what those terms mean. The background paper implies that "kill switches" and pre-set parameters represent direct and indirect forms of human control, respectively, but it is unclear if they are the only examples.
Regardless, the meaningfulness of control provided by a kill switch depends on how much time the operator has to intervene. As the paper says, pre-set parameters are only as good as their quality.
The parameters' effectiveness also depends on how far ahead they are pre-set. To be meaningful, human control should be exercised at the time of attack, not long before. For example, to comply with the proportionality test, the person using the autonomous weapon system needs to determine at the time of a specific attack whether civilian harm is excessive in relation to military advantage in a complex, rapidly changing environment.
The GGE Chair also asked the participants about the contextual factors that should be taken into account when determining the appropriate level of human judgment and control. Certain contexts, such as populated areas where civilians live, heighten the risks of using weapons, including autonomous ones.
Nevertheless, the argument that autonomous weapons systems could be safely used away from such areas still raises concerns. In particular, experience shows that once a certain weapon system exists there is a likelihood that it will proliferate or be misused.
Proponents of cluster munitions frequently contended it would be safe to use them against tanks in a desert. Once cluster munitions were in their arsenals, however, states used these indiscriminate weapons repeatedly in cities and towns. In every conflict zone where I have documented the use of cluster munitions-and there have been many-I have seen first-hand the effects of cluster munition use in populated areas.
While context may be a factor to consider, it seems that the level of human judgment and control should determine whether a context is appropriate not the other way around.
Finally, I wish to raise three additional points, each of which is relevant to both the Chair's background paper and the revised rolling text.
First, neither document mentions that autonomous weapons systems that target people also pose serious legal and ethical risks. While the weapons systems may not have been the background paper's subject, the rolling text, which contains numerous other restrictions, fails to include a prohibition on antipersonnel autonomous weapons systems despite calls from many states, ICRC, and civil society.
Second, the background paper discusses human control only as it applies to compliance with international humanitarian law. It does not address other relevant bodies of law, notably international human rights law, nor does it consider ethical, humanitarian, and security concerns. The broad participation and substantive engagement in this month's UN General Assembly meeting in New York, where those issues were raised, shows that there is strong interest in dealing with them. The rolling text takes a similarly narrow approach, focusing on international humanitarian law.
Finally, on a related note, the background paper and rolling text address the use of autonomous weapons systems in armed conflict, but the systems will likely be used in law enforcement operations and other contexts. A legally binding instrument should prohibit and regulate autonomous weapons systems under any circumstances.
Thank you.