The Future of Warfare: National Positions on the Governance of Lethal Autonomous Weapons Systems

by , | Feb 11, 2025

Autonomous

Lethal autonomous weapons systems (LAWS), such as drones and autonomous missile systems, are no longer a theoretical concern. Indeed, they are finding their way onto the battlefield. Amid growing international concern, States have articulated a range of positions on how LAWS should be governed, ranging from relying on existing international law (traditionalists), to a legal ban on LAWS (prohibitionists), to a new treaty that would ban certain uses and regulate others to “clarify and strengthen existing law” (dualists). In this post, we analyze a flurry of recent international diplomatic efforts to address LAWS, focusing on these three main positions adopted by various States and identifying potential next steps.

Definitional Challenges

Despite progress, the definition of LAWS under international law currently lacks consensus. Simply put, LAWS are weapons systems that, once activated, “select targets and apply force without human intervention.”

Unfortunately, as this post discusses, the lack of a clear definition in States’ national positions has significantly influenced how different States approach this issue. Various conceptions of LAWS include fully autonomous weapon systems without the capability for human control, fully autonomous weapon systems with the capacity for human control, or other forms of autonomous weapons with varying degrees of human control. Adding to these uncertainties are differing understandings of key terms, such as “human control,” “intervention,” and the element of “lethality.” These ambiguities pose significant challenges in achieving the necessary consensus for progress in this area.

The International Committee of the Red Cross (ICRC) definition of LAWS provides a helpful starting point: “Any weapon system with autonomy in its critical functions. That is, a weapon system that can select (i.e., search for, detect, identify, track, select) and attack (i.e., use force against, neutralize, damage, or destroy) targets without human intervention.”

Closely resembling this definition, the Convention on Certain Conventional Weapons’ Group of Governmental Experts on Lethal Autonomous Weapons Systems (CCW Group of Experts) rolling text definition of LAWS states:

1. A lethal autonomous weapon system can be characterized as an integrated combination of one or more weapons and technological components that enable the system to identify and/or select, and engage a target, without intervention by a human user in the execution of these tasks.

2. The above description is without prejudice to any future understanding and the potential modification of this characterization, as well as the possible exclusion of certain types of systems.

National Positions

When synthesizing the positions taken by Member States and Observer States in the UN Secretary-General’s Report on LAWS (2024), the UN General Assembly First Committee, and the CCW Group of Experts working papers (2023 and 2024), three core national positions on LAWS emerge. First, there are traditionalists, or those States that consider that applying existing international law is sufficient to address LAWS. Second, certain States are prohibitionists, in that they are calling for a new treaty categorically prohibiting LAWS. And third, some States take a dualist position, calling for a new treaty prohibiting certain uses of LAWS while allowing others to continue on a regulated basis.

In the following section, after describing each position and identifying some of their leading proponents, we will briefly discuss potential areas of consensus and paths forward, given the divergence of views.

“Traditionalists” – Sufficiency of Existing International Law

Some States acknowledge the challenges posed by LAWS but assert that current international law is adequate to address them. They advocate regulating LAWS through adherence to existing legal frameworks without the need for new binding agreements. This position is championed by countries such as Israel, the United States, and Russia (see UN Secretary General Report, p. 61–63, 94–97, 113–15).

For instance, Russia argues “there are currently no convincing grounds for imposing any new limitations or restrictions on lethal autonomous weapons systems, or for updating or adapting international humanitarian law to address such weapons” (p. 95). Instead, Russia maintains that States should impose limitations and restrictions individually under Article 36 of Additional Protocol I, which says:

In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party.

Russia adds that Article 36 does not impose specific obligations on how legal reviews of new weapons should be conducted, leaving the matter to the discretion of States. While Russia’s interpretation reflects an expansive reading of State discretion, others have argued that Article 36 inherently requires a significant restriction on LAWS, specifically the need for human control as part of compliance with International Humanitarian Law (IHL).

The United States emphasizes implementation strategies under the current IHL regime, asserting that IHL “does not prohibit the use of autonomy in weapon systems or the use of a weapon that can select and engage a target” (p. 114). The United States disputes the usefulness of the “meaningful human control” test, arguing “there is not a fixed, one-size-fits-all level of human judgment that should be applied to every context” (p. 115). This suggests that in the absence of clear treaty norms, issues such as human control over autonomous weapons systems vary based on national policies.

On human control requirements, Russia believes:

An important limitation to be that humans should have control over the operation of lethal autonomous weapons systems. The control loop for such systems should therefore allow for a human operator or an upper-level control system to intervene to change the operating mode of such systems, including to partially or completely deactivate them. However, the specific forms and methods of human control should be left to the discretion of States, and direct control need not be the only option (p. 95).

Nevertheless, like the United States, Russia rejects the terms “meaningful human control” and “forms and degrees of human involvement” (p. 96). The absence of a specific threshold for human control—particularly in the form of “authorization, supervision, and intervention”—affords States broad discretion to define fully autonomous systems and creates ambiguities in their regulation.

“Prohibitionists” – A New Prohibition Treaty

A different group of States is calling for a total ban on the development, deployment, and use of LAWS, emphasizing ethical concerns and the necessity for human control over the use of force. Prohibitionist States, such as Serbia and Kiribati, advocate for a binding legal instrument to completely ban fully autonomous weapon systems and any autonomous weapons systems that are not susceptible to human control. They consistently highlight the necessity of human or meaningful human control, asserting that the absence of such control should render the use of autonomous weapons impermissible.

The debate over the element of “human control” is central to the regulation of autonomous weapon systems and represents a significant point of contention. Prohibitionist States frame the issue of human control as an ethical imperative. Serbia, for example, expresses concern over the removal of moral and ethical norms, warning that lethal autonomous weapons systems designed to maximize enemy losses may eventually be equipped with nuclear capabilities (p. 97).

This concern resonates particularly with States like Kiribati, which endured 33 nuclear weapons tests between 1957 and 1962. Kiribati underscores the urgency of banning autonomous weapon systems that operate “without humanitarian intervention,” advocating for a binding treaty to prohibit the development, acquisition, or use of any autonomous weapon system “designed or used in such a manner to be triggered by the presence, proximity, or contact of one or more persons, or the target profile of which otherwise represents one or more persons” (p. 68–70).

“Dualists” – A New Treaty to Prohibit Certain LAWS Uses While Regulating Others

To reconcile ethical concerns with pragmatic realities, a dualist approach offers a potential compromise. It seeks to prohibit some LAWS uses (e.g. unpredictable systems and those that explicitly target humans) while regulating others. This is often referred to as a “two-tier approach” and appears to have significant support among a large number of States. Among its proponents are Austria, the Netherlands, Switzerland, Spain, Norway, France, Germany, Italy, and Luxembourg.

Such an approach distinguishes between weapon systems capable of being controlled by humans and those without such a capacity. The former category is governed by general rules of international law, while the latter should be outlawed. Among proposed impermissible LAWS uses, the Netherlands describes “inherent unpredictability” as “the ability to change task, assignment or goal, including the applicable rules of engagement that were delegated to them, without human approval” (p. 77). From this perspective, a binding treaty should prohibit such systems not only during operations but also at the stages of design and development.

Tests like “inherent unpredictability” purport to introduce a technical framework that lends greater precision to the concept of human control. However, such tests are not devoid of their own theoretical and practical complexities. For example, while the Netherlands frames inherent unpredictability as the capacity of a system to alter its mission parameters without human approval, this does not necessarily equate to a complete lack of human oversight. By contrast, France and Germany focus on ensuring that operators retain the ability to intervene in real-time decision-making processes (p. 48–51). The lack of clarity on what constitutes “human approval” remains a key challenge in distinguishing permissible and impermissible LAWS. Fundamentally, the test of ‘human approval’ remains as ambiguous as “human control,” with no accepted definition in current regulatory discussions.

It is fair to argue that dualists advocate a two-tier approach that bans weapon systems that cannot comply with international law (e.g., systems incapable of distinguishing between combatants and civilians, or making proportionality assessments), while regulating systems that, despite featuring aspects of autonomous decision-making, can still comply with IHL. Italy describes the latter category as follows:

Such systems would include those whose compliance with international humanitarian law could be assessed by taking into account their existing capacities, by applying appropriate testing and training of human operators (to evaluate their reliability, understandability, and predictability), or by limiting the types of targets as well as the duration, geographical scope, and scale of operations (p. 64).

Next Steps

While there are a range of national positions on LAWS that may prevent the adoption of a universal standard, there are areas where widespread consensus is possible. The CCW Group of Experts process, for example, is working on a text that articulates key statements on the regulation of LAWS. Moreover, prohibitionist and dualist advocates have a strong incentive to push for a rigorous enforcement and purposive interpretation of existing international rules and principles to LAWS. Traditionalists and dualists also share common ground in that both advocate a two-tier approach to prohibiting some LAWS uses, while regulating others (although they differ in whether the current law is adequate).

Where existing law falls short, there remains for prohibitionist and dualist States at least one common set of concerns that a new treaty should address, that is, deciding which LAWS uses are permissible and which are not. Even if such a prospective treaty may not achieve universal adoption, it would nonetheless reflect the view held by most States: the law should prohibit the most egregious uses of LAWS. There are many precedents for other binding international instruments prohibiting certain means and methods of warfare that have widespread, but not universal, adoption such as anti-personnel landmines, and nuclear weapons. That a similar fate could occur with LAWS is not surprising. Regional treaties are also a potential route for prohibitionists and dualists.

A parallel route for developing norms related to LAWS is through customary international law. The accumulation of soft-law approaches, including the articulation of statements such as those being generated through the CCW Group of Experts, would aid this. Other initiatives such as “voluntary norms of responsible behavior,” “voluntary exchange of best practices,” “codes of conduct,” “training and capacity-building,” and political declarations can guide an emerging normative framework.

While discussions on the governance of LAWS have yet to reach an impasse, the rapid pace of technological advancement makes temporary restrictions or non-binding measures a pragmatic option worth serious consideration. For instance, New Zealand has stated it is “supportive of interim measures, such as non-legally binding guidelines, declarations or norms, as steps towards a legally binding instrument and/or as practical implementation tools of that instrument” (p. 80). Interim non-binding measures, such as declarations or temporary agreements banning specific types of LAWS, can lay the groundwork for more comprehensive and binding agreements in the future. On the other hand, there is momentum for a new treaty that would ban some LAWS uses while regulating others, which could be a better focus than interim steps.

Conclusion

In the absence of clear, consistent, and enforceable international standards, the development and deployment of LAWS risks being decided in an ad hoc manner by States using their own discretion to interpret existing general IHL principles. The potential for widespread inconsistencies, heightened instability, and increased risks to civilians is already apparent. Given the growth of artificial intelligence-enabled military technology, the development of a new treaty on LAWS is becoming increasingly urgent.

Concerns about LAWS are real and will only increase as technological developments advance. There is momentum for international legal reform in this area that deserves encouragement and expeditious action with as much international support as possible.

***

Benjamin Perrin is Professor of Law at the University of British Columbia and a member of the UBC Centre for Artificial Intelligence Decision-Making and Action.

Masoud Zamani is a lecturer of International Law and International Relations at the University of British Columbia.

 

 

 

 

Photo credit: Senior Airman Raya Feltner

 

 

Print Friendly, PDF & Email