Hunter 2-S Swarming Attack Drones: Legal & Ethical Dimensions

by | Mar 31, 2022

Hunter 2-S Drone

Halcon, a defense company based in the United Arab Emirates, unveiled its aerial fleet of swarming drones—Hunter 2-S—at the Unmanned System Exhibition and Conference in Abu Dhabi on February 21, 2022. The small-sized Hunter 2-S modular launching system is the latest in the series of unmanned aerial platforms leveraging advanced artificial intelligence technology. This post addresses legal and ethical considerations that are potentially relevant to the development and deployment of the Hunter 2-S, based on the limited information available to the public about this system.

The Hunter 2-S and Autonomous Swarming

The Hunter 2-S, a fixed-wing unmanned aerial vehicle which is 1.25m long and has a wingspan of 1.44m. It is designed to operate as part of a collective of drones that fly in formation to perform a coordinated targeting mission. According to Saeed Al Mansoori, the chief executive of Halcon, these drones are capable of communicating with each other and “[o]nce the target is identified, a decision is made among the swam, and based on the target size, shape, and category, they decide how many drones are needed to destroy the target … and then they start diving towards it.”

Unlike the Turkish-made Alpagu and Kargu-2, the Hunter 2-S appears to be built as an anti-materiel weapon, designed to destroy or cause damage to enemy military aircraft and armored vehicles. It is a type of “homing munition,” where autonomous functions are used to increase the accuracy and reliability of executing a strike against a target or target group selected by a human operator.

It is also distinct from the existing class of unmanned aerial vehicles or precision-guided munitions in that artificial intelligence (AI) is used to coordinate the swarming operation of these drones. The AI enables autonomous determinations and adjustments of each drone’s flight profile. The loitering capability of these munitions allows selective targeting and, potentially, permits attacks to be aborted later in a mission than other conventional weapons systems.

The swarming of autonomous drones is increasingly recognized as one of the key components of the ability to penetrate highly defended and contested environments. Attritable, low-cost unmanned systems such as the Hunter 2-S are expected to perform a variety of tasks, including target attack with limited human intervention, especially in situations where satellite navigation and communication signals are disrupted by an adversary.

The Hunter 2-S and the Law of Weaponry

The legality of the Hunter 2-S, like any other weapons systems, must be assessed against relevant international law obligations that apply to States as they study, develop, acquire, or adopt it. This includes the prohibition of an inherently indiscriminate weapon. As a munition designed for selective targeting, the Hunter 2-S is unlikely to be prohibited on this basis as long as it is capable of operating with a sufficient degree of accuracy and reliability.

As an anti-materiel weapon, weapons law rules applicable to anti-personnel weapons do not control the legal assessment. For example, it is prohibited under international law to employ a weapon that is calculated to cause superfluous injury or unnecessary suffering. This rule applies to weapons that are designed to increase the injury or suffering of the “persons” attacked beyond what is justified by military necessity (U.S. DoD Law of War Manual, § 6.6.2). As such, unnecessary suffering has no application in assessing the legality of anti-materiel weapons, such as the Hunter 2-S, that are designed to cause damage to objects rather than persons.

The use of this weapon system may nonetheless be prohibited if it is equipped with ammunition that the State has agreed to ban, such as chemical or biological weapons and laser weapons that are specifically designed, as a combat function, to cause permanent blindness to unenhanced vision. However, there are no indications that the Hunter 2-S will be designed for such use. Rather, these drones appear to be kinetic-kill vehicles.

The Hunter 2-S and the Law of Targeting

The deployment of the Hunter 2-S system must also comply with various obligations that apply to the conduct of hostilities, such as the principle of discrimination, rules concerning proportionality, and the duty to undertake feasible precautions. This does not require the autonomous system itself to understand and apply those rules as they operate, but rather means that commanders are required to ensure that this weapon platform operates within the bounds of targeting law.

The system’s ability to operate in compliance with targeting law hinges upon a number of operational, technological, and environmental factors.

Autonomous Combat Identification and the Principle of Discrimination

Reliance on sensors and image recognition algorithms has its limits as a means of distinguishing legitimate military targets from legally protected civilians and civilian objects. When the target is an object, the attack must be directed at those making an effective contribution to military action by virtue of their nature, location, purpose or use. The visual features of an object alone do not always offer a reliable basis for making this legal assessment. A vehicle, ordinarily used by civilians, may well be classified as a legitimate military objective when, for example, it is actually used as a means of carrying an explosive to detonate at a security checkpoint.

This technological problem can be circumvented if, as reported in the media, the use of image processing is limited to the combat identification of a target that has been selected by a human operator. The selection and acquisition of military targets is an elaborate process involving the evaluation of mission objectives, intelligence analysis, target systems, and the operational environment, among other elements (JP 3-60 Joint Targeting Chapter II). It is an intelligence-driven, human decision-making process. Artificial intelligence may significantly accelerate the speed and accuracy in intelligence collection and analysis, as has been proved during the May 2021 military action in Gaza  (Gaza Conflict 2021 Assessment, p. 31), and yet the ultimate decision rests essentially with human commanders.

The decision to employ swarm drones, such as the Hunter 2-S, is integral to weaponeering—the process of determining the specific means required to create a desired effect on a given target. This decision involves an assessment of target characteristics (such as size and hardness), probability of damage calculations, and delivery parameters (such as altitudes, speeds, and dive angles), based on scientifically valid data (JP 3-60 Joint Targeting, II-15). The deployment of swarm drones like the Hunter 2-S would effectively mean delegating a portion of this process to the attack execution stage.

Delegation of this aspect of targeting to an autonomous system does not necessarily offend the principle of discrimination. This is even so when a civilian object is mistakenly targeted as a result of human error in intelligence analysis or GPS coordinates, as long as the autonomous system is correctly tracking the characteristics of the misidentified object. This is not the case, however, if image recognition algorithms caused misidentification. It would amount to an indiscriminate attack if commanders employed such a system knowing the intolerable risk of misidentification or in an environment where it was known to mischaracterize an object.

Autonomous Combat Maneuverability and Feasible Precautions

The autonomous execution of combat identification based on sensor feeds and image recognition algorithms would require a systematic consideration of feasible options in exercising precautions. Belligerent parties are required to take feasible steps to verify that the target is a legitimate military objective. For the Hunter 2-S, this would mean that prior to engagement, the weapon system must attain an accurate characterization of the target object in the operational environment and validate its targetability.

On the one hand, the autonomous combat identification function integrated into swarm drones provides a technological means to enable visual confirmation when there is no other intelligence, surveillance and reconnaissance (ISR) platform available to the verification of a target. It will help ensure the ability of these drones to operate in compliance with the principle of discrimination.

On the other hand, it will open doors to various feasibility questions—for example, are the Hunter 2-S drones programmed to suspend or cancel the targeting maneuver when the sensor image captured on the object does not match the physical characteristics of the target object selected by the human operator? What if the object that was determined to be a legitimate military objective because of military use no longer qualifies as such due to the change of circumstances ruling at the time when the drones engage in the attack?

It is not necessarily unlawful to employ weapon systems without the ability to suspend or cancel the attack when it is not feasible. However, the feasibility of “shift cold” as a post-release abort option may need to be considered if there are technologically viable solutions for it. Unlike laser-guided munitions, swarm drones are unlikely to cause any casualties or damage at the re-directed point of impact if the system can be programmed with a mission abort maneuver reverting to a loitering flight position.

Weapons Effect and the Rule of Proportionality

Under the law of armed conflict, it is prohibited to launch an attack if it is expected to cause incidental civilian harm which would be excessive in relation to the concrete and direct military advantage anticipated. The ability of autonomous systems to comply with this requirement is often questioned and cited as the most serious argument against the use of autonomy over considerable periods of time (p. 332). However, this problem arises only in situations where civilian casualty is a possibility. It is indeed plausible to foresee a situation where the deployment of swarm drones is restricted to a battlefield where no civilian presence is expected, and in such cases, the question of proportionality is moot.

The need to consider collateral damage and evaluate whether expected civilian harm is excessive in relation to the direct and concrete military advantage anticipated from the attack also hinges upon the type of ammunition loaded on swarm drones. Assuming that the Hunter 2-S is developed as a kinetic-kill vehicle, its penetration capability relies on high-kinetic energy and high-temperature effects, with a limited range of fragment dispersal. Autonomous drones carrying high-explosive blast munitions, on the other hand, are capable of inflicting damage in a wider range of area, with the greater possibility of causing incidental civilian harm.

If civilian presence is expected, States are also required to exercise feasible precautions to prevent or mitigate civilian collateral damage. This means, for example, that the attacking force must monitor the battlefield for the presence of civilians and civilian objects in the vicinity of the target and set the level of civilian casualties acceptable for the military value of target systems when selected by human operators. The aforementioned “shift cold” question also arises in this context—whether the Hunter 2-S can or should be equipped to scan for transients and be programmed to abort the attack maneuver in situations where an unexpected civilian presence is detected in the vicinity of the target.

Ethical Considerations

The ethical case for the development and deployment of the Hunter 2-S system is strong. It has the potential to reduce the number of rounds required to defeat a target or target system by enhancing the probability of creating a desired effect. The increased accuracy of the strike is also expected to reduce the risk of causing incidental civilian harm.

There will be an even stronger case to be made if the Hunter 2-S were to have the ability to correct human errors, such as intelligence failure in target recognition and the registration of incorrect GPS coordinates, by suspending the attack maneuver and reverting to a loitering flight position. It will provide technological solutions to the ethical issue of unintended casualties as a result of human error, which is not currently regulated under the law of armed conflict beyond the duty to exercise feasible precautions.

As an anti-materiel weapon system, the Hunter 2-S is shielded, to a degree, from ethical criticisms that are often levelled at lethal autonomous weapon systems, which some consider should be banned altogether. The limited autonomy delegated to the Hunter 2-S in the targeting cycle may also deflect attention from the muddied debate around the demand for meaningful human control.

Concluding Observations

The Hunter 2-S exemplifies the potential promise that AI holds to improve the speed and accuracy of warfighting. Swarming technology is calibrated with the application of AI to enable targeting with greater accuracy and efficiency. As a kinetic-kill vehicle, it could plausibly be employed without any risk of causing incidental civilian harm and therefore without the need to evaluate the proportionality of the attack.

In addition, the limited delegation of targeting decision to autonomy at the attack execution stage shows the potential for technologically viable solutions to the feasibility of “shift cold” as a means of preventing unintended civilian casualties. The ethical case to introduce such an autonomous system into the future battlefield is even strengthened if it can demonstrate the superior ability to correct human errors and respond to emergent events with a higher level of compliance with the law of armed conflict.

***

Hitoshi Nasu is a Professor of Law at the United States Military Academy.

 

 

Photo credit: Staff Sgt. Charles Rivezzo, U.S. Air Forces Central Command PAO

Print Friendly, PDF & Email