Lieber Studies Big Data Volume – Algorithms of Care: Military AI, Digital Rights, and the Duty of Constant Care

by | Feb 13, 2024

Constant care

Editors’ note: This post is based on the authors’ chapter in Big Data and Armed Conflict (Laura Dickinson and Ed Berg eds. 2024), the ninth volume of the Lieber Studies series published with Oxford University Press.

In Big Data and Armed Conflict: Legal Issues Above and Below the Armed Conflict Threshold, the new book co-edited by Laura Dickinson and Edward Berg, contributors were asked to examine the sociotechnical and legal metamorphosis of the modern battlefield. Evaluating a set of contemporary and near-future technologies, all relating to the collection, storage, processing, and dissemination of large datasets, the contributors considered the extent to which international legal regimes are sufficient for regulating these disruptive technologies.

To those observing from a distance, warfare may seem to still encompass a predominantly traditional set of military tactics and conventional armaments. Looking to both the ongoing wars in Gaza and Ukraine, an outside observer might say it is still all about airstrikes, missiles, and tanks. But as Laura Dickinson writes in the introduction to the book, “Big Data is radically reshaping the modern battlefield.” By leveraging big data, militaries can enhance their predictive capabilities about enemy operations, optimize resource allocation and streamline supply chain management, enhance surveillance through pattern recognition, and improve decision-making processes. This data-driven approach enables a deeper understanding of the battlefield, aiding warfighters in the development of more effective strategies.

But as Omar Shehabi and I have already demonstrated in our previous post, machine learning tools and big data analytics are not some panaceas for all of war’s ills and misfortunates. Just because we don’t always fully understand algorithms does not mean they possess some magical characteristics. In fact, algorithms can produce more harm than good by scaling up and automating bias, dehumanizing military decision-making, and challenging effective oversight and accountability.

There is another category of harm that can result from the misuse and abuse of autonomous systems and big data tools. In my contribution to Laura and Ed’s book, I describe how with these new technologies, “militaries are now capable of effecting digital harms, thereby eroding individual rights to privacy, anonymity, access to information, online freedom of expression, digital autonomy, and dignity, and intellectual property.” The protection of digital rights in times of armed conflict has become a research agenda for me. I have written about it extensively (see for example here and here) and co-edited a book-length anthology on the subject, together with Russell Buchan.

The Geneva Conventions and Additional Protocols are not centered around digital rights protection. Partially, this is due to the fact that these documents were written in a pre-Internet age and before the cyber revolution. Additionally, however, these treaties were mostly responsive to the wars of the time, namely the First and Second World Wars for the Geneva Conventions of 1949, and the Vietnam War, the Algerian War of Independence, and certain internal wars in Africa, Latin America, and Asia for the Additional Protocols of 1977. Being motivated by a set of humanitarian concerns that did not encompass the kind of rights protection projects of importance in the digital age, the drafters simply had different priorities.

This creates a challenge for identifying existing international humanitarian law (IHL) rules that may be said to apply, by analogy or by extension, to data-intensive and invasive wartime practices. My chapter, entitled “The Duty of Constant Care and Data Protection in War,” identifies one possible IHL rule that may be different. I argue that of all the rules codified within both treaty and customary IHL, the duty of constant care offers the most promise as a wartime data protection regulator. Within the limits of this post, I highlight some of the core findings in my chapter. I discuss the core features of the duty of constant care as understood in contemporary interpretations and assess its implications for digital rights protection in war. I conclude by considering Israel’s use of artificial intelligence (AI) in the war in Gaza as a case study for assessing the findings in my chapter.

The Fundamentals of the Duty of Constant Care

Article 57(1) of Additional Protocol I (AP I) establishes that “in the conduct of military operations, constant care shall be taken to spare the civilian population, civilians, and civilian objects.” As Professor Michael Schmitt writes in one of his contributions to the book, “whether this article restates customary IHL is an open question . . . and even if the obligation applies to all military operations, there is uncertainty about its precise meaning in concrete situations.”

I share Professor Schmitt’s concerns. As the International Committee of the Red Cross (ICRC) Commentary to Article 57 reflects, the drafting of the Article “required lengthy discussions and difficult negotiations” with the final wording being the “fruit of laborious compromise” (para. 2184). This drafting history entails that, unless further codification efforts are attempted, there will forever be some uncertainty about the exact contours of the duty of constant care. Nonetheless, my chapter seeks to add clarity to some of these debates. Relying on treaty interpretation, State practice, relevant case law, and secondary literature, the chapter attempts to answer five questions about the nature and scope of the duty of constant care. The five questions and a summary of my answers to the same are as follows.

Is the Duty of Constant Care Legally Binding as Customary International Law?

The answer is definitively yes. As the International Criminal Tribunal for the former Yugoslavia (ICTY) explained in Prosecutor v. Kupreškić, not only does the duty of constant care embody “general pre-existing norms,” it also does “not appear to be contested by any State, including those which have not ratified the Protocol” (para. 524). The ICRC Customary International Humanitarian Law Study further confirms in rule 15 that the duty of constant care applies in both international and non-international armed conflicts.

What Military Activities Trigger the Duty?

The duty of constant care to spare civilians in military operations is triggered by a broad range of military activities, not just attacks. This expansive definition, as highlighted by Professor Eric Jensen, implies a general legal obligation which captures all military activities with a nexus to combat. In the context of digital-age warfare, this duty should extend to all informational operations supporting military activity. This includes intelligence gathering, data collection, and management activities, regardless of the actor involved (private contractors or civilian intelligence agencies), and as long as these activities are intended to advance combat.

The application of this rule involves assessing how closely related these informational activities are to military combat objectives. This “proximity test,” while subjective and potentially challenging due to the fluid nature of data collection and processing, is crucial. The complexity of data transfers and the multifaceted use of information in modern warfare add to the interpretative challenges. However, this interpretation aligns with the humanitarian intent of the Geneva Conventions to minimize civilian harm in armed conflict.

When Does the Duty Apply?

The duty of constant care, as its name suggests, is constant, meaning it applies continuously, without any temporal limitations. This makes it relevant in both peacetime and wartime. Moreover, as explained in Tallinn Manual 2.0 and other analyses, this duty necessitates ongoing situational awareness, ensuring that military operations do not adversely affect civilians or civilian objects. This obligation extends beyond active combat, covering periods before and after a conflict. The application of this duty, especially in the context of data collected for non-combat purposes that later becomes useful for military objectives, raises complex questions about the extent of data protection responsibilities. Determining when and how these standards apply requires a detailed, case-by-case analysis, adhering to the proposed “proximity test” to assess the relevance and applicability of these protections in various military scenarios.

What Harm is the Duty Meant to Prevent?

The duty of constant care aims to prevent a wide range of harms to civilians, extending beyond mere physical and kinetic damage. Relying on the parallel duty of defenders provided in Article 58 of AP I, the duty of constant care may be said to extend to a broader category of “dangers” that go beyond mere “damages.” A similar move is made in UN General Assembly Resolution 2675, which introduced an obligation on those engaging in military operations to make “every effort . . . to spare the civilian population from the ravages of war” (emphasis added). This broader interpretation includes protecting against dignitary harms and rights violations in the digital realm, such as privacy breaches, loss of anonymity, and freedom of expression infringements.

When is the Duty Breached?

The duty of constant care involves a delicate balance between humanitarian concerns and military objectives, sometimes necessitating commanders to accept greater risks to their own forces to spare the civilian populations. In the chapter, I draw some analogies to the obligation of “due regard” in maritime law. Ultimately, the duty of constant care entails weighing various factors, including the nature and importance of both the rights at risk and the military activities involved, as well as the potential for alternative approaches.

A breach of the duty of constant care might occur when a commander fails to adequately consider these factors in the planning and execution of its operations. For instance, if a military occupier collects extensive personal data from civilians but neglects basic cybersecurity measures, leading to foreseeable data breaches and subsequent harms, this could indicate a breach of the duty of constant care. Such negligence in protecting civilian data, when feasible and reasonable measures could have been taken, demonstrates a failure to adhere to the duty of constant care in the context of modern warfare, where information handling and protection are crucial.

The Duty of Constant Care Retold as a Data Protection Rule

The ICTY expert committee established in 2000 to review the NATO bombing campaign in the former Yugoslavia, concluded in its report that Article 57(2)(a)(i) of AP I requires military commanders to “set up an effective intelligence gathering system” (para. 29). As I explain in the book chapter, there is an interesting correlation between this obligation and the duty of constant care provided in Article 57(1).

Article 57 mandates militaries to establish effective data collection, processing, verification, assessment, and dissemination frameworks and agencies. Those data arms, formed in response to this requirement, operate year-long to produce data to all echelons of the military machine. The effectiveness of this apparatus will be determined by objectively examining the methodologies of data management it employs. In this data-intensive environment, which Article 57 singlehandedly erected, the duty of constant care stands as the only possible lighthouse that could guide militaries in discharging of their duties.

One example I use in the book chapter concerns “targeting banks,” where militaries store and update cards with information about potential targets for aerial strikes. These cards might identify the geographical coordinates of the target, the military value of it, nearby sensitive sites, and a preliminary assessment of proportionality. Producing the cards ahead of the armed conflict is necessary to generate an effective list of targets in the leadup to the war. Regular checks are essential to ensure that any inaccuracies are rectified. A building initially identified as a weapons storage site might later become a kindergarten. This ongoing verification requirement, built into both Articles 57(1) and 57(2)(a)(i) of AP I, reflects an early articulation by the drafters of the Protocol of a data protection rule.

The drafters might not have realized that they were doing so, but in essence they were planting the seeds for what the European General Data Protection Regulation (GDPR) now calls the “data protection principle.” As the GDPR codifies, “Personal data shall be . . . accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay.”

If that is true for data accuracy, why not rely on the duty of constant care to incorporate other data protection principles into IHL? Truly, as I argue in the chapter, the duty of constant care might serve as a “temporary gap filler to fill the lacuna that exists around data protection in IHL,” at least until more substantive codification efforts evolve. Moreover, while I explicitly discuss certain principles in the chapter such as legality, storage specification, and data integrity, other data protection rules can also become relevant. Confidentiality, purpose limitations, fairness, integrity, availability, and accountability, for example, could also be picked up and implemented within this expansive understanding of the informational effects of the duty of constant care.

Conclusion: A Return to the War in Gaza

My previous post with Omar Shehabi discussed the severe and dramatic consequences of introducing AI as a decision-making support tool in targeting operations. Most of those consequences had to do with physical harms to civilians and civilian objects. Some might find it strange to discuss digital rights protection in war. As I have previously written “When more visceral assaults on the human body become the norm, should we take the time to recognize the assaults on the human spirit?” I believe that the answer is that we should, not only because dignity and autonomy are values central to the project of IHL, but also because they serve as gatekeepers. With their demise, other bodily assaults could more easily follow.

Consider programs like “The Alchemist,” “The Gospel,” and “Depth of Wisdom,” which we discussed in our previous post. These AI programs are intelligence programs. Their very development depended on years of data collection, data processing, data analysis, and data storage. What rules Israel imposes on itself in the curation and management of the datasets necessary to produce its AI algorithms plays an important role in the quality and nature of the ultimate tools it deploys. Embedding the duty of constant care into the design-phase of wartime AI projects, including by way of routine and continuous weapons review and human rights impact assessments, is a crucial component in this story. In other words, the wartime protection of digital autonomy and digital ownership, could lead to a corollary protection of bodily autonomy and physical property during armed conflict.

Considering the evolving nature of warfare and technology, the duty of constant care could play a vital role in shielding both analog and digital rights. The duty of constant care could offer a reminder to military commanders that they must adhere to both the laws of humanity and public conscience respecting not just physical safety, but also the human spirit and dignity in times of war.

***

Asaf Lubin is an Associate Professor of Law at Indiana University Maurer School of Law and a faculty. He is additionally, a Faculty Associate at the Berkman Klein Center for Internet and Society at Harvard University, an Affiliated Fellow at Yale Law School’s Information Society Project, and a Visiting Scholar at the Hebrew University of Jerusalem Federmann Cyber Security Research Center.

 

 

 

Photo credit: Maj. Christopher Vasquez

Print Friendly, PDF & Email