Year Ahead – Emerging Technologies and the Collection of Battlefield Evidence

by , | Jan 13, 2023

Battlefield Evidence

In a recent series of posts on responsible artificial intelligence (AI), various authors (including Chris Jenks) have discussed the role of “responsibility” in the context of using of AI enabled systems, including weapon systems, on the battlefield. In each of these posts, the ultimate question was how to adjudge a situation that involves emerging technology performing functions or tasks normally done by a human. While resolution of the meaning and application of “responsibility” with respect to AI is sufficiently notable of itself in 2023, the possible impact of emerging technologies on war crimes prosecutions will likely be equally innovative and challenging. This question will inevitably become a key point of discussion in 2023 as the call for prosecuting war crimes from the current conflict in Ukraine increases.

Much has already been written about Bellingcat’s documentation of civilian harm in Ukraine. Other groups are also engaged in similar documentation and recording efforts. As already discussed elsewhere on this Articles of War, the conflict between Ukraine and Russia has given the world a glimpse of modern “data-rich” combat.

Earlier posts have noted how this documentation effort is being carried out with a view to potential war crimes prosecution. While some criminal prosecutions have already taken place, many more cases will undoubtedly occur in 2023.

Emerging technologies will certainly play a role in those prosecutions. Inevitably prosecutors will seek to introduce video clips, camera images, live streams, and many other forms of “evidence” of war crimes. Authenticating and attesting to the reliability of evidence derived from those technologies will present evidentiary and logistical difficulties that should make the defense Bar in any jurisdiction quite happy. However, it is unlikely that these difficulties will be insurmountable, even given today’s evidence rules and methods. Perhaps more interesting and equally important, is the future of evidence, particularly that gathered by AI, that these prosecutions may demonstrate.

The expanding use of AI, including “responsible” AI, on the modern battlefield will only complicate future evidence gathering issues, pushing the boundaries of admissibility or weighing, particularly in jurisdictions that place an emphasis on the defendant’s right to confront accusers. Layering AI systems, particularly those that are classified and/or contain proprietary information, will present challenges. Specifically, testimony will be required to explain how the different AI systems operate and to attest that there is widespread acceptance of their reliability.

Assume, for example, a situation where visual data are captured by an unmanned and unmonitored AI sensor such as a drone. That drone is directed by another AI “control” system that, based on other AI sensors on the battlefield, determines the drone’s flight path and when it captures visual data. The “control” AI system then analyzes the visual data, applying pre-determined enhancements that increase its intelligence value. These enhancements might include visual resolution clarity, use of facial recognition technology, and the application of biometric data (perhaps collected by other non-human sensors in the area). Finally, a prosecutor seeks to introduce those data in a war crimes prosecution. Over time, all of these “evidentiary hurdles” might be overcome, but as with other technologies, the initial determinations in diverse courtrooms may lead to disparate results, both in terms of admissibility and, depending on the relative significance of the evidence, on the overall trial outcome.

As this sort of “evidence” becomes more acceptable and more accepted as part of war crimes prosecution, its usage will only increase in volume and in usefulness as emerging technologies continue to spread across the modern battlefield. This year might present the beginnings of the development of rules on how such evidence will be used in the future.

***

Chris Jenks is a Professor of Law at the SMU Dedman School of Law in Dallas, Texas.

Eric Talbot Jensen is a Professor of Law at Brigham Young University.

 

Photo credit: Mstyslav Chernov