Military Use of Biometrics Series – The Body Does Not Lie, or Does It? Towards a Disability-Inclusive Approach to Military Biometrics
Editors’ note: This post is part of a series relating to the law applicable to the military use of biometrics. It is drawn from the author’s article-length work, “The Military Fantasy of Biometrics: Neglecting the Risks of the Normalizing of Bodies During Armed Conflicts” appearing in the Journal of International Humanitarian Legal Studies. The introductory post is available here.
For decades, military strategists have expressed a keen interest in biometrics, owing to its promise of providing “objective” identification. The rationale seems straightforward: a fingerprint, an iris scan, and a facial map are all measures of the human body that are irrefutable. In theory, such systems could improve the accurate identification of potential threats and thereby enhance compliance with international humanitarian law (IHL).
Yet the very notion that the human body presents an immutable truth conceals a multitude of underlying assumptions. These assumptions influence various components of biometric systems, ranging from the dimensions of hand scanners to the statistical thresholds integrated into matching algorithms. As I explored in my article, when such design decisions are made without adequate consideration of human diversity within the civilian populations in which armed forces operate, especially the lived experiences of marginalised groups of individuals, the resultant technology may serve to exacerbate civilian harm rather than alleviate it.
In this post, I examine the need to design biometric systems rooted in a disability-inclusive perspective of the human body. Furthermore, I outline specific measures that developers, militaries, and policymakers can implement to ensure that these systems serve to safeguard, rather than endanger, civilians of marginalised groups, such as persons with disabilities, in contexts of armed conflict.
How Biometrics Can Exacerbate Risks for Civilians in Armed Conflicts
Biometric systems are constructed upon implicit assumptions about what a “normal” human body looks like and how it behaves. This is reflected in the hardware that captures biometric data, which is engineered for a narrow set of dimensions: hand scanners expect palms of a specific size; facial cameras are angled for typical heights and lighting conditions; and voice recognisers presuppose unimpeded speech. Similarly, the algorithms that analyse these inputs are trained on datasets that predominantly exclude atypical physiologies.
This is particularly concerning for persons occupying marginalised societal positions. Biometric systems more frequently malfunction when applied to children, women, people of colour, the elderly, and persons with disabilities. This is because their bodies tend to diverge from these expectations.
Let us consider the situation of persons with disabilities. Although they constitute the world’s largest minority group, persons with disabilities are often forgotten, and their needs are overlooked, even more so in armed conflicts. This neglect is also reflected in biometric systems. In fact, owing to factors such as a prosthetic limb, facial deformity, visual impairment, or cognitive condition, these systems tend to either fail to register data of persons with disabilities or produce a low-confidence match.
The consequences of designing biometric systems around a singular notion of normality for persons with disabilities become particularly stark in armed conflicts. In this context, military personnel often continue to overlook the specific and disproportionate risks faced by persons with disabilities in the application of international humanitarian law (IHL). These associated risks are likely to be worsened by the military deployment of existing biometric systems.
For example, a soldier attempting to scan a civilian with autism may receive an ambiguous reading of their behaviour, prompting armed forces to regard the individual as a potential enemy threat. Conversely, a civilian with atypical physiology whose biometric profiles cannot be captured may be excluded from protective measures such as targeted humanitarian aid or early-warning alerts.
This occurs because when the data used to train an algorithm does not include enough examples of, for instance, a particular disability, the system will perceive them as an outlier. Consequently, the algorithm either flags them as threats when they are not (false positives) or fails to identify real risks associated with them (false negatives). And where military personnel are not sufficiently aware of the diversity of civilians present in armed conflicts, such errors are likely to slip past scrutiny. In a combat setting, this can result in the misidentification of a non‑combatant as a combatant or the oversight of a real danger, both of which may lead to fatal outcomes.
Taken together, these failures can translate into misidentification, unnecessary uses of force, or denials of protection. Such failures might directly contravene the rules and principles of IHL, including distinction, proportionality, and the duty to take feasible precautions in attack. Consequently, the systematic exclusion of persons with disabilities (or other marginalised groups) from the development and use of biometric systems undermines the very aim of biometric technology. As we have seen, this aim is to improve accuracy and promote better compliance with IHL.
Using Biometrics for Enhanced Protection of All Civilians in Armed Conflicts
Insofar as biometric systems increasingly influence military decision-making related to IHL, a truly inclusive perspective must be integrated at least at two levels: throughout every stage of system development; and within the structures of human oversight that govern their use.
The first level begins with building biometric systems that follow a disability-inclusive framework, which expects diversity and thus dismantles the expectation that bodies conform to a single template. At least three interlocking pillars should be considered to achieve this shift. First, developers must involve persons with disabilities and other marginalised groups as co-creators rather than solely as end-users. Participatory workshops, iterative testing, and formal feedback loops enable individuals with diverse physiologies to shape hardware specifications and software parameters from the outset. Second, baseline datasets need to be disaggregated by disability, gender, age, and other relevant attributes. Recording the diversity of physiological functions, such as visual, auditory, mobility, or cognitive impairments, provides the algorithmic context necessary to recognise and correctly process a broader range of inputs. Third, algorithmic design should incorporate adaptive thresholds and multimodal fusion, allowing a system to fall back on alternative identifiers (such as voice, gait, or pressure patterns) when one modality proves unreliable. Transparency regarding dataset composition and model behaviour allows for ongoing bias monitoring. Simultaneously, explainability tools can help identify when data from a person with disabilities is being treated as an outlier, prompting human review before any operational decision is made.
However, technical refinements alone are insufficient without proper human oversight. Therefore, at the second level, further corresponding changes in doctrine, training, and budgeting are essential. Military personnel at every level, from strategic planners to frontline operators, must receive comprehensive education on disability rights, the limits of biometric certainty, and the importance of human judgment when system outputs are ambiguous.
Scenario-based drills that simulate low-confidence readings for persons with disabilities or other atypical physiologies can cultivate a culture of cautious verification rather than blind reliance on biometric systems’ output. Institutional doctrine and policies should be revised to explicitly recognise persons with disabilities as a protected group, expanding beyond the current focus on women and children. Finally, dedicated budget lines are needed to fund adaptive hardware, ongoing engagement with persons with disabilities, and research into novel identification modalities suitable for diverse bodies. Through the implementation of these measures, armed forces may use biometric technology to improve operational effectiveness while minimising harm to all civilians present in armed conflicts.
Concluding Thoughts
Through the incorporation of a disability-inclusive perspective into the development and military deployment of biometric systems, we can transform these systems from new sources of risk into genuine means of protection. Biometrics could be used to enhance precautionary measures during the conduct of hostilities by identifying individuals with special needs in real-time. This would help tailor effective warnings, establish safe corridors, and deploy rescue resources more efficiently.
Undoubtedly, the path ahead will not be straightforward, particularly as certain accompanying risks will inevitably materialise. Among these concerns are those related to privacy and safety when collecting sensitive biometric data. It is, thus, imperative that oversight mechanisms continually evolve in tandem with the development and deployment of these systems.
Still, a shift towards a disability-inclusive approach to the military use of biometric systems is essential. As these technologies become ever more entrenched in modern armed conflicts, the responsibility to ensure they serve all persons present in armed conflicts (not just the statistically average) belongs to every stakeholder: investors, tech companies, developers, military leaders, policymakers, and civil society alike. Only through addressing the hidden assumptions in our society and the biometric systems we create can we improve the protection of often overlooked minority groups, both on the battlefield and beyond.
***
Anna Rosalie Greipl is a Researcher at the Academy of International Humanitarian Law and Human Rights (Geneva Academy).
The views expressed are those of the author, and do not necessarily reflect the official position of the United States Military Academy, Department of the Army, or Department of Defense.
Articles of War is a forum for professionals to share opinions and cultivate ideas. Articles of War does not screen articles to fit a particular editorial agenda, nor endorse or advocate material that is published. Authorship does not indicate affiliation with Articles of War, the Lieber Institute, or the United States Military Academy West Point.
Photo credit: ThisisEngineering via Unsplash
RELATED POSTS
by Marten Zwanenburg, Aleksi Kajander, Steven van de Put
October 21, 2025
–
Necrometrics and Contextualising Human Remains in Armed Conflict
by Lily Hamourtziadou, Welmoet Wels
October 22, 2025
–
Israel’s Use of AI-DSS and Facial Recognition Technology: The Erosion of Civilian Protection in Gaza
October 24, 2025
