“The battlefield will become a space of impunity”: How AI challenges the laws of war
“We are going backwards in terms of protecting those who are more vulnerable on the battlefield: civilians,” says Father Afonso Seixas Nunes SJ, a Portuguese Jesuit and expert in the laws of war, including the legal and ethical implications of artificial intelligence (AI) in warfare.
What is an autonomous weapons system?
An autonomous weapons system is defined by its ability to identify, select, and engage military targets without human intervention. Unlike traditional systems such as Israel’s Iron Dome, which respond to pre-loaded targets, new AI-driven systems analyze vast, complex data streams to identify threats in real time.
Remote-controlled drones don’t qualify as autonomous because they require human input to act. Even anti-personnel mines are not autonomous since they do not discriminate between combatants and civilians or adapt to changing battlefield conditions.
Adaptability is a key feature. For example, if a legitimate military target enters a crowd of civilians, an autonomous system can suspend its attack and adjust accordingly.
The role of generative AI in modern warfare
These technologies already exist but are rarely deployed due to vulnerabilities like cyber-attacks. Countries such as the U.S., South Korea, and Russia reportedly possess such systems, with China likely developing them as well.
Instead, many states invest in AI Decision Support Systems (AI-DSS) that provide commanders with enhanced situational awareness. For instance, Israel uses AI-DSS in Gaza to process data from satellites and drones, delivering real-time intelligence on enemy locations.
Legal and ethical challenges
The reliance on AI introduces significant risks. Military commanders must trust AI outputs without full insight into how data is processed. This creates what Father Seixas Nunes calls a “dissociation of communication,” where AI algorithms operate as opaque “black boxes.”
This opacity risks a double scapegoating: blaming AI systems for errors or holding commanders fully responsible for decisions made with limited information. Neither outcome fits well within existing international criminal law frameworks, raising concerns about an “accountability gap” and potential impunity on the battlefield.
New warfare, new legal anarchy?
Autonomous systems promise to reduce soldier casualties but risk creating legal chaos. The current conflict in Israel illustrates how these technologies can undermine traditional laws of war, placing civilians at greater risk.
Prospects for a new legal framework
Father Seixas Nunes remains cautiously hopeful. While political shifts and weakened international institutions challenge progress, he emphasizes the importance of hope and faith in moving forward.
He points to recent global conflicts—Ukraine, Gaza, Sudan—as signs that protecting civilians under international law is regressing, not advancing.
Drones and swarms: The Ukraine conflict
The war in Ukraine is notable for the deployment of drone swarms, a significant evolution from earlier drone use. China’s demonstration of thousands of coordinated drones stunned Western militaries, showing how drone swarms can combine weaponized and surveillance roles.
Ukraine also showcases innovations like radar-invisible suits, highlighting how AI and technology are reshaping modern combat.
Targeting remote operators and civilian contributors
International law recognizes drone operators as legitimate combatants, even if they are far from the frontline. This means they can be targeted at home or elsewhere.
Similarly, civilians who directly support combat operations—like Elon Musk’s company Starlink providing intelligence to Ukraine—could be considered lawful targets, illustrating the blurring lines between traditional combatants, non-state actors, and private corporations.
Distance, desensitization, and moral risks
Since ancient times, weapons have extended the physical distance between combatants. Today’s AI-driven warfare introduces a new dimension: dissociation of communication and risk.
Historical concerns echo today. Pope Innocent II banned the crossbow in 1139 due to fears that combatants would lose sight of the consequences of their actions—a concern mirrored by today’s use of drones and AI, which can desensitize soldiers to the impact of their strikes.
The continuing need for ethical reflection
Despite skepticism, ethical and moral reflection remains crucial. Distrust of international law is understandable given its perceived ineffectiveness, but moral accountability often influences political outcomes more than legal sanctions.
The Chilcot Inquiry into the UK’s Iraq invasion exemplifies how moral consequences can end political careers, underscoring the importance of ethics in warfare.
Just war theory in the age of AI warfare
Pope Francis’s statement that one can no longer speak of just war sparked debate but aligns with traditional Catholic teaching that war is never truly good.
Just war theory permits war only as a last resort to end catastrophic situations. In cases like Russia’s invasion of Ukraine, legitimate self-defense is the only moral option. However, the manner in which self-defense is exercised matters deeply, as seen in Israel’s current operations, which many view as disproportionate and destructive.
How a Jesuit priest became an expert in laws of war
Father Seixas Nunes’s path began with theological studies, inspired by Pope Benedict XVI’s call for Church experts in international humanitarian law. His academic journey led him to focus on autonomous weapons under international law.
Interactions with military and defense institutions
He has collaborated with Dutch, British, and Israeli defense bodies. However, the U.S. military’s approach contrasts sharply, with less emphasis on moral considerations and more on operational efficiency.
Recently, the U.S. Department of Defense made the study of international humanitarian law optional, a move seen by many as a setback for ethical military conduct.
Concerns about criticism and surveillance
Despite critical views on U.S. policies, Father Seixas Nunes has not faced sanctions, though colleagues in related fields have experienced investigations and publishing restrictions linked to their critiques.
Conclusion
AI’s integration into warfare presents complex legal and ethical challenges. The opacity of autonomous systems threatens accountability and risks turning battlefields into zones of impunity.
Protecting civilians and upholding international humanitarian law requires urgent attention to these emerging technologies, balancing military necessity with moral responsibility.
For further insights into AI and its implications across sectors, explore comprehensive resources such as Complete AI Training’s latest AI courses.
Your membership also unlocks: