Death by Algorithm

Death by Algorithm
Podcast Description
A series on autonomous weapons systems, drones and AI in the military domain. Experts from various disciplines share their research and discuss the black box, responsibility, human-machine interaction, and the future of legal and ethical frameworks for AI in war. How is war regulated? Can the ethics of war be programmed into machines? Does it change how we fight? Can war be cleaned up by technology? How can soldiers understand the systems? Will AI systems be the commanders of tomorrow? Why not just let the robots fight? Episodes are narration and interviews and not chronological
Podcast Insights
Content Themes
The podcast centers around the intersection of technology and warfare, with episodes exploring themes such as the ethical implications of AI in combat, the black box problem in military drones, and legal responsibilities around autonomous weapons. For instance, 'A Taste of Tragedy' examines the impact of autonomous systems through the lens of the Ukraine conflict, while 'Just War' questions the justification of machine-led combat and its philosophical underpinnings.

A series on autonomous weapons systems, drones and AI in the military domain. Experts from various disciplines share their research and discuss the black box, responsibility, human-machine interaction, and the future of legal and ethical frameworks for AI in war. How is war regulated? Can the ethics of war be programmed into machines? Does it change how we fight? Can war be cleaned up by technology? How can soldiers understand the systems? Will AI systems be the commanders of tomorrow? Why not just let the robots fight? Episodes are narration and interviews and not chronological
Practices create norms, and words can shape reality. This applies to the debate on autonomous weapons and AI in the military domain. It is significant whether the technology precedes public deliberation and regulations. Additionally, it is crucial whether we refer to AI in the military as ”decision support systems”, ”emerging technology”, ”autonomous weapons”, or ”killer robots”.
Professor and recipient of the Danish Elite Research Prize 2025, Ingvild Bode describes her research project, AutoNorms, and how it tracks discourse and development on autonomous weapons. Furthermore, Ingvild shares her perspectives on the black box, meaningful human control, ethical machines, and the future of regulation.
Shownotes:
Producer and host: Sune With [email protected]
Coverart: Sebastian Gram
– AutoNorms, PI Ingvild Bode (Accessed May 13. 2025)
– Arai, Koki; Matsumoto, Masakazu, 2023, ”Public perception of autonomous lethal weapons systems”, AI and Ethics (2024) 4:451–462
https://link.springer.com/article/10.1007/s43681-023-00282-9
– Bode, Ingvild. 2024. ”Emergent Normativity: Communities of Practice, Technology, and Lethal Autonomous Weapons Systems”. Global Studies Quarterly 4(1), https://doi.org/10.1093/isagsq/ksad073
– Bode, Ingvild.; Nadibaidze, Anna, 2024, ”Autonomous Drones”. In J. P. Rogers (Ed.), De Gruyter Handbook on Drone Warfare, pp. 369-384, De Gruyter.
– Bode, Ingvild; Bhila, Ishmael; September 3. 2024, ”The problem of algorithmic bias in AI-based military decision support”, Humanitarian Law and Policy, ICRC.
– Bode, Ingvild. 2023. “Practice-Based and Public-Deliberative Normativity: Retaining Human Control over the Use of Force.” European Journal of International Relations 29(4), 990-1016, https://doi.org/10.1177/13540661231163392
– Bode, Ingvild, and Tom Watts. 2023. Loitering Munitions and Unpredictability: Autonomy in Weapon Systems and Challenges to Human Control. Odense, London: SDU Center for War Studies, Royal Holloway Centre for International Security. Link
– Campaign to Stop Killer Robots, 2021, “Killer Robots: Survey Shows Opposition Remains Strong”, Humans Rights Watch (Accessed May 14. 2025)
https://www.hrw.org/news/2021/02/02/killer-robots-survey-shows-opposition-remains-strong
– Deeney, Chris, 2019, “Six in Ten (61%) Respondents Across 26 Countries Oppose the Use of Lethal Autonomous Weapons Systems”, Ipsos (Accessed May 14. 2025).
https://www.ipsos.com/en-us/news-polls/human-rights-watch-six-in-ten-oppose-autonomous-weapons
– HuMach, PI Ingvild Bode. (Accessed May 13. 2025)
https://www.sdu.dk/en/forskning/forskningsenheder/samf/cws/cws-activities/projects/humach
– IEEE Standart Association, A Research Group on Issues of Autonomy and AI in Defense Systems. (2024). ”A Framework for Human Decision Making Through the Lifecycle of Autonomous and Intelligent Systems in Defense Applications”. New York, NY: IEEE SA (Accessed April 2. 2025)
https://ieeexplore.ieee.org/document/10707139
– IEEE Standards Association (Accessed April 2. 2025)
– Nadibaidze, Anna; Bode, Ingvild; Zhang, Qiaochu, 2024, “AI in Military Decision Support Systems: A Review of Developments and Debates”, Center for War Studies, SDU.
– Overton Window, Wikipedia (Accessed May 13. 2025)
https://en.wikipedia.org/wiki/Overton_window
– Renic, Neil and Christenson, Johan, 2024, “Drones, the Russo-Ukrainian War, and the Future of Armed Conflict”, CMS Report.
– The Overton Window, Mackinac Center for Public Policy (Accessed May 13. 2025)
https://www.mackinac.org/OvertonWindow
Music: Sofus Forsberg

Disclaimer
This podcast’s information is provided for general reference and was obtained from publicly accessible sources. The Podcast Collaborative neither produces nor verifies the content, accuracy, or suitability of this podcast. Views and opinions belong solely to the podcast creators and guests.
For a complete disclaimer, please see our Full Disclaimer on the archive page. The Podcast Collaborative bears no responsibility for the podcast’s themes, language, or overall content. Listener discretion is advised. Read our Terms of Use and Privacy Policy for more details.