AI and the Bomb

October 2023 No Comments

Speaker: Dr. James Johnson (University of Aberdeen)

Date: 25 October 2023

Speaker Session Summary

Artificial intelligence (AI) is transforming how combat and conflict are evolving and how actors conceptualize and manage their technological assets. This reconceptualization extends to how actors conduct deterrence. Dr. Johnson described deterrence as requiring an understanding of the other side’s “…interests, priorities, strategic objectives, and perceptions.” Artificial intelligence will likely never be able to understand these complex concepts. Artificial intelligence may even undermine deterrence by making it easier to locate an actor’s nuclear assets, attacking nuclear arms systems with cyber weapons, and misinterpreting adversary signaling among others. These misinterpretations could even lead to an accidental nuclear war. Factors that may contribute to an accidental nuclear war include information complexity, information overload, and the low barrier that AI and other cyber technology gives third party and non-state actors.

While AI can pose threats to effective nuclear deterrence, there are potential bonuses for its implementation in nuclear weapons systems. Several advantages to using AI with its nuclear weapons systems include the speed and scale that AI can act. Dr. Johnson described four areas that AI can be used in relating to nuclear weapons in the future: 1) enhancing the safety of nuclear weapons; 2) upgrading command and control protocols; 3) creating robust safeguards to contain the consequences of errors; and 4) arms controls, norms, behaviors, and strategic dialogue. If you would like to read more about how AI may impact the future of conventional and nuclear deterrence, Dr. Johnson’s book, AI and the Bomb Nuclear Strategy and Risk in the Digital Age, is available for purchase.

Speaker Session Recording

Briefing Materials

Report: https://mwi.westpoint.edu/rethinking-nuclear-deterrence-in-the-age-of-artificial-intelligence/

Recommended reading:

Nuclear Brinkmanship in AI-Enabled Warfare: A Dangerous Algorithmic Game of Chicken – War on the Rocks

AI, Autonomy, and the Risk of Nuclear War – War on the Rocks

Bio: Dr. James Johnson is a Lecturer (Assistant Professor) in Strategic Studies in the Department of Politics and International Relations at the University of Aberdeen. He is also an Honorary Fellow at the University of Leicester, a Non-Resident Associate on the ERC-funded Towards a Third Nuclear Age Project, and a Mid-Career Cadre with the Center for Strategic and International Studies (CSIS) Project on Nuclear Issues. Previously, he was an Assistant Professor at Dublin City University, a Non-Resident Fellow with the Modern War Institute at West Point, and a Postdoctoral Research Fellow at the James Martin Center for Nonproliferation Studies in Monterey, CA. He holds a PhD in Politics and International Relations from the University of Leicester. Before entering academia, he worked in the financial sector, mainly in China, and is fluent in Mandarin. His research examines the intersection of nuclear weapons, deterrence, great power competition, strategic stability, and emerging technology – especially artificial intelligence. His work has been featured in Journal of Strategic Studies, The Washington Quarterly, Strategic Studies Quarterly, Defence Studies, European Journal of International Security, Asian Security, Pacific Review, Journal for Peace & Nuclear Disarmament, Defense and Security Analysis, RUSI Journal, Journal of Cyber Policy, Journal of Military Ethics, War on the Rocks, and other outlets. He is the author of The US-China Military & Defense Relationship During the Obama Presidency (Palgrave Macmillan, 2018), Artificial Intelligence and the Future of Warfare: USA, China & Strategic Stability (Manchester University Press, 2021), and AI and the Bomb: Nuclear Strategy and Risk in the Digital Age (Oxford University Press, 2023). His latest book is The AI Commander: Centaur Teaming, Command, and Ethical Dilemmas (Oxford University Press, 2024).

Comments

Submit A Comment