In the AI age, the world must work together to limit the risk of nuclear war

In the AI age, the world must work together to limit the risk of nuclear war
2021-09-08

By Dr James Johnson, Lecturer in Strategic Studies, University of Aberdeen

In a recent article published with The Washington Quarterly I describe the significant (and growing) gap between the expectations and fears of public opinion, policymakers, and global defense communities about artificial intelligence (AI) and its actual military capabilities, particularly in the nuclear sphere. Today's misconceptions are largely caused by the hyperbolic depictions of AI in popular culture and science fiction, most prominently the Skynet system in The Terminator.

Misrepresentations of the potential opportunities and risks in the military sphere (or ‘military AI’) can obscure constructive and crucial debate on these topics — specifically, the challenge of balancing the potential operational, tactical, and strategic benefits of leveraging AI while managing the risks posed to stability and nuclear security.

The article demystifies the hype surrounding AI in the context of nuclear weapons and, more broadly, future warfare. Specifically, it highlights the potential, multifaceted intersections of this disruptive technology with nuclear stability. The inherently destabilizing effects of military AI may exacerbate tension between nuclear-armed great powers, especially China and the United States, but not for the reasons you may think.

A Perfect Storm of Instability?

The article expounds on four interrelated themes. First, in isolation, AI has few discernible (or direct) effects on nuclear stability. AI does not exist in a vacuum. Put differently, as a stand-alone capability, AI will unlikely be a strategic game-changer or silver bullet. Instead, much like cyberspace, it will likely mutually reinforce the destabilizing effects of existing advanced weapon systems. In this way, AI can be best understood as a manifestation and amplifier of an established trend in emerging technologies, associated with the increasing speed of warfare, shortening the decision-making timeframe, and the co-mingling of nuclear and conventional weapons.

Second, AI’s impact on stability, deterrence, and escalation will likely be determined as much (if not more so) by states' perceptions of its functionality than what it is — tactically or operationally — able to do. Even if autonomous drone swarms were not intended to be (or even technically capable of) a disarming nuclear first strike, the perception alone of such an operation; would elicit distrust between nuclear-states, and be destabilizing, nonetheless.

The diplomatic stand-off that followed China’s seizure of a US under-sea drone in 2016 demonstrated clearly the potential risk of inadvertent escalation caused by the strategic ambiguity about the deployment of new technology in contested territory between adversaries. Perceptions are important because, for the foreseeable future, AI will continue to incorporate a large degree of human agency, exhibiting similar cognitive bias and subjectivity, that has been a core part of ‘human’ foreign policy decision-making since time immemorial.

Third, the simultaneous pursuit of AI by great military powers – especially China, Russia, and the United States — will likely compound the destabilizing effects of AI in the military domain. In the second nuclear age consisting of nine nuclear-powers with varying interests, capabilities, and perceptions of their place in the world order, AI will likely compound the impact emerging technology is already having on deterrence during peacetime and likely to have during a crisis between nuclear-armed rivals.

Finally, and related, against this inopportune geopolitical backdrop, the perceived strategic benefits of AI-powered weapons — particularly AI and autonomy — may prove irresistible to states; to sustain or capture - in the case of China and maybe Russia — the technological upper-hand, and first-mover advantages, vis-à-vis adversaries.

In sum, artificial intelligence is likely to exacerbate the destabilizing and escalatory effects of an increasingly complex interplay of advanced military technology in a multipolar nuclear world order. Nuclear-armed states leveraging AI to achieve or sustain first-mover advantages in this multipolar context will likely destabilize this fragile order with uncertain outcomes.

Managing an AI Future

Given the multifaceted interplay AI-augmented enabling capabilities might have with strategic weapons, it is critical for defense communities, academics, and decision-makers to understand the confluence between these diverse capabilities, appreciate how competing strategic organizations view these dynamics, and the implications of these trends for military strategy and posture, arms races, arms control, crisis management, and deterrence.

Ultimately, success in these efforts will require all stakeholders to be convinced of the need and the potential mutual benefits of taking steps toward establishing a coherent governance architecture to institutionalize and ensure compliance with the design and deployment of AI technology in the military sphere.

 

Dr James Johnson is a Lecturer in Strategic Studies in the Department of Politics and International Relations. James is also a Non-Resident Fellow at the University of Leicester and a Mid-Career Cadre with the Center for Strategic Studies (CSIS) Project on Nuclear Issues. Previously, James was an Assistant Professor at Dublin City University, a Non-Resident Fellow with the Modern War Institute at West Point, and a Postdoctoral Research Fellow at the James Martin Center for Nonproliferation Studies in Monterey, CA. He is the author of The US-China Military & Defense Relationship During the Obama Presidency (Palgrave Macmillan, 2018) and Artificial Intelligence and the Future of Warfare: USA, China & Strategic Stability (Manchester University Press, 2021). His latest book project is entitled Artificial Intelligence & the Bomb: Nuclear Strategy and Risk in the Digital Age (under contract with Oxford University Press).

Published by News, University of Aberdeen

Search Blog

Browse by Month

2024

  1. Jan There are no items to show for January 2024
  2. Feb There are no items to show for February 2024
  3. Mar There are no items to show for March 2024
  4. Apr There are no items to show for April 2024
  5. May There are no items to show for May 2024
  6. Jun There are no items to show for June 2024
  7. Jul There are no items to show for July 2024
  8. Aug There are no items to show for August 2024
  9. Sep There are no items to show for September 2024
  10. Oct There are no items to show for October 2024
  11. Nov There are no items to show for November 2024
  12. Dec There are no items to show for December 2024

2023

  1. Jan There are no items to show for January 2023
  2. Feb
  3. Mar There are no items to show for March 2023
  4. Apr There are no items to show for April 2023
  5. May There are no items to show for May 2023
  6. Jun There are no items to show for June 2023
  7. Jul There are no items to show for July 2023
  8. Aug There are no items to show for August 2023
  9. Sep There are no items to show for September 2023
  10. Oct
  11. Nov There are no items to show for November 2023
  12. Dec There are no items to show for December 2023

2022

  1. Jan
  2. Feb There are no items to show for February 2022
  3. Mar
  4. Apr
  5. May
  6. Jun
  7. Jul
  8. Aug
  9. Sep
  10. Oct
  11. Nov
  12. Dec

2021

  1. Jan
  2. Feb
  3. Mar There are no items to show for March 2021
  4. Apr
  5. May
  6. Jun There are no items to show for June 2021
  7. Jul
  8. Aug There are no items to show for August 2021
  9. Sep
  10. Oct There are no items to show for October 2021
  11. Nov
  12. Dec There are no items to show for December 2021

2020

  1. Jan There are no items to show for January 2020
  2. Feb There are no items to show for February 2020
  3. Mar There are no items to show for March 2020
  4. Apr
  5. May
  6. Jun There are no items to show for June 2020
  7. Jul
  8. Aug There are no items to show for August 2020
  9. Sep There are no items to show for September 2020
  10. Oct There are no items to show for October 2020
  11. Nov There are no items to show for November 2020
  12. Dec

2019

  1. Jan
  2. Feb There are no items to show for February 2019
  3. Mar
  4. Apr
  5. May
  6. Jun
  7. Jul
  8. Aug There are no items to show for August 2019
  9. Sep There are no items to show for September 2019
  10. Oct
  11. Nov
  12. Dec There are no items to show for December 2019

2018

  1. Jan There are no items to show for January 2018
  2. Feb There are no items to show for February 2018
  3. Mar There are no items to show for March 2018
  4. Apr There are no items to show for April 2018
  5. May There are no items to show for May 2018
  6. Jun There are no items to show for June 2018
  7. Jul There are no items to show for July 2018
  8. Aug There are no items to show for August 2018
  9. Sep
  10. Oct
  11. Nov
  12. Dec