Recent Advances in AI Increasing Risk of Nuclear Weapon Use-New SIPRI Report

Recent advances in artificial intelligence (AI) contribute to nuclear risk, according to a new report from the Stockholm International Peace Research Institute (SIPRI).
The authors warn that nuclear-armed states’ competition in military AI and premature adoption of AI in nuclear weapons and related capabilities could have a negative impact on strategic stability and increase the likelihood of nuclear weapon use. The report proposes AI-specific nuclear risk reduction measures. The report indicates that recent advances in AI, specifically machine learning and autonomy, could unlock new and varied possibilities in a wide array of nuclear weapons-related capabilities, ranging from early warning to command and control and weapon delivery.
The authors warn that nuclear-armed states’ competition in military AI and premature adoption of AI in nuclear weapons and related capabilities could have a negative impact on strategic stability and increase the likelihood of nuclear weapon use. The report proposes AI-specific nuclear risk reduction measures. The report indicates that recent advances in AI, specifically machine learning and autonomy, could unlock new and varied possibilities in a wide array of nuclear weapons-related capabilities, ranging from early warning to command and control and weapon delivery.
Research shows nonetheless that all nuclear-armed states have made the military pursuit of AI a priority, with many determined to be world leaders in the field. The report warns that this could negatively impact strategic relations, even before nuclear weapon–related applications are developed or deployed. Premature adoption of military artificial intelligence could increase nuclear risk The authors argue that it would be imprudent for nuclear-armed states to rush their adoption of AI technology for military purposes in general and nuclear-related purposes in particular. Premature adoption of AI could increase the risk that nuclear weapons and related capabilities could fail or be misused in ways that could trigger an accidental or inadvertent escalation of a crisis or conflict into a nuclear conflict. ‘However, it is unlikely that AI technologies—which are enablers—will be the trigger for nuclear weapon use.’ says Dr Lora Saalman, Associate Senior Fellow on Armament and Disarmament, SIPRI. ‘Regional trends, geopolitical tensions and misinterpreted signalling must also be factored into understanding how AI technologies may contribute to escalation of a crisis to the nuclear level’. The report recommends that transparency and confidence-building measures on national AI developments would help to mitigate such risks. Challenges of artificial intelligence must be addressed in future nuclear risk reduction efforts According to the report’s authors, the challenges of AI in the nuclear arena must be made a priority in future nuclear risk reduction discussions. ‘It is important that we do not overestimate the danger that AI poses to strategic stability and nuclear risk. However, we also must not underestimate the risk of doing nothing,’ says Dr Petr Topychkanov, Senior Researcher, Nuclear Disarmament, Arms Control and Non-proliferation Programme, SIPRI. ‘While the conversation on AI-related risks is still new and speculative, it is not too early for nuclear-armed states and the international security community to explore solutions to mitigate the risks that applying AI to nuclear weapon systems would pose to peace and stability,’ says Topychkanov. The report proposes a number of measures for nuclear-armed states, such as collaborating on resolving fundamental AI safety and security problems, jointly exploring the use of AI for arms control and agreeing on concrete limits to the use of AI in nuclear forces.How may the negative impact of artificial intelligence #AI on strategic stability and #nuclear risk be addressed?
— SIPRI (@SIPRIorg) June 24, 2020
▫️Raise awareness
▫️Increase transparency on #AI
▫️Support collaborative solutions
▫️Discuss and agree on concrete limits
Full analysis ➡️ https://t.co/jX2ZPWVKr4 pic.twitter.com/Rda4g3xmmA
Latest Videos