Category: Dangerous Activities

Potential Source of Harm: Autonomous Weapons

Updated July 2, 2024

 

Nature of Harm

Weapons are intended to cause harm, and there are weapons such as nuclear and biological weapons that can potentially cause much more widespread harm than AI-based autonomous weapons. So in considering such weapons, it is important to focus on ways that they may increase the harm from existing methods of warfare and other conflict. Significant concerns include: 

 

In addition, there have been claims that AI tools like large language models will enhance the ability of bad actors to develop dangerous weapons, such as nuclear, biological and chemical weapons, including ones based on novel custom or chemicals. We address these as separate harms.

 

The Center for Security and Emerging Technology at Georgetown University released a paper China’s Military AI Roadblocks in June 2024 on China's use of AI for warfare and associated challenges.

 

Regulatory and Governance Solutions

Unlike the nascent efforts to regulate AI for commercial use, there are very few proposals for restriction of AI for military use. Indeed, the EU AI Act contains an exception for AI systems “exclusively for military, defence or national security purposes” (Art. 2(3)), and the US Executive Order on AI limits restrictions on AI for "national security systems".

 

In an increasingly conflictual and competitive world, it appears that governments are unwilling to limit their flexibility to use AI for military purposes. Any substantial control of such use would likely require a multilateral international agreement, as has occurred in the past for nuclear, chemical and biological weapons.

 

There is some pressure on militaries to properly govern use of AI systems, including appropriate human control of targeting decisions, such as for Israeli use of the Lavender system in Gaza and US targeting elsewhere in the Middle East.

 

Technical Solutions

It is difficult to obtain detailed information on technical approaches for control of military AI systems. Technical solutions should include appropriate testing of such systems before they are put into use, as proposed in this academic paper in January 2024. 

 

Government and Private Entities

National militaries and other government bodies have obvious roles in control of military use of AI. Various non-governmental organizations have advocated actions to limit or study use of autonomous, AI-enabled weapons, including: