AI Governance (2024 April)

The Road to Regulation: Challenges to International Efforts to Control Lethal Autonomous Weapons

By Syed Abdul Ahad Waseem (Published on August 28, 2024)

This project was the runner up for the "Best Policy Governance Project" prize on our AI Governance (April 2024) course. The text below is an excerpt from the final project.

One of the most crucial challenges facing humanity today is the potential use of artificial intelligence (AI) in lethal autonomous weapons systems (LAWS). LAWS can be understood as “a special class of weapon systems that use sensor suites and computer algorithms to independently identify a target and employ an onboard weapon system to engage and destroy the target without manual human control of the system.” Some see LAWS as the third revolution in military affairs, after the invention of gunpowder and nuclear weapons.

AI is at the heart of the modern debate on LAWS. While LAWS have been around for decades, and offer many benefits, such as enabling military operations in communications-degraded or - denied environments where traditional systems may not be able to operate, many fear a weapons system that uses AI to operate without sufficient human control of its decision-making processes will be catastrophic on humanitarian, legal, security, ethical, and moral grounds. AI is not a prerequisite for the functioning of autonomous weapons systems, but, when incorporated, AI could further enable such systems. Many fear that rapid advancements in AI and easy accessibility could pose challenges to international security.

The UN Secretary-General Antonio Guterres has called for their prohibition under international law, urging the states to conclude a legally binding instrument by 2026 to prohibit lethal autonomous weapons systems (LAWS) that function without human control. The key global body dealing with LAWS is the Group of Governmental Experts (GGE) of the High Contracting Parties to the Convention on Certain Conventional Weapons (CCW) related to emerging technologies in the area of lethal autonomous weapons systems (LAWS). However, other UN bodies have also taken up the question of LAWS. In November 2023, the First Committee of the United Nations General Assembly passed its first-ever resolution (A/C.1/78/L.56) that called for international law to apply to LAWS. However, it did not ban them, and many key states, notably China, Russia, India, Israel, Iran, and Saudi Arabia, did not support the resolution.

This blog highlights the key challenges to implementing the call of the UN Secretary-General for a legally binding instrument prohibiting or regulating LAWS.

To read the full project submission, click here.

We use analytics cookies to improve our website and measure ad performance. Cookie Policy.