Skip to main content

How should autonomous weapons be developed, used and regulated?

6 March 2023

The Artificial Intelligence in Weapons Systems Select Committee today publishes its call for evidence. The Committee is inviting the public to give their views on the use of artificial intelligence in weapons systems.

The deadline for responses is 4.00pm on Monday 8 May 2023

The inquiry

The Committee is considering the use of artificial intelligence in weapons systems. Advances in robotics and digital technologies, including artificial intelligence (AI), have led to step changes in many sectors, defence included. One such area of advancement and heightened interest is the creation of Automated Weapons Systems (AWS).

Autonomous weapons systems have been defined as systems that can select and attack a target without human intervention. These systems could revolutionise warfare, with some suggesting that they would be faster, more accurate and more resilient than existing weapons systems and could limit the casualties of war.

However, concerns have arisen about the ethics of these systems, how they can be used safely and reliably, whether they risk escalating wars more quickly, and their compliance with international humanitarian law. Much of the international policymaking surrounding Automated Weapons Systems has been focused on restricting their use, either through limitations or outright bans.

The Committee will be looking at a wide range of issues surrounding AWS, including:

  • The challenges, risks and benefits associated with them.
  • The technical, legal and ethical safeguards that are necessary to ensure that they are used safely, reliably and accountably.
  • The sufficiency of current UK policy and the state of international policymaking on AWS.

Questions

The Committee is seeking written submissions addressing any or all of the following questions:

  1. What do you understand by the term autonomous weapons system (AWS)? Should the UK adopt an operative definition of AWS?
  2. What are the possible challenges, risks, benefits and ethical concerns of AWS? How would AWS change the makeup of defence forces and the nature of combat?
  3. What safeguards (technological, legal, procedural or otherwise) would be needed to ensure safe, reliable and accountable AWS?
  4. Is existing International Humanitarian Law (IHL) sufficient to ­­ensure any AWS act safely and appropriately? What oversight or accountability measures are necessary to ensure compliance with IHL? If IHL is insufficient, what other mechanisms should be introduced to regulate AWS?
  5. What are your views on the Government's AI Defence Strategy and the policy statement ‘Ambitious, safe, responsible: our approach to the delivery of AI-enabled capability in Defence’? Are these sufficient in guiding the development and application of AWS? How does UK policy compare to that of other countries?
  6. Are existing legal provisions and regulations which seek to regulate AI and weapons systems sufficient to govern the use of AWS? If not, what reforms are needed nationally and internationally; and what are the barriers to making those reforms?

Chair's comments

Lord Lisvane, Chair of the Committee on Artificial Intelligence in Weapons Systems said:

“Artificial intelligence features in many areas of life, including armed conflict. One of the most controversial uses of AI in defence is the creation of autonomous weapon systems that can select and engage a target without the direct control or supervision of a human operator.

“We plan to examine the concerns that have arisen about the ethics of these systems, what are the practicalities of their use, whether they risk escalating wars more quickly, and their compliance with international humanitarian law.

“Our work relies on the input of a wide range of individuals and is most effective when it is informed by as diverse a range of perspectives and experiences as possible. We are inviting all those with views on this pressing and critical issue, including both experts and non-experts, to respond to our call for evidence by 8 May 2023."

Further information

Image: Adobe