Metropolitan Police Service Written evidence (NTL0031)

 

The Met’s submission of evidence:

 

 

1. In her RUSI speech in 2020, Commissioner Dame Cressida Dick recalled patrolling in 1983 with an Austin 110, a typewriter, fingerprints – using a card index system. The landscape has changed radically since and a more connected society has also changed the way crime is committed. The Commissioner placed a figure of around 50% of all crime being committed entirely online in her 2020 speech. However, the effective capture, evaluation and exploitation of complex data is now a key aspect of many successful police investigations. The 7/7 terrorist atrocity saw policing seize about 400 digital exhibits – amounting to about four terabytes of data. To put that in context, 1 terabyte of data could be the equivalent of 75 million pages of text data, so four terabytes could contain War and Peace, 250,000 times over. In 2018, a single investigation led by CT Command saw the recovery of 97 terabytes of data. The level of data involved in policing investigations means that it is simply not viable to process the data in a manual way within the timeframes required.

Use case

2. In wider society many types of advanced technology are already used and accepted. Artificial intelligence has been used by the ground-breaking AlphaFold programme to predict protein structures. Biometric technologies have been used increasingly during the COVID-19 pandemic to facilitate secure access to buildings and make contactless payments.

3. The same technologies and expertise are necessary for a modern, effective and efficient police force. Artificial intelligence can assist officers to link offences or help evaluate vast amounts of app data on phones - collating leads from cases and platforms where the human task to identify links to an ongoing investigation is vast and often unfeasible without technology. Biometric technologies such as fingerprints, and DNA have a track record of putting those who are dangerous to society behind bars. Equally important are functions like HR and vetting where AI and automation can offer considerable savings and efficiencies.

4. Policing does not seek to use technology without due cause. When policing uses technology, it does so to meet a defined policing purpose which must justify any privacy intrusion. The technology’s effectiveness and demographic performance (where relevant) must be assured as fit for its intended purpose, its use must be transparent to the degree possible within a policing context, and there must be safeguards, supported by community engagement and suitable oversight.

5. To declare technologies as being ‘off limits’ to policing risks denying law enforcement the tools it needs to keep the public safe whilst leaving these tools easily available for criminals and commercial users to consume and exploit. It also risks falling outside of the public’s expectations[1].

Legal framework

6. The need to use technology to improve the speed and quality of investigations, to better evaluate crime, to protect officers and improve the service policing delivers makes it critical for policing to continue to benefit from a legal framework which can continue to evolve in step with the pace of technological change.

7. Policing common law powers are a vital part of the legal framework. Over many years, the common law has evolved to allow the police to benefit from innovation. It has done so with the advent of fingerprints, data analytics and the use of algorithms - most recently facial recognition, where the Bridges decision recognised the duty to prevent and detect crime and empowered policing to use the technology.

8. The use of policing common law powers is not without its critics. This is even when supplemented by a world-leading framework of safeguards such as those provided by the Human Rights Act 1998, the Data Protection Act 2018, the Equality Act 2010 and the Protection of Freedoms Act 2012. These are backed by oversight, in the Met’s case, from the Mayor’s Office of Policing and Crime and various commissioners including those responsible for information, investigatory powers, surveillance cameras and biometrics. In some instances, all can be engaged as part of the oversight process. Nevertheless some still call for policing to be able to use only specific technical capabilities following enabling legislation. There are a number of issues with this approach including:

    1. Stopping the research and development of new technology - at an early stage of development, it may not be known if the technology will work to justify legislation. Even if it does, it may not yet be clear how it will best apply to policing and what safeguards would be needed.
    2. Placing the Met in a position where it is aware of a risk to life, and aware that there may be a capability which could save someone, but being unable to use the technology whilst the law catches up.
    3. Putting criminals on notice as to where the safe-spaces are should policing need to await legislation.
    4. Impacting on advanced technologies which focus on interpreting sensor data as well as those which relate to the collection, analysis and interpretation of information which concerns personal data.

 

 

Assessing effectiveness

9. The legal framework places demanding requirements on police forces. The principle of necessity embedded in the Human Rights Act 1998 means a technology needs to be sufficiently effective to achieve a policing purpose. The Data Protection Act 2018 embeds statistical accuracy into lawful data processing. The Equality Act 2010 places a non-delegable Public Sector Equality Duty on each force to take reasonable steps to understand how a technology performs from a demographics perspective and to take steps to mitigate undue impact.

10. These standards do not usually require 100% accuracy. It does not take 100% to make a meaningful impact on solving crimes and bring efficiencies to policing. It would bear unacceptable opportunity costs to hold out for perfect performance which may never come.

11. AI and machine learning technologies are statistically based and as such will always contain errors. The duty is to measure, understand, minimise and mitigate the errors and then be confident the technology is still effective for use. Challenges to doing this include:

    1. vendors being reluctant to share enough information citing reasons of commercial confidentiality;
    2. the difficulty in establishing what ‘reasonable steps’ are to establish accuracy and demographic differential performance when the nature of new technology means there is often a lack of precedent;
    3. having access to or retaining suitable datasets to test the efficacy new technologies. This is particularly relevant where there is a need to do so in realistic operational conditions and goes beyond tests in controlled environments with consenting and unrealistically compliant participants; and
    4. the cost of undertaking meaningful testing and ensuring that the tests can generalise to wider policing.

12. Even for the Met, which benefits from some great people with significant expertise, this is a hard thing to do. The wider law enforcement community will find it harder still and the challenge will grow with time as the complexity of technology increases.  Policing would welcome the support from experts in Government and beyond to help meet this challenge.

A code of practice for technology

13. The Met recognises that there is a need for the use of technology to be both accessible and foreseeable to the public. It needs to be possible to work out how the technology is used, the rules which regulate that use and predict how they apply to the public. The Met has demonstrated its commitment to this approach across a number of technologies, using published policy with the force of law to give shape to the common law and a programme of community engagement to reach out. However, as the Commissioner said in her RUSI speech, it would be very helpful to have a code of conduct to provide the guiding principles for the use of technology by policing.

14. Providing a framework for ethical decision making: A new code would help the public and policing by providing a consolidated and clear starting point for best practice and combine the variety of guidance, opinion, codes, directions and proposals for ethical frameworks which presently exist. Some of these documents currently apply to specific technologies or even specific deployment methodology for a particular technology and also risk being applied by analogy or inference. This risks confusion and inconsistency in terms of how forces use technology, how oversight bodies assure the use of technology and how the public understand and expect technology to be used.

15. A code of practice would provide a framework for ethical decision making when considering whether to use a new technology. Ideally, to ensure consistency, effective oversight, future-proofing and predictability, it would focus on technology by types rather than seeking to regulate a specific tool or deployment methodology. Areas it would be helpful to cover include, artificial intelligence, advanced data analytics, sensor data, automation and biometrics but in a way which could apply to applications from drones and ANPR to a staff database or property management system.  The approach would be based around the policing purposes pursued, people’s expectations of privacy informed by community engagement, alternatives to the intrusion, and the means by which effectiveness and demographic performance can be assessed.

16. Transparency: Transparency is important to policing – it is critical to effective community engagement when considering if and how to use technology. However, there are challenges when seeking to explain how particularly complex technologies work. This results in a tension between explainability and effectiveness. Some particularly complex algorithms may be known to have an effective output needed by policing but the process to get to that output can be hard to understand and explain.

17. There are also rare occasions when law enforcement must be able to guard its particular use of some technologies carefully in order to preserve their effectiveness. There is a further need to avoid publishing details about the output of technology where this could put the vulnerable at risk. This can present challenges when seeking to be accessible and foreseeable about the use of technology at a force level. A code of practice could provide guiding principles to help forces be transparent whilst also managing risks to protect capabilities and the vulnerable.

18. Working with industry, partners and the community: It is important to recognise that policing does not have all the answers when it comes to developing and using technology in a law enforcement context. Considerable cutting-edge expertise rests in wider industry and there is a need to harness this for public good. This means collaborating to research, develop and deliver the capabilities law enforcement needs – this can include the need to carefully share data to do this.

19. Fighting crime is also a whole society effort. Technical capabilities which help solve crime are often fuelled by data which is shared by society, ranging from members of the public to information shared by those responsible for critical national infrastructure. It is vital that they have confidence to share information with the police and that there are clear ways to do so. A  code would also provide guiding principles for police to engage with others to develop, deliver and use technology to fight crime and keep the public safe.

20. In conclusion, the Met recognises the critical role technology will play in fighting crime and keeping Londoners safe. The Met believes a code of practice for new technology can provide the guiding principles to empower forces, embed consistent best practice and retain public confidence.  We look forward to assisting the Committee further with this inquiry.

 

 

5 September 2021

 

7

 


[1] In 2019, the Information Commissioner instructed Harris Interactive to examine public awareness and perceptions regarding live facial recognition. 82% of respondents indicated they found it acceptable for the police to deploy the technology. Use by other organisations had much weaker support, with entertainment venues, retailers and social media websites gaining the support of 44%, 38% and 30% respondents respectively.