(MAC0045)

Written evidence submitted by the Metropolitan Police Service (MAC0045)

 

 

  1. I welcome the opportunity to reply to the evidence provided by Mr Grace, in particular to correct some factual inaccuracies about the Metropolitan Police Service’s (MPS) use of Live Facial Recognition (LFR) technology in his paper. I also note his call for additional scrutiny for the police use of new technologies. Trust is so important if policing is to maintain the support of the public as we seek to use modern technology in support of today’s many challenges. As you are aware, the MPS strongly support the calls for some kind of code of conduct for the use of tech in policing; a license to operate.

LFR system accuracy

 

  1. I agree with Mr Grace that there is a concern about the poor accuracy rate within some facial recognition technologies. However, I disagree that this very general position applies to the facial recognition system used by the MPS. We have very carefully considered which algorithm we use, how accurate it is and if demographics such as gender or ethnicity cause a variation in the accuracy levels achieved.
  2. The algorithm used by the MPS is the latest algorithm released by NEC. NEC algorithms have repeatedly been tested by the USA’s National Institute of Standards and Technology (NIST). NIST have assessed their accuracy and also assess if demographics such as gender or ethnicity cause its accuracy to vary. Their independent evaluation has shown NEC consistently has very high levels of accuracy.

 

  1. An evaluation of demographic effects published by NIST in December 2019 was a substantial undertaking. It used a total of 2.6 million known faces with the images also being balanced with respect to representation of gender and ethnicity - beyond anything that could be viably replicated by an individual police force. In conducting its testing, NIST also evaluated circa 200 other algorithms and found that not all algorithms show uniform accuracy levels across the different demographics.

 

 

  1. One of the algorithms tested by NIST, NEC-3, is from the same group or generation of NEC algorithms that the M30 (the one the MPS use in LFR) is part of. Importantly the NEC-3 algorithm was developed using the same Neo Face technology and the same training data set as the M30 algorithm used by the MPS. The MPS’s M30 algorithm is simply optimised for live facial recognition using a video feed, the NEC-3 is optimised for the NIST test, which uses images as opposed to video.
  2. NIST concluded that NEC had:
    1. “provided an algorithm for which the false positive differential was undetectable” and the NEC-3 algorithm “is on many measures, the most accurate [NIST] have evaluated”.
  3. This gives the MPS confidence in the current generation of NEC algorithms as the test showed that the variation in accuracy between male, female, black and white individuals in the NEC algorithm is imperceptible.

 


 

 

(MAC0045)

Public Sector Equality Duty (PSED)

 

  1. The MPS has published a detailed Equality Impact Assessment for the use of LFR technology online, explaining how we pay ongoing due regard to the PSED. The NIST evaluation is an important aspect of this, and it allows the MPS to take confidence in the technology, given the NEC algorithm does not have an inherent demographic differential performance within its algorithm.
  2. Notwithstanding this position, to ensure decision makers are ‘properly informed’ and empowered, the MPS has gone further and has adopted the following measures (amongst others):

 

 

 

 

 

  1. In relation to the Biometrics Commissioner, Mr Grace’s account of this matter is not entirely accurate. As the MPS noted at the time this point was raised in the media, we kept the Biometrics Commissioner informed about our use of LFR and continue to look forward to any opportunities to work with him on the use of new biometrics in law enforcement. At the time the MPS updated its equality impact assessment to fully reflect the Biometrics Commissioner’s position.

 


 

 

(MAC0045)

Addressing disproportionality

 

  1. The decision not to include ethnicity data with watch list images reflects the MPS’s desire only to process data for its lawful enforcement purposes where it is strictly necessary to do so. In this respect the MPS seeks to minimise the data it processes and tread lightly where intrusion is unavoidable but necessary to keep Londoners safe.
  2. In considering Mr Grace’s comments, he is mistaken in a number of respects, most notably about how ongoing monitoring can be achieved as regards to system accuracy, and if any particular demographic impacts on accuracy levels. Mr Grace assumes that the decisions made by the LFR system about those on the watch list when they pass the system can be used to monitor system performance. This misunderstands two key points and does not consider other, less data-intrusive ways the MPS draws on to show due regard to the PSED. The two key points are:
    1. One, to determine accuracy (and then, if accuracy varies by any particular demographic), it is necessary to consider every decision the LFR system makes, not just those where an alert is generated but also those where no alert is generated. This allows the MPS to know how often the system falsely generates an alert against someone not on a watch list. To do this the MPS would need to know the gender and ethnicity of all those who pass the LFR system, not just those on a watch list. This would not be lawful and would amount to a substantial intrusion against all members of the public passing the system. It is for this reason that the NIST testing, and other testing the MPS can undertake, is valuable and means that this approach, without watch list ethnicity data, can assure the MPS of its public sector equality duties.
    2. Two, to determine accuracy (and then, if accuracy varies by any particular demographic) it is also necessary to determine how good an LFR system is at alerting against those on a watch list. To work this out, the MPS needs to know about all those on a watch list who pass an LFR system and work out how many of those (as a proportion) the system generates an alert against. By its very nature, a live deployment watch list is not the place to do this if a person on a watch list is missed by the system, nobody is likely to realise this. Instead, the MPS uses blue lists to do this it gets known people (typically police officers) to join those passing the system. Since the MPS knows about all ‘blue list’ people passing the system, it can then work out how accurate the system is at alerting against those on a ‘blue list’ watch list. The MPS explains this process as part of its published Guidance Document as a key measure to enable the MPS to monitor for, and address any disproportionality it may observe.
  3. The post-deployment review process also offers the MPS a chance to monitor for issues by reviewing all alerts, including any incorrect ones and monitoring for trends. This can be done by simply looking at the alert images given the accuracy of the MPS’s LFR system and the very low level of incorrect alerts generated. Should a concern be identified, the MPS would then be in a position to explore that further and test for issues. Again, this would not need ethnicity or gender data to be associated with a watch list as a whole.

 


 

 

(MAC0045)

The purpose of LFR deployments

 

  1. The MPS’s published documents for the overt use of live facial recognition technology, do provide a lengthy explanation as to why and how the MPS uses LFR. This includes our strategic intentions, operational objectives, technological objectives and use case. It outlines that the MPS’s use of LFR at a strategic level will be to locate offenders in accordance with common law policing powers. This includes the targeting of those wanted for imprisonable offences, with a focus on serious crime, with a particular regard to knife and gun crime, child sexual exploitation and terrorism. The MPS also publishes its policy as to watch list composition and where a deployment may occur in order to ensure the policies governing LFR are available to all. In addition to this, the MPS gives prior notice of its deployments and has given the public a reason for them. For example, at a deployment to Westminster in February where an arrest for a serious assault on an emergency worker was made, the MPS advised the public on its LFR website and Twitter accounts that “We are using [LFR] to find people wanted by police for violent and serious crimes.”
  2. You will be aware of the recent judgement from the Court of Appeal in relation to South Wales Police facial recognition technology. The MPS approach to live facial recognition is different to the South Wales Police cases which were appealed. This reflects our policing needs, which given the complexity of keeping London safe and the different crime issues impacting the capital, are very different from those in South Wales. We will carefully consider the judgment and act on any relevant points to ensure that we maintain our commitment to use facial recognition in a lawful, ethical and proportionate way.
  3. I hope this response has answered the questions raised and is helpful. The documents that I refer to throughout are all available here:

https://www.met.police.uk/advice/advice-and-information/facial-recognition/live-facial- recognition/

 

  1. In view of the ongoing interest and discussion around LFR, privacy, oversight and human rights, I would like to invite the Committee to take part in a briefing and discussion on the tool. As COVID-19 restrictions continue to ease, you would also be very welcome to come and view the technology for yourselves. We would be delighted to organise this.
  2. If I can be of assistance, please do not hesitate to contact me.

 

 

Yours sincerely,

 

Lindsey Chiswick Director of Intelligence

 

September 2020