Independent Office for Police Conduct — Written evidence (NTL0054)
I write following the request of the House of Lords Justice and Home Affairs Committee for any evidence the IOPC may be able to provide regarding the use of new technologies in law enforcement.
Following a conversation with the Clerk and Committee Specialist regarding the scope of your Committee’s inquiry, and having read your comments on the second reading of the Police Crime Sentencing and Courts Bill in the House of Lords, I understand the focus of your inquiry to predominantly be the application of algorithmic tools in policing, rather than the use of technology such as Taser or body worn video. Our response is therefore focused on those issues.
The Independent Office for Police Conduct (IOPC) is a non-departmental public body sponsored by the Home Office. Our statutory duty is to secure and maintain public confidence in the police complaints system.
The IOPC is independent of both the police and the government and is charged with investigating the most serious and sensitive complaints and incidents. We also handle some appeals (reviews) from members of the public unhappy with how the police have dealt with their complaint, as well as setting standards for, and monitoring performance of the complaints system.
In order to ensure the IOPC keeps pace with the advancing technological and regulatory landscape surrounding the identification, extraction and evidential presentation of digital evidence, the IOPC has established a Digital Investigations Unit (DIU). The DIU provides support to our investigations to ensure the increasing volume of information technology equipment processed as part of our investigations is handled quickly and proportionately in a way that maximises the evidential opportunities and minimises the risks of collateral intrusion.
Depending on the circumstances of the particular case, the DIU may use algorithmic tools to search digital devices for the presence of particular images or files. Such tools have a number of benefits, including the speed with which large capacity devices can be processed for the evidence sought, the minimisation of collateral intrusion through the reduction in the need for human review of all files, and, in cases where harmful images are sought, a reduction in the need for investigators to view the images sought owing to the conversion of the image to code.
The use of algorithms and algorithmic tools to detect, deter or prevent crime is wide-ranging. Accordingly, the regulatory landscape for issues relating to the application of algorithms in law enforcement is broad and complex; there are a number of bodies concerned with complaints and regulation besides the IOPC that may have a role to play either directly in complaints or in setting the standards and monitoring how data is used and whether such tools are used fairly. To assist the Committee, I have set out some of the bodies besides the IOPC that may be involved in the consideration of complaints or the development of policy and procedure at annex A
In order to categorise complaints, conduct matters and death and serious injury matters, we apply factors to the issues brought to our attention for referral or review/appeal. At present we do not have a factor related to the use of algorithms in policing or the use of live facial recognition. Factors are created where a particular common feature in the work of the IOPC, or an issue of public concern becomes apparent. At present, we have not observed particularly high levels of concern at these issues expressed through public complaints through the police complaints system.
To satisfy ourselves we did not hold any cases that may be relevant to the Committee’s inquiry, we applied search terms to the IOPC case management system for referrals, appeals/reviews and independent investigations across the case description field. The search was applied to cases opened between 1 April 2018 to 31 March 2021. The following terms were searched for:
“algorithm”
“Live Facial recognition” “Facial recognition” “Gang’s matrix”[1]
“Gangs matrix”
“Gang matrix”
“hotspot”
The search returned three hits on the term “hotspot”. All of these related to Death and Serious Injury (DSI) referrals. Of the referrals, one was returned to force and in two the IOPC required the force carry out a local investigation. None of the referrals attracted a public complaint.
The term “hotspot” was not relevant in one case as it referred to locations of significance to the deceased, rather than areas identified by police as of interest owing to criminal activity. In the other two cases, one referred to a “local operation to deal with knife rime hotspots” the other referred to a “known crime hotspot”. In either case, it is not clear whether, and if so to what extent, algorithms or machine learning played a part in identifying a hotspot.
The search returned one appeal following a public complaint regarding “facial recognition”. The appeal was not upheld.
The IOPC has responsibility for setting the standards for, and monitoring the performance of the police complaints system in England and Wales. We publish quarterly bulletins and annual statistics on police complaints for all the 43 Home Office forces and the British Transport Police. Since the changes to the police complaints system pursuant to the Policing and Crime Act 2017 were commenced in February 2020, reporting on the complaints system has been paused to allow for upgrades to computer systems. We anticipate publishing the annual statistics for the financial year 2020/21 in the Autumn.
The IOPC requires police forces to use definitions to log the root cause of the dissatisfaction expressed by a complainant[2]. Whilst there is no definition covering the use of algorithms and machine learning per se, there is a category (cat D) covering the “access and/or disclosure of information”. This definition is further sub categorised. However police forces themselves may be better placed to advise on complaints related to specific issues, such as live facial recognition, in those areas where it has been piloted.
The police super-complaints system allows designated organisations to raise issues on behalf of the public about harmful patterns or trends in policing.
The system is designed to examine problems of local, regional or national significance that may not be addressed by existing complaints systems. The process for making and considering super- complaints is set out in the Police Super-complaints (Designation and Procedure) Regulations 2018.[3]
The system of super-complaints was created through the Policing and Crime Act 2017 and began operating in November 2018.
Super-complaints are considered jointly by HMICFRS, College of Policing and the IOPC. At present, no super-complaints have been submitted or accepted regarding either the use of algorithmic tools in policing, or facial recognition.
Whilst the use of algorithmic tools has not featured, discrimination in policing and the use of data have featured in a number of the super-complaints accepted for consideration. For example:
- A Super-complaint accepted from the Criminal Justice Alliance regarding discrimination in the application of S60 stops and searches. The complaint highlights the harmful impact of disproportionality in the investigation of section 60 stop and search complaints.
Where significant concerns regarding new technologies introducing harmful patterns or trends in policing emerge, it will now be possible for those civil society organisations with designated status to raise a super-complaint.
As with any innovation in policing or change in the law, in order to maintain public confidence, those responsible for implementing it must ensure it is, and is seen to be, free from bias. In light of significant concerns regarding disproportionality in the use of stop and search, we issued 11 learning recommendations to the Metropolitan Police Service in August 2020 regarding their use of the tactic[5]. We share your concerns that the police service make every effort to eliminate the potential for bias to be built into any algorithm or system and to explain how new tools are to be deployed.
The Commission for Race and Ethnic Disparities recently supported Centre for Data Ethics recommendations to both Government and the EHRC regarding discrimination and algorithmic decision making:
• [the Government] place a mandatory transparency obligation on all public sector organisations applying algorithms that have an impact on significant decisions affecting individuals
• ask the Equality and Human Rights Commission to issue guidance that clarifies how to apply the Equality Act to algorithmic decision-making, which should include guidance on the collection of data to measure bias, and the lawfulness of bias mitigation techniques[6]
We are supportive of the aims of these recommendations to ensure and assure bias does not feature in their architecture and application.
10 October 2021
Policing:
Police Forces
The Police Reform Act 2002 was designed on the principle that complaints regarding the police should be handled at the lowest level appropriate. Accordingly, the majority of police complaints are handled by police forces themselves[7] in line with the Statutory Guidance issued by the IOPC and approved by the Secretary of State for the Home Department.
Since the amendments to the Police Reform Act 2002 contained in the Policing and Crime Act 2017 were commenced in February 2020, LPBs have been the relevant review body for complaints that do not meet the threshold for review by the IOPC.
The College of Policing is the professional body for everyone who works for the police service in England and Wales. The College sets the standards in policing for forces and individuals, for example through Authorised Professional Practice (APP). The CoP is currently developing APP for overt use of live facial recognition and launched a public consultation on this in May 2021.[8]
The primary statutory responsibility is for HMICFRS to inspect and report on the efficiency and effectiveness of every police force maintained for a police area and fire and rescue services in England.
The IOPC, CoP and HMICFRS have a concordat which sets out how they will work together.[9] We are also jointly responsible for the Police Supercomplaints system.
Information Commissioners Office
As you will be aware, the Information Commissioner’s Office (ICO) is the UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
Where an individual’s data has been used inappropriately it may be more appropriate for a complaint to be made to the ICO or for the ICO to take enforcement action. The ICO are cited as part of the governance structure for the South Wales Police trial of facial recognition technology alongside the Surveillance Camera Commissioner and the Biometrics Commissioner.[10] A recent example of the ICO issuing an enforcement notice to the police service was in relation to the Metropolitan Police Service Gangs Matrix. In November 2018, the ICO found the MPS breached data protection laws[11]
The SCC is responsible for encouraging compliance with the surveillance camera code of practice. In December 2020, the Surveillance Camera Commissioner issued guidance for the police on the use of Live Facial Recognition[12].
Equalities and Human Rights Commission (EHRC)
The Equality and Human Rights Commission is Great Britain’s national equality body. It has responsibility for enforcing the public sector equality duty.
Banks and regulatory bodies concerned with banking may have a role to play in considering fairness in the application of algorithmic tools relating to fraud. So, for example, disgruntled customers may take complaints to the Financial Ombudsman Service, or the FCA may consider how such tools affect the markets and banking.
[1] The IOPC has not had cause to look in detail at the Gangs’ Matrix. However, it is not clear, from the available literature, whether algorithmic decision making is involved in its construction or use.
[2] G uidance_on_capturing_data_about_police_complaints_Jan2021.pdf (policeconduct.gov.uk)
[3] T he Police Super-complaints (Designation and Procedure) Regulations 2018 (legislation.gov.uk)
[4] S afe to share? Report on Liberty and Southall Black Sisters’ super-complaint on policing and immigration status ( publishing.service.gov.uk)
[5] S top_and_Search_Response_to_IOPC_Learning_Recommendations_2018112898.pdf (policeconduct.gov.uk)
[6] Recommendation 3, Commission for Race and Ethnic Disparities report, F oreword, introduction, and full r ecommendations - GOV.UK (www.gov.uk)
[7] Or LPB’s where they have chosen to take on this function
[8] P olice use of live facial recognition technology – have your say | College of Policing
[9] c oncordat-between-hmicfrs-college-of-policing-and-iopc-accessible.pdf (justiceinspectorates.gov.uk)
[10] See the FAQ’s on the Judicial Review WWhat is AFR? - What is AFR? | AFR | South Wales Police (south-wales.police.uk)
[11] I CO finds Metropolitan Police Service’s Gangs Matrix breached data protection laws | ICO
[12] S urveillance Camera Commissioner releases guidance for police on use of Live Facial Recognition - GOV.UK ( www.gov.uk)