Dr Miri Zilka, Research Associate in Machine Learning at the University of Cambridge; Detective Sergeant Laurence Cartwright, Data Analytics lead at Sussex Police; and Dr Adrian Weller, Principal Research Fellow in Machine Learning at the University of Cambridge — Written evidence (NTL0040)
1. Executive Summary
1.1 We present a short, non-exhaustive survey of tools used in the criminal justice system in the UK, in the following categories: data infrastructure, data analysis, and risk prediction. We provide a summary table on page 3, and a figure illustrating where in the criminal justice pipeline the tools are deployed on page 4.
1.2 A large number of tools are currently in deployment, offering potential benefits including improved efficiency and consistency. However, there are also important concerns. Transparent information about these tools, their purpose, how they are used and by whom is difficult to obtain. Even when information is available, it is often insufficient to enable a satisfactory evaluation.
1.3 Tools are deployed by many different parties across the country. It would be helpful to facilitate better ways for parties to collaborate in order to share good practice, and to enable appropriate data sharing to help improve outcomes.
1.4 More work is needed to establish governance mechanisms to ensure that tools are deployed in a safe and ethical way. This should include more engagement with stakeholders, documentation of the intended goal of using a tool, how it will achieve the goal compared to other options, and how it will be monitored in deployment.
1.5 We highlight additional points to consider when evaluating the trustworthiness of deployed tools and suggest policy directions for consideration.
2. Definitions used in this response
In this response, we focus on specific technologies that are deployed throughout the criminal justice system in the UK, in the following categories. All information presented is based on reliable sources, to the best of our knowledge.
There are other technologies that may be of interest to the committee which we do not cover. These include facial recognition technologies, gait identification, number plate recognition, speaker identification, speech identification, lip-reading technologies, gunshot detection algorithms and social media monitoring.
The criminal justice system in the UK has many players and components. Here we focus on four main components: law enforcement agencies, specifically the 43 police forces operating in England and Wales, and GCHQ; the Crown Prosecution Service (CPS); the courts, primarily referring to the magistrates’ court and the crown court; and HM Prison and Probation service.
3. A brief survey of tools that are in use
This section includes representative examples of the tools and technologies described in Section 2 a., b., and c., and examines how they are used across the criminal justice pipeline. This response does not include a complete list of such tools -- there are many more tools in use even just by the police forces. The information we present was curated from government websites, published reports, peer-reviewed articles, news articles, freedom of information requests and conversations with police officers. We provide a reference for each tool. All tools we list have been in use, though some may no longer be in active use.
Table 1 provides a summary of the tools discussed in this response. Figure 1 illustrates where in the criminal justice pipeline these tools are deployed.
Several police forces
Command and Control system
West Midlands Police
data analytic dashboards
Collaboration with a private company
MoJ Analytical Platform
In house using proprietary and open-source tools
Predicting future crime in space or time
West Yorkshire Police
Collaboration with academia
Private company, implemented as part of research collaboration
The Gangs Violence Matrix
Corvus IOM Case
In house using a
vulnerability risk scores
proprietary tool (Qlik Sense)
HM Prison and
the courts if a
Table 1: A summary of the tools discussed in this response. Observe that private companies are often involved in tool development
Figure 1: A figure illustrating stages in the criminal justice pipeline where predictive tools are used. In the top layer, tools are used to help decide where to look for crime. In all other layers, tools are used to help decide how to process an individual.
3.1 Law enforcement agencies
3.1.1 Data infrastructure. The 43 police forces in England and Wales do not use a single primary data collection tool. For recording incidents as they are reported, and managing first response, many forces use a platform called STORM If the reported incident is a crime or a recordable incident, it is also logged and managed in a record management system, for example, several of the forces use Niche RMS. The data collected in relation to crime and reported incidents follow comprehensive national standards set by the Home Office To our knowledge, there are no national standards specifying which additional data fields may be collected by the forces, who use their discretion. Information regarding which additional data fields are collected by each force is not immediately available. In some cases, information may exist within published police policies or can be accessed through freedom of information requests. The raw data collected by each force is not routinely accessible to other forces, but data sharing and data-based collaborations exist between forces. All forces make a series of annual data returns to the Home Office, containing aggregated data about crime, police workforce, arrests and stop and search (The Home Office, 2021).
The forces have access to several national datasets:
3.1.2 Data analysis. As with data collection tools, police forces do not all use the same data analytics software Most data analysis tools are developed by private companies, sometimes in collaboration with the force.
a. Mapinfo uses crime and incident data to map where crimes occur so police can monitor these 'hotspots'. West Midlands Police reported using this tool in an answer to a freedom of information request from 2016 (The Law Society, 2019).
b. Sussex police use customisable data analytic dashboards, developed for them by a private company. For example, two dedicated dashboards were created in response to the Covid-19 pandemic: one to track employee self-isolation and one to track the daily changes in reported crimes, incidents, calls and arrests (Sussex Police, 2021).
c. The National Data Analytics Solution (NDAS) is a project sponsored by the home office and led by West Midlands Police in collaboration with Accenture The project aims, amongst other ambitions, to introduce a new shared, central data and analytics capability that is owned and directed proportionately by participating UK law enforcement agencies (West Midlands Police ).
3.1.3 Predicting future crime:
a. Patrol-Wise is an algorithm developed by University College London (UCL) in a research collaboration with West Yorkshire Police to predict burglaries on a street-by-street level (The Newsroom, 2017).
b. iHotSpot is an AI based predictive analytics engine used to predict daily crime incident hotspots. This tool is developed by SpaceTimeAI, a spin-out company of UCL’s SpaceTimeLab. The system is integrated into the London Metropolitan Police as part of an EPSRC funded research collaboration (SpaceTimeAI., Cheng, 2012).
c. The Evidence-Based Investigative Tool (EBIT) is a tool deployed by Kent Police (UK) to predict investigative success for minor, non-domestic assault and public order offences. The solvability assessment uses a logistic regression model, followed by a two-step review of the case. Using software built by Kent Police, the EBIT user answers eight questions and is provided with one of the following recommendations: further investigation, close the case pending further evidence, or further review by a supervisor (McFadzien et al., 2020).
3.1.4 Predicting future risk associated with an individual:
a. Not all individual assessment tools involve the use of computational tools. Two manual risk assessment tools used nationally by the forces are: THRIVE and DASH. Thrive stands for Threat, Harm, Risk, Investigation Opportunities, Vulnerability of the victim and the Engagement level required to resolve the issue. It is used to assign a priority level to an incident (HMICFRS, 2019). DASH stands for Domestic Abuse, Stalking and Harassment. It is a checklist tool used to assess the risk of cases involving domestic abuse, stalking, harassment and so-called honour-based violence (HMICFRS, 2019).
b. The gangs violence matrix (GVM) is a dataset of suspected London based gang members used by the Metropolitan Police. It is used as an intelligence tool to reduce gang-related violence, safeguard those exploited by gangs and prevent young lives being lost. Each individual in the dataset is assigned a harm score and a victim score to indicate if they are likely to deliver or receive harm, respectively. Both scores are graded as Red, Amber or Green. Based on a report published by Amnesty International in 2018, the scores are automated risk scores, but no additional details are provided (The Metropolitan Police, Amnesty International, 2018).
c. West Yorkshire Police uses an Integrated Offender Management (IOM) software called Corvus IOM Case The system draws data from other sources including STORM and Niche RMS (mentioned in Section 3.1.1), analysing intelligence, crimes, arrests and substance misuse in order to derive an individualised score aimed at providing an indication of an individual’s likelihood to reoffend. Different crime types are scored differently and the Risk of Re-Offending Cohort scores are categorically displayed as low, medium and high (West Yorkshire Police, 2020).
d. Durham Constabulary uses an individualised Harm Assessment Risk Tool (HART). The tool was designed in collaboration with the University of Cambridge as part of the ‘Checkpoint’ program aimed at offering an alternative to prosecution for individuals classified as ‘medium risk’ -- likely to reoffend, but not in a serious violent manner. Qualifying individuals are offered a four months intervention program tailored to their needs, and are not charged if they successfully complete it (Oswald et al., 2018, The Law Society, 2019)
e. Avon and Somerset Police use a number of bespoke risk assessment tools. These tools are built within Qlik Sense, a data analytics platform developed by a private company The tool produces an offender risk score in the range 0-100 which combines two predictions: the likelihood of offending and the potential harm if the reoffending does occur; and a vulnerability risk score to predict the likelihood of an individual becoming a victim of crime (Dencik et al., 2018).
3.1.5 AI within GCHQ. GCHQ have released a paper that details how AI tools help them tackle key priorities: cyber threats, children’s safety, foreign state disinformation and drugs, weapons and human trafficking (GCHQ, 2021).
3.2 The Crown Prosecution Service and the Courts
3.2.1 Data infrastructure. A 2016 report by the National Audit Office on efficiency in the criminal justice system highlighted that the system’s reliance on paper builds in inefficiency. The report also mentions that the ambitious Courts Reform Programme plans to tackle many of these issues by reducing reliance on paper records and enabling more flexible digital working (National Audit Office, 2016). The aforementioned reform is still ongoing and HMCTS has begun rolling out the new digital platform for managing criminal cases (HM Courts & Tribunals Service, 2021).
3.2.2 Data analysis.
a. The MoJ Analytical Platform is a data analysis environment, providing modern tools and key datasets for MoJ analysts. The platform provides infrastructure for analysts to use open source analytical tools such as R Studio and JupyterLab in a secure environment. It is built on Amazon Web Services and the Kubernetes system Using the platform, MoJ analysts have produced statistics for the ONS, created visualisations to aid internal decision making and developed machine learning based tools to analyse parliamentary questions and their responses. (Linacre, 2018, Ministry of Justice)
3.2.3 Predicting future risk associated with an individual:
If a defendant pleads guilty or is found guilty during trial, the court can request a Pre-Sentence Report (PSR) to be prepared. The report is written by the probation service after a private interview with the offender. Reports for magistrates’ court cases are usually same day PSRs, presented orally or in written format. Crown court cases may require a written report containing a full Offender Assessment System (OASys) assessment (see Section 3.3.1), which requires the court to adjourn for 3 weeks. All reports include the following predictive scores: Offender Group Reconviction Score (OGRS); Risk of Serious Recidivism (RSR) Score; and a Risk of Serious Harm (RoSH) screening (National Offender Management Service, 2017).
a. Offender Group Reconviction Score (OGRS) is a predictor of proven reoffending within 1 and 2 years of noncustodial sentence or discharge from custody. The OGRS4/G includes all recordable offending and does not indicate the severity of predicted reoffending. Offenders with high OGRS4/G scores may reoffend rapidly but are comparatively unlikely to be involved in serious further offending. OGRS4/V includes violent proven reoffending only and outperforms OGRS4/G in predicting violent reoffending. Proven reoffending only measures reoffending known to the authorities within the measured time frame and does not include arrests and offences pending prosecution. OGRS prediction scores are based on a limited number of static risk factors including age, gender and criminal history. (National Offender Management Service, 2015).
b. The Risk of Serious Recidivism (RSR) was introduced in 2014 to predict the likelihood of an offender committing a seriously harmful reoffence within two years. The RSR tool provides three sub-scores, one for contact sexual reoffending, one for indecent image reoffending and one for non-sexual violence. All scores can be calculated from static variables, but the non-sexual violence has an extended static and dynamic version that performs better than the brief static score (National Offender Management Service, 2015).
c. Risk of Serious Harm (RoSH). Serious harm is defined in this context as an event which is life-threatening and/or traumatic, and from which recovery, whether physical or psychological, can be expected to be difficult or impossible. RoSH screening is done in all cases to indicate if a full analysis should be completed. The levels of RoSH are Low, Medium, High and Very high. Medium level indicates serious harm is unlikely unless there is a change in the offender’s circumstances. Very high indicates a high and imminent risk of serious harm. In the full RoSH analysis, there are separate scores indicating risk to children, known adults, the general public, staff and other prisoners. For all but the latter, there is a separate risk level for community and custodial settings.
3.3 HM Prison and Probation service
3.3.1 Predicting future risk associated with an individual:
The prison and probation services in England and Wales use the actuarial risk and needs assessment tool OASys. All offenders undergo a basic screening and offenders with a sentence of 12 months or more undergo a full OASys assessment. OASys generates several risk scores based on static and dynamic risk factors to assess the likelihood of reoffending and risk of harm to self and others. The OGRS and RoSH scores mentioned in section 3.2.3 form part of the full OASys assessment (Prison Reform Trust, 2018, National Offender Management Service, 2015).
In addition to the 18 tools described above, we estimate the total number of similar tools in deployment to be significantly larger. To our knowledge, no exhaustive list of such tools exists, though clearly this would be valuable
Further information on how they operate would also be helpful, especially as significant concerns have been expressed about some of the tools, e.g. see (Amnesty International, 2018).
The large number of different tools used by the police alone, including some developed by private companies, poses a significant regulatory challenge. With 43 self-regulating police forces, each accountable to elected mayors and commissioners, there is no simple path to building a national police data analytics software platform. Developing mechanisms to improve data sharing between the forces, across the criminal justice system, and with researchers is a key step towards national data-centric efforts.
The use of algorithmic systems -- whether computerised or manual -- raises the hope for significant benefits in terms of efficiency and consistency, beyond those achievable by standard human methods. However, while these systems have some advantages over humans, they also have significant drawbacks, such as a lack of contextual understanding or common sense reasoning, leading to important concerns. Points to consider when examining the trustworthy use of the tools discussed in this response include:
Fit for purpose. To decide if a tool is fit for purpose first requires that a purpose is clearly specified, yet this is often lacking. The use of any algorithmic tool should be justified by expected outcomes that yield clear societal benefits, to be established in consultation with stakeholders, including the general public. Without clearly stated, measurable outcomes, it is not possible to evaluate if a tool is functioning as expected in deployment. It is not sufficient to evaluate a tool as a stand-alone component: the appropriateness of its use must be assessed with respect to how it performs in deployment. As an example, in the early 2000s, the state of Virginia, USA, introduced the use of a non-violent risk assessment tool as a judicial aid with the aim of diverting a large share (25%) of low-risk nonviolent offenders from jail or prison. However, even though judges used the risk assessment tool, questions were raised by researchers about whether decisions were improved or worsened (Stevenson & Doleac, 2019).
Data quality. Both when training, and when in use, the data used in tools is often collected in a way that involves some variety of selection bias, and without due consideration to its final task. The quality of data and the design choices made at the collection stage can have a dramatic impact on an algorithm’s ability to function as intended. For example, consider the way information about race and ethnicity is collected by law enforcement. In the USA, data collected includes separate fields for race and ethnicity, yet not all agencies fill in the ethnicity field (Fogliato et al., 2021). This creates a bias in the data which can be very difficult to fix post hoc. In addition, race and ethnicity as defined by the individual might be different from the race and ethnicity observed and recorded by the arresting officer. In the UK, it is clearly stated if the race and ethnicity field is based on the individual’s statement or assumed by the officer, with preference given to the former.
Fairness. There is an extensive literature reporting bias against minority groups within algorithms deployed in the criminal justice system (Selbst, 2017, Chouldechova, 2017). Many concerns focus on tools deployed in the USA (ProPublica, 2016), but there is also significant worry about tools deployed in the UK (Amnesty International, 2018, Big Brother Watch, 2019). A common concern is that an algorithm trained on biased data will learn to repeat or even reinforce the same bias unless steps are taken to avoid it. There is potential for bias to be further aggravated by algorithmic tools -- for example, via feedback loops (Ensign et al., 2018). The performance of tools in deployment should be tested with respect to different subgroups, and results should be made available to the public.
Even if a tool itself is unbiased, bias might still occur when the tool is used as a decision aid for human decision-makers. The user may decide, sometimes for good reasons, to ignore the suggestion of the aid tool in some settings. By selectively accepting or rejecting the tool’s recommendations, even a ‘fair’ tool can lead to greater unfairness in the ultimate decision taken by the human (Stevenson & Doleac, 2019).
Transparency and accountability. It is not always feasible or even desirable to make algorithms in criminal justice fully transparent Concerns about IP, or worries that the system could be ‘gamed’ by criminals might prevent publishing the full model for general scrutiny. It is important, however, that users and stakeholders have sufficient understanding of the underlying mechanisms. In some cases, this might involve mandatory training of human users. It is key that accountability for decisions always remains with human decision-makers.
We argue that it should be possible for an objective party to evaluate a system, with results made public whenever feasible. In this respect, a lack of transparency due to the use of proprietary tools provided by private companies is a concern (though we appreciate there are arguments to allow this in order to encourage the private sector to build useful tools).
Privacy. The availability of data on many individuals enables useful patterns to be discovered by statistical or machine learning methods. However, it will often not be appropriate to access all this data due to concerns over privacy. We note that recent developments in privacy enhancing technologies, including differential privacy and secure multi-party computation, can sometimes enable useful information to be discovered without inappropriate access to private data.
Robustness. If an algorithmic system performs well in some settings, it can create the impression that it will perform well across all settings. However, algorithms can behave in fragile, hard to predict ways, such that small changes in input features can lead to unexpectedly large changes in model outputs (Szegedy et al., 2014). Ongoing work aims to improve the robustness of tools, but model fragility remains an open problem. This further emphasizes the need for users to be properly trained to understand the limitations, as well as the benefits, of algorithmic systems.
We present a non-exhaustive survey of tools used in the application of the law within the UK, in the following categories: data infrastructure, data analysis, and risk prediction. Many more tools are likely in deployment. Transparent information about these tools, their purpose, how they are used and by whom is not easily available. Despite earlier efforts and recommendations (The Law Society, 2019), to our knowledge no official registry of algorithmic tools in deployment in law enforcement exists, nor is there an official body designated for the regulation and evaluation of these tools.
Human decision makers are typically neither transparent nor unbiased, hence there is hope that algorithmic tools could provide significant benefits. However, it is not clear if the tools currently deployed were carefully designed according to the needs of practitioners, or if their use results in clear societal benefits. We suggest that appropriate governance mechanisms such as standards or regulation could help to ensure that tools are deployed in a safe and ethical way. Several police forces have established ethics committees which they consult when considering the use of new technologies. While welcome, these committees include volunteers and are not established regulatory bodies. In section 4, we discuss appropriate requirements for trustworthiness. These include, but are not limited to, determining in advance what deployment of the tool is aiming to achieve, how it will do it compared to other options, and how the tool will be monitored in deployment. Progress on these requirements will not be easy, but is much needed if we are to realize the benefits of algorithmic tools for society.
13 September 2021
Amnesty International. (2018). Trapped in the Matrix. https://www.amnesty.org.uk/files/reports/Trapped%20in%20the%20Matrix%20Amnes ty%20report.pdf
Babuta, A., & Oswald, M. (2018). Machine Learning Algorithms and Police Decision-Making: Legal, Ethical and Regulatory Challenges. RUSI.
Big Brother Watch. (2019). Big Brother Watch’s written evidence on algorithms in the justice system for the Law Society’s Technology and the Law Policy Commission. https://bigbrotherwatch.org.uk/wp-content/uploads/2019/02/Big-Brother-Watch-written
Cheng, T. (2012). Crime, Policing and Citizenship (CPC) - Space-Time Interactions of Dynamic Networks [Research Grant]. EPSRC. Retrieved 2021, from https://gtr.ukri.org/projects?ref=EP%2FJ004197%2F1
Chouldechova, A. (2017). Fair Prediction with Disparate Impact: A Study of Bias in Recidivism Prediction Instruments. Big Data, 5(2), 153-163. http://doi.org/10.1089/big.2016.0047
Dencik, L., Hintz, A., Redden, J., & Warne, H. (2018). Data Scores as Governance: Investigating uses of citizen scoring in public services [Project Report]. Data Justice Lab, Cardiff University.
https://orca.cardiff.ac.uk/117517/1/data-scores-as-governance-project-report2.pdf Devon and Cornwall Police. (2020). Force Call Handling and Contact Policy.
Dorset Police. (2020). Freedom of Information Act Request No: 2020-799. https://www.dorset.police.uk/media/64900/record-1-2020-396.doc
Ensign, D., Friedler, S. A., Neville, S., Neville, N., & Venkatasubramanian, S. (2018).
Runaway Feedback Loops in Predictive Policing. Proceedings of Machine Learning Research, 81, 1–12. http://proceedings.mlr.press/v81/ensign18a/ensign18a.pdf
Essex Police. (2020). D0503 Procedure - Responding to Incidents. https://www.essex.police.uk/foi-ai/essex-police/our-policies-and-procedures/d/d0503- procedure---responding-to-incidents/
Fazel, S., & Wolf, A. (2018). Selecting a risk assessment tool to use in practice:a 10-point guide. Evidence-based mental health, 21(2), 41–43.
Fogliato, R., Xiang, A., Lipton, Z., Nagin, D., & Chouldechova, A. (2021). On the Validity of Arrest as a Proxy for Offense: Race and the Likelihood of Arrest for Violent Crimes. AIES '21: Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society. https://doi.org/10.1145/3461702.3462538
GCHQ. (2021). Pioneering a New National Security. https://www.gchq.gov.uk/files/GCHQAIPaper.pdf
HM Courts & Tribunals Service. (2021, May 14). The HMCTS reform programme. Retrieved 2021, from
https://www.gov.uk/guidance/the-hmcts-reform-programme#progress-so-far HMICFRS. (2019). DASH. https://www.justiceinspectorates.gov.uk/hmicfrs/glossary/dash/ HMICFRS. (2019). THRIVE. https://www.justiceinspectorates.gov.uk/hmicfrs/glossary/thrive/ The Home Office. (2021). Home Office Annual Data Requirement (ADR) data – Privacy Information Notice.
The Law Society. (2019). Algorithms in the Criminal Justice System. https://www.lawsociety.org.uk/en/topics/research/algorithm-use-in-the-criminal-justice-system-report
Leicestershire Police. (2020). Freedom of Information 003528/20. https://www.leics.police.uk/SysSiteAssets/foi-media/leicestershire/disclosure_2020/11
.-november/3528-20-iccs-and-cad-systems.pdf Liberty. (2019). Policing by Machine.
Linacre, R. (2018). Pushing the boundaries of data science with the MOJ Analytical Platform Posted by:. MOJ Digital & Technology. https://mojdigital.blog.gov.uk/2018/04/05/pushing-the-boundaries-of-data-science-wit h-the-moj-analytical-platform/
McFadzien, K., Pughsley, A., Featherstone, A. M., & Phillips, J. M. (2020). The
Evidence-Based Investigative Tool (EBIT): a Legitimacy-Conscious Statistical Triage Process for High-Volume Crimes. Camb J Evid Based Polic, 4, 218–232. https://link.springer.com/content/pdf/10.1007/s41887-020-00050-3.pdf
The Metropolitan Police. Gangs violence matrix. Retrieved 2021, from https://www.met.police.uk/police-forces/metropolitan-police/areas/about-us/about-the- met/gangs-violence-matrix/
Ministry of Justice. MoJ Analytical Platform. Retrieved 2021, from https://user-guidance.services.alpha.mojanalytics.xyz/#content
National Audit Office. (2016). Efficiency in the criminal justice system. https://www.nao.org.uk/wp-content/uploads/2016/03/Efficiency-in-the-criminal-justice- system.pdf
National Offender Management Service. (2015). A compendium of research and analysis on the Offender Assessment System (OASys) [Appendix H: The Risk of Serious Recidivism (RSR) tool]. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachm ent_data/file/449357/research-analysis-offender-assessment-system.pdf
National Offender Management Service. (2017). Determining Pre Sentence Reports - Sentencing within the new framework. PI 04/2016
The Newsroom. (2017, October). I predict a break-in: Yorkshire police use cutting-edge technology to deter burglars. Yorkshire Post. https://www.yorkshirepost.co.uk/news/crime/i-predict-break-yorkshire-police-use-cutti ng-edge-technology-deter-burglars-595904
Niche Technology. Niche RMS Customer Profiles. Retrieved 2021, from https://nicherms.com/region/uk/
Oswald, M., Grace, J., Urwin, S., & Barnes, G. C. (2018). Algorithmic risk assessment policing models: lessons from the Durham HART model and ‘Experimental’ proportionality, Information & Communications. Technology Law, 27(2), 223-250. https://doi.org/10.1080/13600834.2018.1458455
Prison Reform Trust. (2018). Offender Management and Sentence Plan.
ProPublica. (2016). Machine Bias.
The Royal Society. (2021). Privacy Enhancing Technologies. https://royalsociety.org/topics-policy/projects/privacy-enhancing-technologies/
Selbst, A. D. (2017). Disparate impact in big data policing. Georgia Law Review, 52(1), 109-196.
Sopra Steria. (2019). Merseyside Police sign contract for STORM Command and Control system.https://www.soprasteria.co.uk/thinking/blogs/details/merseyside-police-sign-contract-f or-storm-command-and-control-system.
SpaceTimeAI. iHotSpot. Retrieved 2021, from http://spacetimeai.com/iHotSpot.html Staffordshire Police. (2020). Freedom of Information request: reference 12562.
Stevenson, M. T., & Doleac, J. L. (2019). Algorithmic Risk Assessment in the Hands of Humans. Econometrics: Econometric & Statistical Methods - Special Topics eJournal. https://ssrn.com/abstract=3489440
Suffolk Police. (2020). Freedom of Information Request Reference No : FOI 003542/20. https://www.suffolk.police.uk/sites/suffolk/files/003542-20_-_iccs_and_cad_contract.p df
Sussex Police. (2019). Crime and incident disposal recording and auditing policy. https://www.sussex.police.uk/SysSiteAssets/foi-media/sussex/policies/crime-and-inci dent-disposal-recording-and-auditing-policy-7572019.pdf
Sussex Police. (2021). Freedom of Information ref 0162/21. https://www.sussex.police.uk/SysSiteAssets/foi-media/sussex/other_information/free dom-of-information---foi-0162.21-covid.pdf
Szegedy, C., Zaremba, W., Sutskever, I., Bruna, J., Erhan, D., Goodfellow, I., & Fergus, R. (2014). Intriguing properties of neural networks. https://arxiv.org/abs/1312.6199
Weller, A. (2019). Transparency: Motivations and Challenges. In Explainable AI: Interpreting, Explaining and Visualizing Deep Learning. Springer Lecture Notes in Computer Science.
West Midlands Police. National Data Analytics Solution. Retrieved 2021, from
West Yorkshire Police. (2019). Information and Data Management. https://www.westyorkshire.police.uk/sites/default/files/2019-09/information_and_data
West Yorkshire Police. (2020). Integrated Offender Management (IOM). https://www.westyorkshire.police.uk/sites/default/files/2020-07/integrated_offender_m anagement_iom.pdf
 For recording all incidents reported and Computer Aided Dispatch (CAD), several forces, including Devon and Cornwall Police (Devon and Cornwall Police, 2020), Dorset Police (Dorset Police, 2020), Essex Police (Essex Police, 2020), Leicestershire Police (Leicestershire Police, 2020), Merseyside Police (Sopra Steria, 2019), Staffordshire Police (Staffordshire Police, 2020), Suffolk Police (Suffolk Police, 2020), Sussex police (Sussex Police, 2019) and West Yorkshire Police (West Yorkshire Police, 2019) use STORM Command and Control system. STORM is a software from Sopra Steria designed for emergency services, see https://www.soprasteria.co.uk/industries/public-safety
 For recording crime and other recordable incidents, several forces, including East Midlands Forces (Niche Technology), Nottinghamshire Police (Babuta and Oswald, 2018) North Wales Police (Niche Technology), South Wales Police (Niche Technology), Sussex Police (Sussex Police, 2019) and West Yorkshire Police (Niche Technology) use Niche RMS. Niche RMS is a private Canadian software company, see https://nicherms.com/
 The national standards for incidents and crime records set by the home office include The National Standard for Incident Recording (NSIR), National Crime Recording Standard (NCRS) and the Home Office Counting Rules (HOCR) (Sussex Police, 2019).
 The collaboration with Accenture on this project is assumed based on information from the Accenture regarding their large-scale collaboration with West Midlands Police. https://www.accenture.com/gb-en/case-studies/public-service/transforming-west-midlands-police
 Whenever we use italics, it is a direct quote. This information is stated in an ESRC grant proposal ‘Applying data analytics to comprehensive linked police records.’ https://gtr.ukri.org/projects?ref=studentship-2120524
 All forces have access to Microsoft Office 365, including Excel, which may be used for data analysis
 Accenture is a multinational company that provides consulting and professional services, see https://www.accenture.com/gb-en
 The tool is provided by Bluestar, a UK based company, see https://bluestar-software.co.uk/products/offender-management-iom/
 Qlik Sense is developed by QlikTech International AB, see https://www.qlik.com/us/products/qlik-sense
 Kubernetes is an open source system
 OGRS4 is the fourth version of the tool. OGRS3 may still be in deployment
 Static risk factors are factors that do not change or which change in only one direction. Examples include age and past criminal offences
 Dynamic risk factors are factors about individuals or their environments that can change in either direction
 Attempts to map the tools used by the 43 police forces in England in Wales include (The Law Society, 2019, Liberty, 2019).
 Many of the points we mention have been discussed previously, e.g. see the ALGO-CARE framework for algorithms in policing (Oswald et al., 2018) and a 10-point guide for evaluating if risk assessment tools are fit for purpose (Fazel & Wolf, 2018). There is significant ongoing work to advance the trustworthiness of AI systems broadly, including by the authors of this report
 A growing body of research seeks to identify and mitigate various forms of algorithmic bias. It is often easier to adjust an algorithm to avoid repeating bias, than to adjust human behaviour
 For further discussion of the challenges surrounding algorithmic transparency, see (Weller, 2019).
 For more information, please see (The Royal Society, 2021).
 Ethics committees have been established for West Midlands Police, Sussex Police and West Yorkshire Police