Dr Jamie Grace, Sheffield Hallam University  Written evidence (NTL0001)

Summary

1.Introduction

1.1. I am a Senior Lecturer in Law at Sheffield Hallam University, and a former Visiting Fellow at the Information Law and Policy Centre, Institute of Advanced Legal Studies (University of London). I am also currently vice-Chair of the Independent Data Analytics Ethics Committee established by the Office of the Police and Crime Commissioner for the West Midlands. This written evidence submission is adapted from an unpublished research paper, which is freely available in full on the Social Sciences Research Network (SSRN)[1]. I have authored and co-authored a number of peer-reviewed publications on the legalities of data technologies in the UK criminal justice system[2]. Along with a number of co-authors, I created the 'ALGO-CARE' framework or checklist[3], which is endorsed by the Business Change Council of the National Police Chiefs' Council (NPCC) as a form of guidance for police forces in England and Wales to follow in the absence of more formal policing-specific regulation on the use of data capabilities[4].

1.2. This written evidence submission aims to highlight the way that, with appropriate care and planning, predictive data analytics could increasingly make a meaningful difference to the protection of the human rights of women and girls facing gendered violence.

2. Data-driven technologies in the criminal justice system

2.1. Data-driven technology can be split into two main groups - those used as explanatory models or dashboards i.e. 'what has been going on, where and why?' or predictive models or dashboards that risk-score individuals as perpetrators or victims (or both) or that try and predict the location and patterns of offending geospatially i.e. 'what will be going on where?'. An example of explanatory approaches is the way that the West Midlands Police (WMP) Data Analytics Lab (DAL) have been using complex statistical analysis to determine the relationship between different factors that affect or lead to an outcome of no further action' against suspected rapists and/or domestic abuse perpetrators; with the aim of the project being the more targeted use of police investigative resources and the improvement of officer training[5]. An example of the risk-scoring or predictive approach is the RFG[6]-based model used to guide interventions in policing known domestic violence offenders, as used by Northumbria Police, discussed in research by Pamela Davies and Paul Biddle[7].

2.2. Davies and Biddle have researched how a machine learning tool can be used to try to risk-score and predict a small number of the most serious future offenders, based on the amalgamated data of many offenders who perpetrate domestic violence in the force area. The police can then co-ordinate the proactive monitoring and support, or enforcement, against the riskiest handful of offenders on a rolling basis, working within the resources available in the most preventive way possible, that is to say, in a manner informed by a certain data science approach. Davies and Biddle write of the FRG tool-informed approach that:

"It identifies and targets repeat domestic violence perpetrators using a

scoring mechanism that identifies the recency, frequency and gravity of

offending. Based on a range of specific and weighted criteria (e.g. previous

offences, number of victims, interpersonal relationships, health issues and

substance misuse) the RFG scores each perpetrator from 0–100 (100 being

the most harmful). The top-four highest scoring perpetrators identified are

selected for discussion at each [multi-agency] area meeting…"[8]

 

2.3. Davies and Biddle conclude of the RFG tool, used in Northumbria and which draws on police intelligence and wider sources of information, that:

 

"The refinement, use and effectiveness of the RFG relies not only upon

information already available to the police, but also, crucially, on

information from MATAC partners about perpetrators and victims. The

model targets individuals whose profile resembles that of a domestic

violence perpetrator. It facilitates a focus on the coercive and harmful

activities of serial perpetrators who are not yet [more formally] monitored

for their violent inter-personal behaviour… An important outcome of the

[multi-agency approach] therefore has been the shift towards prevention

and early intervention via a multi-agency focus on perpetrators not

previously known to police as presenting a significant risk to women."[9]

 

2.4. In addition to the Northumbrian RFG tool-related study by Davies and Biddle, researchers at the LSE have recently produced evidence showing that the application of machine learning to information about offenders' and victims' criminal histories as 'dyads' has great potential for predicting the risk of harm of domestic abuse of victims[10]. However, in terms of policing and the management of risk to public protection, it is clearly preferable that algorithms should augment police investigative discretion, not replace it.

3. Human rights obligations and data-driven investigations

3.1. In the UK Supreme Court, in the landmark case of Commissioner of Police of the Metropolis v DSD and another [2018] UKSC 11, Lord Kerr [at 29] wrote that "... errors in investigation, to give rise to a breach of article 3, must be egregious and significant." Better use of data-driven technology in policing of domestic abuse might avoid more risk assessments or deployments of policing investigative resources that might contain 'egregious and significant' errors i.e. overlooking higher risk 'dyads', or wrongly rating the risk posed by an offender as too low to continue to monitor.

3.2. DSD means that the police must be proactive in pursuing serial offenders displaying violent criminality in their behaviour if they are to meet their Article 3 ECHR obligations to take preventive steps to stop repeated abuse, and this is a consideration that is to some degree disconnected from the question as to whether a conviction is eventually arrived at. The question then arises, in the context of growth in the sophistication of machine learning informed policing, is whether data-driven tech can help police forces meet their Article 3 ECHR investigative duties in particular, following DSD, as well as their wider obligations under Article 2 ECHR (which includes the positive obligation to take reasonable steps to protect a member of the public from a real and immediate risk to their life).

4. Proper use and process considerations

4.1. Algorithmically fully automated decision-making would raise a number of public law related concerns; but on a practical basis, it would stifle the use of professional expertise and experience in keeping victims safer.  As Robinson and Clancy explain in relation to police forces adopting 'Priority Perpetrator Identification Tools', one anonymous force they studied made multi-agency risk assessment conference (MARAC) "referrals entirely using an algorithm on police data (one domestic incident in the current month and two in the previous month)."[11] This is a very simple algorithm, given what is possible when we think about the wide range of multi-agency data that could be experimented upon for predictive power purposes[12], but it also raises a question about stifled professional discretion - and professional concerns about particular victims being swept away by an overly coded approach. Indeed, as Robinson and Clancy go on to observe in their work: "Following feedback from staff, this [referral process] was subsequently expanded to also enable Domestic Abuse Officers to refer cases using their professional knowledge."[13]

4.2. Whether an algorithm is lawfully used if it can still make 'auto-referrals' to a MARAC or to target police interventions, or otherwise automate investigative decisions, is a question of interpretation in relation to sections 49 and 50 of the Data Protection Act 2018 (DPA 2018). These statutory provisions prohibit the fully automated taking of decisions that significantly affect a data subject or which produce an adverse legal effect for them, save for those automated decisions required or authorised by law - and it is not clear that existing police common law or statutory powers to share information with other agencies on the basis of a need to do so would enable the automation of such decisions.

4.3. Proponents of automated referrals to multi-agency risk assessment processes for offenders, based on the outputs of machine learning tools, might argue, however, that the referral itself is not a significant decision per se. The MARAC itself actually makes the meaningful decision to intervene with an offender, after all; and a referral does not necessarily produce an adverse effect, since the outcome of a multi-agency approach can mean that an offender's addiction issues are better tackled, or separate accomodation, away from their potential or current victim, is provided, for example. But the simplest means of meeting the requirements of sections 49 and 50 of the 2018 Act are strict rules that police forces should only develop profiling tools that create risk-scores which inform or guide policing practices in a given area; and where the final decision to take any step in a process is that taken by an expert, or well-trained, human decision-maker.

5. Impact on the rule of law and equality issues

5.1. There are some legal safeguards that should be noted as relating to predictive profiling and more advanced analytics. Section 47(3) of the DPA 2018 requires that data processing about any individual should be restricted (that is halted) where "it is not possible to ascertain whether it is accurate or not", while due to the effect of Section 205(1) of the DPA 2018; "inaccurate", in relation to personal data, means incorrect or misleading as to any matter of fact. Of course, incorrect data might well be recorded and continue to be held about individuals in any number of criminal justice data systems, because that data is incomplete, stale, or even the result of police systemic or institutionalised prejudices. Successful legal challenges to the accuracy of a predictive system or similar analytical tools, on the basis of data protection principles such as these, would shake confidence in the contribution that data technology might make to the policing of domestic abuse.

5.2. Bias in the use of data is driven by biased data itself. As such, the public sector equality duty (PSED) under section 149 of the Equality Act 2010 is a key standard which must be seen as guiding the use of data-driven technology in the public sector[14]. The PSED requires authorities to be 'properly informed' of the consequences of their measures, in order to show 'due regard' to impact on different groups with protected characteristics[15]. Monitoring, through the collection of evidence, and data, in order to establish potential bias in terms of discrimination across protected characteristics under the 2010 Act can be a vital component of compliance with the PSED as a result. For example, with regard to the use of live facial recognition technology in public spaces in Cardiff on a pilot basis, the Court of Appeal recently highlighted in R (Bridges) v South Wales Police [2020] EWCA  Civ 1058 at 182 that: "...there was no evidence... that there is any reason to think that the particular AFR [automated facial recognition] technology used in this case did have any bias on racial or gender grounds. That, however, it seems to us, was to put the cart before the horse."

5.3. South Wales Police should have been monitoring for bias throughout the pilot in order for that piloted use of a live facial recognition system to have been lawful with regard to the PSED under section 149 of the 2010 Act. 'Not doing your homework' on the PSED, as a public body, is to run the risk of a successful judicial review claim being made against you on the basis of the shortcomings in your own self-monitoring as a policymaking body, or a policy actor. To draw on the explanation of Knowles J, in R (DMA and others) v Home Secretary [2020] EWHC 3416 (Admin), at 324, and in relation to the finding a breach of the PSED[16], sufficient self-evaluation to show compliance with the PSED is not possible if "… there is no monitoring (including collection of data and evaluation) that would enable that." Of course, sex is a protected characteristic listed under section 149(7) of the Equality Act 2010. So there is a need to have 'due regard' to the need to prevent the discrimination suffered by women through violence from men, in considering the way that violence and abuse is policed, deterred and prevented by public bodies connected to the criminal justice system, and related agencies, in England and Wales. And as Calvert-Smith J has observed in R (Hajrula) v London Councils [2011] 448 (Admin) at 69, "…where large numbers of vulnerable people, many of whom fall within one or more of the protected groups, are affected, the due regard necessary is very high." There is then a duty to ensure that due regard is had to the way that a policy of adopting or deploying any particular use of technology in the policing context for example, contributes to the prevention of discriminatory violence against women in society. However, it is vital to have due regard to the way that the planned use of data technology might have an impact on groups of people who share other protected characteristics, besides sex, like race, age and disability.

6. Conclusions

6.1. Human rights duties mean that, ultimately, (admittedly important) well-established concerns cannot be allowed to derail the potential of technology in protecting victims of violence, while still not being allowed to exacerbate intersectional unfairness[17]. Momentum in this area of police practice and data science is considerable. There is a need for a legal duty on police forces, and/or the Home Secretary, to have 'due regard' to the potential for data analytics to better-direct investigative and preventive resources to protect victims most vulnerable to domestic violence. Plenty of commentators, and some UK police forces, have called for a 'rulebook' on the police use of technology, including data-driven predictive technologies; so there is a scope for the degree of guidance in policing circles to go beyond a combination of ICO toolkit and data protection code of practice[18] - perhaps it is time for a statutory code from the College of Policing and the Home Office itself, instead.

 

28 July 2021


[1] Grace, Jamie, Female Victims of Gendered Violence, Their Human Rights and the Innovative Use of Data Technology to Predict, Prevent and Pursue Harms (January 29, 2021). Available at SSRN: https://ssrn.com/abstract=3761207 or http://dx.doi.org/10.2139/ssrn.3761207

[2] J. Grace and R. Bamford, ''AI Theory of Justice': Using Rawlsian approaches to legislate better on machine learning in government', (2020) (3)  Amicus Curiae; J. Grace, 'Algorithmic impropriety in UK policing?', (2019) Journal of Information Rights, Policy and Practice; Vol, 3 Issue 1; M. Oswald, J. Grace, S. Urwin & G. C. Barnes (2018) 'Algorithmic risk assessment policing models: lessons from the Durham HART model and ‘Experimental’ proportionality, Information & Communications Technology Law, Volume 27, 2018 - Issue 2, 223-250; M. Oswald and J. Grace, 'Intelligence, policing and the use of algorithmic analysis: a freedom of information-based study', Journal of Information Rights, Policy and Practice (2016) 1(1) (online).

[3] 'ALGO-CARE' is a checklist of more than 30 legal, ethical and policy prompts, published as an appendix to this peer-reviewed, open access research: M. Oswald, J. Grace, S. Urwin & G. C. Barnes (2018) 'Algorithmic risk assessment policing models: lessons from the Durham HART model and ‘Experimental’ proportionality, Information & Communications Technology Law, Volume 27, 2018 - Issue 2, 223-250: https://www.tandfonline.com/doi/full/10.1080/13600834.2018.1458455

[4] Police forces in the West Midlands, Essex, County Durham, North Wales, West Yorkshire, Wiltshire, Lancashire, Avon & Somerset and Kent have built the ALGO-CARE model into their data analytics projects.

[5] See West Midlands Police (WMP) ethics committee papers on data analytics projects concerning Rape and Serious Sexual Offences (RASSO) and Domestic Abuse: https://www.westmidlands-pcc.gov.uk/ethics-committee/ethics-committee-reports-and-minutes/

[6] 'Recency-frequency-gravity' - the tool uses data about how recent domestic violence offending has been by a perpetrator; the frequency with which it has been reported; and the gravity of the report offences.

[7] Pamela Davies and Paul Biddle, 'Implementing a perpetrator-focused partnership approach to tackling domestic abuse: The opportunities and challenges of criminal justice localism', Criminology & Criminal Justice (2018) Vol. 18(4) 468–487.

[8] Davies and Biddle, pp. 474-5

[9] Davies and Biddle, p. 481.

[10] See Grogger, Ivandic and Kirchmaier, ' Comparing Conventional and Machine-Learning Approaches to Risk Assessment in Domestic Abuse Cases', https://cep.lse.ac.uk/pubs/download/dp1676.pdf

 

[11] Robinson, A. L., & Clancy, A. (2020). Systematically identifying and prioritising domestic abuse perpetrators for targeted intervention. Criminology & Criminal Justice, 1748895820914380. p.6

[12] Bland and Ariel found in 2015 that 80% of harms caused by violence in relationships, measured using the Cambridge Harms Index, stemmed from just 2% of relationships where violence was reported, in a sample of 36,000 domestic abuse reports to Suffolk Constabulary over approximately 5 years. Additionally, once a violent relationship had been the subject of three reports of violence to the police, there was more than a 50% likelihood that a fourth report would follow. This picture suggests that a focus on predicting the manner in which violence will progress in some relationships would be beneficial. However, as the authors note, drawing on intelligence from other agencies would be crucial, given their finding that

"over half of the most harmful cases were not known to the police for domestic abuse should prompt a review of how forces and their partners engage with potential victims and how they use their data to proactively identify risk. The status quo of using a nonactuarial, nonevidence-based, reactive risk assessment is untenable. An alternative needs to be developed which takes into account that much of the harm caused to domestic abuse victims comes from cases that have never even been subject to risk assessment."

Matthew Bland and Barak Ariel, 'Targeting Escalation in Reported Domestic Abuse: Evidence From 36,000 Callouts', International Criminal Justice Review, 2015, Vol. 25(1) 30-53, 48 and 49.

[13] Robinson, A. L., & Clancy, A. (2020). Systematically identifying and prioritising domestic abuse perpetrators for targeted intervention. Criminology & Criminal Justice, 1748895820914380. p.6

[14] Section 149(1) Equality Act 2010: "A public authority must, in the exercise of its functions, have due regard to the need to—

(a) eliminate discrimination, harassment, victimisation and any other conduct that is prohibited by or under this Act;

(b) advance equality of opportunity between persons who share a relevant protected characteristic and persons who do not share it;

(c) foster good relations between persons who share a relevant protected characteristic and persons who do not share it."

[Section 149 (7) EA 2010 provides the following protected characteristics: "age; disability; gender reassignment; pregnancy and maternity; race; religion or belief; sex; sexual orientation"...]

[15] See for example R (LH) v Shropshire County Council [2014] EWCA Civ 404

[16] In this case, because of a failure to properly scrutinise the contracted-out system and standards of providing accommodation to asylum claimants with disabilities, who would otherwise be destitute.

 

[17] For a Rawlsian argument for this position, for example, see J. Grace and R. Bamford, ''AI Theory of Justice': Using Rawlsian approaches to legislate better on machine learning in government', (2020) (3)  Amicus Curiae

[18] See https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2020/12/ico-launches-tool-to-help-police-forces-using-data-analytics/# and https://ico.org.uk/for-organisations/guide-to-dp/guide-to-law-enforcement-processing/