Dr Christopher Lawless, Associate Professor at Durham University Written evidence (NTL0029)
 

1. I am an Associate Professor within the Department of Sociology at Durham University.  During the last sixteen years I have specialized in conducting academic research on forensic science and biometric technology from a social scientific and ethical perspective.  I provided written and oral evidence to the House of Lords Science and Technology Committee Inquiry into forensic science which took place between 2018 and 2019.  I have also provided written and oral evidence to Scottish Parliamentary Inquiries into the Scottish Biometrics Commissioner Bill and facial recognition (FR).  I am acting on an individual basis.

Do you know of technologies being used in the application of the law? Where? By whom? For what purpose?

2. This submission is informed by issues concerning biometric technologies such as facial recognition, DNA profiling and voice analysis.  Such technologies may be reliant on datasets of persons categorized on the basis of appearance or other physical features.  They may utilize automated systems, algorithms or involve data mining methods.  Many of these technologies have already been used in the UK for law enforcement or other purposes such as to assess asylum claims (Home Office 2018).[1]  Different data forms could in the future be combined to assess the risk profile of individuals in public spaces.

3. Elsewhere, new forms of DNA analysis have been used or considered for use in law enforcement.  These include forensic DNA phenotyping, namely methods which claim to infer physical characteristics on the basis of unknown DNA profiles collected from crime scenes.  In the USA, police have used commercially available genealogy (‘family tree’) databases to pursue suspects by comparing unknown DNA profiles with these databases.

What should new technologies used for the application of the law achieve?

4. New technologies should achieve outcomes which can be readily measured and evaluated against clearly stated a priori aims and objectives.  This is necessary to ensure that technology is used proportionately by providing clear assessments of its claimed effectiveness.  Without clear aims and objectives specified in advance, the term ‘effectiveness’ becomes open to too much interpretation and impedes comparison across cases. 

5. Proportionality is often invoked in debates concerning forensics and biometrics, to describe the balance between individual privacy and public safety.  Questions remain over the extent to which biometric systems can de demonstrated to deliver public safety and security.  How is effectiveness currently understood and measured? Might notions of ‘effectiveness’ be relative to different emerging technologies, whose limitations and risks may only become apparent over time in certain use-contexts? The Metropolitan Police Service for example reportedly used facial recognition for a diffuse array of functions: to deter, disperse, detect and disrupt criminal activity.[2]  It was unclear how the uses of FR in these different situations were compared or evaluated.

6. New technologies should achieve accountable outcomes, which can be readily scrutinized in appropriate fora, including by courts, government, regulators and commissioners.  Previous concerns have been raised by the UK Biometrics Commissioner over the so-called ‘black box’ problem - namely the difficulties automated intelligence and machine learning technologies may present to judicial scrutiny.[3]  Technology producers, users, the law and academia should work together to ensure these emerging technologies can be appropriately scrutinized.  Source code, and information about the basis on which algorithms have been trained, should be made available.

In what instances is it acceptable for them to be used?

7. Technologies should only be used when law enforcement users are fully cognizant of all legal requirements and can demonstrably meet them.  This includes statutory data protection and equality obligations.  The case Bridges v South Wales Police highlighted failures by police to meet legal obligations when using facial recognition technology.  Yet while police are required by law to produce a Data Protection Impact Assessment (DPIA) before using FR, they are not required to publish it.[4] 

8. In England and Wales, the Forensic Science Regulator Act 2021 bestows statutory powers and mandates a Code of Practice.  In Scotland, the Scottish Biometrics Commissioner (SBC) Act, passed in 2020, obligates the Commissioner to draft a Code of Practice subject to review and possible revision.  New technology should only be used when it is compliant with these Codes.  It will however take time to prepare these Codes, and they may evolve over time, but that should be no excuse for using technology prematurely.    

9. Technology should only be used once producers and users can clearly demonstrate that it is suitably fit for purpose, and can be scientifically validated.  Limitations of technology should be fully understood by all stakeholders and clearly communicated.  Technology should not be deployed if it cannot clearly demonstrate that it does not risk unduly discriminating against individuals.  Performance data, together with any reference data used to assess possible matches, such as the use of pre-existing facial data in facial recognition systems, should be made readily available. 

10. Technology should only be used if it has been appropriately tested and trialled.  The Metropolitan Police Service was criticised for trialling facial recognition technology in operational settings.[5]  Mixing the testing of FR with operational deployment obscures an important distinction between an individual’s right to withhold consent to participate in research, and their consent to the use of technology in police operations.  From the point of view of research ethics, avoiding cameras may indicate an individual’s right to withhold consent to participate in a trial or to uphold their right to privacy.  This activity however may be interpreted differently by police.  Instead of trialling technology in operational settings, the possibility of trialling in simulated settings should be considered. 

Do new technologies used in the application of the law produce reliable outputs, and consistently so?

11. Concerns have arisen over the accuracy and reliability of facial recognition used by the Metropolitan Police and South Wales Police.[6]  The social justice impact of inaccurate technology is a recognized concern.[7] Facial recognition has raised much concern regarding the risk of misidentifying and discriminating against certain ethnic groups and women.[8]  While the accuracy of algorithms may improve, facial recognition still raises concerns due to the way in which it may be used by police, for example in the context of stop and search practices.[9]

12. Voice and linguistic analysis has been used to verify identify in government settings and in asylum cases to establish a claimant’s place of origin.  Such analysis requires population-level data to establish the probability that a linguistic or phonetic feature may match with an individual (Parliamentary Office of Science and Technology 2015).[10]  Language may not however map directly onto geographic boundaries, which may risk misidentification.[11] 

13. A critical eye with regard to reliability may assist in shaping Codes which benefit all.  Reliability of technology should be monitored over time.  Human-technology interactions may also shape how proficiently outputs are interpreted and understood.  For example, lack of training on using facial recognition systems emerged as a key concern in a study of South Wales Police’s use of the technology.[12]

14. Like effectiveness, reliability is another term which risks being open to interpretation in the context of new law enforcement technology.  Notions of ‘reliability’ may be dependent on specific use-contexts.  This includes, for example, the kind of questions forensic science may be used to pursue in criminal investigations.  Such questions may relate to identifying an individual, or establishing what kind of activity occurred at a particular scene, and when.  It is important that investigators understand the extent to which science and technology can address certain questions while being less informative in the context of others.  While DNA can link a person to a scene, a DNA match may not alone be sufficient to establish exactly what that person did to deposit their genetic material at that scene. 

How do technologies impact upon rule of law and trust in rule of law its application.  Your answer could refer for example to issues of equality.  How could any negative impacts be mitigated?

15. The Information Commissioner’s Office has raised concerns over the retention of custody images of individuals not subsequently convicted of an offence (ICO 2019).[13]  Custody images may be used to construct watchlists against which FR technology may compare images scanned in real time. 

16. The Forensic Science Regulator Act bestowed statutory powers to the Regulator and compels them to enforce a Code of Practice.  It is not yet clear whether or how the Forensic Science Regulator Act may apply to FR.  FR also exists within the remit of the UK Biometrics and Surveillance Camera Commissioner (UKBSCC).  The UKBSCC has statutory powers in relation to decision-making over the sampling and retention of DNA and fingerprints, but not facial data.  There thus appears to be a potentially significant lacuna where it is unclear who is statutorily responsible for regulation and oversight of FR.

17. Genetic methods such as DNA phenotyping, familial search and genealogical analysis present concerns around equality.  Familial searching and genealogical analysis seek to ascertain if unknown DNA profiles from crime scenes may be from persons related to those on genetic databases.  Familial searching of police databases has been used in the UK.  Analysis of commercial genealogical databases has been used by US law enforcement.  The use of such methods raises ethical and legal concerns.  They may reveal previously unknown genetic links, such as through adultery or incest, and engage with data protection legislation.  They may unduly target minority and disadvantaged groups in society who may be disproportionally over-represented on DNA databases.

18. DNA phenotyping has already used in the USA and is being considered in countries such as Germany.[14]  DNA phenotyping risks reversing the presumption of innocence.  By grouping people on the basis of appearance, it puts them under suspicion and pressures them to prove their innocence. 

19. Methods such as DNA phenotyping may group genetic data on the basis of ethnic classifiers (e.g. ‘white Caucasian’ etc.).  These are however cultural labels and do not reflect underlying genetic reality.[15]  Certain terms such as ‘Afro-Caribbean’ or ‘Caucasian’, or those formerly used as ‘ethnic appearance’ classifiers by police, are social conventions, not scientifically established categories.  Such labels are not essential features of DNA profiles, and thus risk reinforcing a tautologous relationship between culturally assumed labels and claims to ethnic linkages with DNA.  Connecting biometric data to socio-cultural labels in this way risks creating erroneous and prejudicial assumptions of links between appearance, behaviour and bodily data. 

With regards to the use of these technologies what costs could arise? Do the benefits outweigh the costs? Are safeguards needed to ensure that technologies cannot be used to serve purposes incompatible with a democratic society?

20. Concerns have been expressed about the impact of using live FR in certain spaces and for certain purposes.  These include concerns over the so-called ‘chilling effect’ on people’s behaviour.  Aston (2017) conducted a study into the use of FR at political demonstrations. Public privacy has been claimed to be an important factor in activities such as political protests.[16]  An awareness of prior anonymity enables people to decide when, how and whom they share aspects of themselves with others.  Political demonstrations are dependent on social networks and social capital to plan and organize them (Aston 2017, Feldman 2002).[17]  Aston (2017) however reported protestors feeling unable to share social links with others on demonstrations if they knew they were under surveillance, for fear of incriminating others.

21. As before, measuring the effectiveness of systems such as FR is highly challenging, particularly if the police employ such technology for different purposes.  The costs of misidentification and discrimination should however be regarded as extremely high.  Use of faulty or inaccurate technology has the potential to seriously erode trust, particularly among some communities who already distrust bodies such as the police.[18]  More inclusive debate is needed on what uses of facial recognition and other such surveillance technology are acceptable.

What mechanisms should be introduced to monitor the deployment of new technologies? How can their performance be evaluated prior to deployment and while in use? Who should be accountable for the use of new technologies, and what accountability arrangements should be in place? What governance and oversight mechanisms should be in place?

22. In England and Wales, governance and oversight are subject to a fragmented arrangement which risks regulatory gaps (paragraph 16).  A number of bodies, including the Forensic Science Regulator, UK Biometrics Commissioner, Surveillance Camera Commissioner (now merged posts), the Information Commissioner’s Office, and the Biometrics and Forensics Ethics Group were mentioned in the 2018 Home Office Biometric Strategy as playing key roles in oversight.  These bodies however vary in their remit and extent of their powers. 

23. Scotland has recently enacted a more singular principles-based approach in the form of the Scottish Biometrics Commissioner.  It is perhaps too early to assess this regime, and whether it may be appropriate for England and Wales, but close attention should be paid to how this regime evolves.  

24. Previous Forensic Science Regulators and UK Biometrics Commissioners actively engaged in horizon-scanning activity around emerging biometric innovations.  This enabled them to anticipate issues and possible limitations of technology at early stages.  The Forensic Science Regulator should continue to work closely with stakeholders, including the UKBSCC and Scottish Biometrics Commissioner, in order to anticipate technological developments.  This is important to ensure that Codes of Practice are suitably responsive (paragraph 8).

25. Ultimately, technology producers and users should be accountable for new technologies.  Producers should be open and clear about the performance and limitations of their systems.  Producers should work closely with user communities to ensure technology is fit for purpose at both the testing phase and in operational deployment.  Social scientific research has drawn attention to how innovation is not a simple linear process.  Instead this work has shown how users play an active role in innovation, albeit in ways not necessarily anticipated by producers.  Regarding law enforcement however, users must be clear about what aims and objectives are being pursued.  More formal procedures to compel law enforcement to publish details such as watchlist data for FR or performance data, could be considered. 

How far does the existing legal framework around new technologies used in the application of the law support their ethical and effective use, now and in the future? What (if any) new legislation is required? How appropriate are current legal frameworks?

26. The legislative coverage for FR is a matter of some contention.  In 2019, the Commons Science and Technology Select Committee expressed concern over the ‘lack of a clear legislative framework for this technology’.[19]  It has been claimed that a number of pieces of legislation engage with FR, including the 1998 Human Rights Act, the 2000 Freedom of Information Act, the 2012 Protection of Freedoms Act, the 2018 Data Protection Act and the 2000 Regulation of Investigatory Powers Act.[20] However, in response to a written parliamentary question from Layla Moran MP, Nick Hurd, the UK Minister of State for Policing, stated ‘there is no legislation regulating the use of CCTV cameras with FR’.[21]

27. An important question is whether legislation should be targeted towards specific technologies such as FR, or whether a more general, principles-based approach, such as the Scottish Biometrics Commissioner Act, might be sufficient to address all technological possibilities now and in the future.  While the latter might prevent constant legislating, it is still open to question how effective the SBC Act will be.  On the other hand, technology-specific legislation may provide clarity, but legislating for multiple specific forms of technology and data as they emerge may be burdensome.    

How can transparency be ensured when it comes to the use of these new technologies, including regarding how they are purchased, how their results are interpreted, and in what ways are they used?

28. A key challenge in addition to those already discussed concerns the fragmented policing landscape of England and Wales.  Monitoring use, interpretation and purchasing across 43 forces in a multitude of use-contexts presents a challenging task.  There may be a role for programmes such as the Transforming Forensics initiative, which seeks some coordination of procurement.  Police forces are not however mandatorily obliged to participate in Transforming Forensics.[22] 

This Committee aims to establish some guiding principles for the use of technologies in the application of the law. What principles would you recommend?

29. Transparency, equality and accountability should be key principles in emerging regulatory regimes, such as those overseen by the Forensic Science Regulator, the UKBSCC and Scottish Biometrics Commissioner.  These principles should be reflected in the respective Codes of Practice.  Clarity of purpose and fitness for purpose are two additional principles which should inform the development and use of technology.

 

5 September 2021

 


[1] Home Office (2018) Biometrics Strategy: Better Public Services Maintaining Public Trust.  London: Home Office.

[2] Fussey, P. and Murray, D. (2019) Independent Report on the London Metropolitan Police Service’s Trial of Live Facial Recognition Technology.  Human Rights Centre, University of Essex.  July 2019.

[3] Office of the Biometrics Commissioner (OBC) (2017) Commissioner for the Retention and Use of Biometric Material: Annual Report 2016. London: Her Majesty’s Stationery Office, p.85

[4] Purshouse, J. and Campbell, L. (2019) ‘Privacy, crime control and police use of automated facial recognition technology’, Criminal Law Review, 3, pp.188-204.

[5] Fussey & Murray (2019). no.2, pp.7-8.

[6] Big Brother Watch (2018) Face Off: The Lawless Growth of Facial Recognition in UK Policing.  London: Big Brother Watch.

[7] Buolamwini, J. and Gebru, T. (2018) ‘Gender shades: Intersectional accuracy disparities in commercial gender classification’, Proceedings of Machine Learning Research, 81, pp.1-15; Gebru, T. (2020) ‘Race and gender’ in M.D. Dubber, F. Pasquale and S. Das (eds) The Oxford Handbook of Ethics of AI.  New York: Oxford University Press, pp.253-70.

[8] Big Brother Watch (2018). no.6; Hamidi, F., Scheuerman, K.M., & Branham, S.M. (2018) ‘Gender Recognition or Gender Reductionism? The Social Implications of Automatic Gender Recognition Systems’, Computer Human Interaction 2018 Conference, April 21-26 2018, Montreal Canada. 

[9] Big Brother Watch (2018). no.6; Chowdhury, A. (2020) Unmasking Facial Recognition: An Exploration of the Racial Bias Implications of Facial Recognition Surveillance in the United Kingdom.  London: Webroots Democracy.

[10] Parliamentary Office of Science & Technology (2015) Forensic Language AnalysisPOSTNote No.509, September 2015.

[11] Eades, D. (2010) ‘Nationality claims: language analysis and asylum cases’, in M. Coulthard and A. Johnson (eds), The Routledge Handbook of Forensic Linguistics. London: Routledge, pp. 411–22.

[12] Davies, B., Innes, M. and Dawson, A. (2018) An Evaluation of South Wales Police’s Use of Automated Facial Recognition.  Universities’ Police Science Institute & Crime & Security Research Institute, University of Cardiff.  September 2018.

[13] Information Commissioner’s Office (ICO) (2019b) Information Commissioner’s Opinion: The Use of Live Facial Recognition Technology by Law Enforcement in Public Places.  31 October 2019. 

[14] Amelung, N. and Machado, H. (2021) ‘Governing expectations of forensic innovations in society: the case of FDP in Germany’, New Genetics and Society, first published online 20 January 2021.  DOI: 10.1080/14636778.2020.1868987

[15] Ossorio, P. and Duster, T. (2005) ‘Race and genetics: controversies in biomedical, behavourial and forensic sciences’, American Psychologist, 60 (1), pp.115-28.

[16] Aston, V. (2017) ‘State surveillance of protest and the rights to privacy and freedom of assembly: A comparison of judicial and protestor perspectives’, European Journal of Law and Technology, 8 (1), pp.1-19; Gavison, R. (1980) 'Privacy and the Limits of Law', Yale Law Journal, 89 (3), pp.421-71; Nissenbaum, H. (2010) Privacy in Context: Technology, Policy and the Integrity of Social Life. Stanford, CA: Stanford University Press.

[17] Aston, V. (2017) no. 16; Feldman D (2002) Civil Liberties and Human Rights in England and Wales 2nd Edition. Oxford: Oxford University Press.

[18] Chowdhury, A. (2020) Unmasking Facial Recognition: An Exploration of the Racial Bias Implications of Facial Recognition Surveillance in the United Kingdom.  London: Webroots Democracy.

 

[19] House of Commons Science and Technology Select Committee (2019) The Work of the Biometrics Commissioner and The Forensic Science Regulator: 19th Report of Session 2017-19.  London: HMSO, p.14.

[20] Surveillance Camera Commissioner (2019) The Police Use of Automated Facial Recognition Technology with Surveillance Camera Systems.  March 2019. 

[21] Quoted in Big Brother Watch (2018), no.6, p.9.

[22] House of Lords Science and Technology Select Committee (2019) Forensic Science and the Criminal Justice System: A Blueprint for Change: 3rd Report of Session 2017-19.  London: HMSO, p.12