Sjors Ligthart LLM (Tilburg University), Dr Lisa Forsberg (University of Oxford), Professor Gerben Meynen (Utrecht University and Vu University Amsterdam) & Professor Thomas Douglas (University of Oxford) — Written evidence (NTL0013)
Neurotechnological ‘brain-reading’ in criminal justice
Introduction and summary
1. We are academics specializing in criminal law, health law, ethics and (forensic) psychiatry from the universities of Tilburg, Oxford, Utrecht and Amsterdam.
2. This evidence submission considers legal and ethical issues concerning neurotechnological brain-reading in criminal justice. It covers the following issues:
3. For the legal part, our focus is on the human rights enshrined in the European Convention of Human Rights (ECHR), incorporated into the law in England and Wales via the Human Rights Act 1998.
4. Our summary conclusions are:
The relevance of ‘brain-reading’ to criminal law
5. In the last ten years, neuroscientists, psychiatrists, lawyers and ethicists have been debating the potential contribution of neurotechnologies to the law, especially to criminal law.[1] A particular area of research concerns the forensic use of neurotechnologies that analyse the brain with the aim of obtaining information about brain features and mental states. These technologies are sometimes referred to as ‘brain-reading’, or, when used in criminal justice contexts, ‘forensic brain-reading’.
6. It is unsurprising that the value of brain-reading technologies for criminal justice is being considered. Subjective mental states of defendants and convicted offenders are often essential to answering central questions of criminal law. For example, does the defendant know something about the crime? Did he lie about his alibi? Did he really hear commanding voices at the time of the offence? And how does the convicted offender feel in various stressful situations?
7. Importantly, however, in the course of criminal justice, defendants and convicted offenders may well be reluctant to disclose their knowledge, memories, or other mental states. This reluctance often manifests in lying, feigning, misleading, or invoking the right to silence. As a consequence, there is a demand for measures that enable a more objective assessment of a person’s subjective mental states among the police, judges, lawyers, parole officers and forensic psychiatrists and psychologists. It is expected that brain-reading technologies could provide objective evidence relevant to answering central questions in criminal law, especially regarding guilt, legal responsibility and factors relevant to offenders’ risk of recidivism.[2]
8. However, the legal status of forensic brain-reading is currently unclear, as it raises fundamental questions regarding privacy and the privilege against self-incrimination.[3] Recently, researchers from different disciplines have stressed the urgency of addressing legal concerns regarding emerging neurotechnologies such as brain-reading.[4] Their message is clear: we must consider the legal status of brain-reading technology and ensure adequate legal protection, before the technology is developed further.
9. In the international literature, non-consensual brain-reading has been discussed for over a decade. In some countries, including Chile and Spain, specific ‘neurorights’ are being considered to protect citizens from far-reaching brain-reading technologies.[5] Moreover, the Committee on Bioethics of the Council of Europe is currently assessing the relevance and sufficiency of the existing human rights framework to address the issues raised by the applications of neurotechnologies such as brain-reading.[6] In the same vein, the Organisation for Economic Co-operation and Development has stressed the need of safeguarding personal brain data gained through neurotechnology.[7]
10. We recommend that the UK Government consider the ethical and legal issues that are raised by the use of brain-reading in criminal justice, in order to inform the development of law and policy in England and Wales. Policy-makers should anticipate developments and consider potential legal and ethical implications of non-consensual brain-reading in criminal justice. To this end, the present submission provides evidence with respect to the right to privacy and the privilege against self-incrimination.
Technologies and applications
11. Different technologies can be used to examine people’s brains. Examples include magnetic resonance imaging (MRI), functional magnetic resonance imaging (fMRI), computed tomography (CT), electroencephalography (EEG) and positron emission tomography (PET). Several of these technologies are already being deployed in criminal justice contexts. A 2015 study found that brain-imaging technologies were being used in criminal proceedings in England and Wales,[8] mainly to appeal against conviction and/or sentencing. Forensic brain-imaging has also been used in the United States, the Netherlands, Slovenia, Italy, Australia and Canada.[9] The use of such technologies is likely to grow in the future, since in recent years we have seen considerable progress in brain-reading technology, often supported by artificial intelligence (AI). Technologies that can shed light on, e.g., intentions, capacities and emotions, are currently under development; if they reach a high level of precision, they could be considered ‘brain-reading’ technologies. In what follows, we use this term to refer to both current and emerging brain-imaging technologies.
12. Brain-reading technologies are likely to be employed for at least three purposes. First, they may be used to assess mental capacity or illness (especially in the context of the insanity defence), for example, by diagnosing conditions such as epilepsy, dementia and brain trauma that may diminish capacity. A recent study reported the use of brain-reading by the Court of Protection when determining people’s lack of ‘mental capacity’ in the meaning of the Mental Capacity Act 2005 in England and Wales.[10]
13. Second, brain-reading technologies may be used, as an adjunct to more traditional risk assessment tools, in the assessment of ‘dangerousness’ or recidivism risk. Such assessments often inform decisions regarding pre-trial detention, sentencing, parole, and—in some jurisdictions—post-sentence preventive detention. Brain-reading studies have reported that specific brain features are correlated with aggressive behaviour.[11] In a recent study, the researchers used both traditional risk assessment tools and data from brain-reading in order to assess the risk of recidivism of forensic psychiatric patients. They found that inclusion of brain-imagin data increased the proportion of predictions (either ‘will recidivate’ or ‘will not recidivate’) that were true from 64% to 82%.[12]
14. Third, brain-reading technologies may be employed in the determination of guilt. Interestingly, brain-derived data has enabled researchers to make inferences about individual mental states, such as whether a person is lying or whether she recognises something.[13] There is considerable agreement across studies on specific brain areas that are more active during lying in a laboratory setting than during truth telling.[14] There is also a specific brain signal, which can be easily and cheaply measured via EEG, that is known to correlate with the recognition of an item (e.g., a murder weapon). This test of recognition has been considered “highly accurate in differentiating between knowledgeable and unknowledgeable individuals.”[15] In Japan, recognition-detection is used on a regular basis in criminal justice, employed through the measuring of physiological responses with a polygraph.[16]
15. Brain-reading technology is currently the focus of many companies, including Facebook and Elon Musk's Neuralink. They aim to develop brain-reading devices that enable to communicate only through our brain signals (brain-computer-interfaces) for the consumer market, for example for gaming or to control your smartphone with your brain. Such technologies may also be useful within the criminal justice system.
Ethical and legal issues
16. Applying brain-reading technologies in criminal justice raises a number of issues. Some of these are scientific or technical. For example, many scientific studies on brain-reading provide findings at group level that cannot be straightforwardly applied to individuals. Moreover, there is a risk of defendants or convicted offenders actively sabotaging certain measurements, for example, by moving their heads in the brain scanner or by recalling emotional memories. Other challenges are psychological or social. For instance, concerns have been raised about the potential of brain-reading evidence to be overly persuasive to judges and juries.[17] This evidence submission, however, focuses on what we see as the most important ethical and legal issues.
Consent
17. In medicine—where brain-imaging technologies are routinely used, for example to diagnose traumatic brain injury or a brain tumour—patients normally consent to their use, unless they are deemed to lack decision-making capacity. In criminal justice, however, defendants and convicted offenders may well be reluctant to cooperate in a brain-reading assessment, because it could jeopardise their interests. At the same time, these technologies could be particularly valuable with respect to uncooperative defendants or prisoners who are unwilling to share information. This raises the question of whether it could be ethically and legally permissible to apply brain-reading technologies without the valid consent of a suspect or convicted offender, even when the person has decision-making capacity.
18. Based on section 28 of the Offender Management Act 2007, the Secretary of State may include a polygraph condition in the licence of a person to whom this section applies.[18] Recently, such a polygraph test led to the conviction of an offender of rape in Harwich. In response, Detective Inspector Nathan Hutchinson of the Essex Police noted that the polygraph “is a fantastic tool and tactic to gather information to provide an accurate assessment of the risk of re-offending and allows us to manage the risk posed by convicted sex offenders.”[19] Similarly, the law allows fingerprinting and DNA testing to be performed on defendants without their consent, and offenders can be obliged to participate in questionnaire-based risk assessment instruments.
19. If non-consensual detection of deception and risk assessment with a polygraph is ethically and legally acceptable, then there is reason to assume that the use of novel and more accurate forms of brain-reading for the same purposes would be too. However, in fact, both the legal and ethical status of non-consensual brain-reading are open to question.
20. From a legal perspective, the absence of valid consent will have significant implications. It is likely that non-consensual forensic brain-reading infringes the right to privacy pursuant to Article 8(1) ECHR.[20] Potentially, such infringements can be justified based on Article 8(2) ECHR, but only when they are proportionate to a legitimate aim, and only if they are ‘in accordance with the law’, which requires an accessible and foreseeable legal basis in domestic law, guaranteeing adequate safeguards to those who could be subjected to non-consensual brain-reading. For example, according to this criterion of Article 8(2) ECHR, the legal basis of brain-reading should indicate very clearly the conditions under which brain-reading may be deployed without consent, for which exact purposes, to whom, and how long the obtained brain-data will be stored.[21] These are not obstacles in-principle, but they do require the law to be changed or specified before non-consensual brain-reading can be employed ‘in accordance with the law’ in the meaning of Article 8(2) ECHR.
21. Importantly, the absence of consent to brain-reading can have different legal and ethical implications across defendants and convicted offenders. Unlike convicted offenders, defendants enjoy a privilege against self-incrimination under Article 6 ECHR. When brain-reading requires active cooperation of a defendant, it is likely that its non-consensual use will infringe this privilege.[22] Whereas the right to privacy under Article 8 ECHR is a qualified right – infringements can be justified based on paragraph 2 – the privilege against self-incrimination is strict. Public-interest concerns cannot justify measures that extinguish the very essence of the privilege.[23] Moreover, evidence obtained in violation of the privilege against self-incrimination must be excluded from the trial.[24]
22. From an ethical perspective, too, there may be important differences between defendants and convicted offenders. Convicted offenders may be regarded as having rendered themselves morally liable to non-consensual brain-reading, just as they are often regarded as having made themselves liable to detention and other restrictions on free movement. However, the same cannot be said about defendants. Thus, the non-consensual use of brain-reading technologies may be easier to justify in convicted offenders than in defendants.
23. Although it seems clear that non-consensual brain-reading in criminal justice will often infringe the right to privacy and, sometimes, the privilege against self-incrimination, it has been argued that these rights are in fact unable to offer robust legal protection against the use of forensic brain-reading without consent. This issue is considered in the following sections.
Privacy
24. The main ethical and legal issues raised by forensic brain-reading concern privacy. Does non-consensually subjecting a person to forensic brain-reading constitute an infringement of their (moral or legal) right to privacy? And if so, could this infringement nevertheless be justified?
25. From an ethical perspective, it can be expected that non-consensual brain-reading would be acceptable only if the infringement of privacy that it entails is proportionate to the expected benefit. Both the benefit and the seriousness of the privacy infringement depend on the accuracy of the technology, in the context in which it is used. A more accurate technology can be expected to have greater benefits. However, since it provides a more certain attribution of mental states, it may also constitute a more serious infringement of privacy.
26. Since accuracy depends on the specific features of a technology and the context and population in which it is used, brain-reading technologies will need to be ethically appraised on a case-by-case basis. However, there is no reason to suppose that considerations bearing on this appraisal would differ from those used to assess extant, cruder forms of brain-reading, such as the polygraph.
27. In law, Article 8 ECHR covers the protection of information that relates to an identified or identifiable individual. It protects ‘personal data’. Brain-reading data clearly constitute personal data, as they relate to an identified or identifiable individual.[25] Therefore, brain-reading data are protected by Article 8 ECHR.[26] The level of legal protection, however, strongly depends on the privacy sensitivity of the data that is obtained in an individual case.[27]
28. The sensitivity of data collected by present brain-reading applications depends on the exact test and technology that are used.[28]
29. For example, detecting recognition with EEG yields only a limited amount of information. It identifies just one single brain response to specific pictures, which enables the determination of whether the subject recognises a particular set of images.
30. In our view, in principle, there are no compelling reasons to consider the data that are obtained in the course of forensic brain-reading – such as information about a suspect’s recidivism risk, or about his deceptiveness about an alibi – to be special or more sensitive compared with other types of personal data gathered in the course or criminal law, such as a person’s DNA,[29] which carries an enormous amount of highly sensitive information.[30]
31. Therefore, we assume that non-consensual forensic brain-reading could be compliant with human rights law as with traditional means of non-consensual data-acquisition in criminal justice, such as DNA testing and fingerprinting.
32. However, it has been argued that the protection of ‘personal data’, as is offered by article 8 ECHR, is in fact unable to protect the personal interests that are at stake in the course of brain-reading. Although forensic brain-reading will not normally yield personal data that is special or highly sensitive in terms of data protection, it does provide access to the intimate, mental spheres of an individual.[31] Mental aspects of life should be given strong and specific legal protection.[32]
33. In order to offer adequate legal protection to people’s intimate mental spheres and to protect them against neurotechnologies such as brain-reading – both within and beyond the context of criminal justice – academics and policy-makers including the Council of Europe and legislators in Chile and Spain, are now considering the development of novel rights that protect people’s brains and mental states, often referred to as ‘neurorights’.[33] One of the proposed rights is a right to ‘mental privacy’, which would protect people against illegitimate access to their brain information and prevent the indiscriminate leakage of brain data.[34]
34. In our view, the UK Government should consider whether the protection of personal data guaranteed under Article 8 ECHR suffices to address the issues that are raised by non-consensual forensic brain-reading. If it does, non-consensual brain-reading in criminal justice is likely to be permissible in some cases, similar to DNA testing and fingerprinting. If it does not, the broader question is to be considered of how the law of England and Wales can guarantee effective protection of the privacy interests that are at stake in the course of non-consensual brain-reading, for example by considering the recognition of a specific right to mental privacy.
The privilege against self-incrimination
35. According to the ECtHR: “The right to remain silent under police questioning and the privilege against self-incrimination are generally recognised international standards which lie at the heart of the notion of a fair procedure under Article 6 [ECHR].”[35]
36. In the course of brain-reading, defendants will not be required to speak. Therefore, the right to silence is unlikely to offer significant legal protection in this regard. However, present case law of the European Court on Human Rights (ECtHR) suggests that obliging defendants to participate in brain-reading has the potential to breach the broader right not to incriminate oneself.[36]
37. According to the ECtHR, the rationale of the privilege against self-incrimination lies chiefly in (1) the protection of the accused against improper compulsion and (2) in avoiding miscarriages of justice, for example, due to false confessions.[37] The first rationale relates historically to human dignity and to the prohibition of torture and ill-treatment as means of criminal investigation.[38] Providing suspects with a right not to incriminate themselves delegitimises the use of harmful methods to enforce self-incriminating evidence.[39] This idea links closely to the second rationale. The use of coercion or even ill-treatment to elicit confessions from suspects entails a serious risk of producing unreliable evidence. Using such evidence in court increases the risk of miscarriages of justice.
38. Neither of these rationales seems compelling with respect to neurotechnological brain-reading. Brain scans and EEG are generally safe. Even when the law would oblige a suspect to participate in a brain-reading assessment, it is unlikely that this would qualify as an act of torture, inhuman or degrading treatment or otherwise infringes human dignity. In general, non-consensual brain-reading will not involve ‘improper compulsion’ in the meaning of the first rationale.
39. Regarding the second rationale of avoiding miscarriages of justice: brain-reading does not increase the risk of producing false information by suspects. In fact, by providing an objective assessment of a person’s subjective mental states, brain-reading will rather reduce the risk of unreliable testimonies.
40. In the literature, a third rationale has been considered that should justify the privilege against self-incrimination: respecting the suspect’s autonomy in the criminal proceedings.[40] Non-consensual brain-reading will indeed interfere with the suspect’s autonomous choice in how to defend himself against the charges. However, we believe that this is insufficient to justify a right against self-incrimination in the case of brain-reading. Autonomy is an important value. It is a central consideration when interpreting the rights and freedoms enshrined in the European Convention on Human Rights and a central notion in shaping criminal law. At the same time, in the context of criminal proceedings, restricting autonomy is often legitimate,[41] for example, to provide breath for a breathalyser test,[42] to participate in a line-up,[43] or to share blood for DNA analysis.[44] Although these means of criminal investigation clearly restrict the autonomy of suspects and defendants, they are not precluded by the privilege against self-incrimination.[45] There are no clear reasons to assume that this should be different with respect to forensic brain-reading.
41. In sum, in light of present case law of the ECtHR, non-consensual brain-reading may have the potential to violate the privilege against self-incrimination. However, the primary rationales for the privilege—preventing improper compulsion and avoiding miscarriages of justice—do not appear to apply to non-consensual brain-reading, and the proposed autonomy rationale alone does not appear compelling to justify the strict legal protection offered by the right against self-incrimination. Therefore, there is no clear and coherent justification for regulating brain reading under the privilege against self-incrimination.
Recommendations and guiding principles
42. Given the potentially far-reaching nature of forensic brain-reading (invading and disclosing private mental states) and the legal complexities it raises (e.g. regarding the privilege against self-incrimination), there is an need for the UK Government to consider the possibilities and limitations of brain-reading for the law of England and Wales.
43. First, the existing level of protection should be determined. This is a legal matter. Next, the UK Government should decide on whether more protection is needed, for example through the development of specific legal protections for mental privacy.
44. There is no clear and coherent justification for regulating brain-reading under the right against self-incrimination.
45. Therefore, the ethical and legal acceptability of brain-reading in criminal justice should be considered chiefly under the head of privacy.
46. In assessing the ethical and legal acceptability of brain-reading under the head of privacy, the UK Government should reject ‘neuroexceptionalism’. That is, it should assess neurotechnologies for brain-reading in the same way that it would assess other technologies and techniques that are employed for similar purposes and have a similar impact on privacy. Depending on the form of brain-reading in question, these may include polygraph tests, DNA testing, blood alcohol testing, and fingerprinting.
47. Relatedly, the application of a technology, rather than the nature of the technique, should be paramount in determining the ethical and legal status of forensic brain-reading. For the most part, what matters is (1) how much and (2) what kind of information is obtained and (3) what use it is put to – not through which technology the information is obtained.
48. There is a need to determine whether, and with what qualifications, the commission of a serious crime can make one morally liable to non-consensual forensic brain-reading.
3 September 2021
10
[2] Meynen 2020; Delfin et al. 2019; Meijer et al. 2016; Farah et al. 2014.
[6] Committee on Bioethics 2020-2025.
[7] Organisation for Economic Co-operation and Development 2019.
[8] Catley & Claydon 2015. This especially concerned the results of electroencephalography (EEG), magnetic resonance imaging (MRI), and computed tomography (CT).
[9] Farahany 2015; De Kogel & Westgeest 2015; Hafner 2019; Ferrua 2020; Alimardani & Chin 2019; Chandler 2015.
[12] Delfin et al. 2019. Cf. Kiehl et al. 2018; Aharoni et al. 2013.
[13] Farah et al. 2014; Meijer et al. 2016.
[14] Farah et al. 2014, p. 124; Ganis 2018, p. 150-152.
[15] Meijer et al. 2014, p. 881.
[17] Aono, Yaffe & Koberhen 2019; Shen et al. 2017.
[18] Offender Management Act 2007 (legislation.gov.uk), s. 28.
[19] Szasz 2020. See also this video of the Essex Police.
[23] Cf. ECtHR Bykov/Rusland, § 93.
[24] Pitcher 2018, p. 39-59.
[28] Ligthart et al. 2020, p. 7.
[29] Meegan 2008; Rainey et al. 2020; Ligthart et al. 2020.
[30] ECtHR, S. & Marper/UK, § 72-76; ECtHR, Gaughran/UK, § 81.
[31] Goering et al. 2021; Ienca & Andorno 2017, p. 15.
[32] Ienca & Andorno 2017, p. 14.
[33] Committee on Bioethics, Council of Europe 2020-2025, p. 9; Ienca & Andorno 2017; Ligthart et al. 2020; La Tercera 2020; Nadal 2020.
[34] Ienca & Andorno 2017, p. 15.
[35] ECtHR, Ibrahim and others/UK, § 266.
[36] Johnston 2016; Ligthart et al. 2020.
[37] ECtHR, Ibrahim and others/UK, § 266.
[38] Waldron 2005, p. 1731; Alschuler 1997, p. 190-192.
[39] Dissenting opinion Judges Martens and Kuris in ECtHR, Saunders/UK, par. 9.
[40] Choo 2013, p. 8; Harris et al. 2018, p. 423.
[41] Galligan 1988, p. 87-88; Oshana 2003, p. 112-114; Jackson & Summer 2012, p. 269.
[42] ECtHR, Tirado Ortiz and Lozano Martin/Spain.
[43] ECtHR, Laska and Lika/Albania.
[45] ECtHR, Saunders/UK, § 69.