Written evidence submitted by Dr Dimitris Xenos (OSB0157)
Certain parts of the Online Safety Bill 2021 (hereinafter OSB 2021) are identified as areas that may require a more detailed examination and renewed focus. Due attention is needed in relation to the concept of duty of care, as this relates to the system of civil liability, which is now presented and imposed as a general approach for every harm – an approach that sits uneasily with serious types of online harms for which criminal and not civil liability is required, and hence, it is the Crown Prosecution Service that can only be involved and not the non-relevant regulator, OFCOM that is currently being pushed. In addition, the Bill’s all-in-one-basket approach bypasses the requirement for an in-depth examination and evaluation of the negative impact of serious online harms, as it relies only on some basic statistical information of its White paper and too much technical legal information of European case-law even in contexts where the country’s moral standards can never be challenged. Lastly, there is a possibility for legal challenges and continuous resistance regarding certain arbitrary (over/under)-classification of various types harms that can perceivably undermine the whole effort.
a) The concept of duty of care
The Bill certainly aims high and is generally an ambitious effort to do something in order to tackle known problems of online harms and the direct or indirect responsibility of key mass communication platforms (such as internet service providers, search engines and social media organisations (hereinafter internet platforms).
The Bill is largely based on the Online Harms White Paper (2019) that I understand was influenced by the study/proposal of the known and very respectable Professor of Law, Lorna Woods and the Carnegie UK Trust trustee, William Perrin that was entitled ‘Online harm reduction – a statutory duty of care and regulator (2019) (hereinafter the Carnegie UK report).
The new concept/approach of the duty of care that is introduced has largely been based on the practice of Health and Safety at Work Act 1974 and Occupiers Liability (Land) Acts.
It should be noted that the concept of duty of care (and as a condition of liability also) is not self-standing and does not arise in priority order. It is the notion and scope of harm that comes first and to which the duty of care relates. Although it is possible to have different approaches and determination (e.g. liability thresholds) of the relevant duties and their monitoring both ex ante and ex post (i.e to prevent the publication of harmful content and/or remove it after it has been published) according to the different types of online harms that are targeted, the fact remains that some harms are very serious.
In civil liability to which the notion of duty of care relates, harm and the victim’s access to a remedy are personal to the victim’s suffering. A good number of harms that the OSB 2021 targets do not involve personal harm to individual victims and, in general, the ex ante civil liability is either rare or non-existent.
Where serious harm can be appreciated and ex ante (prevention) liability is involved (i.e. individual victim actual or potential cannot be identified), the law employs criminal law for its design. The design of what is or ought to be criminal has these flexibilities that civil liability and the duty of care concept do not possess. However, for the criminalisation of certain behaviours and acts, criminal law requires a relatively higher degree of harm and public interest.
For example, extreme pornography or child pornography is an established part of criminal law. If the duty of care terminology is employed, in which a mild regulator OFCOM will be involved (as opposed to the Crown Prosecution Service), it will first distort the basic concepts of civil liability referred to above and will lead to softer responses (some fines for the super-rich internet platforms), rather than targeting specific persons, i.e. the directors of these mega-corporations) for allowing very harmful content (e.g. extreme and child pornography) in their platforms in circumstances where they have previously been notified for the existence of such materials and they failed to check and remove them. In other words, serious harms that are or ought to be classified as criminal, for which a criminal remedy will be required (in order to have the requisite deterring effect), will be downgraded to civil liability where some fines are imposed and some annual reports are produced by relaxed regulators under an all-encompassing legal framework that characterises the OSB 2021(i.e. all things in one basket). Thus, the regulator may only do some great work in some harm areas, at the expense of a more focused approach where criminal matters are dealt with by criminal law and criminal law enforcement authorities. Is it seriously suggested that the Crown Prosecution Service can ever be replaced in critical criminal areas by the soft regulator of commercial broadcasting content, the OFCOM?
Another implication of using civil liability notions of duty of care for serious criminal content is that it will lock the debate (by ending it) of what constitutes serious online harm that can be criminalised. In recent years, we have seen the expansion of the categories of pornography that can be classified as extreme or child pornography, following international law developments, and campaigns of serious civil groups and academics that expand correspondingly the scope of operation of the Crown Prosecution Service. If criminal liability and criminalisation debates are replaced by relaxed legal notions of duty of care and with an all-things-in-one-basket regulator, a new permanent system and practice will emerge that will virtually overshadow a more specific focus that some serious online harms need. In particular, although there has been some progress with expanding criminal liability to new types of extreme pornography, it comes as a surprise that realistic (i.e. real and simulated) depictions of rape took almost 20 years to be classified as extreme pornography and be regulated by criminal law and prosecuted by the Crown Prosecution Service. Yet, the applicable criminal framework remains largely ineffective as it targets ‘possession’ of such extreme pornography and not its production and distribution, including publication. So, instead of debating a new Bill that can provide a robust criminal framework to tackle extreme pornography, we are having an all-in-one-basket debate about a more relaxed approach of civil liability (duty of care) without even the provision of a civil liability remedy (that is now replaced by the softer operation of a regulator).
The call for evidence asks also whether there are any types of content that are omitted from the scope of the Bill. I checked the White Paper, the OSB 2021 and the Carnegie UK report and I do not see any reference to torture. This type of depiction of content is particularly disturbing and perverse and should be tackled in the same way as extreme pornography is treated. As rape amounts to torture and indecent or inhuman treatment and recent developments in criminal law have been made (referred to above), there is no reason to treat torture (and excessive torture) in a different manner. To illustrate this point with the example of the very known TV series, the Game of Thrones, which various critics labelled it as pornography or soft pornography. Indeed, the series included simulated depictions of sex for some seconds rather occasionally in some episodes in few seasons. Although sex is part of the human experience, torture is not, and prohibition of torture and inhuman or degrading treatment is a fundamental international human right that is protected in absolute terms (i.e. absolute prohibition) even in wartime. Section 3 of the TV series contains prolonged depictions of simulated torture (including male genital mutilation) involving the Theon character that expands to psychological breakdown for the gradual development of a Stockholm syndrome. As this is sick and perverse imagery, which was also prolonged, how does it escape public scrutiny if one compares it to the very short depiction of simulated sex scenes?
b) The use and choice of regulator
In the previous section, I have criticised the all-in-one-basket approach of the OSB 2021 because certain serious online harms are criminal or require additional criminalisation and, therefore, they should not be placed under a general, civil liability duty of care. In this regard, they should have a criminal regulator who operates under the established norms of criminal liability which should extend expressly to internet platforms for hosting/publishing content that is prohibited by criminal law of which they are expressly aware or ought to be aware (failure to develop and implement a defence mechanism that can reasonably detect and remove such criminal content).
In this regard, it goes without saying that it is the Crown Prosecution Service that should monitor and enforce compliance with criminal offences.
In this respect, also, the proposal of the Internet Watch Foundation (IWF) for co-operation with OFCOM seems irrelevant, as the latter is a soft regulator and has not been involved in the monitoring or enforcement of criminal offences. As the IWF operates in the area of child pornography, it is a more appropriate organisation than OFCOM, a clear fact that further exposes the all-in-one-basket approach of the current Bill.
As a general comment about OFCOM, it should be said that the Office is a soft moderator operating in the easy context of broadcasting regulations of what is virtually an oligopoly sector that is controlled by few powerful corporations that dominate the broadcasting programmes at massive scale. For films and TV series, it largely relies on the standards of the British Board of Film Classification (BBFC). It is relevant to ask whether we have seen OFCOM criticising BBFC for the constant erosion of age classification of audio-visual content that the latter has long been practising. To return to the example with the season 3 of the US TV series, Games of Throne, the current page of BBCF describes the series only as of ‘very strong language’, while both season 3 and the relevant extreme torture episode referred to above attracted a 15-age restriction only. They have also a R18 category for ‘explicit works of consenting sex or strong fetish material involving adults’ that can only be ‘shown to adults in specially licensed cinemas’. So, films like, ‘Law abiding citizens’ a film that can easily be characterised as a festival or visual encyclopaedia of torture, have lawfully been shown in cinemas and streamed on Netflix although its content is much more harmful than the few occasional sex scenes of some seconds in the Game of Thrones series. Of course, OFCOM is fully aware that BBFC is the industry’s regulator that the film industry set up to regulate itself and hence, it is not an independent body.
The issue of regulators should also take into account the growing phenomenon of the so-called independent regulators who align themselves with the interests of the relevant industries, a practice that is known as ‘regulatory capture’.
A brief definition of the relevant terms have been given inter alia by D. Carpenter and D.A.Moss, Preventing Regulatory Capture (CUP (The Tobin Project), 2014) and need careful consideration:
‘Regulatory capture is the result or process by which regulation, in law or application, is consistently or repeatedly directed away from the public interest and toward the interests of the regulated industry, by the intent and action of the industry itself.’ (p. 13)
… Weak capture, by contrast, occurs when special interest influence compromises the capacity of regulation to enhance the public interest, but the public is still being served by regulation, relative to the baseline of no regulation. In other words, weak capture prevails when the net social benefits of regulation are diminished as a result of special interest influence, but remain positive overall. (p. 12)
Whether the general explanation of regulatory capture is used or the lighter version for a compromised but still operational function in relation to both the BBFC and OFCOM, the fact remains that some types of online harms are very serious and need a more robust monitoring regulator and enforcement mechanism. In this respect, some types of harm, especially those relating to extreme and child pornography, torture and serious violence should be organised under a different regulatory framework with different legal obligations (criminal liability) and more robust monitoring bodies, such as the Crown Prosecution Service.
The other point, which has been circulated in various comments that have appeared online criticising the involvement and oversight of state authorities and ministers, cannot be of universal application, as for the serious harms referred to above, the determination of morals and of what is harmful is not exactly the occasion for an ‘independent’ (as an official label rather than actual function) body but for a political/ institutional one that feels the pressure and demands of the society – a democratic operation that provides the requisite flexibility for change of views and practices.
c) The one-size-fits-all approach
This point has been covered above under a similar terminology, the all-in-one-basket approach.
Some additions can be made. The White Paper on which the OSB 2021 is based has said that consultation was sought as to ‘which enforcement powers the regulator should have at its disposal, particularly to ensure a level playing field between companies that have a legal presence in the UK, and those who operate entirely from overseas.’ The consideration for ‘a level playing field between companies’ is certainly relevant but there are of course limits in relation to serious types of harm and the moral standards of the British society. Although the current call for evidence expressly asks for comparative practices, the UK has different standards of freedom of speech and especially limitations to free speech for the serious harmful content referred to above. This is a reflection of the society’s morals, its tradition, educational policies and vision about the well-being of its people. In this respect, a more relaxed approach that the US practices does not need to be considered in order to ensure ‘a level playing field between companies’ in relation to serious online harms.
Similarly, in the Carnegie UK report, there are numerous references and detailed analysis of EU and ECHR case-law. There are two key points that need to be mentioned.
First, there is too much discussion of technical legal information when the basic concepts and scope of some types of harm, such as extreme pornography, have not been covered with reference to (non-legal) studies and reports about the impact of that type of harm on children and on the development of their personality and relationships with others. Do we need the opinion of a lawyer only or that of an expert body of psychologists? In a recent report, OFCOM presented statistics that included various tables showing what children think about the effect of pornography on them. Is this the correct approach?
It seems to me that the OSB 2021 lacks a serious impact assessment study regarding the negative effect that serious harmful content, such as pornography and extreme pornography has on children and the society in general that is currently emerging from massive exposure to unlimited harmful content.
For this purpose, a relevant passage of such a study that can be included here:
Even if pornography can become addictive, the question remains for some, can it be harmful? The content of the most popular pornography currently consumed does appear to overwhelmingly portray aggression toward women (Bridges, Wosnitzer, Scharrer, Chyng, & Liberman, 2010), and, in homosexual pornography, men (Kendall, 2007). The Hald meta-analysis supports the premise that pornography does indeed increase attitudes of aggression toward women (Hald, Malamuth, & Yuen, 2010), as does the paper from Foubert and colleagues (Foubert, Brosi, & Bannon, 2011).
… potential harms to an individual’s mental and emotional well-being are never discussed. Since these young people, through the brain’s mirror systems, ‘resonate with the motivational state of individuals depicted’ in these films (Mouras et al., 2008), the aggression increasingly inherent in pornography may portend negative emotional, cultural, and demographic effects. (p. 6)
The need for a scientific approach for the assessment of serious harms and the impact on children further justifies a more focused and separate legal framework for serious online harms – as opposed to the current plan that squeezes everything in one basket for a quick fix.
Second, one of the biggest contribution of the UK’s jurisprudence to the development of international human rights law is seen in the case of Handyside v United Kingdom (1979–80) 1 EHRR 737 that concerned the Attorney-General’s banning order against a Danish educational book which covered also the issue of sex. The material was found not to fit with the UK’s moral standards, and publication was only allowed with necessary revisions of some parts of the book. The publisher’s free speech arguments failed as the ECtHR accepted that ‘it is not possible to find . . . a uniform European concept of morals’ (at 753).
As a result, for matters of serious online harms that concern the well-being and healthy mental development of children, there is no reason to consider ‘a level playing field’ with reference to the practices of other countries but we need to look at the traditions and moral standards that are needed in the UK, a question that is for elected politicians and the Parliament (as opposed to unconcerned, unqualified, industry-friendly regulators) to determine and renew accordingly.
What is also important to note in that case is that the government’s arguments in Handyside were mostly non-legal despite the fact that there was a legal case that the government had to defend. The government presented a report that explained how the various views expressed in the Danish book were approached with method and rigour from its own educational system, expanding to wider evaluations and views regarding intimate relationships. This is the kind of non-legal, scientific argumentation and studies that, regrettably, the White Paper on which the current Bill is based did not have.
d) ) Possible legal challenges.
There is strong criticism of the OSB 2021 regarding its likely impact on political speech and I am sure the Committee’s call will be answered by various media organisations, NGOs, and other colleagues.
It is worth citing the comment of the known Tory MP, David Davis who said that:
‘The Online Safety Bill is a censor’s charter. Lobby groups will be able to push social networks to take down content they view as not politically correct, even though the content is legal. The idea we should force Silicon Valley companies to police Briton’s speech online, seems out of Orwell’s 1984, and is not what our voters expect of us.’
It is also a paradox to allow the creation of code of practices targeting alleged misinformation or disinformation when the so called ‘harms’ can hardly be defined as such, and especially when political demagogues and institutional misinformation or disinformation have long existed and channelled by the very institutions that advocate such unprecedent control of political speech.
The all-in-one-basket approach also risks legal challenges regarding obvious violations of freedom of speech, and this means that there can be a shift in focus to these issues and major media coverage, thereby bypassing other more worthy parts of the current Bill that do contribute to the effort against online harms. Such challenges can be initiated by civil society groups or even MPs (as Mr Davis did in the past when he subjected the Data Retention and Investigatory Powers Act 2014 to constitutional review, a challenge that turned out to be successful).
In sum, it might be better to remove certain types of harms from the Bill and have a more focused and manageable approach that can tackle some online harms, rather than attempt an all encompassing framework with weak regulators and disputed notions of harms that can generate negative public reception and continuous legal challenges.
26 October 2021
 Optional Protocol to the Convention on the Rights of the Child on the sale of children, child prostitution and child pornography (OPSC); the Convention on Cybercrime of the Council of Europe (CETS No.185).
 Fiona Vera-Gray, Clare McGlynn, Ibad Kureshi, Kate Butterby, ‘Sexual violence as a sexual script in mainstream online pornography’ The British Journal of Criminology, Volume 61, Issue 5, September 2021, Pages 1243–1260, https://doi.org/10.1093/bjc/azab035
 D. L. Hilton, 'Pornography addiction a supranormal stimulus considered in the context of neuroplasticity' (2013) 3(1) Socioaffective Neuroscience & Psychology.
 https://www.daviddavismp.com/david-davis-mp-comments-on-the-online-saftey-bill/ (as published by the Daily Mail)
 I have spoken about this at a relevant event at the Institute of Advanced Legal Studies regarding similar efforts by the EU union and my position is summarised here with reference to relevant links: https://infolawcentre.blogs.sas.ac.uk/2018/05/09/ilpc-seminar-eu-report-on-fake-news-and-online-disinformation-30th-april-2018/