Written evidence: report of Roundtable organized by LSE, Department of Media and Communications (OSB0247)
Roundtable on the Draft Online Safety Bill
Report[1]
Wednesday, November 3, 2021, 2:00 PM – 4:00 PM
Event Chairs: Damian Collins MP (Select Committee), Dr Damian Tambini (LSE)
Participants: Dr Rachael Craufurd Smith (Edinburgh Law School), Dr Jennifer Cobbe (University of Cambridge), Dr David Erdos (University of Cambridge), Lulu Freemont (Tech UK), Nicholas Hoggard PhD (Law Commission), Dr Brent Mittelstadt (University of Oxford), Dr Martin Moore (University College London), Professor Andrew Przybylski (University of Oxford), Professor Jacob Rowbottom (University of Oxford), Pia Sarma (Times Newspapers), Reema Selhi (DACS), Ruth Smeeth (Index on Censorship), Michelle Stanistreet (National Union of Journalists), Michael Tunks (Internet Watch Foundation), Rebecca Vincent (Reporters Without Borders)
Committee Members: Lord Black of Brentwood, Lord Clement-Jones CBE, Lord Gilbert of Panteg, Darren Jones MP, Baroness Beeban Kidron OBE, Dean Russell MP, Lord Wilf Stevenson
Observers: Andrea Dowsett (Committee Clerk), Chloe Grant (Ofcom), Elizabeth Holloway (Ofcom), Zoe Hayes, Jacquie Hughes (Committee Specialist Adviser), Jilian Luke (House of Commons Scrutiny Unit), Katie Morris (DCMS), Bruce Sinclair, Anna Sophie, Hannah Stewart, Becky Woods
Key Findings
Potential amendments to the Draft OSB suggested by participants
Amendment proposal | Relevant Clause(s)/sections | Raised by |
Tighten up wording to avoid legal challenges | Clause 13 (3) | Dr Rachael Craufurd Smith |
Two-tiered system of speech could have unintended consequences, might protect harm rather than specific speech categories | Clause 13 (6) (b) | Ruth Smeeth, Dr Damian Tambini |
Change wording; Clause 14 still seems to be grounded in the logic of legacy media; is ambiguous as to its principal scope of protection | Clause 14 | Dr David Erdos |
Remove clause; powers of Secretary of State to call in review are problematic; will pose risks and create uncertainty for a diverse set of businesses and undermine public trust | Clause 33 | Professor Jacob Rowbottom, Lulu Freemont |
Introduce an expedited appeals process for election-related content | Clauses 13-14 | Professor Jacob Rowbottom |
Extend and substantiate the regulator’s obligations to consult with the broader public on critical segments of the Bill; change wording of ‘consult’ to make it stronger | Clauses 100/101 | Professor Lee Edwards |
Main discussion themes
Panel 1 – Freedom of expression (Chair: Damian Collins MP)
Illegal content vs legal but harmful content
Damian Collins MP started the discussion by was asking participants whether the regulator should focus only on offences in law or whether they should be able to make subjective judgments on offences deemed “legal but harmful.”
Professor Jacob Rowbottom pointed out that basing the regulator’s enforcement powers on what is or is not legal, would result in the regulator needing to translate legal standards into an online context. This would mean that the regulator would have to publish highly detailed codes on what is deemed to be unlawful content, which would move the scope of the regulator beyond the types of content that would typically be looked at by the police or the courts. Ultimately, the purpose of a regulator is often to address those problems where the law is too blunt a tool. This, Rowbottom said, is not a bad thing providing the right checks and balances are in place. Still, he stated that the distinction between illegal versus legal but harmful is less helpful in an online context, which is one of the critical challenges that the Bill in its current form might face.
Nicholas Hoggard noted that the Law Commission’s position was that the criminal law was quite a blunt and expensive tool, unsuitable for forming the basis of a regulatory regime. The criminal law can be slow and cumbersome, mainly because it is reserved for the most culpable behaviour. As a result, a regulatory regime that merely mirrors criminal law would be very narrow. The language used in Bill defining ‘illegal content’ (Clause 41) currently mirrors criminal law and therefore creates the conditions for this kind of narrow regulatory regime.
Hoggard went on to state that criminal law has a very high threshold i.e., ‘reasonable grounds to believe’. If the regulatory regime mirrored this, platforms would be asked to work out, with not much information, whether there are reasonable grounds to believe that a piece of content is illegal. He drew on an example from criminal law to highlight the problems that arise when attempting to objectively define what might constituted ‘harmful’. He stated that they ran into problems when trying to define gross offensiveness due to the intensely subjective nature of determining standards of moral appropriateness. He suggested that it might be more helpful to focus on the harm caused by content instead of creating objective regulatory standards for what is a subjective evaluation.
Dr Rachael Craufurd Smith agreed that asking platforms to take on regulatory responsibilities related to only illegal content would prove difficult, primarily due to the difficulty of proving intent in an online environment. She also stressed the need for clarity on the types of harms within scope, asking whether the definition of harm was broad enough and whether it should be widened to capture collective social harms that can be caused by content online.
However, several participants raised concerns over the idea that a regulatory regime aimed at intervening in matters of speech would be able to go beyond the bounds of criminal law. Ruth Smeeth highlighted what she referred to as the unintended consequences of this legislation. One such consequence could be outsourcing what is or is not appropriate speech to either a regulator or to Silicon Valley, as opposed to Parliament.
Professor Andrew Przybylski argued that providing the regulator with powers to enforce beyond the law could be a “profound mistake” because we do not currently possess sufficient knowledge about the effects these technologies have on society. His remark echoed a point made by Smeeth that we are talking about regulating culture without thinking about the changes that such regulation might make to culture. Przybylski suggested that our knowledge of this area could be increased by facilitating ways in which the regulator or independent researchers might access data from the platforms (see below). Brent Mittelstadt echoed this view, stating that we are attempting to do evidence-based policymaking without sufficient data on the efficacy of different content moderation techniques.
Dr David Erdos agreed that the focus of the legislation should be on illegal content. He said it was profoundly dangerous to get into the realm of private actors determining things that are divorced from the legality threshold. However, Erdos recognised that the aim of applying the law fairly and impartially at scale was a very challenging one. Simply choosing the legal route by default is not a plausible solution due to the scale of the issue that needs to be tackled.
Pia Sarma stressed that there was a need for platforms to take greater action in preventing the spread of illegal content online. However, she said that giving platforms the power to make nuanced decisions about what constitutes harmful content was almost as damaging as the spread of harmful or illegal content itself when considered in the context of freedom of expression. Sarma went on to say that it would be difficult to set in stone what is regarded as harmful content.
Responding to previous speakers, Dean Russell wondered whether freedom of speech was the right way to frame the issue at stake. He opined that the question was perhaps less what people can (not) say online but rather how algorithms decide how certain types of speech and content are shared and amplified. In a way, ‘freedom of reach’ might be the more appropriate concept for grasping communicative dynamics online.
Lulu Freemont expressed general support for the objectives of the Bill but stated that businesses deemed to be within the scope of the legislation needed more clarity on the definition of harm and the scope of harms. Freemont proposed the creation of an independent committee with evidence-led processes for determining harm to address this lack of clarity.
Concluding the first panel, Dr Damian Tambini questioned whether the platforms wanted the power to determine what is considered harmful online. He pointed out that they want to know what society considers harmful to point to criteria outside their own discretion. If the Bill works as intended, then greater transparency around how each platform defines harmful content should allow users to make more informed decisions on which platforms they choose to use. However, Tambini pointed out that there are issues around user lock-in that might prevent them from being able to switch away from the larger platforms.
Panel 2 – Exemptions, effective regulation, trust & transparency (Chair: Dr Damian Tambini)
Protections & exemptions
Dr Damian Tambini began the discussion by asking participants to comment on categories of services and content that enjoy special protection in the Draft OSB, such as content of democratic importance and journalistic content.
Participants made several proposals with regards to Clauses 12-14 of the Bill. Professor Jacob Rowbottom remarked that the definition of ‘content of democratic importance’, which is defined as ‘that which appears or is ‘specifically intended to contribute to democratic political debate’ in the UK (Clause 13) should be broadened to reflect the content receiving heightened protection under Article 10 of ECHR. It would then include a wider range of public interest content (such as scrutiny of large private actors), although Rowbottom also speculated whether such a definition might be too broad. He also asserted that the exemption was, in fact, more of an affirmative duty imposed on Category 1 services. Furthermore, Rowbottom proposed introducing an expedited appeals process for election-related content, similar to the provisions for journalistic content contained in Clause 14.
Dr Rachael Craufurd Smith called into question Clause 13 (3) of the Draft OSB. The scope of the formulation ‘diversity of political opinion’ is somewhat unclear, as are the obligations that follow for companies. Legal challenges could arise if individual groups and organisations feel disadvantaged.
Ruth Smeeth expressed concerns about categories of people in Clause 13, warning that there was a danger of creating a complicated two-tier categorisation of speech. She gave examples of individuals running for political office and questioned the extent to which the Bill would unfairly protect their speech. Especially Clause 13 (6) (b) affords some people much more power than others and will potentially protect harm rather than specific categories of speech as intended.
Replying to Smeeth, Dr David Erdos said that he was more concerned about Clause 14 and related clauses (39-40). He opined that the wording of certain sections (such as the definition of ‘recognised news publisher’, cf. Clause 40) mirrors an understanding of journalism that was dominant in the era of web 1.0, with disproportionate institutional privileges for established media outlets. Moreover, under current provisions, material that is not necessarily subject to editorial control (comments, discussion forums etc.) falls outside the scope of regulation, which could prove problematic. Erdos questioned the overall usefulness of protection for specific types of actors rather than publication purposes or functions.
Echoing previous remarks, Dr Martin Moore expressed dissatisfaction with the Draft OSB’s overall lack of definitional clarity and ambiguity regarding the purpose of protection. It is unclear whether the Bill’s purpose is to protect institutions, individuals, or particular types of content, and with what aims. He gave examples of cases blurring the lines, such as hyperlocal publishing by an individual or journalists publishing on social media platforms. In its current form, the framing of exemptions lacks clarity and depth of thought.
Speaking for Reporters Without Borders, Rebecca Vincent stated that she had mixed feelings about the framing of exemptions. Like previous speakers, she emphasised the need to rethink the scope of protections and focus on publications in the public interest rather than content of democratic importance, which would include NGOs. Moreover, the definition of ‘recognised news publisher’ leaves room for application to outlets with a clear political slant and questionable relationship to false and misleading information, such as Russia Today or CGTN. She also highlighted the risk of the current wording of Clause 40 forcing unregulated media towards the state-approved regulator.
On the question of Article 10 ECHR, Nicholas Hoggard remarked that the wording of the article was intended to cover more than just political debate and encompass contributions to matters of public interest more broadly. He also speculated about the legal meaning of democratic and non-democratic political debate. In a later comment, Hoggard said that the distinction between freedom of speech and freedom of reach has certain purchase but might create problems regarding people’s rights under Article 10. Examples include cases where one perspective is minimized vis-à-vis another during an online discussion. However, Hoggard also said the application of ECHR principles would be highly context-dependent.
Pia Sarma’s primary concerns were the breadth of the Bill, the incentives for platforms to avoid penalties, and the overall lack of transparency. She wants to avoid a situation in which platforms have the unchallenged power to strip out information that is useful to society. Sarma also echoed prior remarks by other participants about ‘political debate’ being broader than what the Draft OSB currently addresses. She reiterated that although journalism is and has always been a difficult concept to define, its contribution to public debate is a hallmark of practice. Hence, if the Bill's aim is to limit platform power, there must be a case for (institutional) exemptions, but it should beaccompanied by a wider debate about their nature and scope.
Ensuring regulatory independence, public involvement & trust
Prof. Rowbottom stressed that regulatory effectiveness would be determined mainly by the codes of practice devised by Ofcom. Regarding the role of the Secretary of State, Rowbottom opined that Clause 33 (1)(a), which gives the Secretary powers to direct the regulator to modify codes of practice, should be removed or a different process for modification introduced.
Speaking for Tech UK, Lulu Freemont reiterated Professor Rowbottom’s call to delete Clause 33, as it creates an open door for the regulatory regime to be overturned by government intervention, which creates considerable uncertainty for a diverse set of businesses across the UK.
Mike Tunks pointed out the need for more clarity regarding co-designation and for consultation with industry experts, Parliament, and the tech industry itself. Reema Selhi highlighted the lessons to be learned from the Intellectual Property Industry. Many property rights holders have found the online marketplace to be an area with lots of harmful content that impinges on intellectual property rights.
Touching on the issue of trust, Professor Lee Edwards contended that beyond questions of foreseeability and clarity for businesses, it is vital that the new regulatory regime is legitimate in the eyes of the broader citizenry. She called for an amendment that would extend and substantiate the regulator’s obligations to regularly consult with the public about critical factors like definitions of harm or categories of service providers.
Dr Martin Moore stated that one consequence of the new regime would be increased intervention by platforms, which necessitates clarity from platforms and Ofcom regarding how decisions are made. However, there is a lack of provisions for control and scrutiny by independent researchers beyond Clause 10, which stipulates a duty of Ofcom to prepare a report within two years about conditions of access to platform data by researchers. Moore highlighted that public trust and legitimacy is a direct function of the degree to which the Bill allows for genuine independent research into the platforms themselves and the effectiveness of regulation. His comments were echoed by Professor Edwards, who explained that access to company data is currently severely limited unless the research is paid for by platforms themselves, which compromises the independence of the scientific process.
Summing up, Dr Tambini asked whether introducing a new scrutiny committee that monitors research and potentially accesses platform data for research purposes was a way of dealing with these cross-cutting issues and addressing the monitoring function of the regulatory regime.
In this regard, Lulu suggested a scrutiny committee could assess unidentified types of harm, provide guidance regarding the changing nature of online harms, and scrutinise the efficacy of the regulatory regime on a continuous basis.
Baroness Kidron questioned the usefulness of ‘harmful content’ as a concept and asked whether the focus should be broadened to include ‘harmful activity.’ She also wondered whether Parliament itself should be tasked with the oversight of Ofcom.
Responding to this point, Dr Erdos voiced concerns about groups looking for their particular concerns to be strongly vindicated through regulation and law. Many things going wrong online have to do with echo chamber understandings about what constitutes racism or sexism. According to Erdos, these are not questions that the law can resolve for society, and it should not impose definitions on society that are organically formed.
Cross-cutting themes
Business models and algorithmic design
As in the previous roundtable, algorithmic design and business incentives figured prominently among the points raised by participants. Several speakers pointed out that two distinct issues were being discussed: what constitutes harmful content and how harmful content is amplified online. There was a broad consensus that addressing amplification requires greater transparency around the development and impact of the algorithms used by the tech platforms.
Dr Jennifer Cobbe argued that the Draft Bill is far too limited to regulate the algorithmic systems underlying online content circulation. Instead of regulating platforms and how they amplify harmful content algorithmically, the Bill mostly tries to regulate what ordinary people can say and do indirectly, resulting in an institutionalisation of platforms as a form of “speech police overseen by a regulator.” Ultimately, Cobbe argued that the Bill wrongly targets ordinary people’s actions online instead of focusing on how platforms disseminate and amplify content. This would result in outsourcing tremendous powers to companies. Damian Collins pointed out that recommender systems fall within the scope of the Bill, but Cobbe reiterated that what has been included in the Bill is nowhere near sufficient. She suggested different types of intervention are required, including radical transparency around how algorithms are being developed and used in practice. She also stated that platforms showing themselves systematically incapable of reducing the dissemination of harmful content should face serious consequences. These should include the option to prevent platforms from using recommendation system algorithms at all. Pia Sarma also highlighted the risk of over-censorship and information of wide interest to society being taken down by platforms.
Baroness Kidron agreed that there was a need for greater algorithmic oversight and raised the question of how we should set our sights on achieving this. Kidron asked whether we should consider creating a publishers’ code for recommendation loops or a form of editorial responsibility that is subject to the law.
Access to data by independent researchers & civil society
Several speakers stressed the need for access to platform data by independent auditors and researchers. Dr Martin Moore emphasised the need for genuine independent research and scrutiny, which is lacking under current provisions. Professor Lee Edwards pointed out that researchers are presently only granted limited access to platform data unless platforms are funding research, which compromises the integrity of the scientific research process.
Professor Andrew Przybylski raised the importance of allowing for a scientific correction mechanism in evaluating how we regulate online content. Part of this, Przybylski said, means including a meaningful way for citizens to donate their data to either Ofcom or independent researchers via other means. Such a mechanism would allow society to gain more insight into the impact online systems have on citizens’ wellbeing and therefore make more informed decisions on how to regulate the online space. Przybylski also stated that allowing the regulator to go beyond what is written in law could be a “grave mistake” due to our current lack of insight into what we are regulating.
Ruth Smeeth suggested establishing a legal framework for a “digital evidence locker”. This would allow civil society to determine whether any over-censorship is taking place. She pointed out that there could be reasons why Parliament, journalists or security services would want to access some of the content that had been taken down. However, in the Bill's current version, this would mean asking platforms to hold onto and store illegal content when they do not currently possess the legal means to do so.
Annex I: audio transcript [on request]
Annex II: chat transcript
14:28:32 From Damian Tambini : Can I invite participants to mention specific sections or suggested amendments where possible. This will help us prepare the report. thanks.
14:58:47 From Damian Tambini : On these points about harmful algorithms/ recommendation rather than content: the bill refers repeatedly to "harmful content". Should it refer to "harmful services"?
15:00:23 From Stephen Gilbert : chair my virtual hand is broken. Can I come in
15:01:42 From Beeban Kidron : @Tambini the white paper talked of harmful content and activity. In the draft bill the word activity was left out.
15:02:02 From Jennifer Cobbe : Damian - I think it's difficult to define certain services as harmful or not, but there are certain practices by platforms - such as their use of recommender algorithms to disseminate and amplify content, or certain design decisions that offer possibilities of acting in certain ways that might not otherwise be possible - that do bring well-documented risks of increasing the potential for harms to users. In my view, the law should focus more heavily on platforms' practices, and the systems and processes they use, rather than on trying to define certain content or services as "harmful"
15:05:40 From Jake Rowbottom : I think the issue of reach is relevant to the question of proportionality - ie it may be more proportionate to stop content getting a high profile than forbidding outright. But there are free speech issues with both.
15:12:01 From Stephen Gilbert : I think Jake is right about ‘reach’. There are free speech issues with platforms restricting the reach of content given that they are monopolies and in reducing the reach of content they are in a very powerful position but there are some mitigations that could be part of the answer and the bit that is missing from this is the application of competition policy to give much more power to users to determine their own online environment
15:30:20 From Damian Collins : There is a vote in the Commons now, but we’ll hopefully be back shortly
15:31:50 From Alfie : Journalism has always been regarded as privileged speech and protected;some witnesses have suggested a broader definition of journalism to be content created ‘in the public interest’
15:41:17 From Rachael Craufurd Smith : If this is to do with promotion and findability then I think this could be expressed more clearly.
15:57:15 From Rachael Craufurd Smith : There is also a question about the link with the prosecution service and the contexts in which information - eg publisher details should be released and processes. This is of relevance for the issue of anonymity.
15:59:12 From Lee Edwards : @Rachael yes this came up in last week’s discussion on anonymity as well, as an area that needs consideration.
16:12:18 From David Erdos : Thank you very much to Damian as well for all your work!
16:12:26 From Beeban Kidron : Thank you all
16:12:38 From Tim Clement-Jones : Many thankis all. Much appreciated.
Annex III: follow-up email by Professor Jacob Rowbottom
From: Jacob Rowbottom jacob.rowbottom@univ.ox.ac.uk
Date: 3 November 2021 at 16:37:56 GMT
To: "Tambini,D" D.Tambini@lse.ac.uk
Subject: follow up
Dear Damian,
Thanks for organising the seminar.
Just to clarify on some of the points made in the second hour, in case it helps with the summary (most of these points I put in the evidence to the Joint Committee: https://committees.parliament.uk/writtenevidence/39290/html/)
On the definition of content of democratic importance - ‘Content of democratic importance’ is defined as that which appears or is ‘specifically intended to contribute to democratic political debate’ in the UK. I think this could be broadened to reflect the content receiving heightened protection under Art 10 of ECHR, or at least clarified to ensure that a wider range of content (such as scrutiny of large private actors) is covered.
On the elections point – an expedited complaints mechanism (similar to that in clause 14) for electoral content should be considered. This could arise where a person wishes to challenge a decision to remove content during an election campaign.
On the minister’s powers - clause 33(1)(a) gives the Secretary of State the power to direct the regulator to modify a submitted code of practice to ensure that it ‘reflects government policy’. I think the clause should be removed (or at least provide a different process for modification).
Best wishes,
Jake
2 December 2021
[1] This report was prepared by LSE researchers Jacob Angeli and George Gangar. It represents a summary of issues discussed, not a consensus.