Written evidence submitted by The Department of Media and Communications (Media@LSE) at the London School of Economics and Political Science (OSB0001)
Background
This document is submitted following the Online Safety Bill (OSB) Briefing organised by The Department of Media and Communications (Media@LSE) at the London School of Economics and Political Science. The briefing took place Thursday, July 8, 2021,9:00 AM – 1:30 PM as a virtual event over Zoom, and was attended by 108 people from a range of sectors. Following an introduction to the Bill a representative from DCMS, three panel discussions addressed regulatory independence and the Duty of Care’; collective harm, individual harm and exceptions; and the public and their responsibilities in the Bill.
Speakers included: Imran Ahmed (Center for Countering Digital Hate), Dr Edina Harbinja (Aston), Professor Sonia Livingstone OBE (LSE), Graham Smith (Bird & Bird), Dr Damian Tambini (LSE), Glen Tarman (Full Fact), Professor Lorna Woods (University of Essex) and Richard Wronka (Ofcom) and a representative from DCMS’ Security and Online Harms Directorate, who introduced the Bill to attendees.
This submission summarises the key concerns about the Bill that arose during the briefing, for consideration by the Joint pre-legislative scrutiny Committee. A full recording of the briefing can be viewed here, and we are happy to provide further information to the Committee or staff of the Committee if required.
Priorities for pre-legislative scrutiny
Several areas of concern were identified during the briefing for further consideration by the Committee. We order the points below according to the categories of questions that the Committee sets out in its call for evidence.
Objectives
- Online Harm: Definitions of what constitutes online harm are left open to subjective and political interpretation by the Bill, exacerbating the possibility of infringements on the right to freedom of speech. Harms that arise from the interactions between distribution/amplification of content and the content itself, as a result of platforms prioritising attention and reach over rights and harm prevention, are not currently addressed in the Bill. Further clarification is needed to define harms and ensure all harms that arise from digital infrastructures and business models are addressed in the Bill.
- The ‘Duty of Care’: Doubts were expressed over the suitability of a health and safety framework for policing and regulating speech; the potential scale of the regulatory burden placed on companies, which could have negative consequences for freedom of speech and privacy; lack of attention in the Bill to the providers’ business models and their potential effect on the implementation of the Duty of Care. Further consideration should be given to how proportionality will be ensured, how tensions between the Duty and freedom of speech will be resolved through the Bill, and ways in which the Bill can support providers to prioritise the duty over their business imperatives, when required.
- Regulating for adults and children of “ordinary sensibilities”: There is a lack of clarity about how the definition and application of this legal term will operate in practice. It seems to be an attempt to introduce objectivity and create homogeneity among users, but would be extremely challenging for Ofcom and/or private companies to define as a standard. Further clarification is needed to define the user and ‘ordinary sensibilities’.
- Journalistic / democratic content: There is a lack of clarity about the parameters for identifying these types of content, and what the minimum parameters are for protecting such content. Further clarification is needed to define these terms and expectations of providers in terms of how they protect such content.
Content and services in scope
- Collective Harm: As noted in the call for evidence, the Bill does not address societal/collective harms. Briefing participants were concerned that by prioritising specific harms to individuals, the Bill underplays the key question of who has power in the digital world and the impact of that power on the potential for collective harm. Individual and collective harms are linked, and collective harms such as disinformation, radicalisation and manipulation of elections are some of the most prevalent and concerning forms of harm generated through online services. Consider re-introducing the mitigation of collective harms as part of the Bill’s objectives, and regulating systems and processes that permit the production and circulation of mis/disinformation within the Bill.
- Advertising content exemption: The exemption for advertising content leaves a loophole for potential fraud, abuse, or legal harm, e.g. via scams. Given the scale of the problem, and given that platforms generate significant revenue from the advertising that underpins these scams, further consideration should be given to including these types of online harm in the Bill.
The role of Ofcom and regulatory responsibility
- Independence of the regulator: The relationship between the Secretary of State and the regulator is a cause for concern, with panellists questioning the ability of the regulator to remain independent of the changing strategic priorities of different administrations. For example, the draft bill specifically allows the Secretary of State to direct Ofcom to modify the codes of practice for platforms that it will be instructed to set out, as well as to give Ofcom guidance in other areas. In the context of freedom of expression and diversity of media, when and how such directions might be deployed are particularly sensitive issues. Further consideration should be given to the parameters for regulatory independence in the Bill and the separation of state and regulator. Consider reviewing the European Commission's Digital Services Act, which sets out a very high standard for independence of regulators.
- Scope of regulatory responsibilities: The scope of the regulator’s role and power to determine business practice; It is unclear whether a market regulator is ideally placed to take responsibility for public education in relation to media literacy. Give the checks and balances in the Bill further consideration, to ensure they are sufficient to manage the regulator’s power and their potential effectiveness in practice. Give further consideration to the resources and partnerships required for effective media literacy education, and build an obligation for resource provision into the Bill.
- Delegation of regulatory practice and protecting freedom of speech / expression: There is a perceived shift in regulatory responsibility for freedom of speech and definitions of harm to platforms and the private sector instead of government. There were concerns about platforms erring towards over-zealous removal of content to avoid sanctions resulting in collateral censorship, particularly in the case of legal but harmful content removal with user redress currently only required to be implemented post-hoc. Further consideration is needed in relation to how the Bill might proactively protect against disproportionate responses to the new regulatory regime.
- Media literacy: The Bill shifts the focus of media literacy, as outlined by the Communications Act 2003, towards the prevention of online harms, with the implication that public rather than platforms is responsible for protecting themselves from harms. Media literacy is couched in terms of user/consumer protection rather than a mode of effective citizenship. Further clarification is needed about the value of media literacy for citizenship and user agency in the context of the Bill’s objectives.
Algorithms and user agency
- Public involvement: The Bill makes no provision for consulting with the public about measures taken by platforms or the regulator, and there is no mechanism for empowering users in the Bill: users are treated primarily as consumers, rather than citizens. More involvement of civil society, such as citizen juries and citizen representation, would increase the legitimacy of the regulatory regime in the eyes of the public. It would also contribute to redressing the current imbalance between democratic, ‘people’ power and the power of platforms. Consider more explicit integration of mechanisms for user consultation and involvement in determining key issues relating to the Bill (e.g. legal but harmful content, mechanisms for redress, digital rights).
- Consideration of rights: The Bill prioritises risk at the expense of foregrounding digital and other user rights; the tensions between rights and protection that implementation may produce are not addressed and the Bill may therefore ultimately disempower, rather than empower, users vis a vis platforms and the regulator. Further clarification of the user rights underpinning the Bill is needed. Consider integrating a statement of users’ digital rights in the Bill.
- Algorithmic Fairness: The Bill does not address algorithmic biases and the targeting of vulnerable communities (e.g. based on race, gender, age) via predatory algorithms. Algorithmic fairness should be explicitly addressed in the Bill because there is currently little incentivization for platforms to minimize harm through algorithmic processes. Consider reviewing the EU’s Artificial Intelligence Act, which may be more forward-looking regulation in this area.
- Anonymity: The Bill does not address the need to tackle online harm facilitated through anonymity, while also preserving anonymity when required to protect a user’s safety and right to privacy. An approach to challenging anonymity in order to prevent harm, while preserving its benefits for non-harmful use, should be incorporated into the Bill.
Further Reading
Center for Countering Digital Hate: Stop Funding Fake News Report
Why is media literacy prominent in the UK’s draft Online Safety Bill 2021? – Sonia Livingstone, Media@LSE blog
LSE and Ofcom: Rapid Evidence Assessment on Online Misinformation and Media Literacy Final report for Ofcom
The education data governance vacuum: why it matters and what to do about it - Emma Day, 5 Rights
The Online Safety Bill is being opposed on “free speech” grounds—but we urgently need protection from platforms - Damian Tambini, Prospect Magazine
U.K.’s Online Safety Bill: Not That Safe, After All? – Edina Harbinja, Lawfare