Written Evidence Submitted by Dr Francesca Sobande (Cardiff University) (OSB0144)
This document is my written evidence regarding the Draft Online Safety Bill. Such written evidence is informed by six years of scholarly research and my expertise as a Lecturer in Digital Media Studies (Cardiff University) who focuses on issues related to identity, inequality, ideology, and the internet. My publications in this area of research which are relevant to this call for evidence include my monograph The Digital Lives of Black Women in Britain (Palgrave Macmillan, 2020)[1] which features discussion of the impact of online abuse that is both racist and sexist in nature. Relatedly, below I have outlined issues concerning the Draft Online Safety Bill.
***
Key omissions from the Draft Online Safety Bill include a lack of reference to various forms of online abuse and harms, such as online racism, sexism, misogyny, xenophobia, Islamophobia, homophobia, and transphobia. If the proposed Online Safety Bill does not feature explicit reference to a range of types of online abuse and harms, it may promote an insufficient definition and understanding of online abuse and harms that would not adequately address the full breadth of such experiences. Put briefly, the Draft Online Safety Bill has a focus that is too narrow.
It is important that the terms “online abuse” and “harms” are effectively defined and discussed as part of the Bill’s overall focus on online safety. Ideas about what constitutes online abuse are, arguably, ambiguous in the current Draft Online Safety Bill. The word “abuse” seldom features throughout the main body of the Draft Online Safety Bill and is particularly scarce beyond “CHAPTER 4: USE OF TECHNOLOGY IN RELATION TO TERRORISM CONTENT AND CHILD SEXUAL EXPLOITATION AND ABUSE CONTENT”.
Contemporary notions of safety do not exist without interconnected understandings of what constitutes abuse. The decoupling of discussions and definitions of online safety from discussions and definitions of online abuse is a significant limitation of the Draft Online Safety Bill. Before the Bill is finalised it should include a more detailed explanation of online abuse and harms to appropriately contextualise how online safety is understood, and to ensure that a broad range of forms of online abuse are acknowledged (e.g. including, but not limited to, ableism, ageism, racism, sexism, misogyny, xenophobia, Islamophobia, homophobia, and transphobia).
Unlike the word “abuse” which only features 13 times in the Draft Online Safety Bill and mainly features as part of the “CONTENTS” pages and page headers, the words “freedom of expression” feature 36 times in the Bill and are discussed and defined in much more depth. The finalised Bill should address this imbalance and ensure that reference to “abuse” is not predominantly relegated to the “CONTENTS” pages and page headers, nor only discussed as part of writing about “CHAPTER 4: USE OF TECHNOLOGY IN RELATION TO TERRORISM CONTENT AND CHILD SEXUAL EXPLOITATION AND ABUSE CONTENT” and “SCHEDULE 3: CHILD SEXUAL EXPLOITATION AND ABUSE OFFENCES”.
Without addressing the issues outlined in this written evidence, there is a considerable risk that the Draft Online Safety Bill’s explanation of “rights to freedom of expression” may be interpreted in ways that legitimise forms of online abuse and harms, including the examples highlighted in this written evidence that are currently not named in the Bill (e.g. racism).
If the finalised Online Safety Bill does not pay due attention to an extensive range of forms of online abuse and harms, the Bill is at great risk of doing a disservice to victims and survivors of online abuse and harms whose experiences cannot be adequately grasped by solely focusing on the notion of “online safety”.
27 September 27, 2021
2
[1] Sobande, F. (2020) The Digital Lives of Black Women in Britain. Cham, Switzerland: Palgrave Macmillan