Written evidence submitted by Stonewall (OSB0083)
1. Stonewall warmly welcomes the opportunity to respond to this call for evidence. Rapid advancements in online technologies have delivered many benefits for LGBTQ+ communities helping many to explore and understand their identities, speak up for their rights, forge communities and build solidarity. However, marginalised communities, including LGBTQ+ people, experience concerningly high levels of abuse online, which can cause significant harm. This call for evidence presents a key opportunity to highlight successes and drawbacks of the Bill in its current form, and work towards legislation that ensures online platforms effectively protected marginalised communities from online harms.
2. Many LGBTQ+ people experience significant levels of harm in online spaces. A report by Ofcom this year confirmed that one in three people spotted hate speech in online video platforms in the last three months. In relation to the prevalence of hateful content online, the report found that 32 per cent of respondents had seen or been subject to hate speech. Of those:
3. Stonewall’s research with YouGov into the experiences of 5,375 LGBTQ+ adults across Britain, LGBT in Britain: Hate Crime and Discrimination (2017), found that:
4. LGBTQ+ children and young people are at particular risk. While the internet can act as a lifeline to many LGBTQ+ children and young people, many of whom are not able to be open about their LGBTQ+ identity at home or school, two in five - including 58 per cent of trans young people - have been the target of homophobic, biphobic or transphobic online abuse, while nearly all (97 per cent) have witnessed it (School Report 2017).
5. Many LGBTQ+ people do not report the harm they experience online, often because they fear they will not be taken seriously. Galop’s Online Hate Crime Report 2020 found that less than half of LGBTQ+ victims of online abuse (44 per cent) reported their experiences to social media platforms, and less than one in ten (seven per cent) reported to the police. Similarly, 65 per cent of LGBTQ+ young people believe that online platforms are unlikely to do anything when it comes to tackling homophobic, biphobic and transphobic content or incidents reported to them (School Report, 2017).
6. This abuse can cause profound harms to LGBTQ+ communities, including negatively impacting their mental health and creating fears over their safety.
7. Crucially, this also impacts LGBTQ+ individuals’ rights to freedom of expression. Homophobia, biphobia and transphobia online deters LGBTQ+ people from expressing themselves freely online (particularly about LGBTQ+ matters) and moving through online spaces. This is evidenced in Galop’s Online Hate Crime Report 2020:
8. Several participants reported being frequently targeted for posting about LGBTQ+ matters and for correcting misinformation they witnessed. Many felt negative comments and abuse they received was an attempt to silence them and restrict their freedom to talk about LGBTQ+ matters.
‘I stopped posting in the hope that people would lose interest in sending me abuse.’
Respondent – Online Hate Crime Report (2020)
‘I don’t have any social media because it is a place where hate can be freely expressed and not be controlled and I am scared to use it. People think they can do and say anything online.’
Joseph, 13, secondary school (South West) - Stonewall School Report (2017)
9. An online safety regime must therefore be robust, comprehensive and alive to the specific experiences of marginalized communities – including those who are LGBTQ+.
Areas of Interest
Duty of Care Approach in the Bill
10. Stonewall supports a duty of care approach as put forward in the Bill. We support affirmative duties that place positive obligations on platforms to create an environment that is safe for all communities – including LGBTQ+ people.
Legal but Harmful
11. We support the duty to regulate content that is ‘legal but harmful’ in order to create a positive, proactive system that encourages safe spaces online.
12. We would like to highlight a report to the Committee by Hope not Hate published this month showing strong support across the human rights and equalities sector for this approach, as well as demonstrating public support. The report undertook a polling survey of 1512 people in July 2021 to explore public attitudes towards harmful content on social media:
13. We do not believe that harmful content should be required to meet the threshold of illegal to be regulated online. Other statutory instruments, such as the Equality Act 2010, have high thresholds to be met before a member of a marginalised community can be protected. Another system of regulation that captures harmful content taking place below this threshold is crucial to the emotional and physical safety of marginalised communities.
Freedom of Expression
14. We understand that concerns have been raised around the extent to which the Bill represents a threat to freedom of expression – especially in relation to the duty to regulate content that is legal but harmful.
15. However, we believe that appropriately regulating harmful online content is a necessary component to ensure that the rights to freedom of expression of LGBTQ+ communities is upheld and enhanced. Homophobia, biphobia and transphobia online deters LGBTQ+ people from expressing themselves freely online (particularly about LGBTQ+ matters) and moving through online spaces.
16. While the right to freedom of expression is universal, LGBTQ+ people and other marginalised communities do not necessarily have equal access to this right, as many cannot freely and meaningfully express themselves online. Regulation of legal but harmful content is a crucial step in ensuring that online abuse does not have a chilling effect on the ability of marginalised groups, including LGBTQ+ people, to meaningfully exercise their freedom of expression rights.
17. We are aware of concerns that the Online Safety Bill may result in censorship of LGBTQ+ people. The Government and Ofcom must work out a way to set up a regulatory approach that does not position anything related to LGBTQ+ identities as ‘harmful’ to children. There is a high risk that an online safety framework could be used misused to target LGBTQ+ content, and we believe robust and inclusive Codes of Practice could play a vital role in mitigating this risk.
18. The Committee has asked whether the Bill makes adequate provisions for people who are more likely to experience harm online. We understand that there has been some confusion around the lack of a clear and coherent definition of ‘harm’ put forward in the Bill.
19. LGBTQ+ communities experience specific online harms that are informed by homophobia, biphobia and transphobia. This includes concerningly high levels of homophobic, biphobic and transphobic abuse and harassment. Stonewall’s 2017 Hate Crime and Discrimination research with YouGov found that one in ten LGBT+ people (10 per cent) had experienced homophobic, biphobic or transphobic abuse directed towards them personally in the month preceding the survey, while almost half had witnessed it directed against others. These figures are even higher for LGBTQ+ people who are Black and people of colour, as well as for LGBTQ+ young people.
20. Galop’s Online Hate Report found that the most common form of online abuse for LGBTQ+ communities were insults (97 per cent), followed by threats of physical violence (63 per cent), threats of sexual assault (41 per cent), death threats (39 per cent), and doxing (34 per cent). The impact of said abuse cannot be underestimated, causing profound harms to LGBTQ+ communities, including negatively impacting their mental health and creating profound fears over their safety. The report found that victims experienced a range of negative responses to said online harms, including fear, anxiety, self-blame and suicidal thoughts. Some feared for their physical safety following victimisation. This also impacted on LGBTQ+ individuals’ rights to freedom of expression by deterring them from expressing themselves freely online (particularly about LGBTQ+ matters) and moving through online and offline spaces.
21. Trans communities (particularly trans women) have faced an escalating campaign of abuse and misinformation in the media and online in recent years. Open Democracy stated that social media platform policies ‘are not working for LGBTQ+ people’ and there is a ‘deluge of hatred against trans women’. Despite social media platforms having policies against hateful content, trans women continue to experience transphobia online – such as deadnaming and misinformation (e.g. being misrepresented as abusive men). Reports come back saying this has not violated their policies. It also found that “more than 700 leaked screenshots from a ‘secret’ Facebook group called Campaign Against the Takeover (CATT) also featured transphobic posts including some ridiculing specific trans individuals”.
22. We understand that ‘priority harms’ will be reflected in secondary legislation. It is important that online abuse and harassment that meets the legal thresholds of hate crime legislation, communications offences and the Equality Act 2010 as fuelled by prejudice against LGBTQ+ communities are prioritised as illegal harms.
23. However, harms that do not meet these legal thresholds must also be covered – including specific recognition of how homophobic, biphobic and transphobic abuse and harassment manifests online.
24. Priority harms must also be drafted in a manner that explicitly recognises the specific way in which people who belong to multiply marginalised groups experience specific forms of harm due to their intersecting identities.
25. The specific needs and experiences of marginalised communities so often impacted by online harms – including LGBTQ+ people – must be considered when developing a definition of harm. Online harms that are specific to LGBTQ+ communities must be meaningfully engaged with, understood and reflected in the definition of primary harms and secondary legislation introduced. This must also be accompanied by guidance (whether it be through explanatory notes or codes of practice) giving further explanation as to types and impact of said unique harms. Guidance should also be developed in consultation with marginalized groups and the communities that represent them. The Government and/or Ofcom must also commit to regularly reviewing these harms to ensure the law is reflective of contemporary ways in which our communities experience harm.
26. We support a definition of priority harms that is inclusive of LGBTQ+ needs and experiences being accompanied by a flexible, open-ended understanding of harm in the Bill – namely if a service provider has ‘reasonable grounds to believe that the content has a material risk of having or indirectly having a significant adverse physical or psychological impact on a child or adult of ordinary sensibilities.’ We recognise the value of a non-exhaustive definition, rather than a prescriptive list in legislation, helping to future-proof the law, alongside capturing the variety of ways in which certain online communications may evolved to harm LGBTQ+ communities. However, this should also be accompanied by explanatory guidance developed with impacted communities so that this threshold may be interpreted and applied consistently. The Government must commit to regularly reviewing this legal threshold and accompanying explanatory notes to ensure that statute and accompanying documents are able to capture the ways our communities experience harm in the current day.
27. We understand that when assessing the impact of harmful content on an individual with ‘ordinary sensibilities’, the Bill states:
“if the content would particularly affect an adult with a particular characteristic or who is a member of a particular group, then the adult of ordinary sensibilities should be assumed to have that characteristic or be in that group”
28. We welcome this insertion. However, we believe that more guidance must be made available as to how the application of ‘ordinary sensibilities’ will be applied to those with protected characteristics to ensure they are protected.
29. Lastly, the Bill conceptualises harm in an individualised manner. However, it is important to note that online harms against LGBTQ+ communities and other marginalised groups also result in significant societal harms. As demonstrated above, online harms have a chilling effect at a societal level as marginalised groups feel less able to express themselves and participate in public life.
30. A further societal impact of such harms is that they normalize abuse and perpetuate homophobia, biphobia and transphobia. If homophobic, biphobic and transphobic language is acceptable online, then it will become acceptable offline. Online abuse does not happen in a vacuum and has real consequences in the offline world – including the safety of marginalised communities and the overall tenor of our society.
31. We echo the points made by Hope not Hate in their concerns that section 13 of the Bill may in fact serve to protect content that is harmful and informed by prejudice under the guise of being of ‘democratic importance’ – therefore undermining the very purpose of the bill: to minimise harms experienced online.
32. As raised by the CEO of Stonewall Nancy Kelley, in oral evidence to the Joint Committee on the Draft Online Safety Bill, it is crucial that the Government pay due regard to situations whereby a policy argument is couched in discriminatory terms. For example, we have seen those opposing LGBTQ-inclusive education discuss the issue as a gateway to pedophilia: a view inextricably linked to homophobic and harmful attitudes. We must ensure that there are clearer definitions in the Bill in order to ensure hateful speech and other online harms are not rendered exempt from its remit.
33. We are concerned that the duty to protect journalistic content creates exemptions in the Bill that may be impacted by the assumption that anything can be journalism, resulting in harmful content online falling outside of its regulatory remit.
34. The Bill defines journalistic content tautologically as ‘content generated for the purposes of journalism’. More thought is needed as to what constitutes journalism in a rapidly changing online world to ensure that a blanket journalistic exemption does not legitimise misinformation and hate speech.
35. Stonewall echo the concerns put forward by Hacked Off in that the Bill creates several exemptions that leave marginalised communities vulnerable to harm, such as the social media accounts of newspapers and newspaper comment sections being exempt from the remit of the Bill.
36. This is concerning given that seriously abusive content - including that informed by homophobia, biphobia and transphobia – may take place in the comments sections of newspapers. Disinformation is also prevalent in said comment sections - a phenomenon the Bill specifically wants to address. Other review and comment based websites will also benefit from these exemptions - including conspiracy news websites which may host serious harmful content.
Services in Scope
37. We are concerned that there are gaps in protection in the Bill. Concerns in relation to placing ‘disproportionate risks on small businesses’ mean only a small proportion of companies will be designated as Category 1 services and therefore under a duty to regulate legal but harmful content. As such, platforms that are heavily implicated in encouraging online harms against LGBTQ+ communities will not be covered. We believe the Bill must be amended to capture smaller, alternative platforms within its remit, or the Committee must give due regard to how they will ensure that this activity is somehow minimised in a regulatory approach.
Role of Ofcom
38. It is crucial that any regulator has the capability to regulate digital spaces: an arena that is significantly different to traditional media. This will be a challenge Ofcom will face – as would any other possible regulator.
39. As raised by Stonewall’s CEO Nancy Kelley in an oral evidence session on the Bill before the Joint Committee on the Draft Online Safety Bill, effective regulation by Ofcom will require that they are able to operate with a strong technical understanding of how platforms operate, and the extraordinary pace at which they innovate. It is highly likely Ofcom will experience resistance from services falling under the Bill’s remit, and that they will be told the changes they wish to see are ‘not possible’ within the current technical processes of platforms. It is crucial that Ofcom have sufficient technical capabilities to interrogate and assess these answers for regulation to be effective.
40. We understand that Ofcom have stated their intention to consult those who represent people suffering from harm across protected characteristics when producing its Codes. It is therefore important that Codes of Practice and the regulatory approach are developed through meaningful consultation with LGBTQ+ communities..
42. Stonewall understands and appreciates the arguments in favour of a regulatory approach that removes anonymity online. However, we would like to highlight to the Committee that this approach may have the unintended consequence of creating barriers for marginalised people to freely access public platforms – specifically if removing anonymity requires individuals to share their legal identity documents.
43. There are various reasons why LGBTQ+ individuals may not use their legal name online. An LGBTQ+ person may not be ‘out’ to their family and friends, and anonymity may provide them with the freedom to explore their identity in a way that is safe and comfortable. For example, Stonewall’s School Report 2017 found that nine in ten LGBTQ+ young people felt they could be themselves online. LGBTQ+ individuals may also occupy unsupportive or hostile living environments, where being ‘out’ online can compromise their safety and wellbeing. For example, just two in five LGBTQ+ children and young people have an adult at home that they can talk to about being LGBTQ+ (School Report, 2017). Lastly, many trans individuals use the name given on a birth certificate. As of 2018, less than 5000 trans people across the UK possess a gender recognition certificate out of an estimated 200,000 – 500,000 trans people living in the UK (Government Equalities Office, 2018).
44. Many LGBTQ+ people will not have legal documentation of their lived name and gender, even if they wanted to provide them. Stonewall and LGBT Foundation are currently undertaking a survey to get UK-specific data on ID ownership among LGBTQ+ people, including whether ID matches an individual’s gender identity and presentation, and whether they have experienced problems having ID accepted in the past. We also seek to gain qualitative information about the barriers or concerns that make respondents less likely to acquire or use ID– such as cost, bureaucracy, lack of other documentation, as well as which concerns are most significant. We launched the survey on 11th August 2021 and have received 1,033 responses as of the 31st August, and our interim findings have shown that 21 per cent said that it is difficult or impossible to get ID that reflects their chosen name and gender.
45. In countries where it is illegal to be gay, people experience real risks to their wellbeing and safety when it comes to identity documents. This is also the case should they need to reveal their nationality. Many LGBTQ+ people live in locations across the world where it is unsafe to be themselves. For example, we know that in Egypt 57 per cent of charges against LGBTQ+ individuals featured the use of website and social media to entrap or incriminate them (LGBTQ Online Summary Report – Article 19 (2018)). Research has also shown that 86 per cent of respondents to a study across Egypt, Iran and Lebanon were anxious about sharing their real name on social media apps. A regulatory approach with precludes anonymity will therefore have a chilling effect on LGBTQ+ participation globally.
46. For many LGBTQ+ people, as well as vulnerable people more broadly, anonymity is what enables authentic and honest expression and thus their ability to meaningfully exercise their rights to freedom of expression online. Tackling harassment and abuse does not need to occur at the expense of the safety of those most at risk of harm. Should anonymity provisions be included in the Bill at a later date, it is crucial that any regulatory approach impacting anonymity is developed through meaningful consultation with marginalised groups.
47. We understand a middle-ground solution has been put forward by Clean up the Internet, among others that suggests a twin-track approach creating the option for all users to gain ‘verified’ account status and the ability to filter out from their newsfeed and DMs any account that is unverified. However, such solutions still include disclosing personal information like name and nationality – either to end users or back-end processes. Such solutions still put many LGBTQ+ people at risk of harm and may still result in a chilling effect of many LGBTQ+ online users.
48. Stonewall would be happy to talk further with the Committee about the impact of various proposals to regulate anonymity, as well as any other elements of our submission.
21 September 2021
Joint Pre-legislative Scrutiny Committee on the Draft Online Safety Bill