Written evidence submitted by Stonewall
Stonewall Response: Online Safety and Online Harms Inquiry
Introduction
- Stonewall warmly welcomes the opportunity to respond to this consultation. Rapid advancements in online technologies have delivered many benefits for LGBTQ+ communities helping many to explore and understand their identities, speak up for their rights, forge communities and build solidarity. However, marginalised communities, including LGBTQ+ people, experience concerningly high levels of abuse online, which can cause significant harm. This inquiry presents a key opportunity to highlight successes and drawbacks of the Bill in its current form, and work towards legislation that ensures online platforms effectively protected marginalised communities from online harms.
- Many LGBTQ+ people experience significant levels of harm in online spaces. A report by Ofcom this year confirmed that one in three people spotted hate speech in online video platforms in the last three months. In relation to the prevalence of hateful content online, the report found that 32 per cent of respondents had seen or been subject to hate speech. Of those:
- 25 per cent said they had seen it directed towards transgender people
- 23 per cent against a specific sexual orientation
- Stonewall’s research with YouGov into the experiences of 5,375 LGBTQ+ adults across Britain, LGBT in Britain: Hate Crime and Discrimination (2017), found that:
- One in ten LGBTQ+ people (10 per cent) – including 26 per cent of trans people and 20 per cent of LGBTQ+ people who are Black and of colour – had experienced homophobic, biphobic or transphobic abuse online directed towards them personally in the month preceding the survey.
- In the month preceding the survey, almost half of LGBTQ+ people (45 per cent) had witnessed homophobic, biphobic and transphobic abuse or behaviour online.
- LGBTQ+ children and young people are at particular risk. While the internet can act as a lifeline to many LGBTQ+ children and young people, many are not able to be open about their LGBTQ+ identity at home or school. Two in five - including 58 per cent of trans young people - have been the target of homophobic, biphobic or transphobic online abuse, while nearly all (97 per cent) have witnessed it (School Report 2017).
- Many LGBTQ+ people do not report the harm they experience online, often because they fear they will not be taken seriously. Galop’s Online Hate Crime Report 2020 found that less than half of LGBTQ+ victims of online abuse (44 per cent) reported their experiences to social media platforms, and less than one in ten (seven per cent) reported to the police. Similarly, 65 per cent of LGBTQ+ young people believe that online platforms are unlikely to do anything when it comes to tackling homophobic, biphobic and transphobic content or incidents reported to them (School Report, 2017).
- This abuse can cause profound harms to LGBTQ+ communities, including negatively impacting their mental health and creating fears over their safety.
- Crucially, this also impacts LGBTQ+ individuals’ rights to freedom of expression (Article 10 HRA/ECHR). Homophobia, biphobia and transphobia online deters LGBTQ+ people from expressing themselves freely online (particularly about LGBTQ+ matters) and moving through online spaces. This is evidenced in Galop’s Online Hate Crime Report 2020:
- As a result of online abuse, two in five LGBTQ+ victims of online abuse (38 per cent) used their online accounts less.
- One in five (22 per cent) removed LGBTQ+ information from their profiles or left social media sites altogether.
- Several participants reported being frequently targeted for posting about LGBTQ+ matters and for correcting misinformation they witnessed. Many felt negative comments and abuse they received was an attempt to silence them and restrict their freedom to talk about LGBTQ+ matters.
‘I stopped posting in the hope that people would lose interest in sending me abuse.’
Respondent – Online Hate Crime Report (2020)
‘I don’t have any social media because it is a place where hate can be freely expressed and not be controlled and I am scared to use it. People think they can do and say anything online.’
Joseph, 13, secondary school (South West) - Stonewall School Report (2017)
- An online safety regime must therefore be robust, comprehensive and alive to the specific experiences of marginalized communities – including those who are LGBTQ+.
Is It necessary to have an explicit definition and process for determining harm to children and adults in the Online Safety Bill and what should it be?
- Stonewall supports the requirement that some companies take action to combat activity on their services which is ‘legal but harmful’, with Ofcom ensuring that tech platforms enforce their own terms of service. This approach could be flexible and responsive to the needs of an individual platform’s users and evolve in light of changing external context around ‘lawful but harmful’ behaviour.
- We understand that there has been some confusion around the lack of a clear and coherent definition of ‘harm’ put forward by the Bill. We urge the Government to ensure that the specific needs and experiences of marginalised communities so often impacted by online harms – including LGBTQ+ people – are considered when developing a definition of harm.
- We also understand that there has been critique around regulating content that is legal, despite being harmful, specifically in light of freedom of expression concerns. However, we do not believe that harmful content should be required to meet the threshold of illegal to be regulated online. Other statutory instruments, such as the Equality Act 2010, have high thresholds to be met before a member of a marginalised community can be protected. Another system of regulation that captures harmful content taking place below this threshold is crucial to the emotional and physical safety of marginalised communities.
- We also believe that appropriately regulating online content that is harmful is a necessary component to ensure that the rights to freedom of expression of LGBTQ+ communities is upheld and enhanced.
- Homophobia, biphobia and transphobia online deters LGBTQ+ people from expressing themselves freely online (particularly about LGBTQ+ matters) and moving through online spaces.
- While the right to freedom of expression is universal, LGBTQ+ people and other marginalised communities do not necessarily have equal access to this right, as many cannot freely and meaningfully express themselves online. Regulation of legal but harmful content is a crucial step in ensuring that online abuse does not have a chilling effect on the ability of marginalised groups, including LGBTQ+ people, to meaningfully exercise their freedom of expression rights.
- It has also been highlighted that the online harms regime seeks to uphold a distinction between regulation of children and adults. We echo concerns raised in the sector that robust protections from harm must be established for all individuals – regardless of age.
What are they key ommissions to the draft Bill, such as a general safety duty or powers to deal with urgent security threats, and (how) could they be practically includeed without compromising rights such as freedom of expression?
Newspaper Exemptions
- Stonewall echo the concerns put forward by Hacked Off in that the Bill creates several exemptions that leave marginalised communities vulnerable to harm, such as the social media accounts of newspapers and newspaper comment sections being exempt from the remit of the Bill. This is concerning given that seriously abusive content - including that informed by homophobia, biphobia and transphobia – may take place in the comments sections of newspapers. Disinformation is also prevalent in said comment sections - a phenomenon the Bill specifically wants to address. Other review and comment based websites will also benefit from these exemptions - including conspiracy news websites which may host serious harmful content.
- We also note the concerns by Hope not Hate in that Section 14 of the Bill de facto exempts journalists from its remit.
Anonymity
- We understand there have been conflicting viewpoints around the inclusion of anonymity in the Bill.
- Stonewall understands and appreciates the arguments in favour of a regulatory approach that removes anonymity online. However, we would like to highlight to the Committee that this approach may have the unintended consequence of creating barriers for marginalised people to freely access public platforms – specifically if removing anonymity requires individuals to share their legal identity documents.
- There are various reasons why LGBTQ+ individuals may not use their legal name online. An LGBTQ+ person may not be ‘out’ to their family and friends, and anonymity may provide them with the freedom to explore their identity in a way that is safe and comfortable. For example, the School Report 2017 found that nine in ten LGBTQ+ young people felt they could be themselves online. LGBTQ+ individuals may also occupy unsupportive or hostile living environments, where being ‘out’ online can compromise their safety and wellbeing. For example, just two in five LGBTQ+ children and young people have an adult at home that they can talk to about being LGBTQ+ (School Report, 2017). Lastly, many trans individuals do not use the name given on a birth certificate. As of 2018, less than 5000 trans people across the UK possess a gender recognition certificate out of an estimated 200,000 – 500,000 trans people living in the UK (Government Equalities Office, 2018).
- Any approach that requires individuals to verify their identity to a third party with a birth certificate, or other forms of personal ID, raises significant privacy and security concerns.
- Furthermore, many LGBTQ+ people will not have legal documentation of their lived name and gender, even if they wanted to provide them. Stonewall and LGBT Foundation are currently undertaking a survey to get UK-specific data on ID ownership among LGBTQ+ people, including whether ID matches an individual’s gender identity and presentation, and whether they have experienced problems having ID accepted in the past. We also seek to gain qualitative information about the barriers or concerns that make respondents less likely to acquire or use ID– such as cost, bureaucracy, lack of other documentation, as well as which concerns are most significant. We launched the survey on 11th August 2021 and have received 1,033 responses as of the 31st August, and our interim findings have shown:
- Nearly one in four (23 per cent) had experienced problems getting their ID accepted in the past
- 21 per cent said that it is difficult or impossible to get ID that reflects their chosen name and gender
- Finally, other marginalised communities including survivors of domestic abuse and refugees and asylum seekers may be endangered by providing identity documents.
- For many LGBTQ+ people, as well as vulnerable people more broadly, anonymity is what enables authentic and honest expression and thus their ability to meaningfully exercise their rights to freedom of expression online. Tackling harassment and abuse does not need to occur at the expense of the safety of those most at risk of harm. Should anonymity provisions be included in the Bill at a later date, it is crucial that any regulatory approach impacting anonymity is developed through meaningful consultation with marginalised groups.
Are there any contested inclusions, tensions or contradictions in the draft Bill that need to be more carefully considered before the final Bill is put to Parliament?
- We echo the points made by Hope not Hate in their concerns that section 13 of the Bill may in fact serve to protect content that is harmful and informed by prejudice under the guise of being of ‘democratic importance’ – therefore undermining the very purpose of the bill: to minimise harms experienced online.
- It is also important to consider that creating online spaces that are democratic in nature must have the voices and experiences of marginalised communities present. As stated by Hope Not Hate:
“We will only create a genuinely democratic online space by broadening out the definition of ‘democratically important’ to include not just content that is often removed, but also content that is missing in the first place. It cannot just protect existing “democratically important” speech, it must also create a safe and pluralistic online space that encourages and empowers diverse and marginalised voices, enabling them to be heard”
- Greater clarity is therefore needed about what constitutes content with ‘democratic importance’ so that the personhood of marginalised communities is not inadvertently diminished through an attempt to foster pluralistic debate. Furthermore, marginalised voices – including those of the LGBTQ+ community – must be protected and enhanced as a crucial pillar to achieving genuinely democratic online spaces.
Online Safety and Online Harms