1.1 Samaritans welcomes the opportunity to respond to the Joint pre-legislative scrutiny Committee’s call for written evidence regarding the draft Online Safety Bill. Samaritans is the UK and Ireland’s largest suicide and self-harm prevention charity. Through over 16,000 listening volunteers, we respond to around 10,000 calls for help every day. We believe in the power of compassionate and non-judgemental listening to give people a safe place to work through their problems.
1.2 In 2019, in collaboration with government and some of the largest tech platforms, Samaritans established an Online Excellence Programme with the aim of promoting good practice around self-harm and suicide content online. We provide an advisory service for professionals and platforms dealing with self-harm and suicide content online, have published best-practice guidance[1] for platforms hosting user-generated self-harm and suicide content, have a programme of research to better understand the risks and benefits for users accessing this material and provide user resources to support individuals to talk about suicide and self-harm safely online.
1.3 Suicide remains a major public health problem, with the highest rates among men aged 45 – 49. It is the biggest killer of young people aged 16 – 24, and the suicide rate for young women is now at its highest on record. [2] Self-harm, a major risk factor for suicide, is also becoming much more prevalent, having tripled among young people over the last 15 years. [3] 1 in 15 (6.4%) adults in England report that they have self-harmed. [4]
1.4 The internet can be an invaluable resource for individuals experiencing feelings of self-harm and suicide, allowing them to speak openly and engage with people with similar experiences[5]. However, it can also provide access to content that can be distressing, triggering[6] and instructive[7] and may act to encourage, maintain or exacerbate self-harm and suicidal behaviours. [8] Other risks include contagion effects caused by over identification with the user who posts the content and imitative and ‘copycat’ suicides when detailed information about methods is presented.[9]
1.5 Whilst suicide and self-harm is complex and rarely caused by one thing, in many cases the internet is involved: a 2017 inquiry into suicides of young people found suicide-related internet use in nearly 26% of deaths in under 20s, and 13% of deaths in 20-24 year olds[10]. Samaritans is passionate about promoting positive and helpful suicide and self-harm related content and ensuring that harmful content is minimised and effectively managed.
1.6 There is a clear imperative to tackle suicide and self-harm content online. Taking a partial approach to tackling suicide and self-harm content will undermine the UK Government’s efforts to prevent suicide and achieve the aims of the cross-government National Suicide Prevention strategy in England[11]. A key aspect of suicide prevention is the reduction of access to means and reducing the availability of harmful and instructive information is one way of achieving this. No caveats around tackling harmful suicide and self-harm content (size of platform, age of user) should be established that will diminish the legislation’s ability to tackle harmful content in this space.
1.7 We welcome the opportunity to give evidence to the Committee and have focused on questions most relevant to our work and area of expertise. We have indicated where this document contains descriptions of self-harm and methods of suicide which could be harmful if shared in the public domain. We suggest reading our resource on supporting the wellbeing of staff working with self-harm and suicide content which gives advice on managing wellbeing when viewing potentially distressing content. [12] We would particularly welcome the opportunity to give oral evidence to the Committee, alongside Samaritans‘ supporters with lived experience of suicide and the online environment.
2.1 Prioritising illegal content and content that is harmful to children, and requiring only ‘Category 1’ services to address content that is ‘legal but harmful’ to adults, means the opportunity to safeguard adults from harmful suicide content will be missed. In its current form, the Bill will not require services that are not ‘Category 1’ to adequately moderate the content on their platform or protect their users if they confirm they are over 18.
2.2 In this response we refer to ‘harmful self-harm and suicide content’, by which we mean material that has the potential to cause or exacerbate self-harm and suicidal behaviours. We consider the types of suicide and self-harm content that is unequivocally harmful to all includes (but is not limited to):
2.2.1 Information and depictions of methods of self-harm and suicide, especially with instructions or advice on how to hurt yourself, including links and information that enable people to buy products intended for use as a means of suicide, comparisons of the effectiveness of methods, and suicide pacts.
2.2.2 Content that portrays self-harm and suicide as positive or desirable.
2.2.3 Graphic descriptions or depictions of self-harm and suicide.
2.3 We know, for example, that there are online spaces where users share information about methods of suicide. One of these in particular has been highlighted by families and coroners as playing a role in the suicides of individuals in the UK and Regulation 28 reports have been issued in relation to it to DCMS and DHSC to prevent future comparable deaths[13].[14] The forthcoming Bill is an opportunity to introduce greater regulation of poorly moderated smaller[15] sites where harmful suicide content is shared; however in its current form it does not do so.
“Anyone and everyone who is at risk of even considering suicide needs the online help to prevent them finding the information or impetus they may be looking for to take their own life. I know that every attempt my brother considered at ending his life - from his early 20s to when he died in April aged 40 - was based on extensive online research. It was all too easy for him to find step by step instructions so he could evaluate the effectiveness and potential impact of various approaches, and most recently - given he had no medical background - it was purely his ability to work out the quantities of various drugs, and likely impact of taking them in combination, that equipped him to end his life” - Samaritans supporter that has responded to the Committee’s call for evidence
Some parts of the following sections 2.4 – 2.11 could be harmful or instructive to individuals experiencing thoughts of self-harm or suicide and have been redacted for publication.
***
2.4 We are particularly concerned about [redacted]. Whilst it is important that suicidal people are supported without judgement, that there are peer-to-peer spaces for support, that we destigmatise talking about suicide, it is also critical that suicide isn’t seen as a common or successful way out: people who see suicide as a way of overcoming adversity are more likely to plan a suicide attempt. [16]
2.5 [This section has been redacted in the interest of public safety]
2.6 [This section has been redacted in the interest of public safety]
2.7 [This section has been redacted in the interest of public safety]
2.8 [This section has been redacted in the interest of public safety]
2.9 [This section has been redacted in the interest of public safety]
2.10 [Redacted] provides ‘success stories’ of people who have died [redacted]. This not only validates the method as effective but also glorifies the suicides and puts users at risk of contagion by identifying with the individuals.
2.11 Graphic images relating to suicide and self-harm are easily accessible online via a range of smaller sites that are not required to implement robust moderation standards [redacted]. These sites are distressing, raise awareness of methods of harm and may normalise the idea of self-harm and suicide to viewers. [17] [Redacted]. This content is extremely concerning from a suicide prevention perspective and these platforms should be subject to regulation when it comes to over 18s, under the Bill.
***
2.12 Other smaller sites including those with a focus on support and suicide prevention may also become harmful spaces for users, particularly those in distress, because they are poorly moderated (and will continue to be if the Bill is passed in its current form). Furthermore, users on smaller sites that are not required to implement robust moderation standards can be directed by algorithms to other harmful suicide and self-harm content.
3.1 In Samaritans’ view all sites should have a duty of care to all their users, regardless of their age. We are calling for the Bill to specify that suicide and self-harm content is an area the legislation intends to tackle and, in relation to suicide and self-harm content, all user-to-user services, regardless of reach and functionality, will be required to remove suicide and self-harm content that is harmful to children and adults.
3.2 Whilst it is imperative that children are kept safe online, and young people are at greater risk of suicide contagion[18], suicide and self-harm content affects people of all ages: between 2011-2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm. 124 were aged over 25[19]. This data is based on clinical reports and is likely to underestimate the true extent to which the internet plays a role in suicides.
3.3 In a population survey of 21 year olds, conducted by Samaritans and the University of Bristol, almost 75% of the participants who had attempted suicide reported using the internet for a suicide-related purpose; whilst most were seeking help and support, one in five had accessed sites that provided information about methods of harm. [20]
3.4 Research looking at online support groups found associations between the use of these spaces and suicidal feelings are not limited to younger users, but are also present for people aged 30 to 59[21]. It is also important to consider that older people will find their way online in greater numbers in future.
“Harmful and accessible suicide and self-harm online content can be harmful at any age. I am in my fifties and would be tempted to act on this information if I felt suicidal again” – Samaritans’ supporter that has responded to the Committee’s call for evidence
4.1 Individuals experiencing feelings related to suicide and self-harm are likely to be more vulnerable and may be in crisis. The removal of suicide and self-harm as specific categories of harm in the draft Bill is concerning and may mean that sites and platforms do not take them seriously.
4.2 People experiencing suicidal feelings or struggling with self-harm are likely to be more vulnerable and at greater risk of harm from legal but harmful suicide and self-harm content. However, suicidal ideation can quickly fluctuate, sometimes over the course of a single day, [22] meaning that it is difficult to identify who is more vulnerable to suicide and self-harm content: for this reason legal but harmful suicide and self-harm content needs to be regulated across all platforms for people of all ages.
4.3 Samaritans research with University of Bristol found that people who use the internet to find out about suicide are likely to be vulnerable and in need of support at that point. Those experiencing high levels of distress show purposeful browsing, looking specifically for information on methods of harm. [23] Suicidal people using the internet for suicide-related purposes experience higher levels of suicidality and depression than suicidal people who did not use the internet for this purpose[24]. The Bill is an opportunity to ensure these vulnerable users’ access to harmful content is minimised, whilst still accessing supportive online spaces with high-quality signposting to support.
4.4 Samaritans recently conducted research (not yet published) with people with lived experience of self-harm and suicide which gave some insight into the impact harmful online self-harm and suicide content can have on adults:
“People need a place where they can express how they feel without backlash. [A place where] I can share how I feel while getting support from peers and not being bombarded with triggering images of open wounds when I am at my most vulnerable” (Adult aged 25-34). [25]
4.5 Respondents consistently shared the view that regulating the most widely used sites and platforms would be insufficient for safeguarding them and others with lived experience:
“The people using the bigger sites will just flood the smaller sites if their content starts getting removed. The standard needs to be the same across all sites.”
“Smaller sites are just as dangerous especially as content is usually easier to find.”
[Someone with lived experience of suicide attempts]: “If suicidal people can't find what they are looking for at large sites they will just go onto the smaller sites so it doesn't solve the problem.”
5.1 The Republic of Ireland’s Online Safety and Media Regulation Bill General Scheme (which is presently under pre-legislative scrutiny pending publication of a draft bill) makes no distinction between children or adults and ‘large’ or ‘small’ platforms; all platforms will need to tackle harmful suicide and self-harm content regardless of whether it can be found by those over or under 18. The UK’s Online Safety Bill must keep pace with this and ensure it protects users of all ages.
6.1 Encouraging or assisting suicide is a criminal offence in England and Wales under the Suicide Act 1961 (as amended by the Coroners and Justice Act 2009) and in Northern Ireland under the Criminal Justice Act 1966. There is no specific crime of assisting suicide in Scotland, however, it is possible that a person assisting another to take their own life could face prosecution for murder, culpable homicide or reckless endangerment. Encouraging or assisting suicide should be included as priority illegal content in the Bill.
6.2 Given the rise in prominence of online ‘suicide challenges’ that often include elements of self-harm, such as the Blue Whale challenge[26], the lack of legislation around assisting self-harm seems erroneous. Alongside a duty of care there should be accountability for individuals or groups deliberately encouraging individuals to harm themselves, or assisting self-harm, with malicious intent.
6.3 Samaritans is therefore calling for the Bill to implement the Law Commission’s recent recommendation of the introduction of a new offence of encouraging or assisting serious self-harm as part of reforming Communications Offences. As we outlined in our response to the Law Commission’s consultation on the topic, it is essential that the offence does not criminalise vulnerable users, or lead to greater stigmatisation of individuals who speak openly about self-harm online. Prior to the creation of the offence, it is essential that police and the Crown Prosecution Service are made aware of the risks of over-criminalising and are confident discerning between vulnerable individuals in distress and intentional bad actors.
7.1 Communicating online about feelings of suicide or self-harm can be part of a person’s recovery[27], offering support[28] and allowing feelings to be shared without judgement; it is important that supportive content in these spaces is not inadvertently removed. Samaritans looks forward to working with the regulator to establish thresholds for harm, some of which requires further research and engagement with users and experts.
7.2 Samaritans responded to the Law Commission’s consultation on updating Malicious Communications offences with the creation of an assisting self-harm offence with the recommendation that care is taken to avoid criminalising vulnerable users who may share content that could be harmful or triggering to others. The Law Commission’s threshold of “likely to cause harm” amounting to at least “serious distress” may also be applicable in the context of suicide and self-harm content and allow for consistency. [29]
8.1 We have gained valuable insights from our Online Excellence Programme as to how platform design, systems and processes can be shaped to enhance the safety of their users, and have developed guidelines for the tech industry in managing user-generated suicide and self-harm content, in conjunction with academics, experts and individuals with lived experience.[30] This includes processes for removing detailed information on suicide/self-harm methods; turning off algorithms that push harmful content related to suicide/self-harm; using age and sensitivity warnings; prioritising and promoting positive and helpful content; and effective moderation processes.
8.2 However, some of the most harmful suicide and self-harm content exists on smaller platforms i.e., those that would not be considered as ‘Category 1’. A recent systematic review looking at the impact of suicide and self-harm-related videos and photographs found that potentially harmful content massed on sites with poor moderation and anonymity. [31]
8.3 We regularly receive emails from members of the public concerned about smaller platforms, including bereaved parents whose children have accessed these platforms prior to dying by suicide and practitioners concerned that children as young as 12 are accessing these spaces. We are usually already aware of these platforms, and share their frustration but have to respond saying that we do not have the authority to remove or reduce the content, and neither does anyone else in the UK. The Online Safety Bill is a once in a lifetime opportunity to hold these platforms to account.
8.4 There is a need to protect users from content on these platforms. Whilst the proposed regulation, as it currently stands, may make it harder for users to find harmful suicide and self-harm content accidentally, it will not protect users in distress searching purposely for information about methods of harm, for example, on these types of smaller platforms because under current plans, these platforms would only be required to remove legal but harmful content as it applies to children. This legislation must ensure all platforms have a duty to protect their users beyond requiring them to confirm they are over 18, and to adequately moderate the content on their platforms, to help ensure they also adapt their platform design, systems, and processes so that risk of harm is minimised.
8.5 Some suicide and self-harm content is in the ‘grey’ area and is not easily defined. Samaritans looks forward to working with the regulator to ensure that supportive suicide and self-harm content is not inadvertently removed, which requires further research and engagement with users and experts. Any technological interventions to tackle harmful suicide and self-harm content must be underpinned by effective and nuanced human moderation.
9.1 As already stated, Samaritans does not believe allowing smaller sites to continue to expose over-18s to harmful suicide and self-harm content will help meet the Government’s objectives around suicide prevention. Under the current definitions it is unclear whether Wikipedia and other online ‘wikis’ or encyclopaedias based on user-contributions will be required to tackle suicide and self-harm content that is harmful for adults, despite their widespread popularity and use. In relation to self-harm and suicide content, it makes sense for all user-to-user platforms to be in scope of the Bill.
Some parts of the following sections 9.2 – 9.3 could be harmful or instructive to individuals experiencing thoughts of self-harm or suicide and have been redacted for publication.
***
9.2 For example, Wikipedia’s page on [redacted]: this may act as a guide for users in distress. [Redacted].
9.3 [Redacted]. To avoid ambiguity, the Bill should be clear that harmful suicide and self-harm content must be tackled across all platforms.
***
10.1 It is important that users are equipped with the skills they need to stay safe online. However, services hosting user-generated content must take responsibility for ensuring the safety of their users, taking appropriate action on self-harm and suicide content that could be harmful.
10.2 From our user research and advisory service engagement, we are aware that users reporting content often receive poor responses with limited support provided and little or slow action to remove or address the reported content. Regulation will help ensure appropriate action is taken against harmful self-harm and suicide content, but this must be across all platforms and to protect children and adults.
11.1 It is essential that Ofcom work closely with subject matter experts to inform the upcoming codes of practice relating to self-harm and suicide, reflecting the nuance that is needed when considering some of this content and ensuring helpful suicide and self-harm content is not inadvertently removed. It is also essential that these codes of practices are regularly reviewed to ensure they reflect the latest evidence base, emerging online trends and issues and changes in platform technologies.
12.1 User education and media literacy is a key facet of online safety and Samaritans welcome the inclusion of this in the Bill, particularly the promotion of “an awareness of the impact material may have”: this is a key principle of speaking safely about suicide and self-harm online. Ofcom have conducted high-quality research into the levels of media literacy of the UK public for many years and are well-placed to perform this role. It will be important for Ofcom to engage with relevant stakeholders when promoting media literacy in particular areas; Samaritans have co-produced a range of user resources with young people with lived experience and welcome the opportunity to engage further in this area. [32]
24 September 2021
[1] Samaritans (2020) Samaritans' industry guidelines: Guidelines for sites and platforms hosting user-generated content: https://www.samaritans.org/about-samaritans/research-policy/internet-suicide/guidelines-tech-industry/
[2] Suicide statistics report, Samaritans, 2019
[3] McManus, S, Bebbington, P, Jenkins, R, & Brugha, T (2016) Mental health and wellbeing in England: Adult Psychiatric Morbidity Survey 2014. Retrieved from https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/556596/apms-2014-full-rpt.pdf
[4] Ibid
[5] Lavis, A., & Winter, R. (2020). #Online harms or benefits? An ethnographic analysis of the positives and negatives of peer‐support around self‐harm on social media. Journal Of Child Psychology And Psychiatry, 61(8), 842-854.
[6] Arendt, F., Scherr, S., & Romer, D. (2019). Effects of exposure to self‐harm on social media: Evidence from a two‐wave panel study among young adults. New Media and Society, 21, 2422–2442.
[7] Biddle, Lucy, Jane Derges, Becky Mars, Jon Heron, Jenny L. Donovan, John Potokar, Martyn Piper, Clare Wyllie, and David Gunnell. "Suicide and the Internet: Changes in the accessibility of suicide-related information between 2007 and 2014." Journal of Affective Disorders 190 (2016): 370-375.
[8] Arendt et al., 2019; Marchant et al., 2017; Niedzwiedz et al., 2014; Biddle et al., 2012; Lavis & Winter, 2020
[9] Niederkrotenthaler et al, Association between suicide reporting in the media and suicide: systematic review and meta-analysis, 2020
[10] Appleby, L. et al., (2017). Suicide by Children and Young People. National Confidential Inquiry into Suicide and Homicide by People with Mental Illness (NCISH). (Manchester: University of Manchester, 2017).
[11] Department of Health. Preventing suicide in England: A cross-government outcomes strategy to save lives. Department of Health, 2012
[12] https://www.media.samaritans.org/documents/Supporting_the_wellbeing_of_staff_working_with_self-harm_and_suicide_content_FINAL.pdf
[13] Office of the Chief Coroner, Report to Prevent Future Deaths (2019) https://www.judiciary.uk/wp-content/uploads/2019/12/Callie-Lewis-2019-0414_Redacted.pdf
[14] Office of the Chief Coroner, Report to Prevent Future Deaths (2020) https://www.judiciary.uk/wp-content/uploads/2020/11/Joseph-Nihill-2020-0175_Redacted-1.pdf
[15] Throughout this submission “smaller” is used as short hand for platforms that clearly are not Category 1 platforms under the Bill’s current definitions.
[16] Joe, Sean, Daniel Romer, and Patrick E. Jamieson. "Suicide acceptability is related to suicide planning in US adolescents and young adults." Suicide and Life-Threatening Behavior 37, no. 2 (2007): 165-178.
[17] Khayambashi S. Blood and Guts in Living Color: A Study of the Internet Death Video Community. OMEGA - Journal of Death and Dying. 2021;83(3):390-406. doi:10.1177/0030222819855883
[18] Niedzwiedz, C., Haw, C., Hawton, K. and Platt, S., 2014. The definition and epidemiology of clusters of suicidal behavior: a systematic review. Suicide and Life-Threatening Behavior, 44(5), pp.569-581
[19] The National Confidential Inquiry into Suicide and Homicide by People with Mental Illness (NCISH) (2017)
[20] Biddle, L., Derges, J., Gunnell, D., Stace, S., Morrissey, J. (2016). Priorities for suicide prevention: balancing the risks and opportunities of internet use. University of Bristol/Samaritans
[21] Scherr, Reinemann. First do no harm: Cross-sectional and longitudinal evidence for the impact of individual suicidality on the use of online health forums and support groups (2016)
[22] Kleiman, Evan M., Brianna J. Turner, Szymon Fedor, Eleanor E. Beale, Jeff C. Huffman, and Matthew K. Nock. "Examination of real-time fluctuations in suicidal ideation and its risk factors: Results from two ecological momentary assessment studies." Journal of abnormal psychology 126, no. 6 (2017): 726.
[23] Biddle et al, Suicide and Self-Harm Related Internet Use: A Cross-Sectional Study and Clinician Focus Groups, 2017
[24] Niederkrotenthaler, Thomas, Anna Haider, Benedikt Till, Katherine Mok, and Jane Pirkis. Comparison of suicidal people who use the internet for suicide-related reasons and those who do not. Crisis (2016)
[25] Samaritans (2021) Unpublished research. Further information available on request.
[26] Khasawneh, A., Madathil, K.C., Dixon, E., Wiśniewski, P., Zinzow, H. and Roth, R., 2020. Examining the Self-Harm and Suicide Contagion Effects of the Blue Whale Challenge on YouTube and Twitter: Qualitative Study. JMIR Mental Health, 7(6), p.e15973.
[27] Brown, R. C., et al. (2020). ‘“I just finally wanted to belong somewhere”— Qualitative Analysis of Experiences with Posting Pictures of Self-Injury on Instagram’, Front Psychiatry, 11.
[28] Lavis, A. and Winter, R., 2020. # Online harms or benefits? An ethnographic analysis of the positives and
negatives of peer‐support around self‐harm on social media. Journal of child psychology and psychiatry.
[29] Law Commission (2021) Modernising Communications Offences: A final report https://www.lawcom.gov.uk/project/reform-of-the-communications-offences/
[30] Samaritans industry guidelines for managing self-harm and suicide content https://www.samaritans.org/about-samaritans/research-policy/internet-suicide/guidelines-tech-industry/
[31] Marchant, Amanda, Keith Hawton, Lauren Burns, Anne Stewart, and Ann John. Impact of Web-Based Sharing and Viewing of Self-Harm–Related Videos and Photographs on Young People: Systematic Review. Journal of medical internet research 23, no. 3 (2021)
[32] https://www.samaritans.org/about-samaritans/research-policy/internet-suicide/online-safety-resources/