Online Safety (Re-committed Clauses and Schedules) Bill

Written evidence submitted by Samaritans (OSB114)

House of Commons Online Safety Bill Public Bill Committee (re-committed Clauses and Schedules)

1. Executive Summary

1.1 The online environment can be a vital source of information and support. For people experiencing self-harm and suicidal feelings, it can be a place to go without fear of stigma. However, the online environment can also be harmful, hosting content that can maintain, exacerbate or encourage self-harm and suicidal behaviour. The Online Safety Bill presents a once-in-a-generation opportunity to preserve helpful content and spaces online, whilst urgently addressing harmful suicide and self-harm content.  

1.2 All too often, families speak of missed opportunities that could have been used to potentially prevent a person from taking their own life. There is a clear imperative to tackle suicide and self-harm content online. Samaritans strongly oppose s the decision to remove regulation of legal but harmful content from the Bill. Furthermore, the new proposed empowerment duties a re inadequate as this relies on the individual to actively choose to opt-out of being able to view harmful suicide and self-harm content. We know that people experiencing suicidal thoughts and ideation often actively seek out harmful content because they are looking for ways to take their own life when they feel that suicide is the only option.

1.3 Taking a partial approach to tackling suicide and self-harm content will undermine the UK Government’s efforts to prevent suicide and achieve the aims of the cross-government National Suicide Prevention strategy in England. A key aspect of suicide prevention is the reduction of access to means and reducing the availability of harmful and instructive information is one way of achieving this.

1.4 The Government must regulate to protect people of all ages from all extremely dangerous suicide and self-harm content on large and small platforms.

2 Introduction

2.1 Samaritans is the UK and Ireland’s largest suicide prevention charity. Through over 22,000 volunteers, we respond to a call for help every 10 seconds.   

2.2 Over the last three years we have developed a hub of excellence in suicide prevention and the online environment with the aim of minimising access to harmful content and maximising opportunities for support. Our Online Excellence Programme includes industry guidelines for responding to self-harm and suicide content, an advisory service for sites and platforms offering advice on responding to self-harm and suicide content, a research programme exploring what makes self-harm and suicide content harmful and for whom, and a hub of resources helping people to stay safe online.  

2.3 Suicide is the leading cause of death in males under 50 years and females under 35 years in the UK, with the latest available figures released confirming that 5,583 people in England and Wales tragically took their own lives in 2021. These figures show the largest increase in suicide for females under 24 since records began. Self-harm - a strong risk factor for future suicide [1]   [2]  - has also increased among young people since 2000 and is more common among young people than other age group s  

2.4 The internet can be an invaluable space for individuals experiencing self-harm and suicidal feelings, providing opportunities for users to speak openly and to access support. [3]  In  Samaritans research with people who self-harm, two-thirds said online forums and advice were helpful to them. [4] However, the internet can also provide access to content that may act to encourage,  maintain  or exacerbate self-harm and suicidal behaviours. Detailed information about methods can also increase the likelihood of imitative and copycat suicides, with risks such as contagion effects [5] also present in the online environment.

2.5 Whilst suicide and self-harm is complex and rarely caused by one thing, in many cases the internet is involved: a 2017 inquiry into suicides of young people found suicide-related internet use in 26% of deaths in under-20s, and 13% of deaths in 20–24-year-olds [6] .  Samaritans’ own research has shown that at least a quarter of patients who had self-harmed with high suicidal intent had used the internet in connection with their self-harm. [7]

2.6 Furthermore, Samaritans’ new research with Swansea University found that the experiences of people viewing harmful online suicide and self-harm content can be devasting: three-quarters of people who took part in the research said they had harmed themselves more severely after viewing self-harm content online. [8]

2.7 The Online Safety Bill is a cruci al opportunity to reduce access to harmful suicide and self-harm content, helping to create a suicide-safer internet. The Government must keep its promise to make the UK the safest place in the world to go online.

3 Amendments 6,7 and 41 - Protecting people of all ages on all platforms

3.1 Suicide and self-harm content affects people of all ages: between 2011-2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm and 82% of these patients were aged over 25 [9] . [1] We strongly oppose the Government’s decision to remove regulation of legal-but-harmful content from the Bill. Whilst Samaritans supports the Government’s commitment to protecting children online, susceptibility to harm from suicide and self-harm content does not end when people reach the age of 18. The Government must protect people of all ages from all extremely dangerous suicide and self-harm content on large and small platforms.

3.2 Encouraging or assisting suicide is a criminal offence in England and Wales under the Suicide Act 1961 (as amended by the Coroners and Justice Act 2009). Content encouraging or assisting someone to take their own life is illegal and has been included as ‘priority illegal content’ in the Bill, meaning that all platforms will be required to proactively and reactively prevent individuals from encountering it. Search engines will also need to structure their service to minimise the risk of individuals encountering this content. Howeve r, this is by definition a partial approach .

3.3 Samaritans consider the types of suicide and self-harm content that is legal but unequivocally harmful includes (but is not limited to):   

· Information, depictions, instructions, and advice on methods of self-harm and suicide 

· Content that portrays self-harm and suicide as positive or desirable  

· Graphic descriptions or depictions of self-harm and suicide  

3.4 We also know from our own work with technology platforms and from our discussions with policymakers around the Bill that there will inevitably be some content that may have the effect of encouraging or assisting suicide without meeting the threshold of an offence. There is not yet clarity on the practical parameters of what is legal and illegal content for the purposes of the online safety regime, but we know that simply having offences of encouraging or assisting suicide and encouraging or assisting self-harm in the Bill will not, on their own, go far enough. A comprehensive approach to all dangerous suicide and self-harm content is vital to meet the aims of the Bill.

3.5 Removing regulation of legal but extremely harmful suicide and self-harm content will mean that platforms will not  even need to consider the risk that such content could pose to adult users. This will leave huge amounts of dangerous content widely available online, and completely undermine the online safety regime from the outset.

3.6 Samaritans supporters with lived experience of suicide and self-harm have highlighted the need to protect all ages from harmful suicide and self-harm content:

3.6.1 "Harmful and accessible suicide and self-harm online content can be harmful at any age. I am in my fifties and would be tempted to act on this information if I felt suicidal again"

3.6.2 "Anyone and everyone who is at risk of even considering suicide needs the online help to prevent them finding the information or impetus they may be looking for to take their own life. I know that every attempt my brother considered at ending his life - from his early 20s to when he died in April aged 40 - was based on extensive online research. It was all too easy for him to find step by step instructions so he could evaluate the effectiveness and potential impact of various approaches"

3.7 In research that we commissioned from Swansea University, we found that whilst users generally support age verification and restrictions across social media and online platforms, these are easily bypassed by children [10] . In a sample of over 5200 participants, over three quarters saw self-harm content online for the first time at age 14 years or younger, with nearly a fifth saying that they were 10 years or younger. It was highlighted by participants that date of birth alone is not sufficient as age verification as using a fake birthday was a simple way to get around this.    A more comprehensive online safety regime for all ages will of course also increase protections for children: the current proposals create a two-tier approach to safety which opens the possibility of children being able to circumnavigate safety controls.

4 Amendments 8-17 - User Empowerment

4.1 The Government has specified that it is up to individual adults to protect themselves from suicide and self-harm content through new user empowerment duties, as per Amendment 12. This is inadequate as it still relies on the individual to actively choose to opt-out of being able to view harmful suicide and self-harm content. We know that people experiencing suicidal thoughts and ideation often actively seek out harmful content because they are looking for ways to take their own life when they feel that suicide is the only option. A study from the University of Bristol found that participants with severe suicidal thoughts actively used the internet to research an effective method, and often found clear suggestions. [11] Increasing the controls that people have is no replacement for holding sites to account through the law.

4.2 Furthermore, r esearch also shows that the way that people use the internet varies depending on their current level of distress. For example, people in low levels of distress typically browse content to explore different support options and to hear from other people and their stories in order to gain a greater understanding of their own experience. Contrastingly, people in high levels of distress tend to show purposeful browsing, such as specifically looking for information on methods of harm [12] . W hilst people experiencing intense suicidal feelings may actively avoid online help, individuals who are less distressed are often receptive to sites hosting helpful and positive content as well as support groups and information. An individual's emotional state can fluctuate and change over time, including within the same day, and therefore, their viewing and searching habits can also change throughout a single day . Therefore, providing individuals with the option of whether they see such content will not address the damage that is caused by viewing harmful suicide and self-harm content.

4.3 We also know from our research with people with lived experience of self-harm and suicidal thoughts that whilst there i s a good understanding of the purpose of online community guidelines "to keep users safe" very few online users have seen or read them. Many participants said that they would only check out the community guidelines in response to having their own content removed by the platform and there were low level of awareness of guidelines specifically relating to suicide and self-harm content [13] . A requirement to offer user empowerment tools does not in itself equate to users understanding the tools that are available to them and chang ing their settings accordingly . User empowerment duties are no substitute for regulation of access to dan gerous suicide and self-harm online content through the law.

December 2022


[1] This data is based on clinical reports and is likely to underestimate the true extent to which the internet plays a role in suicides. 


[1] Klonsky, E.D., May, A.M., & Glenn, C.R. (2013). The relationship between non-suicidal self-injury and attempted suicide: Converging evidence from four samples . Journal of Abnormal Psychology, 122(1), 231.

[2] Mars, B., Heron, J., Klonsky, D.E., Moran, P., O’Connor, R, C., Tilling, K., Gunnell, D. (2019). Predictors of future suicide attempt among adolescents with suicidal thoughts or nonsuicidal self-harm: a population-based birth cohort study , The Lancet Psychiatry, 6, 327-337.

[3] Brennan, C et al. (2022) ‘Self-harm and suicide content online, helpful or harmful? A systematic review of the recent evidence’ Journal of Public Mental Health Vol 21 No 1: 57-69; Dyson, M.P et al (2016) ‘A systematic review of social media use to discuss and view deliberate self-harm acts’ PloS one 11(5), e0155813;  Lavis, A. and Winter, R. (2020).  ‘Online harms or benefits? An ethnographic analysis of the positives and negatives of peer‐support around self‐harm on social media’ , Journal of Child Psychology and Psychiatry, 61(8), 842-854. 

[4] Samaritans. (2020).  Pushed from Pillar to Post: improving the availability and quality of support after self-harm in England .  

[5] Niederkrotenthaler et al , Association between suicide reporting in the media and suicide: systematic review and meta-analysis , 2020.

[6] Appleby, L. et al., (2017). Suicide by Children and Young People. National Confidential Inquiry into Suicide and Homicide by People with Mental Illness (NCISH). (Manchester: University of Manchester, 2017).

[7] Biddle L, Derges J, Gunnell D, Stace S, Morrissey J, (2016) Priorities for suicide prevention: balancing the risks and opportunities of internet use , University of Bristol/Samaritans. 

[8] Samaritans (2022) How social media users experience self-harm and suicide content, available here https://media.samaritans.org/documents/Samaritans_How_social_media_users_experience_self-harm_and_suicide_content_WEB_v3.pdf

[9] The National Confidential Inquiry into Suicide and Homicide by People with Mental Illness (NCISH) (2017)

[10] Samaritans (2022) How social media users experience self-harm and suicide content, available here https://media.samaritans.org/documents/Samaritans_How_social_media_users_experience_self-harm_and_suicide_content_WEB_v3.pdf

[11] Biddle L, Derges J, Goldsmith C, Donovan JL, Gunnell D (2018) Using the internet for suicide related purposes: Contrasting findings from young people in the community and self-harm patients admitted to hospital. PLoS ONE 13(5): e0197712. https://doi.org/10.1371/journal.pone.0197712

[12] Samaritans (2022) Towards a suicide-safer internet https://media.samaritans.org/documents/Samaritans_WhatASafeInternetLooksLike_2022.pdf

[13] Samaritans (2021) Understanding user views of online platform messaging around self-harm and suicide content

 

Prepared 15th December 2022