Written supplementary evidence submitted by NSPCC (OSB0228)


Strengthening the draft Online Safety Bill: NSPCC recommendations to ensure an effective response to harms against children


This note has been produced by NSPCC for the Joint Committee on the Draft Online Safety Bill and sets out a number of core areas where we consider it essential that the draft legislation is strengthened to ensure it effectively addresses the dynamics of inherently avoidable child abuse and wider harms. This note should be read alongside our written evidence to the Committee.

We identify five key areas where the Bill must be strengthened to ensure the Bill delivers a response that maps onto the scale and complexity of the child abuse threat and wider risks of child online harm:


Below we suggest some draft amendments for each for the Committee to consider and upon which to build.


Cross platform risks

Online services should be required to consider how the design and operation of their platforms contribute to cross-platform child abuse risks, and should be made to risk assess on this basis.

The issue:

The NSPCC is strongly concerned that the draft Bill fails to place provisions on online services to consider the ways in which design and operation of their services contributes to inherently cross platform child abuse dynamics.

Unless the legislation responds to the dynamics of online child abuse and the spread of content that is harmful to children, we cannot conceive the Bill can be effective in addressing the substantive child abuse threat (but also the broader risks of harmful content). For example

-          Abusers use well-established grooming pathways[1] to exploit the design features of social networks, making effortless contact with children before migrating them to livestreaming, chat or messaging sites;

-          Recent whistleblower allegations submitted to US regulators highlight how Facebook Groups are used to facilitate abuse and signpost to child abuse material on third party sites, a problem that has not been effectively addressed[2];

-          harmful content is able to spread with considerable velocity and virality across online services, resulting in children and young people accessing age-inappropriate content or being subject to unacceptable levels of harassment.


The NSPCC has secured legal advice that cross-platform cooperation is likely to be impeded unless there is a clear statutory basis to enable or require collaboration that otherwise might be impeded by competition law[3]. The absence of such a statutory provision risks being a substantive constraint on the regulatory framework.

Lilly’s story

Lilly is a highly vulnerable 16-year old with ADHD and an autism diagnosis. She had previously been sexually assaulted and bullied by an old friend who had filmed it and threatened to post it online.

In January this year, Lilly’s mum Hannah was told to look on TikTok. She found the video had been posted, and with it, a video of a of a young boy being abused by an older girl. Lilly’s ‘friend’ falsely claimed it was her committing the abuse.

From that moment, Hannah’s phone didn’t stop ringing and beeping with messages calling Lilly a paedophile and threatening to kill them both. Hannah had 450 calls in 6 hours. Vigilantes tried to track them down. Within weeks, on TikTok there were over 600,000 views of a hashtag of Lilly’s name. Abusive content spread on Snapchat and YouTube. An online petition that Lilly be put in jail received 30,000 signatures.

Hannah and Lilly’s story underlines that platforms don’t have effective systemic processes in place to identify harmful content and to alert other sites before it spreads across them with alarming velocity and virality. If online services were subject to cross platform duties, and had a requirement to put in place ‘rapid response’ arrangements, platforms could have acted sooner to prevent unimaginable suffering to Hannah and Lilly.

Some platforms did take action, but only after the NSPCC intervened on Hannah’s behalf. Over several weeks, Hannah and her friends spent six weeks reporting multiple accounts saying things like ‘Lilly is a paedophile’ and ‘Lilly must die.’ They were up until midnight every night doing it. Hannah describes it as being like trying to put out a huge wildfire with a bottle of water.

Hannah has been left with PTSD. She gets upset by her phone ringing because it could be someone telling her they’ve seen more things online. 

Lilly’s mental health and personality has been affected. She’s quieter, more anxious and constantly fearful. Her self-esteem has dropped. Hannah knows she’s holding onto a lot after everything that has happened to them both.


How the Bill can be strengthened:


In order to effectively address the cross-platform nature of risks, and the ways in which the design and operation of sites contribute towards an ecosystem of online harms, the legislation should:

-          explicitly set out that platforms should consider cross-platform risks as part of meeting their illegal and child safety duties;

-          require online services to consider how the design and operation of their services may contribute to cross-platform harms through the risk assessment process, and to take proportionate measures to address all reasonably foreseeable harms;

-          in meeting the illegal and child safety duties, ensure platforms demonstrate they have effective arrangements in place to share information on highly agile and constantly evolving threats vectors;

-          ensure platforms have ‘rapid response’ arrangements in place to identify and mitigate harms to children which spread with considerable velocity, such as in Lilly’s story. 


Suggested amendments:


INSERT in clause 7(8):

‘(h) the extent to which and the means by which the service can be used in conjunction with other services to give rise to the risk of illegal content and behaviour to children’

AMEND clause 7(8)

‘(i) where the service is likely to be accessed by children, the level of risk of the service facilitating the presence or dissemination of illegal content or behaviour that is harmful to children, identifying functionalities that that present higher levels of risk, including functionalities-

(i)                  enabling adults to search for other users of the service (including children), and

(ii)                enabling adults to contact other users (including children) by means of the service


INSERT IN clause 7(9):

‘(h) the extent to which and the means by which the service can be used in conjunction with other services to give rise to the risk of content that is harmful to children’

AMEND clause 7(9)

‘(i) where the service is likely to be accessed by children, the level of risk of the service facilitating the presence or dissemination of illegal content or behaviour, identifying functionalities that that present higher levels of risk, including functionalities-

(i)                  enabling adults to search for other users of the service (including children), and

(ii)                enabling adults to contact other users (including children) by means of the service


Explanatory note:

These amendments require services to explicitly risk assess how the design and operation of their services may contribute to cross-platform risks, including harms that may arise on the site and contribute towards harms that extend elsewhere, for example grooming pathways.


AMEND CLAUSE 9(2)             

‘a duty, in relation to a service, to take proportionate steps to mitigate and effectively manage the risks of harm that result through the operation of the service, as identified in the most recent illegal content risk assessment the service’


(i)                  ‘a duty to mitigate and effectively manage the risks of harms posed to individuals through the extent and means by which the service can be used in conjunction with other services to give rise to the risk of illegal content and behaviour, particularly to children’

(ii)                ‘a duty to co-operate and collaborate with other service providers and establish proportionate mechanisms to mitigate and effectively manage the dissemination or prevalence of illegal content, including measures specified through the discharge of clause 29.’


We would also propose similar amendments to the safety duties for services likely to be accessed by children, set out in clause 11.


AMEND CLAUSE 30 (2)(b)

‘to design and assess the service with a view to protecting United Kingdom users, particularly children from harm, including with regard to

(i)                  algorithms used by the service

(ii)                functionalities of the service

(iii)              the extent and means by which the service can be used in conjunction with other services


Explanatory note:

These amendments explicitly set out expectations that services should be taking reasonable proportionate steps to remove mitigate and effectively manage cross-platform risks in the discharge of their safety duties.

The amendments also place requirements on services to take proportionate steps to collaborate on measures that enable them to mitigate and respond to cross platform risks, including any take measures which the regulator may deem necessary.

In doing so, this mitigates a potential adverse interplay with competition law: the inclusion of a specific duty on platforms to co-operate on cross-platform risks addresses a potential major constraint on the regulatory framework, and provides Ofcom with a clear and explicit platform from which to develop an effective cross-platform regulatory scheme that is proportionate to the nature and extent of risks[4].


Content that directly facilitates online sexual abuse

The draft Bill should be strengthened to ensure an upstream approach to child protection, including through requiring online services to tackle harmful forms of content where these directly facilitate online abuse and survivor re-victimisation.


The issue:

We have substantive concerns that the draft legislation fails to adequately address content that directly facilitates illegal behaviour, including child abuse.

Abusers produce abuse material that may not meet the current criminal threshold, but which can facilitate access to illegal images; act as ‘digital breadcrumbs’ that allow abusers to identify and form networks with each other; and allow children to be actively re-victimised through the sharing and viewing of carefully edited abuse sequences.

There is growing evidence about the harm caused by so-called ‘tribute sites’, in which offenders   create online profiles that misappropriate the identities of known survivors[5]. These fraudulent accounts, which typically adopt survivors’ names and feature non-harmful imagery at the account/profile level, are then used by offender communities to connect with like-minded perpetrators, primarily to exchange contact information, form offender networks, and signpost to child abuse material on the dark web. In Q1 2021, there were 6 million user interactions with content referencing known survivors or commercial websites[6].


How the Bill can be strengthened:


As it stands, the Bill fails to appropriately address the risks of content that directly facilitates online abuse and that actively re-victimises children. The draft Bill would not require platforms to address such material as part of the discharge of its safety duties, and based on discussions with Ofcom and government, it also appears unlikely the content is substantively distinct enough to be classified as a priority harm in its own right.

Given the clearly egregious nature of such material, and its direct contribution to driving illegal activity, we recommend the scope of the illegal content safety duty being amended, granting the regulator powers to treat content that facilitates child abuse with the same severity as illegal material.

This will ensure the legislation enables a fully upstream approach to the disruption and detection of abuse behaviour, and would give regulatory certainty to companies that at present either don’t do enough, or adopt highly differentiated approaches, to tackling this type of content.


Suggested amendment:

INSERT IN Clause 9(3):

(e) minimise the presence of content that may facilitate the dissemination of or access to priority illegal content, particularly with relation to children


User advocacy arrangements

Children should benefit from statutory user advocacy arrangements to ensure their needs are represented in the context of powerful and well-resourced industry interventions.

The issue:

The draft Bill does not contain provisions for user advocacy arrangements, and will therefore create a regulatory regime without a permanent and effective children’s voice to balance against the power and resources of the regulated companies.

Although the Government committed to introduce proposals for user advocacy during the pre-legislative scrutiny process, such proposals have not been forthcoming.

Effective user advocacy is integral to the success of the regulatory regime. The Bill must make provision for a statutory user advocate for children, which should be funded by the industry levy. Statutory user advocacy is vital to ensure there is effective counterbalance to well-resourced industry interventions, and to enable civil society to offer credible and authoritative support and challenge. This will ultimately lead to better long-term regulatory outcomes.

Statutory user advocacy is a central component of strong, user centred regulatory settlements across multiple sectors, from postal services to public transport to essential utilities. However, because the draft Bill fails to make provision for statutory user advocacy, children who have experienced or are at risk of online sexual abuse stand to receive less systemic protections than users of post offices or passengers on buses and trains.


How the Bill can be strengthened:


The legislation must make provision for a statutory user advocacy body for children, which in line with the ‘polluter pays‘ principle, should be funded through the industry levy.

At present, a range of civil society organisations represent children. However, it should not be taken for granted that civil society and charitable organisations can continue to perform these activities in perpetuity, or to the level and extent that is necessary to support and when necessary offer challenge to the regulator.

If there is an inappropriately scaled, poorly focused or insufficiently resourced civil society response, this is likely to significantly weaken the regulator’s ability and appetite to deliver meaningful outcomes for children.

Tech firms are a well-resourced and powerful voice, and will legitimately seek to exert strong influence when decisions are made about their services. Powerful industry interests are not unique to the tech sector, but the size of and resources available to the largest companies are arguably distinct.

In most other regulated markets, these risks are addressed through strong, independent advocacy models. Without such arrangements in place for online harms, there is a clear risk that:

-          children’s interests will be asymmetrical to those of industry, and unable to compete effectively with their worldview and resources;

-          organisations representing children’s interests will be unable to produce high-quality super complaints and produce actionable evidence of systemic harm and/or regulatory breaches;

-          the regulator will be tasked with making decisions based on poor quality or selective data. It seems highly likely that tech companies will seek to skew the evidence-based on online harms, through a strategy of academic and expert capture that has previously been seen in other markets[7]. Without effective civil society counterbalance, these initiatives are much more likely to succeed.

Statutory user advocacy arrangements are in place across a range of sectors, for example Citizens Advice act as the statutory consumer advocate for energy and postal consumers; Passenger Focus represents the interests of bus and rail passengers; and the Consumer Council for Water represents domestic water users.

We have suggested a proposed statutory framework that is based upon the legislative parameters underpinning broadly comparable user protection frameworks. In the suggested amendments below, we draw heavily on the statutory functions set out in the Consumer, Estate Agents and Redress Act 2007 (these underpin Citizens Advice’s statutory consumer advocacy functions.)


Suggested amendments:



(1)   there is to be a body corporate (the Advocacy Body) to represent interests of child users of regulated Internet services (see clause 3)

(2)   “Child users” means –             

    1. A person aged 17 years or under who uses or is likely to use regulated internet services
    2. “child users” includes both an existing child user and future child users

(3)   the Advocacy Body may represent the interests of child users on any relevant matter relating to children’s use of regulated services

(4)   “Relevant matters” means –

    1. the interests of child users
    2. the protection and promotion of these interests
    3. any matter connected with those interests

(5)   the Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010

(6)   The Secretary of State may appoint an organisation known to represent children to be designated the functions under this Act, or may create an organisation to carry out the designated functions.



(1)   The Advocacy Body may –

    1. provide advice and information to persons within subsection (2) about relevant matters,
    2. make proposals to such persons about relevant matters, and
    3. represent the views and interests of child users on relevant actors to such persons

(2)   Those persons are –

    1. any Minister of the Crown government department;
    2. any regulatory body established by or under an enactment;
    3. any other international organisation;
    4. any other person whom the council considers might have an interest in the matter in question


(1)   the Advocacy Body may consider and report on any matters relating to –

    1. the safety duties about illegal content (clause 9)
    2. the safety duties for services likely to be accessed by children (clause 10)
    3. matters relating to harms caused to the physical, emotional or moral development of children
    4. any matter which appears to the Advocacy Body to be, or to be related to, an issue relating to the online safety functions of regulated Internet services that may affect child users generally or child users of a particular description



(1)   the Advocacy Body may obtain and keep under review –

    1. information about relevant matters
    2. information about the use and experiences of child users on relevant matters
    3. information about or relating to regulated Internet services as it affects child users
    4. information about harms caused to the physical, emotional moral development of children

(2)   the Advocacy Body may –

    1. prepare a report in relation to any matter falling within the scope of its functions
    2. publish any report prepared under this section if it appears to the Advocacy Body that its publication would promote the interests of child users


INSERT NEW c113             

(1)   the Advocacy Body may, by notice, require a person within subsection (3) to supply it with such information as specified or described in the notice within such reasonable period as a specified

(2)   the information specified or described in a notice under subsection (1) must be information the Advocacy Body requires for the purpose of exercising functions

(3)   the persons referred to in subsection (1) are:

    1. any regulated Internet service
    2. any person who supports a regulated Internet service available to child users

(4)   where a person within subsection (3) fails to comply with a notice under subsection (1) the Advocacy Body may refer the failure to the relevant regulator for investigation



Explanatory note:

These amendments set out the broad functions of user advocacy arrangements for children, including its scope, remit, investigatory and information disclosure powers. These broadly mirror the powers available to other statutory advocacy mechanisms, for example the powers available to Citizens Advice in their discharge of statutory energy and postal consumers.


AMEND c29 (5) (d)

Before preparing a code of practice or amendments under this section, Ofcom must consult

(d) the body enacted under the provisions of c109 and any other person who appear to Ofcom to represent the interests of children (generally or in respect of online safety matters)

AMEND c50 (2) to(d)

(d) the body enacted under the provisions of c109 and any other person who appear to Ofcom to represent the interests of children (generally or in respect of online safety matters)

AMEND c106

(1)   an ‘eligible entity’ is:

    1. the body enacted under the provisions of c 109
    2. meets criteria specified in regulations made by the Secretary of State


Explanatory note:

These amendments make provision for Ofcom to consult the statutory advocacy body in the development of its codes of practice, and designate the advocacy body with the powers to raise supercomplaints.

Please note that this is an indicative schedule of amendments, and that we have focussed on the most substantive consequential amendments in the draft Bill for the purposes of clarity.


Foundational duty

A foundational duty will provide coherence to a structurally complex regulatory regime, and focus the legislation against its core safety and harm reduction objectives

The issue:

We strongly support the proposed amendments for a foundational duty prepared by Carnegie UK[8].

The foundational duty will provide much-needed coherence to a structurally unnecessarily complex piece of legislation. Crucially it would help ensure the framework of secondary legislation, codes and guidance that will come together to form the online safety regime are tightly focused around the Bill’s fundamental safety objectives.

Our strong assertion is that the proposed amendments reinforce the targeted duties in respect of children and illegal content, and will help to ensure it is future-proofed in the context of medium to long-term technological and market changes, including the growth of decentralised social networks and the metaverse.

These proposed amendments will also address a range of issues with the scope and architecture of the Bill, as set out in full in Carnegie UK’s note. 


Scope of the child safety duty

The scope of the child safety duty could exclude highly problematic sites, and lead to harms being displaced rather than tackled

The issue:

We have substantive concerns that the ‘child user condition’ in Clause 26(4) of the draft Bill places a higher threshold than the ICO Children’s Code in respect of whether a service is likely to be accessed by a child, and could therefore offer lower standards of regulatory protection.

The clause specifies that a service is only considered as being ‘likely to be accessed by children’ if there are a significant number of children who use it, or the service is of a kind likely to attract a significant number of users who are children.

Although the definition of ‘significant’ is not adequately set out, this raises the possibility that many smaller or specialist sites could be excluded from this part of the legislation, and that highly problematic services including Telegram and could potentially fall below the qualifying threshold set.



How the Bill can be strengthened:

Clause 26(4) should be deleted, with the child safety duty applying to any service that is likely to be accessed by a child. Ofcom can signal that it intends to address issues of proportionality and regulatory burden through the discharge of its risk-based enforcement approach.


Suggested amendments:


AMEND CLAUSE 26(5) as follows:

‘For the purposes of this Part, the service is to be treated as ‘likely to be accessed by children’ if the providers assessments of the service concludes that it is possible for children to access the service or any part of it.’


24 November 2021

[1] Europol (2020) Internet Organised Crime Threat Assessment. Den Haag: Europol your meal

[2] Crawford, A (2021) Whistleblower: Facebook's Response to Child Abuse ‘inadequate’. BBC News, 28th November 2021

[3] NSPCC (2021) Duty to Protect: an assessment of the draft Online Safety Bill against the NSPCC's Six Tests for Statutory Regulation. London: NSPCC

[4] Legal opinion provided to the NSPCC by Herbert Smith Freehills identifies a significant negative interplay with competition law which should be could effectively mitigated by a specific duty on platforms to collaborate on cross-platform risks. Further information is set out in NSPCC (2021) Duty to Protect: an assessment of the Draft Online Safety Bill against the NSPCC's six tests for protecting children. London: NSPCC

[5] WeProtect Global Alliance (2021) Global Threat Assessment 2021. London: WPGA

[6] ibid

[7] See, for example, Abdalla, M et al (2021) The Grey Hoodie Project: Big Tobacco, Big Tech and the Threat to Academic Integrity. Preprint. Cambridge, MA: Harvard; Toronto, ON: University of Toronto

[8] Carnegie UK (2021) Simplifying and strengthening the draft Online Safety Bill – amendments. Dunfermline: Carnegie UK Trust