(COR0127)

Supplementary written evidence submitted by the National Society for the Prevention of Cruelty to Children (NSPCC) (COR0127)

 

 

  1. Following previous calls for evidence from the Home Affairs Committee regarding preparedness for Covid-19, and the development of this work towards a focus on online harms, the NSPCC welcomes scrutiny of the risks of technology-facilitated child abuse during the Coronavirus. This evidence provides background to the current online harms risks to inform the forthcoming evidence session with Baroness Williams (Home Office Minister) and Caroline Dinenage (Minister of State for Digital and Culture), and the inquiry as a whole. In addition to the NSPCC’s previous evidence submission to the Covid-19 preparedness inquiry, this supplementary evidence provides an updated assessment of the online child abuse threat, and the Government and industry response.

 

The nature, prevalence and scale of online harms during the Covid-19 period

 

  1. In our earlier evidence, the NSPCC set out the potential for a three-fold ‘perfect storm which could lead to a spike in online child abuse, with platforms facing understandable pressures in sustaining their moderation processes, and being forced to rely on artificial intelligence (AI) that is often used to triage but not make final decisions on issues such as grooming; children using technology to stay connected with family and friends during the lockdown, with more children likely to be sad, anxious or worried at this time (which may increase their vulnerability to grooming)[1]; and with intelligence from Europol and the National Crime Agency pointing to a significantly increased threat[2].

 

  1. Unfortunately, we are now seeing these risks translate into evidence of actual harm. The National Center for Missing and Exploited Children (NCMEC) has released data showing that child endangerment reports rose by 106 per cent in March 2020, from 983,734 to 2,027,520 reports[3]. Last month, the Internet Watch Foundation reported that in the first four weeks of lockdown industry takedowns of child abuse URLs fell by a considerable 89 per cent[4].

 

  1. We are increasingly concerned by the potential for a sharp increase in abuse on livestreaming and video chat sites. In a worrying number of cases, offenders have been hijacking public meetings and events on Zoom (‘Zoombombing’) to display child abuse images to both adults and children on calls. In one case last week, 60 children taking part in an online fitness class were subjected to child sexual abuse images in Plymouth.[5]

 

  1. Video chat sites present particular risks for online grooming: NSPCC research finds that one in ten children aged 7-16 that have used video chat sites to speak to someone that they don’t know have received a request to undress. One in eight children have video-chatted with someone they haven’t met in person. [6]

 

Steps that could be taken to mitigate these concerns                                                                                                                                                                                                   

  1.                                      We are concerned that in a number of areas, platforms have been reluctant to provide meaningful information about their response to the Coronavirus, and it has been highly challenging to assess the impact on content moderation. In recent evidence to the Digital, Culture Media and Sport Committee, Facebook suggested they had suspended outsourced moderation arrangements, and had been reliant on volunteers for the early weeks of the lockdown. Facebook were unable to tell the Committee how many of its moderators have now resumed work.[7] 

 

  1.                                      In the absence of current regulatory or legal requirements, we have recommended that the Government intervenes to secure the following:

 

-          Rolling provision of platform-level data on the volume and type of child abuse referrals, an important indicator of current abuse, and the resilience of tech firms to identify and disrupt it;

-          Arrangements to secure cross-industry sharing of threat assessments and intelligence during this period of heightened risk, drawing on the more developed arrangements for counter-terrorism;

-          Agreement to share intelligence on emerging threats and trends with child safety organisations, so we can ensure our messaging to children and parents is suitably informed and risk-aligned.

 

At present, we are not aware that any platform has agreed to these voluntary transparency and cross-industry collaboration measures.

 

  1.                                      Each of these requests are entirely consistent with the voluntary principles for tackling child abuse agreed by the major tech firms and the Five Eyes Governments in March.[8] For example, Principle 11 commits companies to regularly publish or share meaningful data and insights on their efforts to tackle abuse. Principle 10 commits companies to support opportunities to share relevant expertise, helpful practices, data and tools.

 

Adequacy of the Government’s Online Harms proposals

 

  1. While no one could reasonably have foreseen the pandemic, and it inevitably presents challenges that even the most safety conscious of companies would face, the current crisis is shining a light on the existing weaknesses in platform moderation and their poor design choices. Offenders have been able to exploit the lack of safety features on social networks, messaging and livestreaming sites. Children have been exposed to unacceptable harm not only because of the public health crisis, but arguably because of the long-term failure of many platforms to invest in and appropriately prioritise tackling online abuse.

 

  1. The crisis demonstrates that an Online Harms Bill is urgent. The NSPCC has always recognised the need to draft this complex legislation carefully, but as we must prepare for the prospect that Coronavirus may shape children’s online lives for many months or even years to come, it is vital that the Government minimises any unavoidable slippage.

 

  1. We would welcome reassurance from the Government that it intends to maintain the scope and ambition of its proposals. The Online Harms White Paper is a good basis for proceeding with legislation, and as recently as February, Digital Minister Matt Warman told the Commons that ‘will take our time to get this right, but we will not delay for a second longer. That is why we will legislate we this session.’[9]

 

  1. In recent weeks, it appears the timing may be slipping. In response to a parliamentary question last week, Digital Minister Caroline Dinenage confirmed that Government would publish its final decisions later this year, but would commit only to legislation when parliamentary time allows.[10] In his recent evidence to the Digital, Culture, Media and Sport Select Committee, the Culture Secretary Oliver Dowden indicated that this was not a ‘can kicking exercise’ but was unable to commit to legislation being passed during the current parliamentary term.[11]

 

  1. In his evidence session to that Committee, the Secretary of State also described some of the proposed enforcement measures, including criminal sanctions and named director responsibility for upholding the Duty of Care, as ‘draconian’[12]. A detailed regulatory proposal prepared by NSPCC and Herbert Smith Freehills envisages a named director scheme as an integral part of the enforcement powers available to the regulator.[13] It continues be essential that the regulator has enforcement powers proportionate to the companies in scope, to avoid the risk it becomes a ‘paper tiger.’

 

  1. We also encourage the Government to move quickly to enact the Age-Appropriate Design Code as soon as possible. The Code will require platforms to risk assess their sites for grooming and sexual abuse; restrict the use of algorithmic friend recommendations that can be exploited by abusers to contact large numbers of children; and prevent the use of algorithms that recommend harmful suicide and self-harm content to children. Having been cleared by the European Commission, the Government should enact the Code by statutory instrument without delay.

 

May 2020


[1] Forthcoming NSPCC research suggests that children aged 11-17 who display traits of loneliness, extroversion and use social media frequently were more likely to have been sent, received or asked to send sexual content to an adult on large social networks and gaming platforms

[2] Europol (2020) Catching the virus: cybercrime, disinformation and the Covid-19 pandemic. Lyon: Europol

[3] Forbes (2020) Child Exploitation Complaints rise 106% to hit 2 million in a month – is Covid 19 to blame? Published April 24th

[4] Figures provides by Internet Watch Foundation

[5] The Guardian (2020) Zoom hacker streams child sex abuse footage to Plymouth school children. Published May 9th

[6] NSPCC (2018) Livestreaming and Video Chat: a snapshot. London: NSPCC

[7] The Guardian (2020) Tech firms criticised for lack of answers on Covid 19 disinformation. Published on April 30th (awaiting publication of full oral evidence transcript)

[8]https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/870622/11_Voluntary_principles_-_one_page_-_Final__002_.pdf

[9] https://hansard.parliament.uk/Commons/2020-02-13/debates/FF107BE3-10AE-44CF-884F-E03F593FE5FB/OnlineHarmfulMaterial#contribution-4BB13239-B518-41AC-AC85-BD17D79B50DF

[10] https://www.parliament.uk/business/publications/written-questions-answers-statements/written-question/Commons/2020-05-01/42080/

[11] Digital, Culture, Media and Sport Committee Oral evidence: The work of the Department for Digital, Culture, Media and Sport, HC 157

[12] Ibid

[13] NSPCC (2019) Taming the wild west web: how to regulate social networks and protect children from abuse