CARE Submission | September 2021
Written evidence submitted by CARE (OSB0085)
Introduction to CARE
Summary
4.1. We disagree with the Government’s claim that the most accessed pornography sites will be covered by the Bill since:
4.1.1. The Bill’s focus is on sites with user-generated content. The definition of user-generated content means pornographic websites could amend how they operate to be outside of scope.
4.1.2. Unlike the DEA, pornographic content is not on the face of the Bill so there is considerable uncertainty about what is in within scope and what is not.
4.2. It is not clear whether the wide list of actions that are considered enforceable under the Bill will be effective in preventing harm to children or violence to women caused by pornographic material. Unlike the DEA, there is no requirement for pornographic websites to introduce age verification nor an expectation that websites which contain extreme pornographic material will be blocked by ISPs.
5.1. a core number of online harms, including pornography should be listed on the face of the Bill.
5.2. pornography should be classed as primary priority content that is harmful to children and this should be on the face of the Bill (clause 45). This should include both legal and illegal content.
5.3. An offence of possessing extreme pornography and an offence under the Obscene Publications Act 1959 should be added to the illegal content covered within the scope of the Bill so that its harm to women is recognised.
5.4. There should be a clear requirement for age verification for pornographic websites on the face of the Bill.
5.5. Commercial pornographic websites should be added to the scope of the Bill, so that this Bill genuinely extends the provisions in the DEA.
5.6. Internet service providers should be expected to block extreme pornography.
CARE’s concern about online pornography
7.1. There is a substantial body of evidence suggesting that exposure to pornography is harmful to children and young people: which can have “a damaging impact on young people’s views of sex or relationships”[2] and “beliefs that women are sex objects”.[3]
7.2. Recent evidence published by the Government in January 2021 of frontline professionals working with clients aged 16 to over 60 who had either exhibited harmful sexual behaviours towards women or were at risk of doing so said that for young people, pornography is seen as “providing a template for what sex and sexual relationships should look like.” One worker is quoted as saying, “Porn comes up in probably eighty or ninety percent of my cases… what they’ve done is influenced by what they’ve seen. […] For them, the internet is fact.” [4] Another that “[Pornography is young people’s] main reference point for sex and there’s no conversation about consent”.[5]
7.3. OFSTED’s June 2021 rapid review of sexual abuse and harassment in schools reported that, “Leaders we spoke to also highlighted the problems that easy access to pornography had created and how pornography had set unhealthy expectations of sexual relationships and shaped children and young people’s perceptions of women and girls.”[6] And “There is some evidence that suggests access to technology and the sharing of inappropriate images and videos are also issues in primary schools. For example, in one all-through school, leaders have identified a trend of cases in the primary school that are linked to social media. There is a no-phone policy in this school, so incidents are likely taking place outside school. Incidents cited include viewing pornography, requests to look up pornography websites and viewing inappropriate images on social media. There was an example from another school of children in Years 6 and 7 sending nudes.”[7]
8.1. In 2018, the Select Committee on Women and Equality reported on pornography’s impact on women and girls in public places and concluded, “There is significant research suggesting that there is a relationship between the consumption of pornography and sexist attitudes and sexually aggressive behaviours, including violence.[8]
8.2. The Government then commissioned frontline research and a literature review into the use of legal pornography and its influence on harmful behaviours and attitudes towards woman and girls, which was finally published on 15 January 2021.
8.2.1. The literature review concludes: “… there is substantial evidence of an association between the use of pornography and harmful sexual attitudes and behaviours towards women”. While the report recognises that pornography is one amongst potential factors associated with these attitudes and behaviours, “it is clear that a relationship does exist and this is especially true for the use of violent pornography.”[9]
8.2.2. The Frontline workers report of professionals working with clients aged 16 to over 60 who had either exhibited harmful sexual behaviours towards women or were at risk of doing so said, “The view that pornography played a role in their clients’ harmful attitudes and/or behaviours was undisputed.”[10]
8.3. The Government’s 2021 Tackling Violence against Women and Girls Strategy reported that their “Call for Evidence showed a widespread consensus about the harmful role of violent pornography can play in violence against women and girls, with most respondents to the open public surveys and many respondents to the nationally representative survey agreeing that an increase in violent pornography has led to more people being asked to agree to violent sex acts (54% nationally representative, 79% Phase 1, 78% Phase 2), and to more people being sexually assaulted (50% nationally representative, 70% Phase 1, 71% Phase 2).”[11]
Government action to reduce access to online pornography
CARE’s concern about the scope of the Online Safety Bill with respect to pornography websites
11.1. The Bill focuses on user-to-user interactions, which means pornography websites will be required to provide a duty of care if and only if they have user generated content. Pornhub disabled access to all user generated content in December 2020, where the user was not verified.[25]
11.2. However, a user-generated service can be exempt from regulation under clause 2(7) if it meets certain criteria set out in Schedule 1. Schedule 1(5) could apply to pornographic websites if they have “limited functionality services”, which means that a user could not post new content but could post comments or reviews on the website’s content. This exemption is intended to apply to websites like Amazon where customers give reviews of products but it is easy to see how this could be extended to pornography websites.
11.3. This means that the Bill would have no regulatory impact on commercial pornographic websites where there is no user content or where the user content is no more than commenting on a video.
11.4. The Government says the Bill “will capture the most visited pornography sites”,[26] but this is unlikely if only those that allow user generated content are within scope. CARE believes that all pornographic websites should fall within the scope of this Bill, as they would have done under the DEA.
12.1. While we recognise that there is pornography on social media, the Government’s claim that the greater concern about pornography rests with social media is counter to the BBFC research published in January 2020 which stated, “The most popular site for accessing pornography was Pornhub, but other sites such as xHamster, xVideos, or RedTube were also mentioned. It was also very common for respondents to have seen pornography through social media.”[27]
12.2. Moreover, Thurman’s May 2021 academic paper reporting on viewing by 16 and 17 year olds states, “The results show that more (63%) had seen pornography on social media platforms than on pornographic websites (47%), suggesting the UK government was right to target such platforms in its latest proposals. However, pornography was much more frequently viewed on pornographic websites than on social media, showing how important the regulation of such sites remains.”[28] (emphasis added)
13.1. Category 1 will be for about 0.1% of the organisations: “a small number of the largest and highest risk businesses will have additional duties, namely to take action with regard to legal but harmful content and activity accessed by adults.”[30] The current estimate is that only up to 20 of the largest and highest risk services will meet the Category 1 thresholds, “likely to be large social media platforms and potentially some gaming platforms and online adult services” (bold added) and likely to be multinationals.[31] The Government appears to suggest that it will not be specifying a requirement for age verification. It says, “Therefore, whilst legislation is technology neutral, a small number of high risk services which are likely to be accessed by children will be required to know the age of their users and therefore may choose to implement age assurance technologies.”[32] (emphasis added) This differs from the level of protection required for all and not just some pornographic websites under the DEA.
13.2. Category 2 organisations will include those who are expected to address legal but harmful content accessed by children[33], in addition to illegal content, if the service is likely to be accessed by children, but the Government does not provide any evidence about the proportion of Category 2 businesses likely to be accessed by children.[34] The exact thresholds for the different categories will be set out by Regulations (Schedule 4). Again, contrary to the DEA, and in violation of the Conservative manifesto commitment, no pornographic web site covered by category 2 will be required to provide age verification.
CARE’s concern about extreme pornography
16.1. illegal content. The list of illegal content specifies terrorism offences (Schedule 2) and child sexual exploitation and abuse (CESA) offences (Schedule 3) only plus a list of other offences to be determined by regulations, expected to include hate crime and sale of illegal drugs and weapons.[37]. Positively, the CSEA offences include possession of a prohibited image of a child – an aspect that was missing from the DEA. Negatively, as currently drafted, neither the extreme pornography offence[38], nor material considered obscene are listed anywhere in the Bill; unlike in the DEA.[39] Indeed, under regulatory powers, they might not even make the list, although extreme pornography was included in the Online Harms White Paper list.[40]
16.2. content harmful to children. Clause 45 sets out that future affirmative regulations will determine “content that is harmful to children”[41]; that is the Secretary of State will set out the primary priority content and priority content by regulations after consultation with OFCOM (clause 47). In the meantime, how these two different types of content will differ is not clear as there is no explanation of the objectives of the two categories. Clause 45(3) also sets out a general definition of harmful content defining it as content where the provider of the service “has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on a child of ordinary sensibilities”. The Impact Assessment states that harmful content to children would include pornography and violent content.[42]
16.3. for the largest social media platforms content harmful to adults[43] (clause 11) but the Bill does not state what this is intended to cover, other than “priority content that is harmful to adults” (clause 46(2)). In the Government’s Response to the White Paper, the Government indicated that priority categories of harmful content to adults would include abuse and content about eating disorders, self-harm or suicide.[44]
19.1. the Bill be amended so that illegal material includes extreme pornography and material that would be considered obscene under the OPA.
19.2. pornography is listed as “primary priority content that is harmful to children” on the face of the Bill.
CARE’s concern about the Duty of Care requirements on those pornography websites that do fall within the Bill
CARE’s concern about Enforcement powers
Conclusion
30.1. include a core number of online harms listed on the face of the Bill which could be amended through Regulations. Pornography should be within this list;
30.2. pornography should be classed as primary priority content that is harmful to children and this should be on the face of the Bill (clause 45). This should include both legal and illegal content.
30.3. An offence of possessing extreme pornography and an offence under the Obscene Publications Act 1959 should be added to the illegal content covered within the scope of the Bill so that its harm to women is recognised.
30.4. There should be a clear requirement for age verification for pornographic websites on the face of the Bill.
30.5. Commercial pornographic websites should also be added to the scope of the Bill, so that this Bill genuinely extends the provisions in the DEA, rather than abandoning them for a new and different piece of legislation.
30.6. Internet service providers should be expected to block extreme pornography.
ANNEX: COMPARING PART 3 DEA WITH DRAFT ONLINE SAFETY BILL
| Part 3, DEA 2017 | Draft Online Safety Bill |
---|---|---|
Scope related to pornography | Commercial pornography websites – defined in Online Pornography (Commercial Basis) Regulations 2019[61]
No social media
| Commercial pornography websites if they allow users to generate content for other users
Social media
|
Definitions of pornography | Pornography defined: section 15 | No definition of pornography on the face of the Bill. Expected to be some reference in regulations |
Requirements in relation to pornography | Requirement to secure “that, at any given time, the material is not normally accessible by persons under the age of 18”. Section 14 | No exact requirement specified but online objectives in clause 30 that “there are adequate controls over access to, and use of, the service by children, taking into account use of the service by, and impact on, children in different age groups”[62] |
Information power | Age-verification regulator has power to require information (section 18) | OFCOM has power to require information (clause 70) |
Power to issue enforcement notice and a fine | If not complying with Age verification requirements or information, can impose a fine under section 19 | If not complying with duty of care requirements or information, power to issue provisional notice of enforcement action (clause 80 and other clauses in Part 4, Chapter 6 which include imposing a fine. |
Size of financial penalty | Section 20 - £250,000 or 5% of qualifying turnover | Clause 85 - £18m or 10% of qualifying worldwide revenue, whichever is greater |
Extra-territorial powers | None | Clauses 127 and 128 provides extraterritorial powers. |
Action with payment-services providers and ancillary service providers | Section 21 – could give notice of non-compliance but not require action. | Clause 91 – have to get court order to direct provider to take action under service restriction order |
Defines extreme pornographic material | Section 22 | Does not define on the face of the Bill. May be added as illegal content under Regulations in clause 44. Does include possession of a prohibited image of a child – an aspect that was missing from the DEA |
Action on extreme pornography | Section 23 – require ISPs to block websites | Not clear if content in scope. If in scope under illegal material, duty would be to minimise it or if brought to providers attention to remove swiftly Power to block material would be via a court order – an access restriction order (clause 93) |
Blocking non-compliant websites | Section 21 – require ISPs to block websites if non preventing access to under 18s. | Not clear either if in scope. If in scope under illegal material, duty would be to minimise it or if brought to providers attention to remove swiftly (clause 9). Power to block material would be via a court order (clause 93) |
Guidance powers | Section 25 – guidance on methods to be used by providers, to be approved by Parliament. | Extensive guidance powers for OFCOM but equivalent power would be code of practice about duties (clause 29) and online safety objectives (clause 30) |
Exercise of functions by regulator | Section 26 – allows regulator to focus on providers with a large number of users in the UK or a large number of child users or a provider with a large turnover – ie proportional regulations | No equivalent for the Regulator but the SofS can make a statement of strategic priorities (clause 109) |
Guidance by Secretary of State to regulator | Section 27 – in relation to regulator’s functions | The SofS has power of direction on codes of practice (clause 33), can make a statement of strategic priorities (clause 109) and give general guidance about OFCOM’s functions (clause 113) |
Requirements for notices by regulator | Section 28 – how the regulator is to contact organisations inside and outside of the UK | Does not appear to be an equivalent |
Report on effectiveness/review | Section 29 - Within 18 months, but not before 12 months, with a consultation on the definitions in the Act. | Clause 115 – Not before two years and within five years. |
22 September 2021
13
[1] Child Safety Online: Age Verification for Pornography, Feb 2016, page 4
[2] Young People, Sex and Relationships: The New Norms, Institute for Public Policy Research, August 2014, page 4. Study involved 500 18 year olds
[3] Basically…porn is everywhere – A Rapid Evidence Assessment of the effects that access and exposure to pornography have on children and young people, Horvath et al, (2013), pages 7 and 8, Produced for the Children’s Commissioner for England
[4] The relationship between pornography use and harmful sexual behaviours A primary research report prepared for the Government Equalities Office, 20 February 2020, page 9
[5] Ibid, page 19
[6] Review of sexual abuse in schools and colleges - GOV.UK (www.gov.uk)
[7] Review of sexual abuse in schools and colleges - GOV.UK (www.gov.uk)
[8] Women and Equalities Select Committee on Sexual harassment of women and girls in public places, Report published 23 Oct 2018, para 101
[9] The relationship between pornography use and harmful sexual attitudes and behaviours: literature review, 20 Feb 2020, pages 6-9
[10] Primary Research, Op Cit, pages 8 and 12
[11] Tackling Violence against Women and Girls Strategy, HM Government, July 2021, page 35
[12] https://issuu.com/conservativeparty/docs/ge_manifesto_low_res_bdecb3a47a0faf/75 page 35.
[13] Sections 15-17, 21(5) 22, 25, 26(2), 27, 30(1) and 30(2) came into force on 31 July 2017 under the Digital Economy Act 2017 (Commencement No. 1) Regulations 2017
[14] Draft Online Safety Bill - GOV.UK (www.gov.uk)
[15] Impact Assessment, paras 84-85, page 27 and para 123, page 36
[16] https://committees.parliament.uk/call-for-evidence/567/
[17] Impact Assessment, paras 84-85, page 27 and para 123, page 36
[18] House of Commons Hansard, 10 June 2021., column 1162
[19] https://avpassociation.com/uncategorized/an-open-letter-to-the-prime-minister-from-teachers-charities-and-baroness-benjamin/
[20] The system will be funded by fees on service providers. See Impact Assessment, page 159 and Part 3; Chapter 2 of the Bill
[21] Government statement on Online Harms, 15 December 2020, cols 146 and 153
[22] Impact Assessment, page 123
[23] Memorandum from the Department for Digital, Culture, Media and Sport and the Home Office to the Delegated Powers and Regulatory Reform Committee, para 151, page 31
[24] The figure is based on costs associated with child sexual abuse and exploitation, hate crime, illegal sale of drugs, modern slavery, cyber stalking, cyber bullying and intimidation of public figures, so the actual cost is likely to be much larger. Impact Assessment, para 102, page 30 and Table 40, page 80
[25] https://www.theguardian.com/technology/2020/dec/14/pornhub-purge-removes-unverified-videos-investigation-child-abuse
https://www.bbc.com/news/technology-55231181
[26] WPQ 38440, answered 6 September 2021
[27] Young people, Pornography & Age-verification, BBFC, January 2020, page 6. Available from the BBFC on request as the research contains graphic sexual content and pornographic language.
[28] Thurman N and Obster F, The regulation of internet pornography: What a survey of under‐18s tells us about the necessity for and potential efficacy of emerging legislative approaches, Policy and Internet, 4 May 2021, https://onlinelibrary.wiley.com/doi/10.1002/poi3.250
[29] Impact Assessment, para 109, page 32
[30] Impact Assessment, para 115, page 33
[31] Impact Assessment, para 116, page 34
[32] Impact Assessment, para 193, page 53
[33] Impact Assessment, para 161, page 45, para 176, page 48 and para 180, page 50
[34] Impact Assessment, para 180, page 50
[35] Digital Economy Bill, Age Verification Impact Assessment, 2018, page 1
[36] Quote from Q104. See also Q107-Q109, Evidence given to the Public Bill Committee Digital Economy Bill, 11 October 2016
[37] Online Harms White Paper: Full Government Response to the consultation. December 2020, CP 354, para 2.3, page 24
[38] Section 63, Criminal Justice and Immigration Act 2008
[39] See: meaning of extreme pornographic material in section 22 of the DEA.
[40] Online Harms White Paper, April 2019, Table 1: Online Harms in Scope, page 31
[41] Delegated Powers Memorandum, page 31
[42] Online Harms White Paper: Full Government Response to the consultation. December 2020, CP 354, para 2.3, page 24
[43] Impact Assessment, pages 19-20, para 72 and Table 2, page 25 and page 123
[44] Online Harms White Paper: Full Government Response to the consultation, para 2.3, page 24
[45] Tackling Violence against Women and Girls Strategy, HM Government, July 2021, page 35
[46] Impact Assessment, see Table 15 and footnote 84, page 44
[47] Impact Assessment, page 124
[48] The Minister said, “This is not an indefinite postponement of the measures that we are seeking to introduce; it is an extension of what they will achieve.” House of Commons Hansard, 17 October 2019, col 454
[49] Explanatory Memorandum to The Draft British Board Of Film Classification (BBFC) Guidance On Age Verification Arrangements, para 7.4
[50] Impact Assessment Age verification for pornographic material online, 13 June 2018, page 10
[51] Impact Assessment, page 135
[52] Impact Assessment, page 137
[53] PQ UIN 199, answered 17 May 2021. See also the same statement made by Caroline Dineage MP on 11 May 2021, Q226 in evidence given the House of Lords Communication and Digital Committee
[54] Tackling Violence against Women and Girls Strategy, HM Government, July 2021, page 39
[55] Note that separately, it will be an offence under clause 71 not to comply with a request for information that OFCOM considers it requires under clause 70. The penalty is a fine and/or up to 2 years imprisonment.
[56] Impact Assessment, para 238, page 63
[57] House of Lords, Select Committee on the Constitution, 17 January 2017, HL 96, para 12, page 3
[58] House of Lords, Delegated Powers and Regulatory Reform Committee, 22 December 2016, HL 89, para 31, page 6
[59] Q26 and Q27 in evidence given the House of Commons, Digital, Culture, Media and Sport Committee, 13 May 2021
[60] Tackling Violence against Women and Girls Strategy, HM Government, July 2021, page 48
[61] https://www.legislation.gov.uk/uksi/2019/23/contents/made
[62] Clause 30(2)(a)(viii)