CARE Submission | September 2021

 

Written evidence submitted by CARE (OSB0085)

 

Introduction to CARE

  1. CARE (Christian Action Research and Education) is a well-established mainstream Christian charity providing resources and helping to bring Christian insight and experience to matters of public policy and practical caring initiatives across the UK. CARE has been working on policy issues related to online safety since the mid-1990s.
  2. Our response to call for evidence focuses on the Bill’s impact on children accessing online pornography and the availability of extreme pornography on violence against women.

Summary

  1. CARE advocated for age verification to protect children from online pornography be introduced through Part 3 of the Digital Economy Act 2017 (DEA) and welcomed the power in the Act to block access to extreme pornographic material. 

 

  1. While we welcome the inclusion of social media and user-generated content in the Online Safety Bill, we are extremely concerned about the lack of focus in the Bill on the pornographic material covered by the DEA. 

4.1.             We disagree with the Government’s claim that the most accessed pornography sites will be covered by the Bill since:

4.1.1.   The Bill’s focus is on sites with user-generated content. The definition of user-generated content means pornographic websites could amend how they operate to be outside of scope.

 

4.1.2.   Unlike the DEA, pornographic content is not on the face of the Bill so there is considerable uncertainty about what is in within scope and what is not.

 

4.2.             It is not clear whether the wide list of actions that are considered enforceable under the Bill will be effective in preventing harm to children or violence to women caused by pornographic material.   Unlike the DEA, there is no requirement for pornographic websites to introduce age verification nor an expectation that websites which contain extreme pornographic material will be blocked by ISPs.

 

  1. CARE recommends the following changes to the Bill:

5.1.             a core number of online harms, including pornography should be listed on the face of the Bill.

5.2.             pornography should be classed as primary priority content that is harmful to children and this should be on the face of the Bill (clause 45).  This should include both legal and illegal content.

5.3.             An offence of possessing extreme pornography and an offence under the Obscene Publications Act 1959 should be added to the illegal content covered within the scope of the Bill so that its harm to women is recognised.

5.4.             There should be a clear requirement for age verification for pornographic websites on the face of the Bill.

5.5.             Commercial pornographic websites should be added to the scope of the Bill, so that this Bill genuinely extends the provisions in the DEA.

5.6.             Internet service providers should be expected to block extreme pornography.

 

  1. CARE recommends that the Government should implement Part 3 of the Digital Economy Act 2017 with immediate effect to ensure that children and women have some level of protection from online pornography before the Online Safety Bill comes into effect.

 

CARE’s concern about online pornography

  1. CARE is concerned about the impact of online pornography for its impact on children because “pornography has never been more easily accessible online, and material that would previously have been considered extreme has become part of mainstream online pornography. When young people access this material it risks normalising behaviour that might be harmful to their future emotional and psychological development.[1]  

7.1.       There is a substantial body of evidence suggesting that exposure to pornography is harmful to children and young people: which can have “a damaging impact on young people’s views of sex or relationships”[2] and “beliefs that women are sex objects”.[3] 

 

7.2.       Recent evidence published by the Government in January 2021 of frontline professionals working with clients aged 16 to over 60 who had either exhibited harmful sexual behaviours towards women or were at risk of doing so said that for young people, pornography is seen as “providing a template for what sex and sexual relationships should look like.” One worker is quoted as saying, “Porn comes up in probably eighty or ninety percent of my cases… what they’ve done is influenced by what they’ve seen. […] For them, the internet is fact.” [4]  Another that “[Pornography is young people’s] main reference point for sex and there’s no conversation about consent”.[5]

 

7.3.       OFSTED’s June 2021 rapid review of sexual abuse and harassment in schools reported that, “Leaders we spoke to also highlighted the problems that easy access to pornography had created and how pornography had set unhealthy expectations of sexual relationships and shaped children and young people’s perceptions of women and girls.”[6]  AndThere is some evidence that suggests access to technology and the sharing of inappropriate images and videos are also issues in primary schools. For example, in one all-through school, leaders have identified a trend of cases in the primary school that are linked to social media. There is a no-phone policy in this school, so incidents are likely taking place outside school. Incidents cited include viewing pornography, requests to look up pornography websites and viewing inappropriate images on social media. There was an example from another school of children in Years 6 and 7 sending nudes.”[7]

 

  1. We are also concerned about pornography as a contributing factor in violence against women, which has been raised by contributors to the Government’s consultation on Violence Against Women and Girls.

8.1.       In 2018, the Select Committee on Women and Equality reported on pornography’s impact on women and girls in public places and concluded, “There is significant research suggesting that there is a relationship between the consumption of pornography and sexist attitudes and sexually aggressive behaviours, including violence.[8]

8.2.       The Government then commissioned frontline research and a literature review into the use of legal pornography and its influence on harmful behaviours and attitudes towards woman and girls, which was finally published on 15 January 2021. 

8.2.1.   The literature review concludes: “… there is substantial evidence of an association between the use of pornography and harmful sexual attitudes and behaviours towards women”While the report recognises that pornography is one amongst potential factors associated with these attitudes and behaviours, “it is clear that a relationship does exist and this is especially true for the use of violent pornography.”[9]

8.2.2.   The Frontline workers report of professionals working with clients aged 16 to over 60 who had either exhibited harmful sexual behaviours towards women or were at risk of doing so said,The view that pornography played a role in their clients’ harmful attitudes and/or behaviours was undisputed.”[10]

8.3.       The Government’s 2021 Tackling Violence against Women and Girls Strategy reported that their “Call for Evidence showed a widespread consensus about the harmful role of violent pornography can play in violence against women and girls, with most respondents to the open public surveys and many respondents to the nationally representative survey agreeing that an increase in violent pornography has led to more people being asked to agree to violent sex acts (54% nationally representative, 79% Phase 1, 78% Phase 2), and to more people being sexually assaulted (50% nationally representative, 70% Phase 1, 71% Phase 2).”[11]

 

Government action to reduce access to online pornography

  1. In 2015, the Conservative Government made it a manifesto commitment to require age verification checks on all pornographic web sites.[12]  The age verification proposals were enacted by Parliament through Part 3 of the Digital Economy Act 2017 (DEA).  The Act also included the power to require internet service providers (ISPs) to block sites containing extreme pornography, (content that it is illegal to possess under section 63, Criminal Justice and Immigration Act 2008),  Nearly two years ago, the Government announced it was not going to implement Part 3 in favour of a wider scoped proposal in the Online Safety Bill.[13] Two years later, pre-legislative scrutiny is only just beginning on the Draft Online Safety Bill.[14]  The Bill’s Impact Assessment assumes compliance will be needed from 2024 but clearly hoped that pre-legislative scrutiny would be concluded significantly before the Committee’s deadline for reporting of 10 December.[15] [16] Moreover, previous experience with the DEA suggests that these processes take longer than expected and the Government itself recognises that “the timelines are dependent on parliamentary scheduling and capacity”,[17] so implementation could be later than 2024 – that is five years later than would have been the case under the DEA. Indeed, the Government effectively acknowledged the problem a delay would cause by formally asking the Children’s Commissioner to advise them on what steps might be taken to protect children in the interim.[18] A letter sent by Baroness Benjamin and over 60 others including MPs, Peers, head teachers, NGOs etc said that the interim action should include the implementation of Part 3.[19]

 

CARE’s concern about the scope of the Online Safety Bill with respect to pornography websites


  1. The Bill is complex and limited in its impact on pornographic content.  OFCOM will regulate tech firms[20] that allow user generated content (ie social media companies) and require them to demonstrate that they are meeting a duty of care[21]to prevent user-generated activity and content on their services causing significant physical or psychological harms to individual.[22]   The Government says it wants to deliver “legal certainty[23] but so far its actions have had quite the opposite effect. Its proposals are dense and opaque with no examples of how it might work for a particular type of content or service. The requirements are complicated and specified in outline only with the majority of detail to come in regulation.   As a result, the costs to implement this regime are uncertain. The Government estimates that over the 10 years from 2023, under the status quo, online harms will result in a societal cost of at least £54 billion based on present values but this is based only on a small subset of harms[24] and does not include any impact from pornography.

 

  1. The Bill does not deal with pornographic websites as robustly as the Digital Economy Bill (DEA). The Government has criticised the DEA because it does not cover social media sites.  CARE recognises this but believes there are fundamental weaknesses in the Online Safety Bill because:

11.1.   The Bill focuses on user-to-user interactions, which means pornography websites will be required to provide a duty of care if and only if they have user generated content.  Pornhub disabled access to all user generated content in December 2020, where the user was not verified.[25] 

 

11.2.   However, a user-generated service can be exempt from regulation under clause 2(7) if it meets certain criteria set out in Schedule 1.  Schedule 1(5) could apply to pornographic websites if they have “limited functionality services”, which means that a user could not post new content but could post comments or reviews on the website’s content.  This exemption is intended to apply to websites like Amazon where customers give reviews of products but it is easy to see how this could be extended to pornography websites.

11.3.   This means that the Bill would have no regulatory impact on commercial pornographic websites where there is no user content or where the user content is no more than commenting on a video. 

 

11.4.   The Government says the Bill “will capture the most visited pornography sites”,[26] but this is unlikely if only those that allow user generated content are within scope.  CARE believes that all pornographic websites should fall within the scope of this Bill, as they would have done under the DEA.

 

  1. The Bill assumes that the major problem with pornographic content rests with social media. The Secretary of State said on 13 May 2021, “The model for the Online Safety Bill…was always to deal with the challenges that arise on social media…On the issue of commercial pornography, the biggest risk is kids stumbling across it but there is a greater risk from social media and user-generated content…I believe that the preponderance of commercial pornography sites have user-generated content on them, so most of them will be in scope.” 

12.1.            While we recognise that there is pornography on social media, the Government’s claim that the greater concern about pornography rests with social media is counter to the BBFC research published in January 2020 which stated, “The most popular site for accessing pornography was Pornhub, but other sites such as xHamster, xVideos, or RedTube were also mentioned. It was also very common for respondents to have seen pornography through social media.”[27]

 

12.2.            Moreover, Thurman’s May 2021 academic paper reporting on viewing by 16 and 17 year olds states, “The results show that more (63%) had seen pornography on social media platforms than on pornographic websites (47%), suggesting the UK government was right to target such platforms in its latest proposals. However, pornography was much more frequently viewed on pornographic websites than on social media, showing how important the regulation of such sites remains.”[28] (emphasis added)

 

  1. Not all pornography websites in scope will be treated the same.  Pornography websites will fall into two different categories. Only a very few pornographic websites will be considered the most at risk – but even that is not certain. The Government is expecting about 24,000 services to be within the Bill’s scope, including business and civil society organisations.[29]  Organisations will be classed into three different strata of risk: with 3% of organisations considered high risk, 48% expected to medium risk and 49% low risk.  

 

13.1.         Category 1 will be for about 0.1% of the organisations: “a small number of the largest and highest risk businesses will have additional duties, namely to take action with regard to legal but harmful content and activity accessed by adults.”[30] The current estimate is that only up to 20 of the largest and highest risk services will meet the Category 1 thresholds, “likely to be large social media platforms and potentially some gaming platforms and online adult services” (bold added) and likely to be multinationals.[31]  The Government appears to suggest that it will not be specifying a requirement for age verification.  It says, “Therefore, whilst legislation is technology neutral, a small number of high risk services which are likely to be accessed by children will be required to know the age of their users and therefore may choose to implement age assurance technologies.”[32] (emphasis added) This differs from the level of protection required for all and not just some pornographic websites under the DEA.

13.2.         Category 2 organisations will include those who are expected to address legal but harmful content accessed by children[33], in addition to illegal content, if the service is likely to be accessed by children, but the Government does not provide any evidence about the proportion of Category 2 businesses likely to be accessed by children.[34]   The exact thresholds for the different categories will be set out by Regulations (Schedule 4). Again, contrary to the DEA, and in violation of the Conservative manifesto commitment, no pornographic web site covered by category 2 will be required to provide age verification.

 

  1. The implication of this is that only a very few online adult services are going to be classed as of the highest risk.   What that will mean is hard to judge given the vagueness of requirements on service providers.  However, it should be noted in the Impact Assessment for the Digital Economy Bill, it was estimated that one in 10 of those who accessed adult sites in March 2015 were children.[35]  It did not state how many organisations were expected to be in scope.   However, in 2016 the British Board of Film Classification (BBFC) said that, “There are 1.5 million new pornographic URLs coming on stream every year. However, the way in which people access pornography in this country is quite limited. Some 70% of users go to the 50 most popular websites. With children, that percentage is even greater; the data evidence suggests that they focus on a relatively small number of sites.” [36] The BBFC was expecting to start with regulating these 50 most visited sites and then turn to others. There is no reference in the documents accompanying the Online Safety Bill to the 50 most visited pornography sites.

 

  1. CARE recommends that the Bill be amended to include all the websites that would be in scope of the DEA so there is no removal of protections for women (with respect to extreme pornography) and children (with respect to pornography in general) which are already on the statute book. We must move forwards not backwards.

 

CARE’s concern about extreme pornography

  1. The Bill is concerned about implementing a duty of care with respect to illegal content, legal content harmful to children, and in some cases, legal content harmful to adults, but there is limited indication beyond illegal content of what is likely to fall into the different categories, thereby making it impossible to judge what the intentions of the Government and OFCOM might be; and therefore, how effective they might be.   All online harms will fall into these three categories:

16.1.         illegal content. The list of illegal content specifies terrorism offences (Schedule 2) and child sexual exploitation and abuse (CESA) offences (Schedule 3) only plus a list of other offences to be determined by regulations, expected to include hate crime and sale of illegal drugs and weapons.[37]Positively, the CSEA offences include possession of a prohibited image of a child – an aspect that was missing from the DEA.   Negatively, as currently drafted, neither the extreme pornography offence[38], nor material considered obscene are listed anywhere in the Bill; unlike in the DEA.[39]  Indeed, under regulatory powers, they might not even make the list, although extreme pornography was included in the Online Harms White Paper list.[40] 

 

16.2.         content harmful to children. Clause 45 sets out that future affirmative regulations will determine “content that is harmful to children”[41]; that is the Secretary of State will set out the primary priority content and priority content by regulations after consultation with OFCOM (clause 47).  In the meantime, how these two different types of content will differ is not clear as there is no explanation of the objectives of the two categories. Clause 45(3) also sets out a general definition of harmful content defining it as content where the provider of the service “has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on a child of ordinary sensibilities”.   The Impact Assessment states that harmful content to children would include pornography and violent content.[42]

 

16.3.         for the largest social media platforms content harmful to adults[43] (clause 11) but the Bill does not state what this is intended to cover, other than “priority content that is harmful to adults” (clause 46(2)). In the Government’s Response to the White Paper, the Government indicated that priority categories of harmful content to adults would include abuse and content about eating disorders, self-harm or suicide.[44]

 

  1. The Government’s 2021 Tackling Violence against Women and Girls Strategy states that “Through the new Online Safety Bill, companies will need to take swift and effective action against illegal content targeted at women. They will have to have effective systems in place to minimise priority illegal content and remove all illegal content quickly once they are aware of it. The Government will work with stakeholders and Parliamentarians to identify priority illegal harms which will be specified in secondary legislation and may include those of particular relevance to women, such as ‘revenge porn’, extreme pornography…”[45] (emphasis added)

 

  1. In summary, there is no clarity about what sort of content is likely to be in each category, ie what content will be covered by the duty of care.  It is not clear into which categories pornographic content defined in section 15 of the DEA will fall, that is material that would be considered as either R18 or falling within the scope of material too extreme to be classified by the BBFC and material that requires an 18 certificate.

 

  1. CARE recommends that:

19.1.            the Bill be amended so that illegal material includes extreme pornography and material that would be considered obscene under the OPA.

 

19.2.            pornography is listed as “primary priority content that is harmful to children” on the face of the Bill.


 

CARE’s concern about the Duty of Care requirements on those pornography websites that do fall within the Bill

 

  1. CARE is concerned that even those pornography websites which do fall within the Bill’s scope will not be required to have age verification since the impact assessment states at paras 195-6 that, contrary to the 2015 Conservative manifesto commitment, “The proportion of businesses required to employ age assurance controls and the type of controls required are unknown at this stage, this will be set out in future codes of practice.”[46]  As it is not clear in which category of content pornographic websites will be placed, there is a complete lack of clarity about the Government’s intentions.

 

  1. Companies will fulfil the duty of care “by putting in place systems and processes that improve user safety on their services. These will include, for example, user tools, content moderation and recommendation procedures.”[47]  However, exactly what actions will be required of providers is not clear because most of the detail will not be set by the legislation.  It will be for OFCOM to determine how the duty of care is met through the steps that are set out in their codes of practice which won’t be published until after the Bill has become law (clause 29), but which will be subject to negative resolution.    

 

  1. The codes of practice must meet statutory online objectives. Children of any age must be prevented from encountering “primary priority content” that is harmful to children.  As stated above, it is not clear whether that includes pornography but it is plainly imperative that it does if the Government is to honour its commitment to Parliament during the Urgent Question debate on 17th October 2019 about the non-implementation of Part 3 that rather than eroding the DEA protections, its new online safety legislation would enhance them.[48] For content accessed by children, clause 10 sets out “A duty to operate a service using proportionate systems and processes designed to…prevent children of any age from encountering, by means of the service, primary priority content that is harmful to children”.  This is similar but not as defined as the requirement in section 14 of the DEA where a provider would be non-compliant if they made “pornographic material available on the internet…other than in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18.” 

 

  1. When determining the options for age verification for the DEA, the Government said, “the BBFC will not provide an exhaustive list of approved age-verification solutions and instead confirms the criteria that the BBFC will use to judge, on a case-by-case basis, whether age verification arrangements are compliant with the Act[49] so there was some flexibility even then, but nevertheless an expectation that all commercial pornography websites would employ some age verification solution, in line with the Government’s stated policy objective at the time that “Commercial providers of porn, should have age verification (‘AV’) controls in place where it is accessed online in the UK.”[50]  In contrast, OFCOM will need to have regard to the “the desirability of promoting the use…of technologies designed to reduce the risk of harm” (clause 56(4) – new subsection (4A)(e)) in the way they operate but there is no requirement set out for a service provider to use age verification for pornographic content.  Instead, the Government continues to say it will champion the “online safety technology market”[51]  and ensure that OFCOM promotes online media literacy.[52] 

 

  1. Since the Bill was published, the Government has been much clearer about their expectation saying in a WPQ on 17 May: “we expect companies to use age verification technologies to prevent children from accessing services which pose the highest risk of harm to children, such as online pornography”.[53]  In the Tackling Violence Against Women and Girls Strategy published in July, the Government said, “Companies will also be expected to take steps to ensure that children are not able to access services which pose the highest risk of harm such as online pornography… Under the Government’s proposals, to prevent children from accessing content which poses the highest risk of harm to them, we expect companies to use measures that provide them with the highest confidence in the age of a user, for example, age verification. If companies do not use age verification technology, they will be required to demonstrate that their alternative approach delivers the same level of protection for children.” (The bold is the Government’s emphasis).[54]  We welcome these statements, but the expectation of which they speak must be set out clearly in the Bill and not left to chance.

 

  1. CARE recommends that the Bill be amended to require age verification for all pornography websites so that there is certainty for providers and the public.

 

CARE’s concern about Enforcement powers

 

  1. It is not clear whether the wide list of actions that are considered enforceable under the Bill (see clause 82[55]), will be effective in preventing harm to children or violence to women.   The DEA relied on the regulator asking ancillary services to block services or requiring ISPs blocking websites to enforce the provisions. Under this Bill, only in rare situations[56] will a court (rather than the regulator, OFCOM) direct an ancillary service to take actions against a service provider or for an ISP or other service to block access to a provider.  

 

  1. Nor is it clear how proactive OFCOM will be in ensuring websites are implementing the duty of care as set out in the Online Harms Bill. 

 

  1. CARE recommends that the Bill should have as robust as enforcement as the DEA on blocking extreme pornographic material.  

 

Conclusion

 

  1. This Bill is complex.  It comes with three other accompanying documents, totalling 488 pages.   Much of the content begs more questions than it answers. Important decisions about what content is going to be within scope will be in secondary legislation, eg. what is considered primary priority harmful and priority harmful material for children. Parliamentarians are, therefore, being asked to vote for a framework without a clear outstanding how it will apply to content specific concerns.   During the passage of the DEA, the Constitution Committee said,We question whether the House can effectively scrutinise the Bill when its scrutiny is impeded by the absence from the face of the Bill of any detail about the operation of the proposed age-verification regime. Nor is it the case that there will be subsequent opportunities for parliamentary scrutiny of delegated legislation on this matter, since the details of the regime will be set out in due course not by Ministers (or others) exercising regulation-making powers[57] The House of Lords Delegated Powers and Regulatory Reform Committee also criticised the vagueness of the Government’s proposals in the DEA and said, “we consider it objectionable as a matter of principle that a regulator, who is to be clothed with extensive powers to impose fines and take other enforcement action, should itself be able to specify how key concepts…are to be interpreted.”[58]  The same principles apply to this Bill.  Given these criticisms made at the time of the DEA, it is extremely disappointing that there are so few firm indications of what will be in scope on the face of the Bill. and urge the Committee to raise this point.

 

  1. CARE’s concerns about how online pornography will be treated within the Bill have been set out in this submission to the Committee. The Secretary of State did say that “If we could find a commensurate way of providing wider protection for children within [the Bill]…There is a strong case for doing that.” [59]  We welcome the statement in the Tackling Violence against Women and Girls Strategy that, “The Government recognises the concerns that have been raised about protecting children from online pornography on services which do not currently fall within the scope of the draft Bill. The Government will use the Online Safety Bill’s pre-legislative scrutiny process to explore ways to provide wider protections for children from online pornography, including on sites that do not fall within scope of the draft Bill.” (Government’s emphasis)[60] We urge the Committee to recommend changes to the Bill to:

30.1.         include a core number of online harms listed on the face of the Bill which could be amended through Regulations.  Pornography should be within this list;

30.2.         pornography should be classed as primary priority content that is harmful to children and this should be on the face of the Bill (clause 45).  This should include both legal and illegal content.

30.3.         An offence of possessing extreme pornography and an offence under the Obscene Publications Act 1959 should be added to the illegal content covered within the scope of the Bill so that its harm to women is recognised.

30.4.         There should be a clear requirement for age verification for pornographic websites on the face of the Bill.

30.5.         Commercial pornographic websites should also be added to the scope of the Bill, so that this Bill genuinely extends the provisions in the DEA, rather than abandoning them for a new and different piece of legislation.

30.6.         Internet service providers should be expected to block extreme pornography.

 

  1. In the meantime, CARE recommends that the Government should implement Part 3 of the Digital Economy Act 2017 with immediate effect to ensure that children and women have some level of protection from online pornography.  A comparison of the provisions of the DEA and the Online Safety Bill is included in the Appendix.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


ANNEX:  COMPARING PART 3 DEA WITH DRAFT ONLINE SAFETY BILL

 

 

Part 3, DEA 2017

Draft Online Safety Bill

Scope related to pornography

Commercial pornography websites – defined in Online Pornography (Commercial Basis) Regulations 2019[61]

 

No social media

 

Commercial pornography websites if they allow users to generate content for other users

 

Social media

 

Definitions of pornography

Pornography defined: section 15

No definition of pornography on the face of the Bill.

Expected to be some reference in regulations

Requirements in relation to pornography

Requirement to secure “that, at any given time, the material is not normally accessible by persons under the age of 18”. Section 14

No exact requirement specified but online objectives in clause 30 that “there are adequate controls over access to, and use of, the service by children, taking into account use of the service by, and impact on, children in different age groups[62]

Information power

Age-verification regulator has power to require information (section 18)

OFCOM has power to require information (clause 70)

Power to issue enforcement notice and a fine

If not complying with Age verification requirements or information, can impose a fine under section 19

If not complying with duty of care requirements or information, power to issue provisional notice of enforcement action (clause 80 and other clauses in Part 4, Chapter 6 which include imposing a fine.

Size of financial penalty

Section 20 - £250,000 or 5% of qualifying turnover

Clause 85 - £18m or 10% of qualifying worldwide revenue, whichever is greater

Extra-territorial powers

None

Clauses 127 and 128 provides extraterritorial powers.

Action with payment-services providers and ancillary service providers

Section 21 – could give notice of non-compliance but not require action.

Clause 91 – have to get court order to direct provider to take action under service restriction order

Defines extreme pornographic material

Section 22

Does not define on the face of the Bill.  May be added as illegal content under Regulations in clause 44.

Does include possession of a prohibited image of a child – an aspect that was missing from the DEA

Action on extreme pornography

Section 23 – require ISPs to block websites

Not clear if content in scope. 

If in scope under illegal material, duty would be to minimise it or if brought to providers attention to remove swiftly

Power to block material would be via a court order – an access restriction order (clause 93)

Blocking non-compliant websites

Section 21 – require ISPs to block websites if non preventing access to under 18s.

Not clear either if in scope.  If in scope under illegal material, duty would be to minimise it or if brought to providers attention to remove swiftly (clause 9).

Power to block material would be via a court order (clause 93)

Guidance powers

Section 25 – guidance on methods to be used by providers, to be approved by Parliament.

Extensive guidance powers for OFCOM but equivalent power would be code of practice about duties (clause 29) and online safety objectives (clause 30)

Exercise of functions by regulator

Section 26 – allows regulator to focus on providers with a large number of users in the UK or a large number of child users or a provider with a large turnover – ie proportional regulations

No equivalent for the Regulator but the SofS can make a statement of strategic priorities (clause 109)

Guidance by Secretary of State to regulator

Section 27 – in relation to regulator’s functions

The SofS has power of direction on codes of practice (clause 33), can make a statement of strategic priorities (clause 109) and give general guidance about OFCOM’s functions (clause 113)

Requirements for notices by regulator

Section 28 – how the regulator is to contact organisations inside and outside of the UK

Does not appear to be an equivalent

Report on effectiveness/review

Section 29 - Within 18 months, but not before 12 months, with a consultation on the definitions in the Act.

Clause 115 – Not before two years and within five years.

 

22 September 2021

13

 


[1]               Child Safety Online: Age Verification for Pornography, Feb 2016, page 4 

[2]               Young People, Sex and Relationships: The New Norms, Institute for Public Policy Research, August 2014, page 4.  Study involved 500 18 year olds

[3]                Basically…porn is everywhereA Rapid Evidence Assessment of the effects that access and exposure to pornography have on children and young people, Horvath et al, (2013), pages 7 and 8, Produced for the Children’s Commissioner for England

[4]               The relationship between pornography use and harmful sexual behaviours A primary research report prepared for the Government Equalities Office, 20 February 2020, page 9

[5]               Ibid, page 19

[6]               Review of sexual abuse in schools and colleges - GOV.UK (www.gov.uk)

[7]               Review of sexual abuse in schools and colleges - GOV.UK (www.gov.uk)

[8]              Women and Equalities Select Committee on Sexual harassment of women and girls in public places, Report published 23 Oct 2018, para 101

[9]               The relationship between pornography use and harmful sexual attitudes and behaviours: literature review, 20 Feb 2020, pages 6-9

[10]               Primary Research, Op Cit, pages 8 and 12

[11]               Tackling Violence against Women and Girls Strategy, HM Government, July 2021, page 35

[12]               https://issuu.com/conservativeparty/docs/ge_manifesto_low_res_bdecb3a47a0faf/75 page 35.

[13]               Sections 15-17, 21(5) 22, 25, 26(2), 27, 30(1) and 30(2) came into force on 31 July 2017 under the Digital Economy Act 2017 (Commencement No. 1) Regulations 2017

[14]               Draft Online Safety Bill - GOV.UK (www.gov.uk)

[15]               Impact Assessment, paras 84-85, page 27 and para 123, page 36

[16]               https://committees.parliament.uk/call-for-evidence/567/

[17]               Impact Assessment, paras 84-85, page 27 and para 123, page 36

[18]               House of Commons Hansard, 10 June 2021., column 1162

[19]               https://avpassociation.com/uncategorized/an-open-letter-to-the-prime-minister-from-teachers-charities-and-baroness-benjamin/

[20]               The system will be funded by fees on service providers. See Impact Assessment, page 159 and Part 3; Chapter 2 of the Bill

[21]               Government statement on Online Harms, 15 December 2020, cols 146 and 153

[22]               Impact Assessment, page 123

[23]               Memorandum from the Department for Digital, Culture, Media and Sport and the Home Office to the Delegated Powers and Regulatory Reform Committee, para 151, page 31

[24]               The figure is based on costs associated with child sexual abuse and exploitation, hate crime, illegal sale of drugs, modern slavery, cyber stalking, cyber bullying and intimidation of public figures, so the actual cost is likely to be much larger.  Impact Assessment, para 102, page 30 and Table 40, page 80

[25]                             https://www.theguardian.com/technology/2020/dec/14/pornhub-purge-removes-unverified-videos-investigation-child-abuse

              https://www.bbc.com/news/technology-55231181

[26]               WPQ 38440, answered 6 September 2021

[27]                             Young people, Pornography & Age-verification, BBFC, January 2020, page 6.  Available from the BBFC on request as the research contains graphic sexual content and pornographic language.

[28]               Thurman N and Obster F, The regulation of internet pornography: What a survey of under‐18s tells us about the necessity for and potential efficacy of emerging legislative approaches, Policy and Internet, 4 May 2021, https://onlinelibrary.wiley.com/doi/10.1002/poi3.250

[29]               Impact Assessment, para 109, page 32

[30]               Impact Assessment, para 115, page 33

[31]               Impact Assessment, para 116, page 34

[32]               Impact Assessment, para 193, page 53

[33]               Impact Assessment, para 161, page 45, para 176, page 48 and para 180, page 50

[34]               Impact Assessment, para 180, page 50

[35]               Digital Economy Bill, Age Verification Impact Assessment, 2018, page 1

[36]               Quote from Q104.  See also Q107-Q109, Evidence given to the Public Bill Committee Digital Economy Bill, 11 October 2016

[37]               Online Harms White Paper: Full Government Response to the consultation. December 2020, CP 354, para 2.3, page 24

[38]               Section 63, Criminal Justice and Immigration Act 2008

[39]               See: meaning of extreme pornographic material in section 22 of the DEA.

[40]               Online Harms White Paper, April 2019, Table 1: Online Harms in Scope, page 31

[41]               Delegated Powers Memorandum, page 31

[42]               Online Harms White Paper: Full Government Response to the consultation. December 2020, CP 354, para 2.3, page 24

[43]               Impact Assessment, pages 19-20, para 72 and Table 2, page 25 and page 123

[44]               Online Harms White Paper: Full Government Response to the consultation, para 2.3, page 24

[45]               Tackling Violence against Women and Girls Strategy, HM Government, July 2021, page 35

[46]               Impact Assessment, see Table 15 and footnote 84, page 44

[47]               Impact Assessment, page 124

[48]               The Minister said, “This is not an indefinite postponement of the measures that we are seeking to introduce; it is an extension of what they will achieve.” House of Commons Hansard, 17 October 2019, col 454 

[49]               Explanatory Memorandum to The Draft British Board Of Film Classification (BBFC) Guidance On Age Verification Arrangements, para 7.4

[50]               Impact Assessment Age verification for pornographic material online, 13 June 2018, page 10

[51]               Impact Assessment, page 135

[52]               Impact Assessment, page 137

[53]               PQ UIN 199, answered 17 May 2021.  See also the same statement made by Caroline Dineage MP on 11 May 2021, Q226 in evidence given the House of Lords Communication and Digital Committee

[54]               Tackling Violence against Women and Girls Strategy, HM Government, July 2021, page 39

[55]               Note that separately, it will be an offence under clause 71 not to comply with a request for information that OFCOM considers it requires under clause 70.  The penalty is a fine and/or up to 2 years imprisonment. 

[56]               Impact Assessment, para 238, page 63

[57]               House of Lords, Select Committee on the Constitution, 17 January 2017, HL 96, para 12, page 3

[58]               House of Lords, Delegated Powers and Regulatory Reform Committee, 22 December 2016, HL 89, para 31, page 6

[59]               Q26 and Q27 in evidence given the House of Commons, Digital, Culture, Media and Sport Committee, 13 May 2021

[60]               Tackling Violence against Women and Girls Strategy, HM Government, July 2021, page 48

[61]               https://www.legislation.gov.uk/uksi/2019/23/contents/made

[62]               Clause 30(2)(a)(viii)