POSR0010
Written evidence submitted by The Christian Institute
1. The Christian Institute exists for “the furtherance and promotion of the Christian religion”. We are a non-denominational Christian charity operating throughout the UK and elsewhere. We are supported by approximately 60,000 individuals and churches. We have decades of experience defending free speech and religious liberty as well as working to protect children from inappropriate material.
2. The Institute welcomed the passing of the Online Safety Bill, as a long-overdue attempt to keep children safe online, particularly from pornographic content. Our concern now is that the Bill may not be enforced effectively, particularly when similar provisions in Part 3 of the Digital Economy Act 2017 were never brought into force. During the passage of the Bill, the Institute worked in co-operation with other charities and organisations seeking to strengthen enforcement provisions for companies failing to protect children from primary priority content.
3. The Institute also supported the Government’s decision in November 2022 to drop the vague ‘legal but harmful’ provisions in the Bill. We were concerned that these measures would have a detrimental effect upon the freedom of speech for Christians and others whose convictions are unfairly construed as ‘harmful’.
4. In summary, this submission makes eight recommendations:
1. Ofcom must plan to increase transparency under the OSB regime, publishing more detailed regular updates.
2. Ofcom should clearly set out which platforms it expects to fall within the 20 service providers with close supervision and what proportion will be pornographic websites.
3. A dedicated taskforce should be set up to deal with age verification for pornographic websites. We recommend that Ofcom designate a co-regulator to assist it in this aspect of its work, possibly the British Board of Film Classification (BBFC), which was the designated regulator under Part 3 of the Digital Economy Act 2017.
4. Ofcom should provide a clear timeframe for platforms to implement age verification after a confirmation decision is issued.
5. Ofcom should outline what assurances of cooperation it has received from payment and ancillary services in implementing the regime. Every effort should be taken to work with fellow regulators overseeing those sectors.
6. Ofcom should ensure that in the recruitment of staff it is considering those who will bring expertise on freedom of expression, including those from free speech advocacy groups.
7. Ofcom should reconsider previous guidance which has provided clearer advice on the limits of categories such as ‘incitement to hatred’. Particular consideration should be given to the wording of the ATVOD guidance from May 2012.
8. Ofcom should ensure robust free speech training is provided for staff involved in content moderation.
Ofcom’s slow action with Video Sharing Platforms
5. The size of the scale-up of the online safety regime must not be underestimated. Ofcom has had powers to regulate UK-based Video Sharing Platforms (VSPs) since November 2020 (although Ofcom indicated that routine enforcement would not happen until after their guidance was published in October 2021, commencing the ‘first year’ of the regime)[1]. As of 9 August 2023, 20 platforms were notified as falling within scope of the regulation. These were categorised into three categories: large VSPs; smaller adult VSPs; and smaller VSPs.[2] By contrast, the National Audit Office’s Preparedness for online safety regulation report suggests that the “number of online services subject to regulation could total more than 100,000 and could be significantly higher”.[3]
6. However, despite there currently being only 20 notified platforms, progress on the VSP-regime has been slow. In its first-year review of the regime, in October 2022, Ofcom said that one of their “strategic priorities” for Year 2 (i.e. October 2022 to October 2023) was to “drive forward the implementation of robust age assurance, to protect children from the most harmful online content (including pornography)”.[4] In January 2023 Ofcom launched a four-month enforcement programme, suggesting that at the end of this period of “assessment” it may open formal investigations.[5] But, over nine months on, Ofcom has extended this period until at least the end of the year.[6] At the end of Year 2, only two formal investigations have been opened, with only one relating to failure to “take appropriate measures to protect under-18s from videos containing pornographic content” (the other, opened prior to the enforcement programme, related to failure to comply with a formal information request and concluded with the only fine that has been issued thus far under the regime).[7]
7. Lack of transparency makes it easy for Ofcom to hide behind constant ‘assessment’. In its first-year review, Ofcom revealed that “Most of the smaller adult VSPs appear to have age assurance measures that involve potential subscribers entering their own date of birth or ticking a box declaring themselves to be over 18 and agreeing to terms and conditions which state that all users must be over 18 years of age”.[8] Yet, in September 2023, Ofcom said it was still continuing its assessments into the age assurance systems of notified smaller adult VSPs (i.e. all notified services except OnlyFans).[9] Is this an assessment of new systems subsequently put in place or a re-evaluation of the same data? We simply don’t know because no substantive update has been published for over a year.
8. Ofcom has argued it does not want to take an overly-punitive approach to regulation, and would rather work with platforms. However, Ofcom appears to be allowing platforms to drag their feet? Ofcom’s initial guidance for VSPs in 2021 was clear that tick-box age assurance was not sufficient.[10] This should have been no surprise to the industry. In 2013, the then Authority for Television on Demand (ATVOD) Chief Executive Pete Johnson stated in the context of on-demand services: “Asking visitors to a website to click an ‘I am 18’ button or enter a date of birth or use a debit card is not sufficient”.[11] Platforms have had ample time to prepare prior to the commencement of the enforcement process.
9. Evidence from other countries shows that swift implementation is possible when pressure is applied. French regulator Arcom gives non-compliant porn sites only 15 days to put age verification in place, before requesting blocking orders from the court.[12] Legislation going through the Canadian Parliament gives 20 days.[13] The Age Verification Providers Association has noted that when pushed, “major sites turned it on with ten days’ notice in France”.[14] This timeframe better reflects the technical timescale for a site to implement age verification. Yoti, the age verification provider used by OnlyFans, suggests it can “Add age verification to your website in minutes”.[15]
10. The timeframe of Ofcom’s investigation into Secure Live Media Ltd (SLM) raises concerns that children could be left unprotected for prolonged periods, despite harms being known. It was announced on 16 May 2023 that Ofcom was opening an investigation into SLM for failing to notify Ofcom of its service CamSoda and to implement appropriate age assurance.[16] The outcome of this investigation has now been delayed to at least next year. Although specific circumstances may have complicated this investigation, including attempts to dissolve the company and the resignation of its director Mr Musonda,[17] this does not set a good precedent. The pornography industry has a reputation for similar tricks. Under parent companies such as Aylo (formerly MindGeek), companies have been disbanded but the same content is allowed to flourish with distribution to other sites. There are serious questions as to how Ofcom would cope with multiple simultaneous investigations, even with an extra 450 staff.[18]
Recommendations
11. Recommendation 1: Ofcom must plan to increase transparency under the OSB regime, publishing more detailed regular updates. The current three or four bullet points updates under its enforcement bulletin do not give confidence that action is being taken.[19] At the very least, regular interim reports should be published, of the level of detail contained in its first-year review, outlining examples of specific measures adopted by particular platforms, even with the caveat that the final analysis awaits a full review. This would enhance accountability.
12. Recommendation 2: Ofcom should clearly set out which platforms it expects to fall within the 20 service providers with close supervision and what proportion will be pornographic websites. The National Audit Office report states: “there will be a two-tier regime for monitoring compliance, with only what Ofcom expects to be the largest and riskiest 20 service providers initially subject to detailed, one-to-one supervision, while Ofcom does not expect to engage regularly with the rest.”[20] This would make the immediate focus the same size as the current VSP regime. But it is unclear whether any dedicated pornography website would fall into this category. Once Ofcom has considered Google, YouTube, Facebook, X, Instagram, Reddit etc., what space will be left for pornographic sites? It is also unclear what it means in practical terms when Ofcom says it will not “engage regularly” with the rest. Will this be once a year, every three years, or more frequently? And will Ofcom be willing to take enforcement action against platforms it has not engaged with regularly for failing to have robust age verification in place?
13. Recommendation 3: A dedicated taskforce should be set up to deal with age verification for pornographic websites. We recommend that Ofcom designate a co-regulator to assist it in this aspect of its work, possibly the British Board of Film Classification (BBFC), which was the designated regulator under Part 3 of the Digital Economy Act 2017. During the passage of the Online Safety Bill, the Minister Lord Parkinson of Whitley Bay stated: “Where appropriate and effective, Section 1(7) of the Communications Act 2003 and Part II of the Deregulation and Contracting Out Act 1994 provide a route for Ofcom to enter into co-regulatory arrangements under the online safety framework.”[21] We believe that the unique conditions of the online pornographic industry justify its own dedicated co-regulator for age verification. There are three reasons for this:
14. Under the unimplemented Part 3 of the Digital Economy Act 2017 the BBFC was designated as a dedicated age verification regulator and published Guidance on Age-Verification Arrangements. Although this was never implemented under that regime, it was adopted by Ofcom for its regulation of pornographic content on Video-On-Demand (VOD) platforms.[24] Given its previous experience, the BBFC is the obvious choice for a co-regulator.
15. Recommendation 4: Ofcom should provide a clear timeframe for platforms to implement age verification after a confirmation decision is issued. The Online Safety Bill leaves the timescale of each step of the enforcement process largely to the discretion of Ofcom. This includes the time to implement an enforceable action. Clause 134(4)(1)(f) requires Ofcom to “specify a reasonable period within which each of the steps specified in the [confirmation] decision must be taken”.[25] This discretion may be justified to cover the wide range of actions that could be required under the regime as a whole. But Ofcom should be very clear that, for any size of platform, implementing robust age verification does not take long.
16. Far from encouraging a culture change, Ofcom’s slow progress under the VSP regime is likely to encourage inaction. This could change if Ofcom issued a statement that age verification must be implemented within a matter of days or weeks of notification, reflecting the overseas regimes mentioned above. As the former chair of the Better Regulation Executive, Lord Curry of Kirkhale made clear in reference to enforcement of the online safety regime: “It was my experience that regulators that had a reputation for acting quickly and decisively, and being tough, had a much more compliant base as a consequence.”[26]
Ofcom engagement with financial and ancillary services
17. Cultivating a safer internet for children is going to take more than the efforts of a single regulator. Ofcom must engage not only with other regulators, but also with the financial and ancillary services which prop up the tech sector. There is some evidence of Ofcom doing this already in limited ways. The National Audit Office refers to Ofcom’s work with “the Financial Conduct Authority (FCA) on the Bill’s provisions regarding fraudulent advertising”.[27] Obviously, in this instance, there is a clear overlap of regulatory functions. But Ofcom needs to set out how they are doing this more broadly. For example, the FCA also has a role in penalising payment services supporting illegal activities. What procedures are in place for Ofcom to inform the FCA of particular platforms promoting illegal CSEA content, so that payment providers can be held to greater account?
18. Evidence suggests that the most effective regulation comes through co-operation of the supporting sectors. In 2020, Visa and Mastercard announced that they were withdrawing availability of their payment services from Pornhub, following a damning investigation into Pornhub by Nicholas Kristof of the New York Times.[28] As a result, Pornhub deleted more than ten million of its vilest videos, after years of failing to effectively remove CSEA content hiding behind the excuse it is impossible to be sure whether a youth in a video is 14 or 18.[29]
19. But pressure must continue to be applied in order to encourage payment services to act. It was only as a result of a negative court case in 2022, which raised questions over whether Visa and Mastercard could be facilitating child pornography, that they announced they were suspending ties with the advertisement arm of MindGeek, the parent company of Pornhub.[30] Ofcom currently publicise investigations or enforcement action against platforms on its website and email bulletins. An effort should be made to ensure that the major payment providers are included on this circulation so that they are fully informed of the situation and can act appropriately. They should not be able to plead ignorance.
20. Gaining the support of financial services and Internet Service Providers (ISPs) was the bedrock of the Digital Economy Act 2017. The Government has argued that the Online Safety Bill is a stronger regime because it empowers the courts to force action more directly. But in reality this system will almost certainly drag on. After a period of time Ofcom may issue a ‘confirmation decision’. More weeks pass and the company refuses to implement the measures, preferring an ultimate day in court. Eventually, Ofcom applies to the courts for ‘service restriction orders’. The courts grant it, but then accedes to the porn site’s request for interim relief pending the outcome of a judicial review. How long will this take? Ancillary service action in addition to court action would provide extra speed and flexibility.
Recommendation
21. Recommendation 5: Ofcom should outline what assurances of cooperation it has received from payment and ancillary services in implementing the regime. Every effort should be taken to work with fellow regulators overseeing those sectors. When the Digital Economy Act was laid before the Lords in 2017, the then Minister Lord Ashton of Hyde informed the House: “I can inform noble Lords that we have had constructive discussions with payment providers and they have indicated that they will act under our regime.”[31] Can Ofcom confirm any actions that payment providers or ancillary services have said they will take on being informed?
Freedom of expression
22. Despite the removal of the ‘legal but harmful’ provisions, Ofcom will still have an important role in ensuring that freedom of expression is protected online. Ofcom will have to assess whether platforms are complying with their duty “to have particular regard to the importance of protecting users’ right to freedom of expression within the law”.[32] Furthermore, Ofcom guidance will effectively set thresholds for the categories of content requiring user-empowerment tools, for example ‘incitement to hatred’ based on a range of protected characteristics.[33] A low or unclear threshold risks posts which merely criticise a particular belief being hidden from a user who believes they are only filtering out genuinely dangerous content.
23. Ofcom has indicated it has recruited from a “variety of relevant bodies” from civil society to assist in operating the regime. It explicitly mentioned the National Society for the Prevention of Cruelty to Children (NSPCC), Internet Watch Foundation, the National Crime Agency, and online service providers such as Google and Meta, along with other regulators, but no free speech groups are mentioned.[34]
24. In recent years, Ofcom has watered down its guidance for on-demand programming around content likely to incite hatred. In May 2012, Ofcom’s designated regulator for on-demand services, ATVOD, issued guidance explaining what incitement to hatred does and does not cover, including:
‘Hatred’ is a strong word. It is neither the purpose nor the intention of section 368E(1) of the Act to restrict legitimate freedom of speech by prohibiting or restricting discussion, criticism or expressions of antipathy, dislike, ridicule, insult or abuse for groups covered by this requirement. For example it is permissible to express criticism, dislike or ridicule of a religious belief system or its practices or urge its adherents to cease practising or to express views which are sexist, insulting or offensive but which stop short of being likely to incite hatred.[35]
25. Although non-statutory, this guidance drew on section 29J of the Public Order Act 1986, which clarified what was not covered by ‘incitement to hatred’. It also reflects judgments in UK case law, including Redmond-Bate v DPP (1999), which included the important principle that: “Freedom only to speak inoffensively is not worth having.”[36] However, following the disbanding of ATVOD in 2015, Ofcom issued subsequent guidance which removed this clear statement. Instead it has linked to a page containing various European Court of Human Rights judgments on ‘incitement to hatred’.[37] Expecting companies to interpret these is more likely to cause confusion than previous guidance.
26. There are questions as to the appropriate training Ofcom has received to adjudicate on public order offences. The priority illegal content set out in Schedule 7 includes public order offences like sections 4A and 5 of the Public Order Act 1986. Although the police mostly get it right, they sometimes – despite their training and expertise – overstep the mark in enforcing these laws. This leads to unjustifiable arrests of street preachers, protestors and others. Expecting Ofcom to adjudicate parallel decisions without sufficient training is deeply problematic. Similarly, Schedule 7 mentions sections 29B, 29C and 29E of the Public Order Act, which deal with stirring up hatred on the grounds of religion and sexual orientation. These are criminal offences covering speech, with a seven-year maximum custodial penalty. They are offences so serious and potentially restrictive of free speech that only the Attorney General can authorise a prosecution. The offences also contain several other free speech safeguards. What police-style training has Ofcom staff been given to make appropriate assessments?
Recommendations
27. Recommendation 6: Ofcom should ensure that in the recruitment of staff it is considering those who will bring expertise on freedom of expression, including those from free speech advocacy groups.
28. Recommendation 7: Ofcom should reconsider previous guidance which has provided clearer advice on the limits of categories such as ‘incitement to hatred’. Particular consideration should be given to the wording of the ATVOD guidance from May 2012.
29. Recommendation 8: Ofcom should ensure robust free speech training is provided for staff involved in content moderation.
October 2023
[1] Regulating video-sharing platforms: A guide to the new requirements on VSPs and Ofcom’s approach to regulation, Ofcom, October 2020, page 7, para. 3.9
[2] ‘Notified video-sharing platforms’, Ofcom, 9 August 2023, see https://www.ofcom.org.uk/online-safety/information-for-industry/vsp-regulation/notified-video-sharing-platforms as at 13 October 2023; The VSP Landscape: Understanding the video-sharing platform industry in the UK, Ofcom, October 2022, page 4, figure 2
[3] Preparedness for online safety regulation, National Audit Office, July 2023, page 9, para. 14
[4] Ofcom’s first year of video-sharing platform regulation: What we found, Ofcom, October 2022, page 21, para. 5.7
[5] ‘Enforcement programme into age assurance measures on UK-established, adult video-sharing platforms’, Ofcom, 29 September 2023, see https://www.ofcom.org.uk/about-ofcom/bulletins/enforcement-bulletin/open-cases/cw_01266 as at 13 October 2023
[6] Ibid
[7] ‘Enforcement programme into age assurance measures on UK-established, adult video-sharing platforms’, Ofcom, 29 September 2023, see https://www.ofcom.org.uk/about-ofcom/bulletins/enforcement-bulletin/open-cases/cw_01266 as at 13 October 2023; Ofcom’s first year of video-sharing platform regulation: What we found, Ofcom, October 2022, page 11, para. 3.8; ‘Investigation into Tapnet’s compliance with a statutory information request’, Ofcom, 27 March 2023, see https://www.ofcom.org.uk/about-ofcom/bulletins/enforcement-bulletin/all-closed-cases/cw_01263 as at 13 October 2023
[8] Ofcom’s first year of video-sharing platform regulation: What we found, Ofcom, October 2022, page 106, para. 14.6
[9] ‘Enforcement programme into age assurance measures on UK-established, adult video-sharing platforms’, Ofcom, 29 September 2023, see https://www.ofcom.org.uk/about-ofcom/bulletins/enforcement-bulletin/open-cases/cw_01266 as at 13 October 2023
[10] Video-sharing platform guidance: Guidance for providers on measures to protect users from harmful material, Ofcom, October 2021, page 42, para. 4.119
[11] ‘Pornographer barred from providing video on demand service’, ATVOD, 15 November 2013, see https://web.archive.org/web/20131203053908/http://www.atvod.co.uk/news-consultations/news-consultationsnews/pornographer-barred-from-providing-video-on-demand-service as at 13 October 2023
[12] ‘No blocking for these porn sites in France, but a reprieve of a few months’, Aroged, 7 July 2023, see https://www.aroged.com/2023/07/07/no-blocking-for-these-porn-sites-in-france-but-a-reprieve-of-a-few-months/ as at 13 October 2023
[13] Bill S-210, 18 April 2023, Clauses 8(2)(d) and 9(1), see https://www.parl.ca/DocumentViewer/en/44-1/bill/S-210/third-reading as at 13 October 2023
[14] Submission by Age Verification Providers Association, Ofcom, Ofcom’s proposed plan of work 2023/24, page 3, see https://www.ofcom.org.uk/__data/assets/pdf_file/0026/255905/age-verification-providers-association.pdf as at 13 October 2023
[15] ‘Age Verification’, Yoti, see https://www.yoti.com/business/age-verification/ as at 13 October 2023
[16] ‘Investigation into Secure Live Media Ltd’, Ofcom, 16 May 2023, see https://www.ofcom.org.uk/about-ofcom/bulletins/enforcement-bulletin/open-cases/cw_01272 as at 13 October 2023
[17] ‘Secure Live Media Ltd.’, Companies House, see https://find-and-update.company-information.service.gov.uk/company/10220682/filing-history as at 13 October 2023
[18] National Audit Office, Op Cit, page 9, para. 12
[19] ‘Enforcement programme into age assurance measures on UK-established, adult video-sharing platforms’, Ofcom, 29 September 2023 see https://www.ofcom.org.uk/about-ofcom/bulletins/enforcement-bulletin/open-cases/cw_01266 as at 13 October 2023
[20] National Audit Office, Op Cit, page 35, para. 3.11
[21] House of Lords, Hansard, 16 May 2023, col. 220
[22] ‘Children see pornography as young as seven, new report finds’, British Board of Film Classification, 26 September 2019, see https://www.bbfc.co.uk/about-us/news/children-see-pornography-as-young-as-seven-new-report-finds as at 13 May 2023; ‘A lot of it is actually just abuse’: Young people and pornography, Children’s Commissioner, January 2023, page 5
[23] A lot of it is actually just abuse’: Young people and pornography, Children’s Commissioner, January 2023, page 7
[24] Submission by British Board of Film Classification, Australian House of Representatives, Standing Committee on Social Policy and Legal Affairs inquiry into age-verification for online wagering and online pornography, see https://www.aph.gov.au/DocumentStore.ashx?id=1eb17b19-7894-4f5f-b1c0-40c303cf70c1&subId=673259 as at 13 October 2023
[25] Online Safety Bill, 19 July 2023, Clause 134
[26] House of Lords, Hansard, 2 May 2023, col. 1492
[27] National Audit Office, Op Cit, page 34
[28] The New York Times online, 10 December 2020, see https://www.nytimes.com/2020/12/10/business/visa-mastercard-block-pornhub.html as at 13 October 2023
[29] Spectrum News NY1 online, 15 December 2020, see https://ny1.com/nyc/all-boroughs/entertainment/2020/12/15/pornhub-removes-10-million-videos-in-response-to-allegations-about-content as at 13 October 2023
[30] Reuters, 4 August 2022, see https://www.reuters.com/business/finance/mastercard-visa-suspend-ties-with-ad-arm-pornhub-owner-mindgeek-2022-08-04/ as at 13 October 2023
[31] House of Lords, Hansard, 2 February 2017, col. 1351
[32] Online Safety Bill, 19 July 2023, Clause 22
[33] Online Safety Bill, 19 July 2023, Clause 16
[34] National Audit Office, Op Cit, page 31, para. 2.16
[35] Rules & Guidance: Statutory Rules and Non-Binding Guidance for Providers of On-Demand Programme Services (ODPS), The Authority for Television On Demand, May 2012, page 11
[36] Redmond-Bate v Director of Public Prosecutions [1999] EWHC Admin 733 at para. 20
[37] On-demand programme services (“ODPS”) guidance: Guidance for ODPS providers on measures to protect users from harmful material, Ofcom, December 2021, page 4