Written evidence submitted by CEASE

 

 

 

Evidence from CEASE to the DCMS Sub-Committee on Online Harms and Disinformation’s Inquiry into the Government’s approach to tackling harmful content online

 

Introduction

 

CEASE - the Centre to End All Sexual Exploitation- is an human rights advocacy charity which exposes the underlying cultural and commercial forces behind all forms of sexual exploitation.

 

We welcome this inquiry and are extremely concerned that the ‘once in a generation’ Online Safety Bill will not, as proposed, address the multifaceted harms caused by the commercial online pornography industry, including image-based sexual abuse, child sexual abuse, the normalisation of sexual violence and harmful sexual attitudes and behaviours.

 

Our recommendations:

1. Mandate the use of Age Verification for all pornographic websites, regardless of their size or functionalities. This should be brought in immediately through the powers of the Digital Economy Act 2017 and then further strengthened through the Online Safety Bill.

 

2. Introduce regulation to ensure that pornography websites either remove user generated content (UGC) functionality or implement robust age and consent verification processes to ensure that all those featured in video uploads are consenting adults.

 

3. Identify pornography websites as providers of Category 1 services, introduce relevant Codes of Practice and designate a specific regulator to ensure compliance. Porn sites must be made to stop hosting illegal “extreme” pornography and the “legal but harmful” content prohibited by their own terms of service.

 

Our CEO, Vanessa Morse, would welcome the opportunity to give further information in an oral evidence session to the Committee.

 

  1. How has the shifting focus between ‘online harms’ and ‘online safety’ influenced the development of the new regime and draft Bill?

 

The shifted focus of the bill seems to represent a step back away from comprehensively tackling the more generalised harms of the online world. Using the term ‘safety’ instead implies a narrower focus on protecting certain vulnerable groups of users, rather than also addressing the broader impact of online industries even to society as a whole. Its focus seems to have shifted onto reactively addressing discrete problems through specific ‘bolt on’ tools rather than through broader, more comprehensive and proactive approaches that deliver safety-by-design (except for services likely to be accessed by children, where more weight is given to consideration of design).

 

This approach is probably sufficient for most online areas, but it is certainly not sufficient to manage the harms of others such as online commercial pornography.

 

To take the example of image-based sexual abuse: while the bill’s explanatory notes do make it clear that this will be tackled, the bill itself makes no mention of it, instead focusing almost exclusively with user safety, rather than the safety of those impacted on online industries less directly.

 

We have been assured that “companies will need to tackle illegal content, including revenge porn and illegal online abuse... making sure it is taken down quickly and using tools to minimise the risk of similar material appearing." However, “making sure it is taken down quickly” is simply not good enough; for as long as it appears online, it has the potential to be downloaded, reuploaded and disseminated far and wide,  devastating the lives of victims. What’s more, the notion of sites using "tools to minimise the risk" of this material appearing in the first place fails to recognise how porn tube sites' design is intrinsically high-risk. Therefore, mere “tools” will not suffice.

Image based sexual abuse, along with child sexual abuse material (CSAM) and other forms of illegal content do not simply appear occasionally and accidentally on porn sites; they are part of the mainstay of user generated content. They don’t simply “slip through the net” since there is no net; or at least none save the long, rather dry and inaccessible terms of service. There are currently no verification processes to ensure all user generated content is legal. Unless the more fundamental issues of business model and design are addressed head on, pornography sites will be able to introduce reactive “tools” which it knows will be of limited efficacy. We are concerned that such measures will represent empty, tick-box virtue signaling, rather than an effective solution that protects the vulnerable.  These issues are detailed and evidenced in our recent report, Expose Big Porn (July 2021).

Whilst no one would deny the urgent need to introduce regulation to address all of these issues, there are problems with the bill’s “all-encompassing” approach which, because it is necessarily vague, generalised and ambiguous, could easily cause some areas of harm to simply fall through the cracks. The one-size-fits all approach could result in some industries succeeding in shrugging off the ill-fitting obligations altogether.

 

  1. Is it necessary to have an explicit definition and process for determining harm to children and adults in the Online Safety Bill, and what should it be?

 

It is extremely difficult to give one definitieve and explicit definition for harm to adults and children because of the bill’s breadth. Online harms are both direct and indirect, i.e. they impact the individual user but also sometimes those around the user and ultimately, the whole of society. What’s more, harms vary according to the age, demographic and background of the individual user.

 

That being said, we believe that the understanding of harm should be sufficiently broad to include the harm of being exposed to ideas and ideology that undermine our ethical and moral social values. For example extreme racism, sexual violence, coercion, child sexual abuse and incest should not be tolerated in online pornography, since like any other media, pornography shapes users’ understanding, attitudes and behaviours. Research confirms that ideas in pornography play out in the real world and drive sexual violence and sex inequality. 

 

Understanding the harms in each area of the online world will require research. We would recommend that Ofcom takes the time to consult with expert academics, practitioners and organisations in order to gain an accurate picture of how online pornography both drives harmful sexual behaviour and exacerbates vulnerability.

 

Does the draft Bill focus enough on the ways tech companies could be encouraged to consider safety and/or the risk of harm in platform design and the systems and processes that they put in place?

 

We do not believe that the draft bill focuses enough on the risk of harm in platform design and the systems and processes that they put in place. This is a huge area of concern with regards to the online commercial pornography industry, whose ‘freemium’ tube-style business model is inherently high risk and where reactive, bolt-on measures are simply insufficient.

 

The bill is drafted with certain generalised assumptions about the industries that it is regulating. There seems to be a presumption of integrity, and an expectation that industries will have reasonably high levels of corporate social responsibility and will aim to comply with both the letter and the spirit of the law. However, these assumptions are ill-founded in relation to the online pornography industry, which is characterised by a virtual absence of corporate transparency and accountability. It lays down superficial moderation processes on top of an inherently high-risk business model, denies all evidence of criminal negligence, releases statements that amount to exercises in PR spin and is strategically “creative” in the enforcement of its own terms and conditions.

 

We believe it is therefore of foremost importance to ensure that, at least for high-risk industries, measures are comprehensive and robust.

 

We refer you to CEASE’s July 2021 report: ‘Expose Big Porn: uncovering the online commercial pornography industry’ for further information on this.

 

What are the key omissions to the draft Bill, such as a general safety duty or powers to deal with urgent security threats, and (how) could they be practically included without compromising rights such as freedom of expression?

 

We are extremely concerned that, although the draft Bill references certain online industries and areas of harm (namely Social Media and “illegal content”[1], “terrorism content”, “CSEA content” and “priority illegal content”) it excludes the online commercial pornography industry. In fact, there is no reference to it at all, in spite of its extraordinary scale, influence and its unique, elevated risks and harms. If online pornography is not specifically mentioned in the Bill, there is a strong chance that any secondary legislation will simply not be created with it in mind.

 

Regulation of the porn industry is not about curtailing freedom of expression. Rather, it is about identifying the fact that currently, porn sites are hosting and monetizing illegal content with impunity.

 

Are there any contested inclusions, tensions or contradictions in the draft Bill that need to be more carefully considered before the final Bill is put to Parliament?

The draft Online Safety Bill’s explanatory notes include “revenge porn” and “upskirting” as examples of illegal content. However, as mentioned, the Bill itself does not even mention the online commercial pornography industry, much less identify the ways in which it facilitates the mass distribution of such content. The same can be said of child sexual abuse material.

 

We implore the government to apply joined-up thinking to these issues.  There is a real danger that stipulations are vague and open to subjective interpretation, which will give unscrupulous industries the freedom to wiggle out of their obligations.

 

For example, the main way the Bill expects sites to tackle illegal material on its site is through identifying the risks (through a risk assessment) and then taking unspecified steps to mitigate them. However, risk assessments are inherently subjective, and give companies the scope to determine for themselves what constitutes acceptable risks and sufficient safeguards.

 

The Bill stipulates that companies are meant to consider “how the design and operation of the service (including the business model, governance and other systems and processes) may reduce or increase the risks identified”- but nothing about them having to actually change them and to ensure that their services are “safe by design.”[2]

 

We know that porn platforms have a strong incentive to emphasise the measures they have put in place (which are reactive, partial and insufficient) whilst remaining silent on the more fundamental changes necessary. In short, there’s a very real danger that the mandate for risk assessments will allow companies to effectively continue self-regulating.

 

What are the lessons that the Government should learn when directly comparing the draft Bill to existing and proposed legislation around the world?

 

Globally, governments are beginning to wake up to the fact that online pornography is a new and growing threat, with legislation sorely needed in order to address the industry’s unlawful business practices and to stem the tide of its multiplying harms.

 

16 U.S. states have passed resolutions identifying pornography as a public health issue. In December 2020, the Stop Internet Sexual Exploitation Act was proposed to create overarching safeguarding standards for online pornography sites. In June, 2021, a New Canadian Bill called for Age and Consent Verification in Online Pornography. Age Verification legislation has been considered in various other countries around the world, including Poland (Dec 2019) Australia (Feb 2020) France (June 2020) and South Africa (Aug 2020).

 

This is only the beginning. The UK has the opportunity to lead the world in making the internet safe and we implore the Government to ensure that the laws introduced by the Online Safety Bill are robust and effective at cleaning up the online commercial pornography industry.


[1] Which as the accompanying report explains does include image-based sexual abuse and upskirting- though this is not mentioned in the bill itself

[2] Something mentioned in the final response to the Online Harms white paper. https://www.gov.uk/government/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response