Written evidence submitted by CEASE (Centre to End All Sexual Exploitation) (OSB0104)

 

 

Written in September 2021 by:

 

Naomi Miles, Head of Policy, CEASE,

Tom Farr, Head of Legal Advocacy, CEASE,

 

Introduction

 

CEASE - the Centre to End All Sexual Exploitation - is a human rights advocacy charity which exposes the underlying cultural and commercial forces behind all forms of sexual exploitation.

 

Note, since CEASE works to combat sexual exploitation, the evidence we present will focus specifically on the implications of the Online Safety Bill for the online commercial pornography industry. 

 

We welcome this call for evidence since we are passionate about ensuring that the ‘once in a generation’ Online Safety Bill will address the multifaceted harms caused by the commercial online pornography industry, including image-based sexual abuse, child sexual abuse, the normalisation of sexual violence and harmful sexual attitudes and behaviours.

 

Our Expose Big Porn report (July 2021) provides in-depth analysis of the online harms driven by online pornography and lays out our regulatory recommendations.

 

Our CEO, Vanessa Morse, would welcome the opportunity to give further information in an oral evidence session to the pre-legislative scrutiny Committee.

 

 

Summary of key points

 

       This ‘once in a generation’ Online Safety Bill must be amended to address the multifaceted harms caused by the online commercial pornography industry, including image-based sexual abuse, child sexual abuse, the normalisation of sexual violence and harmful sexual attitudes and behaviours.

       The current draft Bill fails to consider the specific, severe and multifarious harms of the online commercial pornography industry.

       This Bill must stop children from having free and easy access to extreme, hardcore content on pornography sites. We welcome the Bill’s recognition of children’s particular vulnerability to online harms and recommend that the Bill requires age verification processes for all online pornography websites, regardless of their size or functionality.

       Online pornography also harms children by driving child sexual abuse. Police officers and organisations such as the Lucy Faithfull Foundation report increasing instances of adult sex offenders whose interest in child sexual abuse material was sparked by the ‘pseudo’ child pornography they had viewed on mainstream porn sites.

       Most major porn sites function like YouTube, allowing members of the public to post their own videos with no age or consent verification, in a process that is deliberately kept as “zero friction” as possible.  This is an inherently high-risk business model that allows the uploading and downloading of illegal content such as child sexual abuse, rape and image-based sexual abuse.

       We recommend that this Bill be amended to introduce regulation to ensure that pornography websites either remove video sharing platform functionality or implement robust age and consent verification processes to ensure that all those featured in video uploads are consenting adults.

       Currently mainstream pornography sites hold a plethora of ‘legal but harmful’ material that is prohibited in their terms and conditions, but not moderated and still freely available, for example ‘pseudo’ child sexual abuse, sexual violence and racism. A 2021 study by researchers at Durham university released earlier this year highlights how one in eight titles shown to first-time users on the first page of such sites depicts sexual activity that constitutes sexual violence (relating to incest, physical aggression, sexual assault, image-based sexual abuse and depictions of coercion and exploitation). This normalises sexual violence towards towards women, girls, and children, alongside various vulnerable groups such as LGBTQ+ individuals, racial minorities and those with disabilities.

       We recommend that pornography websites are included as providers of Category 1 services, with relevant Codes of Practice introduced and a specific regulator to ensure compliance. Porn sites must be made to stop hosting illegal “extreme” pornography and the “legal but harmful” content prohibited by their own terms of service.

 

 

Full Response:

 

Objectives

 

Will the proposed legislation effectively deliver the policy aim of making the UK the safest place to be online?

Most other countries have yet to introduce robust and effective regulation specifically designed to tackle online harms; the UK thus has the opportunity to be a global leader.

 

The OSB represents an opportunity to do away with the startling double standard between the online and offline worlds, particularly with regards to pornography where currently, children have free and easy access to extreme, hardcore content and unregulated porn sites host illegal content with impunity.

 

Although the Bill’s aspirations are high and its scope is admirably ambitious, the current draft fails to consider the specific, severe and multifarious harms of the online commercial pornography industry.

 

We would welcome the opportunity to assist the government in rectifying this situation.

 

Our July 2021 Expose Big Porn report lays out the issues in more detail and lists the following

recommendations for the bill:

 

1. Mandate the use of Age Verification for all pornographic websites, regardless of their size or functionalities. This should be brought in immediately through the powers of the Digital Economy Act 2017 and then further strengthened through the Online Safety Bill.

 

2. Introduce regulation to ensure that pornography websites either remove video sharing platform functionality or implement robust age and consent verification processes to ensure that all those featured in video uploads are consenting adults.

 

3. Identify pornography websites as providers of Category 1 services, introduce relevant Codes of Practice and designate a specific regulator to ensure compliance. Porn sites must be made to stop hosting illegal “extreme” pornography and the “legal but harmful” content prohibited by their own terms of service.

 

Will the proposed legislation help to deliver the policy aim of using digital technologies and services to support the UK’s economic growth? Will it support a more inclusive, competitive, and innovative future digital economy?

 

In response, we would first like to express the importance of recognising how the current lack of regulation with regard to the online pornography industry is extremely costly. The global online commercial pornography industry is not led by responsible corporate actors; free online porn sites were founded on criminal pirating, their business model is inherently high risk and safeguarding processes are weak, reactive and ineffective.

 

Recent campaigns and investigations have demonstrated how the industry’s poster child Pornhub’s commercial success has come at the expense of countless victims, disproportionately represented by women, children, and other vulnerable minority groups. The high cost of such harms is borne by our law enforcement, counselling services, NHS, children’s charities, and ultimately by the victims themselves.

 

The pornography industry has exploded in scale and profitability over the past two decades, aided by the predominant culture of cyberlibertarianism. Values such as ‘freedom’, ‘connection’ and ‘creativity’ trumping those such as equality and protection from harm and exploitation. Fast-paced technological advancement has been narrowly focused on profitability and has been virtually no innovation in the service of protecting the rights and welfare of the most vulnerable. Safeguarding has been consistently relegated to the margins and the time for change is long overdue.[1]

 

We firmly believe that the UK’s commitment to robust safeguarding will lead to groundbreaking technological innovation that will undoubtedly support our future digital economic growth, since the rest of the world is sure to follow. Already, the UK is a global leader in age verification technology; the sector has a trade body, a code of conduct, and a formal standard for age checks developed by the BSI, and it is regularly asked to advise overseas governments on how best to apply online age verification.

 

Are children effectively protected from harmful activity and content under the measures proposed in the draft Bill?

 

We applaud the UK government for its recognition of children’s particular vulnerability to online harms and its focus on measures designed to protect children. We recommend that Bill requires age verification processes for ALL online pornography websites, regardless of their size or functionality.

 

There appears to be greater awareness and appreciation of the fact that having free and easy access to online pornography is extremely harmful to children and young people. However, there is less awareness around the ways in which online pornography harms children by driving child sexual abuse.

 

Online porn sites host millions of videos depicting sexual activity with children. Petite, young-looking performers are deliberately made to look underage through props such as stuffed toys, lollipops and school uniforms. Although not strictly illegal, this “pseudo child pornography” is nonetheless extremely harmful since it presents children as legitimate objects of sexual desire and normalises adult/ child sexual activity, which has serious consequences:

 

1. It fuels the demand for “real” child sexual abuse material. Police officers and organisations such as the Lucy Faithfull Foundation report increasing instances of adult sex offenders whose interest in child sexual abuse material was sparked by the “pseudo” child pornography they had viewed on mainstream porn sites.

 

2. It drives children’s vulnerability to sexual abuse and exploitation. With no age verification processes, children and young people are being exposed to pornography that treats the depiction of child sexual abuse by men such as teachers, step-fathers, and employers as normal and even desirable.

 

3. It allows “real” child sexual abuse material to hide in plain sight on mainstream porn sites.  It is virtually impossible to tell the difference between a 15-year-old and an 18-year-old pretending to be younger. This problem is exacerbated by the absence of age and consent checks on user-generated uploads and by the fact that even professional pornography studios are not under legal obligation to keep robust records of performers’ ages.

 

Most online commercial porn sites’ terms of service actually prohibit even the representation of children as 18-year-olds. However, even a cursory glance at these same sites reveals that they are failing to enforce their own rules- presumably because of the popularity and profitability of this material.

 

We therefore recommend that the Online Safety Bill identifies all pornography websites as Category 1 providers in order to ensure that they robustly enforce their own terms and conditions prohibiting such “legal but harmful” content.

 

Does the draft Bill make adequate provisions for people who are more likely to experience harm online or who may be more vulnerable to exploitation?

 

As noted above, it is encouraging that the Bill makes special provisions for children, who are especially vulnerable to experiencing a wide spectrum of online harms.

 

However, it is vital that the Bill considers how the online pornography industry normalises sexual violence towards towards women, girls, and children (alongside various vulnerable groups such as LGBTQ+ individuals, racial minorities, those with disabilities, etc.). As Dr Helen Mott, Research Consultant with Bristol Women’s Commission observes: “A startlingly large number of the freely, readily available and more frequently viewed porn on our mainstream sites is violent, depicts criminal acts and normalises violence in sexual relationships”.

 

A response letter CEASE received from the DCMS (June 7th) implicitly acknowledges the particular vulnerability of women to illegal content such as sexual bullying, harassment, image-based sexual abuse, and extreme and deepfake pornography. This is welcome, and we urge the government to ensure that the final Online Safety Bill specifically names these forms of online sexual abuse, to which women and girls are particularly vulnerable.  It is essential that the government ‘joins the dots’ by tying the issue of online safety into the government’s wider agenda of tackling gender-based violence.

 

Regarding image-based sexual abuse:

In the same letter, CEASE was assured that services in scope would be obligated to “have effective systems in place to minimise and remove” such gender-based sexually-abusive material. Whilst this is encouraging, we are concerned that the language used suggests the sufficiency of bolt-on “solutions” when what is needed is a more holistic, “safety by design” approach.

 

Presently, most mainstream porn sites have weak, reactive systems of moderation that do little to moderate the risk posed by the video-sharing functionality which facilitates the easy, almost instantaneous upload and distribution of virtually any content. There are currently zero checks in place to ensure that user generated content is uploaded onto pornography sites is legal.

 

It is impossible to identify image-based sexual abuse just by looking at it. We therefore urge the Government to introduce regulation to ensure that pornography websites either remove user generated content (UGC) functionality or implement robust age and consent verification processes to ensure that all those featured in video uploads are consenting adults.

 

Regarding illegal “extreme” pornography:

A 2021 study by researchers at Durham university released earlier this year highlights how mainstream pornography sites are currently hosting illegal extreme pornography with impunity. It notes how one in eight titles shown to first-time users on the first page of such sites depicts sexual activity that constitutes sexual violence (relating to incest, physical aggression (e.g. sexual assault), image-based sexual abuse and depictions of coercion and exploitation- but excluding fetishised violence of BDSM).

 

The study’s authors explain how its findings “raise serious questions about the extent of criminal material easily and freely available on mainstream pornography websites and the efficacy of current regulatory mechanisms”. The ubiquity of material that normalises and promotes sexual violence and female degradation is shaping our cultural norms, attitudes and behaviours. Radical regulatory action is urgently needed: all pornography websites must be classified as Category 1 services.

 

Is the “duty of care” approach in the draft Bill effective?

 

It is perhaps impossible to know for sure whether the “duty of care” approach will be effective or not. However, whilst it may work for some industries (who have demonstrated high levels of corporate social responsibility), it is risky to put the onus of determining what constitutes acceptable risk and sufficient safeguards onto the online commercial porn industry which has a track record of criminality and non-compliance.

 

Risk assessments seem to constitute an important aspect of sites’ tackling of illegal material but since these are inherently subjective, their implementation could constitute little more than an empty box-ticking exercise. Would there be anything to prevent them from implementing weak, ineffectual safety processes in order to avoid more fundamental and costly (though necessary) reforms? What would stop companies from treating the Bill’s requirements as an exercise in PR spin?

 

This is not a hypothetical risk. In response to the shocking revelations of trafficking and child sexual abuse material on its site in 2020, Pornhub responded with flat denial and aggressive defensiveness. It boasted of the careful risk assessments and state-of-the-art safeguarding measures it already had put in place, refusing to acknowledge their evident shortcomings or to run any kind of investigation. It only made significant reforms in response to Visa and Mastercard pulling their services from the site.

 

We can already see a gap between what certain companies say they allow on their sites and what appears on them in reality. The Bill has the opportunity to create robust regulation, but it also runs the risk of creating stipulations that are vague enough to give unscrupulous industries the freedom to wiggle out of their obligations. Without concrete, universal minimum standards, the OSB will effectively give companies the freedom to continue self-regulation which, at least with regards to the online commercial pornography industry, has been proven to be an abject failure.

 

Does the Bill deliver the intention to focus on systems and processes rather than content, and is this an effective approach for moderating content? What role do you see for e.g. safety by design, algorithmic recommendations, minimum standards, default settings?

 

We were disappointed to note the shift from the White Paper, which mentioned “safety by design” to the focus on systems and processes. Our experience of the online pornography industry has taught us that porn platforms are incentivised to protect systems and business models that yield them high profits, in spite of their intrinsic high risks.

 

For example, pornography platforms allow users to upload content almost instantaneously, with no verification processes to ensure that this content is legal. The bolt-on, reactive moderation processes these platforms have in place are insufficient, since they cannot objectively identify whether those in the videos are consenting, or even of age. Consequently, mainstream porn sites are contaminated with vast though unknown quantities of so called “revenge porn”, child sexual abuse material, and other illegal content.

 

Prevention is better than cure. To relegate safeguarding to mere “systems and processes” is to regard safety as almost as an afterthought instead of putting it at the very heart of web design.

 

We need to adopt a zero-tolerance approach to illegal content. The digital technology sector has proven its incredible capacity for swift innovation and only lacks the incentive to turn this towards safeguarding. There is no practical reason why we cannot hold online service providers to the highest possible standards and insist on safety-by-design, introducing universal minimum standards and careful rules regarding algorithmic recommendations and default settings. Compared with the “duty of care” approach, this will create a more level playing field and should also facilitate enforcement.

 

Now is not the time for compromise. The virtual absence of external governance or regulation of the online world thus far should not lead us to introduce timid, compromised measures of limited efficacy but regulation that is holistic, robust and future proof.

 

How does the draft Bill differ to online safety legislation in other countries (e.g. Australia, Canada, Germany, Ireland, and the EU Digital Services Act) and what lessons can be learnt?

 

This response compares the draft Bill’s consideration of efforts to tackle sexual exploitation and/or the wider commercial sex industry including the porn industry, as well as the advertising of prostitution “services” with other jurisictions. 

 

Australia perhaps mirrors the UK’s approach to “online safety” most closely, with the introduction of their Online Safety Act in 2021 (‘OSA 2021’). This Act covers a host of regulated content, including but not limited to “cyber bullying”, “non-consensual intimate image sharing”, and “abhorrent violent conduct”. Most strikingly, there is no mention of the porn industry, prostitution, or the commercial sex industry more generally. References to anything pertaining to “sex” is limited to the non-consensual sharing of intimate images.

 

The hosting of “abhorrent violent conduct” is expressly prohibited in the OSA 2021, and of relevance here is the prohibition of material that depicts “torture”, “rape”, and “kidnapping”. The definitions of each of these can be found in s474.32 (3), (4), and (5) of the Criminal Code. Whilst of course the Australian Government should be applauded for prohibiting such material in and of itself, the lack of links drawn between this material generally and its prevalence in pornographic content specifically is worrying. There is seemingly no awareness or recognition that material that fits the description of torture, rape, and kidnapping is extremely prevalent on mainstream porn sites, and this must be recognised and indeed combatted.

 

Suggesting any recommendations or reforms to the OSA 2021 is far beyond the scope of this submission, but suffice to say the UK Government must bear this in mind as a similar lacuna is present in the OSB in its current form. For example, whilst “child sexual abuse material” is recognised as a specific form of prohibited online content, there are no links drawn, nor prohibitions outlined, of its prevalence on mainstream porn sites. Similarly, the draft OSB broadly recognises “material that is harmful to adults” but does not acknowledge the prevalence of such harmful material on mainstream porn sites. Neither does the draft Bill prohibit material depicting depicting rape, abuse, torture, or kidnapping, all of which is prohibited under the Australian OSA 2021. Such harmful content must not only be prohibited but its prevalence on mainstream porn sites must be explicitly identified in order to help OFCOM to seek it out and take robust enforcement action against the commercial sex industry.

 

An analogous legislative measure can be found in the (now-repealed) German Zugangserschwerungsgesetz Act implemented by the Bundestag in 2009. This Act established specific requirements for the Federal Criminal Police to maintain records of websites hosting child sexual abuse material (“child pornography”). The two-fold approach required websites to take such material down immediately, and if they failed, then the Internet Service Providers were empowered to block access to such sites indefinitely. Whilst the specifics of this Act are beyond the scope of this submission, an important comparative point can be gleaned.

 

The Bundestag recognised the importance of having a separate statutory framework dealing with material of this nature, and it is CEASE’s position that the UK Government would do well to follow this approach, not just with CSAM, but with the regulation of the commercial sex industry as a whole. We strongly urge the Government to amend the OSB to include a new section dealing specifically with prohibited material on pornographic websites in order to ensure that any regulator moving forward has an appropriately drafted legislative mandate to enforce takedowns and/or levy fines against such websites. Without such a recognition and specifically drafted section, we maintain that there is an increased risk of websites within the commercial sex industry not conforming to their obligations, as well as OFCOM being unaware of exactly how and when their enforcement powers should be imposed.

 

Does the proposed legislation represent a threat to freedom of expression, or are the protections for freedom of expression provided in the draft Bill sufficient?

 

We do not hold that the proposed legislation represents a threat to freedom of expression, which is (rightly) held in high regard. However, we wish to make the government aware of the fact that freedom of speech/ expression arguments are regularly deployed by the online commercial pornography industry in order to deflect scrutiny, ward off regulation, and repel criticism.

 

For example, in response to the BBC’s questioning over why it allowed videos with titles such as "teen abused while sleeping", "drunk teen abuse sleeping" and "extreme teen abuse" (similar to the titles of a video depicting the rape of 14-year-old Rose Kalemba), Pornhub said: "We allow all forms of sexual expression that follow our Terms of Use, and while some people may find these fantasies inappropriate, they do appeal to many people around the world and are protected by various freedom of speech laws." (emphasis added: note, these video titles do not follow Pornhub’s Terms of Use, which prohibit the depiction of “non-consensual sexual activity”[2]).

 

The reality is that the porn industry's resistance to regulation has less to do with free speech ideology and more to do with safeguarding profits.

 

Content in Scope

 

The draft Bill specifically includes CSEA and terrorism content and activity as priority illegal content. Are there other types of illegal content that could or should be prioritised in the Bill?

 

Yes: the Online Safety Bill should explicitly include sexual bullying, harassment, image-based sexual abuse, and extreme and deepfake pornography as types of priority illegal content. Such abhorrent content disproportionate impacts women and girls and actively undermines society’s efforts to tackle gender-based sexual violence. The Bill should also prioritise the removal of illegal “extreme” pornography, which mainstream porn sites currently host with impunity and which normalises the sexual violence, coercion, and abuse of women and girls.

 

The reasons for specifically including these additional types of content are laid out at length in this submission.

 

The draft Bill specifically places a duty on providers to protect democratic content, and content of journalistic importance. What is your view of these measures and their likely effectiveness?

 

Currently, the PR machinery of the online commercial pornography industry exploits the notion of democracy in order to defend hosting virtually any content, no matter how harmful.

We thus assert that the Government must take care to balance the democratic right to freedom of expression (i.e. to view virtually any kind of pornography) with women and children’s right to freedom from sexual harm, abuse and exploitation.

 

As Betty McLennon writes in her essay, Pornography and the State: “Those who choose to interpret Freedom of Speech in absolute terms see it as their own individual right to do and say and read and view whatever they like regardless of how their speech and actions may affect others. It is this absolute focus on the individual that is unsustainable due to the fact that it takes little account of the rights of others.”

 

Earlier proposals included content such as misinformation/disinformation that could lead to societal harm in scope of the Bill. These types of content have since been removed. What do you think of this decision?

We cannot comment on the particular issue of misinformation/ disinformation, and we do recognise the difficulty of drawing up parameters for the scope of a Bill designed to address so many different forms of online harm.

 

However, with regards to the online pornography industry, it is clear that far from being limited to individual users, many of its harms are felt on a societal level. In this area at least, the Bill must not miss the opportunity to address the ways in which online pornography is having a profound and harmful real-world influence on cultural norms, attitudes and behaviour.

 

Regardless of whether they watch porn, women and children are nonetheless impacted by its harms, sometimes directly as victims of image-based sexual abuse (“revenge porn”) or child sexual abuse material and sometimes indirectly because of the “association between the use of pornography and harmful sexual attitudes and behaviours towards women” (an assertion backed by “substantial evidence” according to the recent Government Equalities Ofice government report).

 

We are encouraged by the fact that in a letter to CEASE, the DCMS has expressed a commitment to tackle illegal content related to the online sexual abuse of women and girls, clamping down on image-based sexual abuse, child sexual abuse material and “extreme pornography” (which, as the researchers at Durham University note, is by no means “relegated to niche sites, hidden from all but a determined viewer, or only available on the dark web”).

 

We recommend that the Bill identifies pornography websites as providers of Category 1 services, introduces relevant Codes of Practice and designates a specific regulator to ensure compliance. These regulations will help to stem the tide of illegal “extreme” pornography and the “legal but harmful” content prohibited by their own terms of service (such as racist content, or content depicting adult/ child sexual activity).

 

The Government must not fail to take the opportunity to protect women and girls from the sexual violence, exploitation and abuse driven by the online commercial pornography industry. 

 

Are there any types of content omitted from the scope of the Bill that you consider significant e.g. commercial pornography or the promotion of financial scams? How should they be covered if so?

We are very concerned that the draft Bill omits mention of the commercial pornography industry, in spite of the recent damning evidence of the harms wreaked by its unlawful business practices.

 

The draft Bill implies that all pornography websites of a certain size and functionality will be expected to implement robust age verification processes. Whilst this is welcome, we wish for this to be spelt out explicitly, along with the assurance that such an expectation will be made of ALL websites dedicated to pornography, regardless of their size or functionality.Without this amendment, there is a danger that porn sites will change their business model, splitting off into smaller parts and losing their user-to-user functionality in order to avoid regulation.

 

Secondly, we believe it is vital that the draft bill takes careful consideration of the unique and elevated harms associated with the online commercial pornography industry. In particular, it should address how pornographic video sharing platforms effectively facilitate the mass distribution of illegal content such as “revenge porn” (image-based sexual abuse) and child sexual abuse material due to the virtual absence of any checks or verification processes.

 

In correspondance from the DCMS, CEASE was assured that the government is taking urgent action to address the issue of so-called “revenge pornography”, recognising how it “has the potential to have significant psychological effects on victims.” Whilst this is welcome, we implore the government to “join the dots'' and hold the porn industry to account for its effective complicity in this crime.

 

In December 2020, the porn giant Pornhub faced serious criminal accusations of facilitating sex trafficking and child sexual exploitation. Recognising the impossibility of managing the risk of its video sharing platform functionality, it took decisive action only because it was backed into a corner by financial and corporate pressure from Visa and Mastercard. Without external regulation, there is no guarantee that this change will remain in place. The website’s owner, Mindgeek, has not made the same changes to its other sites, nor have other pornographic websites followed suit.

 

Pornhub exemplifies the failure of the porn industry’s self-regulation and the urgent need for the UK government to introduce reforms to improve its safeguarding procedures and to increase its transparency and accountability. According to a public survey CEASE carried out in June 2021, 80% of respondents said that they would support strict new pornography laws.

 

The Online Safety Bill should introduce regulation to ensure that pornography websites either remove user generated content functionality or implement robust age and consent verification processes to ensure that all those featured in video uploads are consenting adults.

 

What would be a suitable threshold for significant physical or psychological harm, and what would be a suitable way for service providers to determine whether this threshold had been met?

This is an important question: objectivity is paramount. We would consider “significant physical or psychological harm” to be a harm that inhibits an individual’s ability to live his or her “normal” life (including for example inhibiting his or her ability to study, to work, to form healthy relationships, etc.) It should involve medical diagnoses and/or the opinion of respected professional bodies.

 

It may also be worth obtaining statistics in order to identify broader patterns of harm driven by particular high-risk tech sectors. For example, with regards to the online pornography industry, it would be interesting to scrutinise data and statistics held by relevant agencies and institutions (for example school and university reports of sexual harassments; sexual offending unit reports of men seeking help for inappropriate viewing/ harmful sexual behaviour; sexual health clinic reports of porn-induced erectile dysfunction or sex addiction;  reports from the “revenge porn hotline”; and police reports of rape and sexual assault, etc.).

 

Are the definitions in the draft Bill suitable for service providers to accurately identify and reduce the presence of legal but harmful content, whilst preserving the presence of legitimate content?

Is the “legal but harmful” content, defined as being at risk of “having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities” sufficient to allow service providers to “accurately identify and reduce” this content?

 

In considering this matter, it is vital to recognise that websites will often be commercially incentivised to play down the harms of content they host. For example, the business model of free tube-style porn sites involves hosting the maximum amount of content. Being asked to “accurately identify and reduce the presence of legal but harmful content” thus presents them with a conflict of interest.

 

Leaving the decision over what constitutes  “legal but harmful” content to service providers themselves is risky, as it is extremely difficult to provide a neat and objective definition of what constitutes “significant adverse physical or psychological impact” and many sites will be keen to deny that any content on their websites meets the threshold for harm.

 

We would assert that the process of defining “legal but harmful” content needs to be rigorous, robust and objective with input from different expert professional bodies. For example, the regulator could look at the referral rates for particular physical or psychological problems clearly linked to exposure to certain online content.

 

Services in Scope

 

The draft Bill applies to providers of user-to-user services and search services. Will this achieve the Government's policy aims? Should other types of services be included in the scope of the Bill?

 

User-to-user services pose particular risks of harm and it is right that they are front and center of the OSB. However, because of the type of sensitive and extreme sexual content hosted by pornography websites, we recommend that the government brings all of these into the scope of the bill, regardless of their size or functionality.

 

Although it is the case that most major pornography sites are currently user-to-user services, only targeting these sites will leave an obvious loophole for the industry to exploit, with websites potentially changing their structure in order to avoid regulation.

 

The draft Bill sets a threshold for services to be designated as 'Category 1' services. What threshold would be suitable for this?

 

In addition to protecting children, we welcome the fact that high risk, high reach platforms designated as Category 1 providers will also be required to address legal but harmful content for adults. 

 

The “objective and evidence-based process” used to designate Category 1 services must consider the relative risk of harm through the type of content featured on the website, alongside consideration of the sites’ functionality and reach.

 

Due to their content and their design and functionality, commercial pornography websites have a unique and elevated risk of hosting not just illegal content such as image-based sexual abuse, child sexual abuse material and “extreme” porngraphy, but also “legal but harmful” content such as the depiction of underage sexual activity, violence, incest, racial slurs and torture. Although these things are generally prohibited by their own terms of service, porn sites often fail to enforce them effectively.

 

It is imperative that the Online Safety Bill explicitly addresses the harms of the online commercial pornography industry.

 

Are the distinctions between categories of services appropriate, and do they reliably reflect their ability to cause harm?

The draft Bill is vague about the criteria by which different services will be categories, simply referring to the service’s size, functionalities and “any other factors that the Secretary of State considers relevant.” Presumably, the Secretary of State will be armed with all the knowledge and information necessary to make appropriate judgements.

 

This system of categorising service providers according to risk makes sense although once again, we implore the government to ensure that the online pornography industry is identified as a Category 1 provider and does not slip through the net.

 

Will the regulatory approach in the Bill affect competition between different sizes and types of services?

Certainly with regards to the online pornography industry, there need to be clear and universal minimum standards in order to ensure fairness and to encourage complicity. Websites are less likely to take more radical, costly and effective safeguarding measures unless they have assurance that all competitors will be subjected to the same regulation.

 

It’s notable that the porn site XVideos received a welcome boost in web traffic after its major competitor Pornhub made significant safeguarding reforms. Companies should not be allowed to gain a competitive advantage from having fewer scruples and less oversight or accountability. Universal minimum standards are the only way to ensure a level playing field, where no company can profit from unlawful or intrinsically high-risk business practices. 

 

What role do algorithms currently play in influencing the presence of certain types of content online and how it is disseminated? What role might they play in reducing the presence of illegal and/or harmful content?

As Dr Elly Hanson explains in her report ‘Pornography and Human Futures’: “Individuals’ data, gathered without informed consent, is fed into machine learning algorithms which assess, categorise and serve up the content that is deemed most likely to nudge specific people to spend – this includes both free content that keeps them on the site (such as images, keywords, videos) and the content of adverts.”

 

Pornography websites’ algorithms feed users not only more-of-the-same kind of content; they also continually suggest new (and often more extreme- at times even illegal) content in order to maximise engagement. This business model not only increases the risk of users developing compulsive habits of consumption; it also influences users to develop harmful sexual preferences, as Hanson’s report describes:  “the industry is left free to target people’s vulnerabilities, manipulating their sexuality towards abusive, unhealthy and bigoted interests. And in the face of all of these state-of-the-art manipulative technologies, children and young people are afforded no extra safety or protection.”

 

The Online Safety Bill must ensure that pornography websites all have age verification to prevent children’s access; that they are transparent about the ways in which they harvest and use data; that they do not host illegal and harmful content; and that their algorithms are used to promote safeguarding rather than just profits.

 

Over the past few years across the UK, police and practitioners have noted the increasing trend of men ‘crossing the line’, acquiring a sexual interest in children as a result of their heavy porn use, often via the bridge of ‘teen porn’.[3] Pornhub has recently started to work with the Lucy Faithfull Foundation in order to display warning messaging when users type in search terms associated with child sexual abuse material. This is positive, as it demonstrates that Pornhub is aware of its responsibility for curtailing rather than encouraging harmful or illegal sexual interests.[4] Certainly, there is scope for further development in this area.

 

Are there any foreseeable problems that could arise if service providers increased their use of algorithms to fulfil their safety duties? How might the draft Bill address them?

Pornography websites already heavily rely on algorithms in order to identify potentially illegal or violative content. However, whistleblower former moderators at Pornhub report how they are incentivised to approve as much content as possible, adopting a loose interpretation of the websites’ terms and conditions.

 

There are obvious limitations with relying upon algorithms in order to identify illegal or harmful content; just because a video does not have “rape” in the title is no assurance that it doesn’t depict rape of course. What’s more, there are also limitations to human moderation processes since it’s impossible to objectively verify the age of someone in a porn video, or whether or not the video has been made or shared with consent.

 

That is why we strongly recommend that the Government implements robust age and consent verification procedures in order to provide firmer assurance that all those featured in pornographic videos are consenting adults.

 

Does the draft Bill give sufficient consideration to the role of user agency in promoting online safety?

There are two issues the Government must be aware of on this issue: first, the balancing of rights inherent under Articles 8 and 10 European Convention on Human Rights (right to family and private life, and right to freedom of expression respectively); and secondly, the approach and role of service-providers and their intersection with user agency.

 

As to the first point, Articles 8 and 10 of the Convention are qualified by the legal test of any restriction on such rights being necessary in a democratic society, as well as being proportional. Whilst CEASE does not advocate the wholesale blocking (or “banning”) of pornographic websites for several reasons, we submit that restrictions on content that is empirically harmful, abusive, and exploitative

is wholly justified. Such restrictions as those outlined in the OSB (which we maintain can and should be improved) would be proportional and necessary in a democratic society.

 

The consumption of pornographic content has undeniable links to the proliferation and normalisation of violence against women and girls, as well as being harmful in its own right to users. (See CEASE’s July 2021 Expose Big Porn Report for further detail.) Recognising this harm and restricting access to such material is of course a restriction on rights provided for under the Convention. However, we maintain that the balancing act and subsequent restriction inherent within the aforesaid Articles is justified on the basis that tackling VAWG and sexual exploitation is a “pressing social need”.

 

This position is further supported by the fact that the restrictions within the OSB, as well as those proposed by CEASE, hardly constitute an all-encompassing restriction of these rights. Recognising the harm facilitated and caused by the commercial sex industry and putting in place appropriate safeguards is a necessary step that the Government must take given the spiralling problem of online sexual exploitation and abuse in the digital age. On the second point, service-providers can facilitate user agency by providing clear and accessible Terms of Service that outline explicitly and clearly the prohibition against illegal content. Again, if users wish to go against these terms – for instance, by searching for illegal and/or exploitative content – that is their prerogative to do so, but this must not mean they are free from the consequences of exercising such agency.

 

Is Ofcom suitable for and capable of undertaking the role proposed for it in the draft Bill?

 

Without OFCOM having the necessary training to recognise the specific and particular harms caused and facilitated by the commercial sex industry, the effectiveness of the Bill will be severely compromised. Irrespective of who is assigned to regulate these industries and implement the frameworks within the Bill, specific training and/or recognition must be given to the commercial sex industry as to the inherent and fundamental harms therein.

 

How will Ofcom interact with the police in relation to illegal content, and do the police have the necessary resources (including knowledge and skills) for enforcement online?

 

We submit that OFCOM must have a sufficient system of processes in place for passing illegal content onto the police as it applies to sexual exploitation/sexual violence related material. Whilst the OSB recognises “illegal content” (which could feasibly extend to cover all manner of illegal pornographic-related content, such as image-based sexual abuse [IBSA] as well as “rape-themed” content, for example) as well as child sexual abuse material, there is no clear demarcation as to what pornographic content may fall under the broad scope of “illegal content”. Respectfully, this must be considered an error of drafting.

 

Unless OFCOM knows exactly what pornographic content meets (or appears to meet) the threshold of illegal, then it will be impossible to even begin an investigation leading to enforcement and/or reporting to the police. We submit that the OSB should contain a non-exhaustive guidance list of content that regularly appears on mainstream pornography websites and that may also be considered “illegal”. For instance, this could include (but would not be limited to): pornography depicting rape; pornography depicting torture; pornography depicting physical harm and/or abuse; pornography depicting child-impersonation; pornography depicting violence against women and girls; pornography depicting other illegal acts such as prostitution (the exchange of money) and/or coercion.

 

However, having any system for reporting such content to the police is only the first step. The relevant authorities must also be cognisant of the existence of such material, as well as how to deal with it. For example, despite a rise in IBSA in recent years, prosecutions for such offences have remained staggeringly low.

 

Campaigners have pointed to the fact that authorities often lack the relevant training in both recognising such offences, as well as knowing how to pursue proper investigations for such offences. This must be remedied, ideally as part of the Government’s VAWG Strategy.

 

There is also a lacuna regarding the culpability of office-holders of corporate entities that have legal personality. We submit that if a criminal offence pertaining to illegal pornographic content is made out, then the office-holders of such legal entities should be held personally liable, as well as the corporate entity being found liable under any enforcement measures outlined in the OSB (and imposed by OFCOM).

 

Are there systems in place to promote transparency, accountability, and independence of the independent regulator?

 

We applaud the Government for requiring the publishing of (at least) annual transparency reports by the independent regulator (OFCOM), as detailed in Part 4, Chapter 7 of the OSB. There are two aspects to this reporting system that could be improved.

 

First, there should be systems in place for members of the public and/or sector professionals (such as CEASE) to request and access information pertaining to specific and particular regulated services that may have been lodged with the regulator prior to (or following) the publication of any annual report.

 

Secondly, attention should be drawn to s.102(3) and (4) of the OSB, which stipulates what matters may be excluded from publication. We submit that if a regulated service-provider is complicit in the proliferation of illegal content (such as IBSA material on a pornographic website), this should preclude the exclusion from publication matters which may be “prejudicial to the interests of that body”.

 

Given the sheer prevalence of illegal and harmful content proliferated on mainstream pornography sites, it would in effect render the transparency reports (for those particular sites) entirely impotent as these would undoubtedly be “prejudicial to the interests of that body”.

 

This again is why we urge the Government to consider dealing with the issues raised by pornographic websites/the wider commercial sex industry within its own section in the OSB. It is fruitless to group together all manner of regulated service-providers under one umbrella of “harmful” or “illegal” content, when websites in different industries will vary wildly in terms of the prevalence and “utility” of hosting such content. By this it is meant that pornographic websites actively profit from, and indeed thrive, with the proliferation of illegal content.

 

This must be differentiated from other such industries that may not be focused on “adult content” (e.g. social media sites) but still need to be aware of the presence of such illegal material on their sites.

 

Such service-providers must produce reports that factor in all types of content present on their websites, even if this would be prejudicial to their interests, given the importance (and Government’s commitment) of tackling such illegal content.

 

How much influence will a) Parliament and b) The Secretary of State have on Ofcom, and is this appropriate?

On the assumption that the OSB will survive successive Parliaments (i.e. it will not be repealed if and when a new Government is formed), it is our position that OFCOM and Parliament/SOS should have a relationship informed by mutuality. As a statutory corporation, OFCOM already has a symbiotic relationship with Parliament, so it is CEASE’s position that this should continue. For example, the Government’s focus on tackling Violence Against Women and Girls (VAWG) runs through to 2024. It should be incumbent upon the Government, and as a consequence OFCOM, to factor this in when delegating and exercising any statutory powers within the OSB that pertains to VAWG.

 

With regards to CEASE’s work specifically, we urge the Government to recognise the links between harmful content online – specifically within the commercial sex industry – and its broader VAWG strategy. This should then be afforded appropriate recognition in OFCOM’s approach to exercising their statutory powers under the OSB. For example, OFCOM’s power to levy a penalty fee against a non-compliant adult-content service provider (i.e. a porn site) will be more effectively exercised if their statutory mandate is already informed by the Government’s VAWG strategy, as opposed to the two working independently and, potentially, inefficiently.

 

In short, OFCOM should be informed by, but not necessarily beholden to, Parliament and the SOS. If OFCOM are not informed by the broader VAWG work being undertaken by Parliament, then there is a danger that they may be unaware of which websites/digital sectors are at a greater risk of falling foul of the regulations within the OSB, and thus damaging and/or preventing the effective implementation of them.

 

Are the media literacy duties given to Ofcom in the draft Bill sufficient?

 

It is CEASE’s position that it is misguided to delegate wide-ranging media literacy duties to OFCOM in its role as regulator, notwithstanding that the duties in the draft Bill are seemingly sufficient per se. The most substantial requirement found under s.103 OSB [s.11(2)(a)-(e) Communications Act 2003] lays out the undertakings required of OFCOM regarding educating the public about the type of content they are viewing, and its nature, as well as the impact it may have on viewers.

 

Whilst at CEASE we strongly impress the point that the consumption of pornography/adult content is widely misunderstood and poorly recognised across a number of sectors, it is also our position that this is too large a job for a statutory regulator to undertake alongside the performance of its regulatory duties.

 

This is grounded in our position that the consumption of pornography is requiring of its own guidelines and management beyond that of more generalised “harmful content”. It must be incumbent upon the Government not only to recognise pornography as the “harmful content” that much of it is, but also to recognise the lack of “media literacy” around this issue is of growing concern to professionals across many sectors, including health, the anti-VAWG sector, and the human rights sector.

 

Therefore, we strongly urge the Government to consider implementing a separate “media literacy” style educational framework that deals specifically with the harms of pornography consumption amongst people of all age-ranges.

 

This of course could – and should – be used by OFCOM when fulfilling their duties, but without this educational framework, it raises the persistent concern that a) OFCOM may not even know how to promote media literacy about this vastly misunderstood and poorly regulated industry; and b) if that is the case, they would be failing in their duty as regulator but this would not be recognised by the Government and thus, it runs the risk of not being remedied.

 

 

September 2021


[1]https://www.researchgate.net/publication/337114034_'Losing_track_of_morality'_Understanding_online_forces_and_dynamics_conducive_to_child_sexual_exploitation

[2] https://www.pornhub.com/information/terms

[3]https://www.dailymail.co.uk/news/article-8213645/Young-British-men-emerging-new-group-online-paedophiles-says-police-chief.html

[4] Of course, Pornhub still profits from “teen porn” depicting adult-child sexual activity; there are clearly limits to how far it will go to protect children from harm and exploitation.