House of Lords Communications and Digital Committee – Inquiry into freedom of expression online
We are submitting to this consultation because we are deeply concerned with how ‘freedom of expression’ and ‘freedom of speech’ arguments are deployed by the online commercial pornography industry (OCPI)’s PR machinery as justification for hosting (or failing to moderate) content depicting extreme violence, sexual abuse, intrafamilial rape, underage sexual activity, public sexual harassment and so-called ‘revenge porn’.
For too long, the OCPI has acted with impunity, unaccountable to any government or authority, seeking to align its ‘hands-off’ approach with the values of sexual liberty and free expression. It has used free speech ideology in order to repel criticism, deflect scrutiny and ward off regulation. However, there is mounting evidence of its multifaceted and extensive harms and its persistent strategy to prioritise profit over protecting users.
Freedom of speech arguments tend to be weighted in favour of those with power, money and influence. Big porn has a voice that drowns out the rest and we must redress that balance, giving voice to the rights of women and children to be free from sexual violence, abuse and exploitation.
Is freedom of expression under threat online? If so, how does this impact individuals differently, and why? Are there differences between exercising the freedom of expression online versus offline?
When it comes to online pornography, freedom of expression is certainly not under threat. On the contrary, the notion of free expression is misused to justify an absence of corporate social responsibility, lax moderation and the failure to effectively self-regulate.
Pornhub is effectively claiming that there can be no objective way of judging what kind of content is harmful, since it’s all subjective and a matter of personal taste. The only taboo seems to involve questioning somebody else’s choice.
Matching its explosion in growth, the industry hasseen an escalation in extreme and hard core pornographic content: “Rule #34: If you can imagine it, it exists as Internet porn”. Extreme, violent pornographic material and paraphilias which would once be banned, refused classification or relegated to niche genres is now unexceptional in mainstream pornography, the result of the competitive forces of an unregulated and rapidly-expanding marketplace: “As more and more pornographic images become readily available, it takes much more to scratch one’s sexual itch ...that leads to the necessity for extremism.”
Enjoying virtually total freedom from regulation, pornographic platforms’ lack of proactive moderation means that they host and monitise extreme pornographic content depicting child sexual abuse material (CSAM), so-called ‘revenge porn’, spy cam porn and footage of extreme sexual violence, public sexual harassment, rape and sexual assault. It is impossible to objectively identify how much of this illegal or abusive material is ‘simulated’ i.e. choreographed with the participants’ knowledge and consent, and how much is ‘real’.
What other industry could get away with ‘accidentally’ hosting a vast (yet unquantifiable) amount of illegal content?
Even the content that depicts simulated violence, CSAM and abuse is harmful, since it:
● Harms vulnerable individuals (mostly women and children) in its unregulated production
● Normalises, promotes and monitises sexual violence, child sexual abuse and exploitation, non-consensual sexual activity and extreme racism.
● Fuels violence-supporting, sexist, misogynist and racist attitudes
● Drives real-world sexual violence and coercion
● Increases the demand for illegal pornography
● Makes it harder for authorities to identify the real illegal acts on porn sites
Whilst freedom of expression is vital, it’s also essential to recognise its limitations. One person’s right to free expression cannot be at the expense of another person’s right to freedom from prejudice, sexual violence and abuse.
How should good digital citizenship be promoted? How can education help?
We welcome the fact that the UK government has recently included a consideration of pornography in its sex education curriculum. However, simply telling children that pornography is fundamentally harmless, so long as they recognise that it’s not realistic sex, is misleading and utterly inadequate. As with drug education, it’s vital that children are given the full facts about the multifaceted harms of pornography and its potentially serious and life-long impact.
In addition to the shock and trauma children experience as they encounter online porn, research confirms that it can also cause them profound psychological, social, emotional, neurobiological, and sexual harms. As the charity Culture Reframed explains, “Extensive research has shown that porn undermines the social, emotional, cognitive, and physical health of individuals, families, and communities. These studies also demonstrate that porn shapes how we think about gender, sexuality, relationships, intimacy, sexual violence, and gender equality.”
David Austin, Chief Executive of the BBFC points out that online porn is “affecting the way young people understand healthy relationships, sex, body image and consent.” Watching online porn normalises sexual aggression, risky sexual practices and men’s violent domination over women. It drives real-world harassment and coercion, and many experts suspect that it’s behind the sharp rise peer-on-peer sexual abuse that’s been happening in schools across the country. What’s more, by reinforcing sexual objectification and peddling harmful gender stereotypes, online porn can increase sexual harassment, negatively impact girls’ body image and create unhealthy pressures for them to agree to performing sex acts that are painful, risky or humiliating.
Online porn is not only more extreme and hardcore than ever before, but any child stumbling across Pornhub will also find themselves less able to turn away. Using cutting-edge persuasive design techniques and sophisticated algorithms porn websites’ ‘mousetrap’ users, turning curious clickers into consumers. Children’s underdeveloped prefrontal cortex makes them particularly susceptible to pornography’s addictive qualities. Anti-porn scholar and activist Dr. Gail Dines believes that this is a tacit part of the porn industry’s strategy: “The younger you get them, the longer you've got them. It's like handing out cigarettes outside the middle school."
Is online user-generated content covered adequately by existing law and, if so, is the law adequately enforced? Should ‘lawful but harmful’ online content also be regulated?
A vital question: the law surrounding the online user-generated content is NOT adequately covered by existing law, nor is it adequately enforced. Lawful but harmful online content should be regulated.
To take the OCPI giant Pornhub for example: its (extensive) Terms of Service state that members must not upload material that’s, “illegal, abusive, hateful, obscene or defamatory.” In other words, Pornhub itself considers it necessary to prohibit ‘lawful but harmful’ material, which includes the depiction or representation of unlawful material.
In particular, its terms of service prohibit (amongst other things):
● the posting of “...any Content that depicts any person under 18 years of age (or older in any other location in which 18 is not the minimum age of majority) whether real or simulated” (emphasis mine).
● the posting of any content “depicting underage sexual activity, non-consensual sexual activity, revenge porn, blackmail, intimidation, snuff, torture, death, violence, incest, racial slurs, or hate speech, (either orally or via the written word).”
The prohibition of lawful but harmful material is important because:
Although porn users generally start out watching material that’s aligned to their ethical values, research suggests increased consumption over time results in an escalation of tastes and preferences. This explains why heavy users have an increased tolerance for and interest in more extreme material.
Pornography serves as a ‘mentor to the masses’, reinforcing traditional gender roles, promoting violence-supportive attitudes and inspiring harmful behaviour. A 2010 meta-analysis suggest there is a significant positive correlation between sexually-violent pornography use and attitudes supporting violence against women.Pornography that simulates abusive and illegal scenarios normalises and popularises sexual violence and undermines our values of gender equality, consent and non-violence. It’s a powerful cultural force that’s compromising the government's anti-VAWG agenda
As the recent New York Times investigation points out, Pornhub turned a blind eye to whole swathes of content that violated its Terms of Service. For example, even a cursory glance at Pornhub shows thousands of search terms and keywords related to the representation of underage sex, rape, revenge porn and extreme racism (e.g. ‘young’, ‘tiny girl’, ‘violation’, ‘leaked sex tapes’).
In response to mounting public, legal and political pressure, Pornhub has recently rolled out a series of reforms which, for the first time, recognise the need for more moderation of violative content and keywords. It has also removed its user-generated content (for now).
But Pornhub is only one porn platform among many. There needs to be an industry standard, since we cannot trust the OCPI to self-regulate. Rather than relying on the reactive process of moderating content that’s already being hosted, monetized and distributed, all content should be proactively verified in advance.
Should online platforms be under a legal duty to protect freedom of expression?
When it comes to the online pornography industry, the duty to protect freedom of expression must be mitigated by a duty to protect individuals from trafficking, abuse and exploitation.
The ‘sexual fantasies’ represented in pornography do not exist in a vacuum: they are ‘played out’ in real life, causing untold harm to vulnerable individuals and feeding into real-world violence-supporting attitudes and behaviour. It’s no coincidence that the vast majority of harmful but legal content falls along the faultlines of sex and race inequality. Research shows that the violent, extreme and hard-core pornography normalised by the OCPI supports a wider culture of sexism, misogyny, violence against women and racism. It also influences users’ real life attitudes and behaviour.
What model of legal liability for content is most appropriate for online platforms?
We believe it would be beneficial to ensure that there is a degree of secondary publisher liability for online platforms, overturning the long-standing Section 230 of the Communications Decency act?
To what extent should users be allowed anonymity online?
The current business model of pornography platforms relies on hosting the maximum amount of user-generated content as each video is monetised. Most platforms therefore make it very easy to upload virtually anything, with no robust identification checks of any kind. Users’ anonymity nurtures a sense of impunity from responsibility or liability for any illegal, unlawful or violative content.
In recognition of this fact, Pornhub has just announced its intention to initiate a system of age identification verification for all users wishing to upload content to its site. This is a welcome development, and we urge the UK government to make it a prerequisite for all those wishing to upload content to pornographic platforms. However, it’s only a start: there needs to be a system in place to verify the age and consent of any individual featured in user-generated videos.
How can technology be used to help protect the freedom of expression?
It is noteworthy that the ‘free expression’ of pornographic user-generated content tends to centre on the aggressive domination of women and other minority groups by men. As bastians of ‘free expression’, the industry exemplifies the fact that there are important ethical lines that should never be crossed.
Online porn platforms recognise that even with fantasy-fuelling, taboo-shattering pornography, there need to be limits. This is why their comprehensive terms of service carefully express what content and behaviours are permitted. However, some of the lines are being crossed with impunity. Users break the rules and nothing happens.
These companies do more than turn a blind eye to violative material: its algorithms actively promote it to its millions of users, with the incentive of keeping its best consumers engaged on its site for longer which is of course good for the bottom line. Those who challenge the platform are lectured on free speech.
For too long, the porn industry has been shielded by free speech ideology, which has proven a surefire way of deflecting scrutiny and warding of regulation. It has acted with impunity, unaccountable to any government and acting as if it’s above the reach of law.
We have a responsibility to redress that balance, to give voice to the unknown number of women and children impacted by a powerful yet unregulated industry.
How do the design and norms of platforms influence the freedom of expression? How can platforms create environments that reduce the propensity for online harms?
The whole design of the OCPI treats the most hard-core, extreme content (which often violates the platforms’ Terms of Service) as it would any other content.
By presenting users with “most viewed” videos in each category and including user comments, it effectively legitimises pseudo-CSEA and representation of illegal content: “Each query and click, further enshrines acceptance of this pornography, as views are tallied,“likes” calculated, and increasingly so, comments added for each video”.
This normalises and rationalises users’ interest in such material, weakening any restraint of conscience. This leads to an increased demand in real CSAM and other non-consensual material.
How could the transparency of algorithms used to censor or promote content, and the training and accountability of their creators, be improved? Should regulators play a role?
To the latter question, we would answer: yes! Absolutely.
Research tells us that watching porn is an escalating habit with heavy users requiring increasingly extreme content in order to maintain arousal over time. This is particularly true of young people. This means that, in order to maximise profits by keeping its most loyal consumers engaged for as long as possible, pornography platforms have a strong incentive to introduce them to more novel, hard-core ‘extreme’ content, some of which or violates the site’s Terms of Service (for example, pseudo-child pornography, simulated gang rape, etc). Some is also illegal, since we know that illegal content slips under the radar of the platform’s flimsy and reactive moderative processes.
What’s more, promoting “most viewed” videos and including other users’ comments all works to rationalise users’ interest in such material. For example, pornography platforms use algorithms to direct users with an interest in ‘teen porn’ to search tags and keywords related to categories of more explicit pseudo-child pornography. In this way, porn platforms effectively normalise, endorse and extend sexual interest in children. Police investigators are noting the increasing trend for men who have no prior sexual interest in children ‘crossing the line’ from adult to children pornography as a result of heavy porn use, often via the bridge of ‘teen porn. The same applies for users with an interest in violent, non-consensual or extreme racist pornography.
We call for transparency regarding the algorithms pornography platforms deploy. If we are to prevent such harmful escalation, it is critical that we have a greater understanding of the algorithyms that lead users on the path from mainstream pornography interest in illegal content.
How can content moderation systems be improved? Are users of online platforms sufficiently able to appeal moderation decisions with which they disagree? What role should regulators play?
Online pornography platforms have a vast amount of content. There were 6.83 million new user-generated videos uploaded to Pornhub last year alone. “To put this in perspective–if you strung all of 2019’s new video content together and started watching them way back in 1850, you’d still be watching them today,” (according to a statement from Pornhub).
How can human moderating teams hope to check each video? Even though it claimed to be able to do so, Pornhub has recently conceded that it needs to do much more to identify and take down illegal and violative content by massively extending its system of moderation.
Welcome as this development is, the fact remains that, just by watching a video, it’s impossible to objectively verify the age and consent of its participants- especially since we know, in the instances of trafficking and child sexual abuse, victims are often trained to act as if they are enjoying what’s being done to them.
There must be external regulators to assess the effectiveness of pornography platforms’ systems of moderation. But as well as tackling illegal or violative content through reactive methods, it is vital that these sites put in place effective systems of age and consent verification to minimise the risk of such content appearing in the first place.
Anecdotally, the OCPI has been unresponsive to user complaints. However, a complete lack of transparency means that we have no means of assessing the efficacy of its complaints system: how many are made, how swiftly videos are taken down and whether or not they reappear.
Are there examples of successful public policy on freedom of expression online in other countries from which the UK could learn? What scope is there for further international collaboration?
There is definitely scope for international collaboration here since the OCPI is global, and seems to attract particular categories of abuse focused in different parts of the world (e.g. CSAM from Asia, Spy Cam porn from South Korea, etc.). Various politicians, particularly in Canada and the US, have spoken out on this issue and international collaboration may well be fruitful.
12 January 2021
 https://www.researchgate.net/publication/38041887_Pornography_and_Attitudes_ Supporting_Violence_Against_Women_Revisiting_the_Relationship_in_Nonexperimental_Studies