final logo red (RGB)


Select Committee on Communications and Digital

Corrected oral evidence: Freedom of expression online

Tuesday 2 February 2021

4 pm


Watch the meeting

Members present: Lord Gilbert of Panteg (The Chair); Baroness Bull; Baroness Buscombe; Viscount Colville of Culross; Baroness Featherstone; Baroness Grender; Lord Griffiths of Burry Port; Lord McInnes of Kilwinning; Baroness Rebuck; Lord Vaizey of Didcot; The Lord Bishop of Worcester.

Evidence Session No. 8              Virtual Proceeding              Questions 72 - 78



I: Louise Perry, Freelance Writer and Campaigner; Dr Fiona Vera-Gray, Assistant Professor, Department of Sociology, Durham University.



This is a corrected transcript of evidence taken in public and webcast on




Examination of witnesses

Louise Perry and Dr Fiona VeraGray.

Q72              The Chair: Welcome, Louise Perry and Dr Fiona VeraGray. Louise Perry is a freelance writer and campaigner.  Dr VeraGray is assistant professor at the Department of Sociology at the University of Durham. Thank you for joining us. I think you listened to that last session and you have been following our inquiry into freedom of expression online. We have plenty of questions for you. Why do you not kick off by introducing yourselves in a little more detail and giving us your initial perspective on the issues of freedom of expression, particularly online?

Louise Perry: Thanks so much for having me. I am a journalist and contracted currently at the New Statesman magazine, but I have historically written for a range of outlets. Before I was a journalist, I worked in the women’s sector at a rape crisis centre, working with young people in various capacities, including delivering workshops in schools on sexting, online safety and so on, so I have some frontline experience from way back when.

I also work as the press officer for the We Can’t Consent To This campaign, which is a campaign group that we originally set up to document cases where women in the UK had been killed and their killers had relied on a rough sex defence in court. They claimed that the women had consented to the violence that resulted in their deaths. We have since started also documenting nonhomicide cases and looking in a more general sense at sexual violence, the normalisation of violent porn and how that might contribute towards crime.

From my perspective, I have a slightly ambivalent attitude towards this in terms of the gendered issues. In terms of free speech online, I have written quite a lot over the last couple of years on some cases in which there has been a misuse of hate speech legislation in a really heavyhanded approach by some police forces. For instance, I am sure some of you will be aware of the judicial review last year in which a retired police officer took Humberside Police and the College of Policing to court because they had pursued him for political speech online, which was classed as a noncrime hate incident.

The judicial review found in his favour because the tweets he had written were a bit tasteless, a bit crude and not something that, personally, I would ever say online, but were very much within the realms of legitimate political discourse, yet he had had police officers show up at his work. It was felt to be having a very chilling effect on online speech. That is the sort of thing that I am very concerned about. Feminists have often ended up at the sharp end of this in the last couple of years, in discussions over the Gender Recognition Act.

Part of me is concerned about there being currently too much state interference in terms of freedom of expression online. On the other side, and this is particularly linked to my campaigning work, I am very concerned about where we are at with the availability of extreme porn online. I am hoping that, today, I can negotiate a distinction in thinking that we need more state intervention on the issue of porn while protecting speech very carefully. There is a very clear distinction between these two things because porn is not speech. We are not talking about people sharing erotic stories, for instance, online. I have no desire to criminalise that.

We have seen an explosion in access to online porn. Legislation has had real difficulty keeping up and the legislation we do have, particularly criminalising certain types of porn, is not being enforced. We have an amazing availability and mainstreaming of porn that is illegal or ought to be illegal. We know that children are viewing this regularly and that this has become extremely mainstream. Our campaigning work has highlighted that this seems to be seeping out into sexual culture, in that sexual acts that have been normalised by online porn are now becoming increasingly common among younger people. You can see this when you look at the data on the sorts of sexual experiences that young people are having compared to older generations.

It is a longrunning discussion, which I am sure Fiona will comment on as well, about the exact extent to which online porn influences reallife sexual behaviour, but our work suggests that it does, at least to some to extent, and that we should be concerned about this, particularly when we are thinking about children viewing this material, which they are. I am very supportive of the moves towards having an age verification system. That is really good. We should also be thinking about more structural ways of enforcing existing legislation on extreme porn and rape porn, targeting platforms rather than just targeting possession. I hope we can talk more about that.

The Chair: We certainly want to talk more about that. There are some very nuanced issues in there, which we look forward to exploring.

Dr Fiona VeraGray: Thanks, Louise. That was great. I am an assistant professor in sociology at the University of Durham. I work specifically on violence against women and girls. I am a member of the Centre for Research into Violence and Abuse at Durham. Like Louise, I have a history of working in the specialist sexual violence sector. I worked there for about 10 years. I also did training and prevention work with young people in schools on sexual violence, harassment and pornography. More recently, I have been working with the Department for Education on the materials that it is going to give teachers about the newly mandated sex and relationships curriculum, looking specifically at the modules on online media, pornography and harms.

The focus on prevention is something that I would like us to have space to discuss. It is not only about removing harmful content from the internet but about how we prevent it from getting up there in the first place. I was listening to the earlier session, which I found really interesting, and my perspective supports, in many ways, a lot of the things that were being said there.

Like Louise, I believe in the importance of maintaining the internet’s commitment to freedom of expression. It has done a lot to enable a space for minoritised, particularly ethnic, voices. There is greater diversity of gender and sexual expression to be had online. That has translated into really positive changes in society around antioppressive practice. We have seen stuff recently around antiracist practice.

We need to be careful and mindful that, sometimes, the way we think of freedom of expression as being for everyone ignores the reality of what happens for freedom of expression. Freedom of expression for everyone is not really freedom of expression for everyone, because people who are subjected to harassment or violence will censor themselves. They will not take up that right or be able to enjoy that freedom because, if it is not safe for you to express yourself freely, you are not going to do so. You are going to prioritise your safety over the potential for harm.

Overall, that means that, in my view, there is a justification for the regulation of freedom of expression online and that justification can be seen as human rights-enhancing. A lot of the time, the two are put in conflict with each other, where any kind of regulation is seen as removing freedoms and rights. We can think about it in terms of enabling individuals to take up and enjoy their rights and freedoms, both online and offline. A really important point was raised earlier about that distinction. We need to be a bit more complicated than that now.

Sometimes, a form of regulation is human rights-enhancing, in that it enables minoritised people, thinking in the context of this session about women and other individuals who are minoritised on the basis of their gender, to take up the powerful tool that we have in the internet for contributing to this marketplace of ideas. That is my position. I am less ambivalent. There is a place for regulation where that can be seen to support someone’s ability to take up their freedom of expression.

Q73              Baroness Featherstone: Thank you for both of your introductions. Some of this touches on some very difficult issues. Louise, I was going to say that you referred to the TERF war, but perhaps that is not the right way to express it. Personally, I think the war between TERFs and feminists is unnecessary. Going to the heart of it, what should not be protected by the right of freedom of expression online?

Louise Perry: I agree that there have to be some limits on speech, where it incites or facilitates criminal violence. That is widely accepted, and I would also accept it. If one of the outcomes of this inquiry was to strengthen hate speech laws online, I would be a bit concerned about that. Even though I can, in theory, understand why introducing misogynist hate speech online as a criminal offence, for example, would potentially have some advantages, I am very cognisant of the risks. The slippage between hate speech and legitimate political discourse is quite fine and can slip very easily. On that front, I take a more liberal stance.

In terms of pornography, I take the opposite stance, in that I think there currently is not enough intervention. The distinction I would draw is partly to do with the fact that pornography is not about the sharing of ideas. It is not about political discourse. It is about sexual stimulation and, more crucially, it involves real people and sexual acts. The 2015 legislation on revenge porn went some way towards acknowledging that there is a very particular harm done to people when sexual images of them are shared without their consent online.

This is an entirely new form of criminal offence that just did not exist before the internet, which we are having to start to reckon with. When I was working with young people, I found that it is now commonplace for this form of harm and this crime to be committed against the young in particular. We are not addressing it anywhere near well enough and treating it with the seriousness that it deserves. I really would like to see legislation strengthened on that point.

We need to enforce, properly, the existing legislation that we have. In our campaigning work, we have been focused on how certain violent sexual acts have become very normalised by porn and have become very commonplace among young people now. They result in a lot of physical and emotional harm, and have been hugely underprosecuted to date. I am thinking particularly about strangulation.

We are hoping that one of the things to come out of the Domestic Abuse Bill, which is currently being discussed, is a new provision on nonfatal strangulation. Strangulation in mainstream pornography is now incredibly common. It used to be a very niche form of sexual practice that would sometimes be featured in porn. It is now on the front page of practically every mainstream porn platform and it is very common for children to be seeing this kind of material.

This suggests the ways in which online porn can generate this kind of increasingly extreme trend, encouraging viewers to see more and more intense and extreme stimulation. We were involved in some surveys looking at people’s experiences of nonconsensual forms of violence such as strangulation. Almost half of women between 18 and 24 in this country have experienced strangulation now during sex

Baroness Featherstone: Shocking.

Louise Perry: I know; it is astonishing. For many of them, it has been multiple times, without consent. This is a very good example of the availability of this kind of material to children who have no sexual experience to date. Their first encounters with sexual stimulation are through this, online, on their own, in their bedrooms or whatever. Sadly, this has not been taken nearly seriously enough in our efforts to regulate it. I really hope that this Bill can be an opportunity to do something.

Dr Fiona VeraGray: There is a lot to pick up there. Ultimately, it is quite difficult. It is about us thinking about what principle we can use that future-proofs itself, in a way. It was raised in the previous session, and is really important, that we need to not think, “This can be protected. That should not be protected”. We need to have a principle that can be applied in 20 years’ time when the online world is completely different than it is even now. We have justification for that principle being about a human rights-enhancing approach.

I was thinking as Louise was talking: I was part of the campaign for the inclusion of nonconsensual sexual penetration, so the rape pornography amendment to the extreme pornography legislation, which came into force in April 2015. Whenever you get into pornography, you get into that issue about sexual censorship and that kind of thing. The Joint Committee on Human Rights said that introducing that amendment was human rights-enhancing, on the basis that there is that crossover between the online and offline world, and that, while watching pornography showing those kinds of acts does not mean that you will definitely try them, it contributes to a culture where women are aware that they are not seen to be as valuable as men, that violence against them is not seen to be as serious and that sexual violence in general is undermined and sexualised. It is not necessarily a causal argument, but it is saying that we know that some kinds of expression contribute to a harmful broader cultural environment where we have substantive gender inequality.

There is a principle there that we can use and apply when we are thinking about what should and should not be protected. We need to think about how we create an online space where everybody feels able to contribute in an equal manner given that, in society, we are not starting from the same playing field. It is not an equal playing field on which everyone can access the internet in the same ways.  We know, in society, there is substantive gender and racial inequality. There are lots of inequalities. Those things play out in the online world in the same way that they do in the offline world.

I would encourage the committee to think about how we ensure freedom of expression for some of the most oppressed or marginalised groups, and to begin to think about recommendations from that place, rather than coming from a place where we think that inhibiting anyone’s freedom is taking away human rights. This is a way of increasing the ability of some people to access their rights. It is a long way of saying that I do not think there is an easy way of deciding what should and should not be protected.

Baroness Featherstone: You can do what you like so long as you do not do harm to anyone. This is kind of what you are saying.

Dr Fiona VeraGray: It kind of is, but do not just think about harm as directly harming a person. It is about a cultural harm and how your expression contributes to a broader cultural context that enables various forms of oppression to be played out on individuals.

Baroness Featherstone: As a female politician, obviously, one has experienced a fair amount of online abuse of one sort or another. Polls show that women are more likely than men to prioritise safety online over the freedom of expression. Why do you think that is? Why do we feel that safety is more important than freedom?

Dr Fiona VeraGray: For me, I can answer that quite easily. Safety is a necessary precondition for us to be able to take up our rights and freedoms. The fact that there is a gender difference there means that, for men, that precondition is met. On a general basis, they feel safe in online and offline spaces, so they are not going to prioritise that. That tells me very clearly that women do not feel safe in online spaces and that, in order to feel free, they need to first feel safe.

Research that I have done in offline public space, when we are allowed in public space, shows that need to balance freedom over safety. Women talk about restricting their freedom of movementnot going for a run in the dark, for example. It is getting dark now and it is only 5 pm. I have not even finished work yet, but women talk about not going for a run now because they know that they are going to be held responsible for anything that happens to them. Women are positioned in a different way in society than men, being made responsible for keeping ourselves safe and being blamed if we do something that means we are not keeping ourselves safe. We have internalised that message and that means that we do those acts of selfcensorship.

Women such as you, in any kind of public position, alongside feminist researchers such as me, have experienced online abuse. I am sure that Louise has experienced it as part of the We Can’t Consent To This campaign. Over time, that sends a really strong message to watch what you are saying, be careful of how you are saying it and maybe not share too much personal information about yourselffor example, not using your own picture as an avatar. Women and girls do a lot of forms of safety work to be able to use online spaces. Very clearly, that tells me that we need to think about safety and freedom as connected, and safety being a precondition for us to be able to take up our right of freedom of expression.

Louise Perry: I agree. There is not necessarily a quantitative difference between the amount of abuse that men and women experience online, on my reading of the data, but there is a qualitative difference. For women, it is much more likely to be sexual, which has a more intimidating effect and is more likely to drive women offline. Polling seems to show that.

Q74              Viscount Colville of Culross: Good afternoon. We have been given quite a lot of research to read through. What you have already said has emphasised this, but that research shows that men receive hate messages much more targeted towards what they think and, for women, it is directed to who they are and they are strongly affected by that. We have also been told that Amnesty International, in its 2020 report, accused Twitter of failing to respect women’s rights online, and failing to respond in a transparent manner to reports of violence and abuse.

Louise, you spoke quite strongly about your concern that, if there was too much control of hate speech online, that would limit our freedom of expression. Should the platforms and social media groups be doing anything to take down or in some way filter gendered harms?

Louise Perry: There are definitely some measures that social media platforms can take and have taken, which can be very effective and sophisticated, drawing from social psychology, in incentivising and disincentivising certain types of behaviour on their platforms. They are very positive. I am personally more familiar with Twitter just because it is the only social media I regularly use, but they have introduced notification filters, limited who can reply to tweets and things such as that, which have the cumulative effect of making the space easier to use.

The difficulty that we come down to, which was engaged with very well in the last session, is that these are private companies. They are motivated by profit, not maximising the well-being of their users. They have their own political biases, so we inevitably see some forms of speech censored more than others, particularly on a platform such as Twitter. The President of the United States being removed from Twitter is the most remarkable example of this. This is clearly not a politically neutral form of internal regulation. It has to be tailor-made for every platform, so I am not in a position, in a technical sense, to go through every single change that should be made.

In terms of future-proofing, as Fiona was quite rightly saying, the legislation compelling these companies to do something, in a bespoke way, to limit the misuse of their platforms seems like a good thing. It would be good to have the default settings for users more oriented towards privacy and safety, rather than people having to seek out ways of making themselves safer online. A lot of people are not tech savvy enough to do that and might not be aware of the measures that they can take. It would be much better for these platforms to be orientated towards the least tech savvy rather than the most, as they currently are, always remembering that the problem here is that these are private companies that are oriented towards profit. They are not in the business of behaving responsibly in general.

Many porn platforms are, of course, social media platforms, Pornhub being the largest and most famous. It is essentially a specialised form of YouTube and is egregious in its reluctance to regulate itself. Recently, we saw some very belated efforts by Pornhub to start cracking down on the misuse of its platform, which it did only in response to Mastercard and Visa withdrawing their support. What is fundamentally motivating these platforms is money and that is a very bad moral metric. Unfortunately, the time has come for there to be more state intervention requiring these companies to behave more ethically.

Viscount Colville of Culross: Fiona, you pointed out that the regulation should happen to online platforms to make sure that there is freedom of expression for minoritised groups. To deal with the fears expressed by Louise about controlling freedom of expression for everybody else, how do you define those gendered online harms? How should the platforms be responding to them?

Dr Fiona VeraGray: It is a really good question. Even when I was preparing for this, I thought about how there are so many gendered online harms. It is hard to think of how to approach it. We have looked at the issues of online gendered hate speech and pornography. We have not really touched on the non-consensual distribution of private sexual images, which happens across social media and pornographic platforms. On a lot of the tube pornography sites, which are free to access, users are uploading stolen content taken from other pornography producers. The people in those videos are no longer getting any kind of income generated. That is another form of gendered harm. There is a huge range.

I think about something being a gendered form of harm when it disproportionately impacts a particular group. What you said at the start about it not necessarily being a disproportionate extent was important. We can think about both men and boys receiving online abuse, but it is about the disproportionate impact, some of which is about the ways in which women and girls are situated as being responsible for preventing this kind of material offline and how that is translated into the ways that we manage ourselves online.

It is tricky. How do you balance it? One way may be to start thinking about prevention. Louise’s point is really important. These are massive companies, and they respond to money, but they do have to respond to government law. We have seen this with what is happening in Australia at the moment. It is very interesting that there seems to be a stand-off happening between Google and government. That is something to keep an eye on. Their response has been that stand-off. They do not have that same stand-off when they are told to do something by Visa withdrawing its services from their platform. There is definitely the money question there.

I have been thinking about this with colleagues at the law school in Durham. We have the public sector equality duty, which puts a positive duty on public sector organisations to have due regard to eliminating unlawful discrimination, harassment and victimisation, advancing equality of opportunity between different groups, and fostering good relationships between different groups. Similarly, we need a positive duty on these companies. It is not enough to be removing this material or preventing online hate speech. How are you proactively advancing equality of opportunity for freedom of speech? How are you proactively eliminating unlawful discrimination and fostering good relations?

That starts to get us into some really interesting territory about the ways in which these platforms can be used. Again, it was covered in your previous session. We know that they have a massive amount of data there and that it is being used by private companies to change our behaviours, so that we start buying things, or to sell us particular products. How about platforms use that information to send out more positive, challenging messages about changing gender norms: harmful norms of masculinity and harmful and restrictive norms of femininity? How can we proactively make these companies start to do some preventive work?

That, in some way, gets us to the side of this issue of preventing someone’s freedom of speech to enable someone else’s freedom of speech. It takes a different route in. On the back of what has been said previously, that might provide a different approach. I do not know if it is being done yet. It might be that we could lead on it here in the UK.

Viscount Colville of Culross: That was very helpful. Thank you so much.

Q75              Lord Vaizey of Didcot: Thanks very much for that. That is a terrific idea. I think you were both watching the previous evidence session, so you know that I am going to ask, effectively, the same question about digital citizenship. My thought process has been going all over the place as I have listened. Part of me thinks that the online world has unleashed baser instincts, which society manages to suppress in the way social norms worknot entirely, but broadly speaking. Using my terrible analogy of the pub, people tend not to shout rude comments across the room at people in the way that they do online.

Your point about the platforms getting positive messaging out there is an interesting one. If you take an analogy with Kick It Out, whether it has worked or been halfhearted, the football league has at least stepped up to say that racist chants are unacceptable in football grounds and relentlessly campaigned on it. The recent racist abuse of Marcus Rashford online makes my point that, when some people are out of scope, they revert to their normal type. I am being a bit discursive, but that point was very interesting and I hope it is one that we take up in our report.

It helps me start the question. Can you do anything about digital citizenship in terms of educating adults? I was going to say, “No, you cannot”. You have to go back to general education, which is going on all the time, whether it is Kick It Out or other things, to tell people that this is not an acceptable way to behave. Maybe digital citizenship can happen and the platforms could come up with innovative ways of making those social norms much more prevalent online or campaigning for them. Anyway, I think you have worked out that I am trying to ask you a question or take the point on.

Dr Fiona VeraGray: It is interesting and something to think about. We could come at it in a slightly different way, which might open up some opportunities that we do not have yet. I challenge you a little about people not saying things such as that in the pub, because I know from my research about women in public spaces that women have anonymous, quite aggressive things said to them in pubs, on public transport and when walking outside in public. The similarity between what happens there and online, a lot of the time, is the anonymity of the offender. There is something about anonymity and established gender norms that means some men will take the opportunity that anonymity provides them in order to harass.

Lord Vaizey of Didcot: That is a very fair point.

Dr Fiona VeraGray: We need to think about how that happens online. My thinking on this idea of digital citizenship is in a similar area to yours. This was brought out in your earlier session. What happens online is just an extension of what is happening offline. There is not that clear distinction any more. We could use the opportunity of the online space, not to change digital citizenship necessarily, but to start changing those broader gender norms that leak out. We have a situation at the moment in this country where 85,000 women a year are raped. If we can somehow connect those two things, because they are connected in women’s experiences, we can use this as an opportunity to start changing that broader picture of violence against women and girls.

I would recommend, and I can tell from the questions that your committee is already thinking about this, that it be evidencebased. I would be careful about saying to the platforms, “This is what we need you to do; can you do it?”, and then them coming up with something really weird, ultimately wrong, possibly quite ineffective and, at worst, harmful. There is a growing literature about how to change people’s attitudes and behaviours around violence against women. It is about stepping back and changing gender norms, not necessarily changing individual actions but trying to change what people think masculinity and femininity are, so taking that step back. I would encourage that any push on platforms to take up something such as this be evidencebased and based in the literature about preventing harms and violence against women and girls.

Lord Vaizey of Didcot: Do you know anyone who is doing this, even as a small platform?

Dr Fiona VeraGray: To my knowledge, no one is doing this. The only platform that I thought of that does some good, interesting stuff is Bumble. I do not know if you know of it. It is a dating platform for women. It did stuff about preventing cyberflashing, which is endemic for women on dating platforms, as anyone who has used one will know. Bumble did some really good work using algorithms to identify when a penis image was being shared and to blur it out, until somebody clicked on it and said, “Yes, I want to see it”. There are tech solutions that they can come up with.

I do not know anyone who is doing anything, particularly, on challenging gender norms because there has not been the requirement. I am a bit sceptical that they will do anything unless there is some kind of duty or push from government.

Louise Perry: I am afraid I take a slightly more pessimistic attitude.

Lord Vaizey of Didcot: I do not think Fiona was feeling particularly optimistic. I was just very intrigued by her proposal.

Louise Perry: I would be very disappointed if the only thing that came out of this committee was a recommendation for PHSE instruction on digital etiquette, for schools to take a more didactic approach.

Lord Vaizey of Didcot: I was thinking more about adults, but I agree with you. I am coming to the conclusion that it is probably a waste of time, which is why I seized on Fiona’s idea.

Louise Perry: I do not think it would be a waste of time. I have taught a lot of consent workshops in my life. I do not think they are pointless. Their main function is to educate children about the law and give the schools an official position, meaning that, if children misbehave, they have no plausible deniability. When people are spending several hours, at least, online every day, a onehour workshop is not quite pointless but close to it.

In terms of the platforms having their own didactic function, the recent withdrawal of Mastercard and Visa services from Pornhub is a really interesting example. This happened because there was an article in the New York Times, which is a prestigious, internationally read publication, so there was some public pressure put on MindGeek as a consequence and, to some degree, this worked.

Lord Vaizey of Didcot: The credit card companies withdrew their services.

Louise Perry: Yes, they did, because of the bad PR.  Bad PR has some degree of control here. Thinking about Kick It Out, as you mentioned, fortunately, there has been a huge swing in public opinion against that kind of racism, which means that, when we see it, there is a negative public response. This is a positive cycle, which means that there has been more official action taken. The problem is that porn is quite immune from this. Porn is astonishingly racist, for instance. Almost nowhere online will you see more explicit racism than on entirely mainstream porn platforms that are some of the most visited in the world.

It is so dismaying to see a platform such as Pornhub, for instance, that will say lots of positive political things, donate to political causes and will work quite hard on its public image, simultaneously platforming all this stuff. For some reason, there does not seem to be that feedback loop of public opinion. Either people do not know about that content or, for some reason, it is just considered to be outside the realm of public disapproval.  Thinking about a platform such as Twitter, Jack Dorsey cares about what people think of him. He and Twitter have some interest in their own public reputations, which means that they are responsive, to some degree, to bad press. That is basically the only thing that would motivate them to do any of this kind of positive work and that does not apply across the board.

I do not know whether state intervention could oblige them to do that sort of thing. I am afraid I am quite pessimistic about the scope of any of these companies to have any kind of positive input in that way. I think they really just need to be controlled.

Lord Vaizey of Didcot: That is fair enough. It might be that the online harms Bill could look at that in the future.

Q76              Baroness Bull: We are going to be staying on the theme of pornography and pornographic websites with my question. It seems we have talked quite a lot about that. Louise, this is not my question; it is a musing and I will make it brief. That line between legal and illegal porn seems to be incredibly blurry and difficult to navigate. It seems as if the big websites are using freedom of expression as a defence for retaining some of their content. You will, of course, know absolutely about this.

My question is really about the responsibility of the legal websites to verify that there is consent of the participating actors. Are those processes in place? Are they sufficiently robust? Are they doing enough and what would your view be about where those processes might be enhanced?

Louise Perry: No, they are not doing enough. There is a large organisation in America called Traffickinghub, which has done quite a lot of campaigning on this issue. It has tested the verification process by trying to upload things and seeing how robust the system is on sites owned by MindGeek. It is not robust at all. It is extremely easy to hoodwink the verification systems. Content uploaded by unverified users was all deleted in response to the New York Times piece earlier this year, but, unfortunately, there is still a huge amount of very dubious content available, including a huge amount of content that is or ought to be illegal in this country, such as strangulation images and rape porn.

The defence that many of these companies would make, and you are absolutely right that they hide behind freedom of expression, is that this is simulated and there is a huge amount of rape pornography, described as such, available online, but it is impossible to tell the difference between something that is simulated and something that is real. It is also impossible to tell where content might have been uploaded at the time with the consent of the user but they no longer give consent for its continued availability. Unfortunately, once this stuff is available online, it is not going away and it is impossible to expunge it.

There have been various scandals. For instance, there was a porn production company called GirlsDoPorn, which was hosted by Pornhub. It produced a huge number of videos over several years. There has been a scandal and criminal proceedings taken against the producers, because it turned out that they had tricked the women in the films. They had basically told them that they would not be distributing the video publicly, but then they did. It is possible to remove it from the official platform, but it has been shared. This stuff is available absolutely everywhere. Once it is out there, it is very difficult to rein it back in, which is one of the ways in which the internet is different from any medium that we have ever encountered or had to legislate around before.

We also know that there are lots of cases of people who have revenge porn images put up, or who no longer consent to having their images being available, reporting them to the platforms and being ignored or having nothing done for weeks or months at a time. This is very common. It is distressingly common, particularly to very young women, who had this experience and who are sometimes driven to suicide and so on. It is really grim.

In this regard, I would advocate very heavy fines for the platforms that fail to remove that contentwith the outcome, I hope, that they would remove it by default as soon as there was any report made, because at the moment they are dragging their heels and they suffer nothing for doing thatand criminal liability for any of the executives who are based in the UK. That has to be the really heavyhanded approach at this stage because they are not going to do it spontaneously.

Baroness Bull: The individual damage is already done at that point, so it is protecting future victims, but it does nothing, of course, for the damage that is already done. You said earlier that there was a difference between porn and erotic stories. I am not quoting you exactly there. Porn is not speech. It is not erotic stories. We could have a whole seminar on that and we do not have the time. Is your argument that porn per se causes so much damage that you would like to see the regulations much tighter? Would you accept that there is a case that sexual expression is a kind of expression?

Louise Perry: I would be delighted if we started enforcing existing legislation. That is where I would leave it. Currently, it is already illegal to have bestiality, necrophilia, anything threatening someone’s life, rape porn, revenge porn and child sexual abuse images. If we could get rid of that, I would be absolutely delighted, so I am trying to be pragmatic.

Dr Fiona VeraGray: I do not have that much to add. No, there are no robust processes. To echo Louise’s point, we need to start thinking about consent. At what point are we thinking about establishing the consent? Is it consent to have the video made, uploaded, distributed or to have other people make money off it? Again, we need to think about these tube sites, which are stealing the property of other people and performers who did consent but maybe did not consent to it then going up on Pornhub.

To that point about responsibilisation, currently it is individuals’ responsibility to find this material online, which, a lot of the time, happens from a friend or a colleague saying, “Hey, I saw you on a porn site”. It is up to them to find this material and then report it. The Revenge Porn Helpline, when people report this material to it, has good relationships with some of the companies, which will then take it down. Where individuals report this material, we know directly from victim survivors that, exactly as Louise said, the companies drag their heels, they do not hear back about what happens and then they find, because of the nature of the internet, that somebody has already ripped that, saved it and then uploaded it to another 20 porn sites.

There are no robust processes. To be honest, I do not think there is a real commitment from them to ensure that there are robust processes to verify consent, because that will significantly decrease the amount of material that is going to be available on their platform. There is not that desire from them. Louise has mentioned MindGeek. We talk a lot about Pornhub, but we need to be aware that MindGeek is a large company that owns a number of sites. At least nine of its sites are in the top 50 pornography sites accessed in the UK. The only site around which it has done any kind of proactive work is Pornhub. If it really cared and had an ethical commitment to ensuring that its platforms were free of nonconsensual material and child abuse images, it would have applied that across RedTube and all the other sites that it owns and operates. It has not. It has done it just with Pornhub because there has been pressure from card companies.

Pornhub is being set up as this acceptable face of online pornography. It launched a sex education centre in 2017 to try to say, “Look, we are doing proactive sex education work”, for example. We need to have an eye not necessarily on the platform but on the company and all the subsidiary companies that it owns and manages. I would support the view that, no, they do not have robust processes.

Q77              Lord McInnes of Kilwinning: Thank you both. That has been really helpful. I would like to ask about the nature of the state intervention. I guess the answer to my question as to whether government is doing enough to remove illegal porn from the internet would be no. I wanted to explore further what you think the Government should bring forward to remove that illegal porn. From what you have already said, could the Government do more to remove that radicalised sexual imagery, such as strangulation, which may not be entirely illegal, but is leading to radicalisation of the knowledge of sexual acts among young people?

Dr Fiona VeraGray: The Government are not doing enough. It is quite hard. As Louise has touched on, the online space creates a legislative challenge. The material is not produced here. It is not being distributed here. It is hosted somewhere else. The only thing you can legislate is about consumers. They have done that for extreme pornography. The Internet Watch Foundation, which I am sure you know of, is funded by industry, not by government. It does amazing work, but it is just on the removal of child sexual abuse images. It is not on any other form of illegal pornography, so the Government do not really do anything to ensure that illegal pornography is removed from the internet. I cannot think of anything that the Government proactively do, with the caveat that it is quite hard to do anything.

There are opportunities. I have not looked at the most recent online harms White Paper, but there is a real opportunity there for the regulator to go further, not just focusing on illegal content. The regulator should have a role in ensuring that the terms and conditions of the website reflect what is on the website. I recently did some research where we looked at 150,000 of the videos on the three top mainstream porn sites. We did not look; we got a computer to look at them for us and tell us what was there. We just looked at the front page and the material that was advertised to a firsttime user. We found tens of thousands of descriptions of images that would violate the terms and conditions of the site. The site says, on the front page, very clearly, “We do not host racist material, material promoting or endorsing sexual violence or material that is classed as revenge pornography”.

This is not even about the content. This is about the titles used to describe it. Platforms could very easily create an algorithm that searches the front page, or the whole database, but definitely does not push those videos to the front page. The algorithm of that site is pushing those videos to the front page. We had the advanced computing institute do it. It was anonymised, basically. It pushed things to the front page with no interactions or it did not know where we were accessing the material, et cetera.

There needs to be something about ensuring that the terms and conditions of the website accurately represent what is on the website. At the moment, they seem to just be there and I do not know how they are enforcing them or anything. There also needs to be some kind of mechanism of appeal, which is not in the White Paper, the last time I looked, for individuals to approach the regulator directly when online harms have been experienced and the company response has been inadequate. We need someone we can contact and say, “Can you look into this for us?” At the moment, that is not in there. The e-safety commissioner in Australia, which I can talk about in a minute, is something that we might want to think about.

It is important that we do not just focus on illegal content. It is so difficult to police the internet, but, when you look at the terms and conditions of these sites, they know very well the kind of material that should not be on there, because it is there. They are saying that this material should not be on here. We need a mechanism to hold them to account for what they are saying should not be on their own platforms.

Louise Perry: I absolutely agree with that. We have existing legislation around possession of a lot of extreme porn, which is, effectively, not enforced and, frankly, could not be enforced. It is beyond the police’s capacity to pursue this. When you end up with any prosecutions, it is typically because the police have been looking at the computers of a suspect for some other reason. Particularly post austerity, the police are not even keeping up with child sexual abuse images, which they are supposed to be proactively pursuing. It is just not possible.

The only way of really contending with this is to target the platforms and hit them where it hurts. I would suggest very large fines, which I believe other European countries are considering, linked to how long content has been up. You might say that, for instance, for every hour after a report is made that it remains up, some sort of fine is levied. Similarly, having personal criminal liability for any UKbased directors is really important. Particularly thinking about companies such as MindGeek, their directors are extremely private. It is not like with Facebook, Twitter or whatever, where the CEOs are household names. There is a lot of deliberate obscurity, I think, which the state is in a position, potentially, to challenge by bringing criminal proceedings against directors who have knowingly enabled their platforms to facilitate criminal activity.

I briefly mentioned the slippage between legal and illegal porn. I was surprised when I first started writing about this by how much stuff is already illegal. It is already the case, for instance, that most strangulation material is probably illegal in terms of acts that threaten a person’s life. That has been illegal since 2008, including simulated. The extent to which this stuff is publicly available online is absolutely astonishing.

Q78              Baroness Buscombe: Thank you both very much. This has been incredibly informative. You have given us some pretty stark examples and, in a sense, you are, overall, raising awareness of a really difficult issue. One of my questions: what would you like the outcome of this inquiry to be? You have both given us quite a lot to think about in that area. I am not sure we need to say more because, even in your opening comments, you were talking about serious issues such as anonymity, prevention, enforcement, the fine lines where you might agree with each other or not, and the stark reality that what I used to know of from a legal standpoint as sexual asphyxiation has now become commonplace strangulation. I have to say, from a legal standpoint, we always looked at it as an issue of men.

Do you know of any examples across the world where other countries may be handling some of this stuff better than we are? Do you have any examples that we could look at and think about where gendered harms are being addressed in a more robust way?

Dr Fiona VeraGray: Similar to the previous witness, at the moment, everyone is in the same situation. We are all starting to grapple with this. We have let it all happen, and then we have realised that there are some people profiting off harming society and we need to do something about it. There is some interesting work happening in Australia with the e-safety commissioner, which is established. At the moment, they are consulting on an online safety Bill, which has some interesting things. I wrote them down so I would remember.

They are thinking about establishing in law a set of basic online safety expectations for the digital industry with mandatory reporting requirements. It requires sections of the technology industry to create new and strengthened industry codes that meet the Government’s expectations to keep users safe, and provides new and strengthened powers for the e-safety commissioner, requiring the digital industry to do more to make its platforms safe. They are really getting the point I was making that we need to think about safety as a precondition to freedom. They are doing some work around creating a safe internet and connecting that to the issue of freedom of expression.

They are consulting on it at the moment. I will leave it up to you to have a look at it. I am not going to say that it is 100% great. I just know that it is happening. There is some work happening in Iceland. I am sure you are aware that, for a long time, the Nordic countries have been doing work on pornography and gender inequality. They have led the way globally in recognising that this issue is about violence against women and human rights, not necessarily about censorship.

They are currently looking at some ways to think about what they can do in terms of online pornography. A while ago now, they were going to ban online pornography. Louise, I do not know if you remember that. They could not do it. I cannot remember what they did, but they have made a really positive statement about their intention or the ways in which they see pornography as being linked to substantive gender inequality in the offline world. That is a really interesting and important principle and position to take. It requires further investigation. It would be great if the UK could look to what they are doing there and think about how we can embed that here.

Louise Perry: I do not have anything to add. Fiona said it all extremely well.

The Chair: Thank you very much indeed. Thank you to members of the committee. It has been a very interesting session, with some completely different perspectives for us, which are going to be extremely valuable. Keep following our inquiry and, if you have any further thoughts, we would very much like to hear them. Louise Perry and Dr VeraGray, thank you very much for all your time this afternoon.