Women and Equalities Committee
Oral evidence: Non-consensual intimate image abuse, HC 664
Wednesday 8 May 2024
Ordered by the House of Commons to be published on 8 May 2024.
Members present: Caroline Nokes (Chair); Elliot Colburn; Kate Osborne; Kirsten Oswald; Bell Ribeiro-Addy.
Questions 1 - 67
Witnesses
I: Georgia Harrison, Campaigner, Broadcaster and TV Personality.
II: David Wright, CEO, SWGfL, and Director, The UK Safer Internet Centre (UKSIC); Keily Blair, Chief Executive Officer, OnlyFans.
Witness: Georgia Harrison.
Q1 Chair: Good afternoon and welcome to this afternoon’s meeting of the Women and Equalities Committee, and our inquiry into non-consensual intimate image abuse. Can I thank you, Georgia Harrison, for coming to give evidence to us this afternoon? The Committee members will ask you questions in turn, but to start could you just tell us in your own words about what has led you to be a campaigner on this issue, and what happened in 2020 that has brought you to this place?
Georgia Harrison: For anyone who is not aware, back in 2020 an ex-partner of mine filmed us having sex without me knowing. So he had cameras around his garden and inside his property, and I was not warned of that prior to the sexual intercourse. I was not told until after that actually 20 minutes of the footage had been recorded. I was then assured on that day that it would go no further than those four walls, that the footage would be deleted and that was the last anyone would know about it.
However, further on in that evening I saw him send that to someone else via WhatsApp. I then made him retract it, and I explained to him explicitly what would happen if that footage was ever to get into someone else’s hands. I asked if he was aware of what revenge porn is and explained that if it ever did go anywhere I would then go on to call the police. My only regret, in hindsight, was not calling the police there and then, but I was hoping that it would never ever go any further than it did that day.
Six months down the line I was then sent a screenshot of the footage in question from a fan of a TV show we have both been on. That fan was located in America, so as soon as I saw that image I immediately knew that somehow the footage that I had been told would never ever go anywhere had been spread globally online. I just was not aware of how or where it had come from, so obviously I then immediately asked this fan, “Where have you seen this?” I was then told it was on the person in question’s verified OnlyFans account, Stephen Bear’s.
But the person that I was speaking with then deleted their account, so I was not sure how to track it down, or how far it had gone. My first thing was, “If I can track down where it has come from, maybe it is not too late to stop too many people from seeing it.” So I then went on to my Instagram and just said, “Has anyone seen a video of me in said person’s garden? If so, can you please send it to the email address that is on my account.” This was actually my mother’s email address as she was my agent at the time.
With that, we had about 100 different people messaging in with evidence of this video. Not just from his OnlyFans account at this point, it had been spread to various different platforms—Pornhub, XXBrits—there were multiple different platforms that were hosting this video, and it had already been hosted on the OnlyFans account for, I think, at least a month at that point. So unfortunately, by the time we were made aware that this was circulating on the web, millions of people had seen it and it then just completely and utterly blew up. It went viral to a point I could not explain and my phone was just going off every two seconds. The link was readily available to the public, and people were getting it sent around WhatsApp groups. I was obviously absolutely horrified that an individual had done this to me, but what most horrified me was that these platforms were hosting this video, which was unconsented.
Some of them did well but many of them did not, and for many that I tried to reach out to I would get an automated response saying, “We will get back to you within four to six days.” Speaking earlier I used a metaphor of when something like this is happening it is really like a house fire, the quicker you can put it out the quicker you can stop it. Unfortunately in four to six days your house has burnt down and it is just too late, everyone knows about this video: your family, your workplace, your peers. However, if you can get through to someone in the first 24 hours, you then have time to stop this going any further and potentially not ruining your life. But for me it went as far as you can imagine, to the point where it was global.
Q2 Chair: Your first recourse was actually to social media, not to the police. Why was that?
Georgia Harrison: Because I knew it was on some sort of platform already, and I had such a big platform, I just think I assumed I could maybe find it before even calling the police.
Q3 Chair: And when you did go to the police, what sort of response did you get?
Georgia Harrison: I had a very good response from the police; I know not everyone has the same experience, but immediately it was just, “We’re going to take your statement, we’re going to do it from start to finish and do everything we can to stop this from happening.” But their immediate response was that they needed to arrest the man in question, however, he was not in the UK at the time, he was in Dubai. So I think they went round to his house to look for evidence of the cameras and stuff like that, but it was quite a long time before we could actually get him into custody, because he was not in the country.
Q4 Chair: You said that you were first made aware of it by somebody in the US. Is the global nature of the internet one of the big challenges, that legislation in one jurisdiction may not cover others?
Georgia Harrison: I think so; it was quite shocking that it took a month for me to even be aware that this footage was circulating online. One of the biggest challenges in general is I feel like these big social media companies who are hosting pornographic images and videos do not have any way of us getting through to them to report it when there is something unconsented there. And it should not be robots that you are getting through to when it is this important of a situation. It should not be that hard to be able to get through to someone and just say, “Look I’m not saying delete it, I’m not saying take this person’s account down, I’m just saying can we pause it and then review it in a few weeks?” That is the biggest issue that we are dealing with at the moment.
Q5 Chair: That could give a breathing space for a victim, if it could just be instantly paused?
Georgia Harrison: Yes.
Q6 Chair: The technology would exist to do that, which would then give people a chance to consider it more carefully?
Georgia Harrison: Yes, because then they could go back and review it, and if whoever is in it has not consented then obviously it would never go back up. But if actually, after review, both people have consented, I do not feel there is an issue with that, and then the footage is still there to get put back on the platform. But if something gets reported to the level where you say, “This is me in a sexually explicit act and I have not given permission for it to be on your platform,” it should be paused that day, not in four to six days.
Q7 Chair: Finally, from your experience of the criminal justice system, do you feel that it worked for you, or what would you change?
Georgia Harrison: I feel it worked very well for me. It would be good if there was legal support for victims during the process, which is something a lot of MPs have been discussing. Because a lot of young women do not have anyone to go to, to give them advice as they are going through this process, and it would be really nice if victims did have help.
Q8 Chair: Where do you think the advice should come from, should police forces have a system in place so that there is support for the victim?
Georgia Harrison: Yes, having a system in place, or what is being proposed is there would be funding from the Government to certain lawyers who could help support victims throughout the process, just to help them understand what their rights are, basically, and to help them through it. So many victims give up during the process because it is just too much for them, or they are not sure if they are doing or saying the right thing. Sometimes they feel like they are getting victim blamed slightly. If they just had someone to lean on during the process they would not feel so much like they have to withdraw, they would have an inner strength in them, and they would have advice on what to do and what to say along the timeline of events.
Q9 Chair: The timeline can be really long, so did you ever feel like giving up?
Georgia Harrison: Yes, there were definitely times when I felt like giving up. First, knowing that I then could not speak throughout that timeline until I reached the court process, there were so many times I thought, “Shall I just give up, so I can just, like, speak my truth to the public?” But I knew I had to push on with what I was doing no matter what, although there were so many times I did think, “Have I made the right decision?”
Q10 Chair: You waived your anonymity, at what personal cost to yourself?
Georgia Harrison: I feel as though my anonymity was removed as soon as the video went up anyway, because everyone I knew friendship-wise, colleague-wise, family-wise—which is your main concern when something like this happens—were already made aware of the video. So my anonymity did not really exist.
But obviously when I did take the choice to step up and speak about it publicly, yes, I just felt I had been at such an injustice. Not just by the man that done it to me, but by the platforms that used me to make money on their behalf, that literally used this unconsented footage to make money off of me. I just felt like it was something that I did not ever want to happen in the future, and I honestly felt like my voice did not matter as an individual.
It was not until people realised that potentially I had a platform that they made the necessary moves to take the footage down. But so many women I have spoken to just get ignored by these people, and I just felt like I needed to stand up and make some form of a change to help support others in the future.
Q11 Kate Osborne: Georgia, I would just like to echo what the Chair said, and thank you personally for coming in today, for spending so much time talking to the MPs about your case, the wider issues and the urgent legislation change that we need around this. We are all so impressed by your resilience and your bravery in sharing your story in order to help so many others. Can I start by asking you how the case impacted you personally, both emotionally and physically?
Georgia Harrison: It impacted me in every way you could imagine. So I always sort of compare it to grief: you have to actually grieve a former version of yourself, you feel like you lose your dignity and a lot of pride, there is so much shame involved in it. For the first few days I was really just going through waves of complete sorrow and shock.
It got to the point where I was so emotionally affected by what happened to me that I ended up being physically ill as well, to the point where I was in hospital for, like, five days over Christmas, a little after the incident, because the stress took such an effect on my body that I ended up having a cyst burst and I got an infection. It was literally just like my body deteriorated with my emotions.
This obviously also had a huge knock-on effect on my career, because I went from just a normal girl who works in television to someone who is now effectively in the porn industry, even though that was something I never ever wanted to be involved in; I always wanted to be a presenter.
For about two years so many brands retracted from working with me. It got to the point where I had to move out of my home and move back in with my mother for two years. “Love you, mum”, but it was not the plan at 27. So financially it was a massive hit to me, but emotionally it just took away a lot of my innocence and changed the way that I value myself as a person. I know now, whenever I am dating, meeting new people or going into any sort of new work opportunity, I am known as the person who has this sex tape which I never ever filmed and never consented to be out there.
So it changed a lot for me, and definitely gives me a lot of fear that one day, if I do have a family, which is something that was always my intention, are my children going to be able to stumble across this footage because I am not protected in the right way by the Government for that not to be an issue right now?
Q12 Kate Osborne: Thank you, and if I could just touch a bit more on your professional career. I know that shamefully some brands dropped you as their ambassador. Was there any attempt by them to acknowledge that you had been a victim of a crime?
Georgia Harrison: No. If you looked at my earnings before it happened and then after, the drop was just completely drastic. But the majority of them obviously did not want to admit that the reason that they did not want to work with me was because of this issue, especially at that time, prior to my name being cleared.
Things definitely turned around once I had the court case and I got the guilty verdict, but prior to that people just wanted to wash their hands of me. They were not sure if I had some form of involvement in the video. Either way, there were plenty of people who had the same following as me or the same thing to offer, and I just became someone that was a red flag. So yes, many of them just sort of acted like they did not want to rebook me, but a few people said to my agent, sort of off the books, that obviously a lot of the reason that they done it was because of the video.
I will never get my connection back with the majority of those brands, but I am building new connections with brands that really stand up for women, that believe in tackling violence against women and girls. They are the sort of brands I want to be working with now, but for about two years I could not get any work whatsoever because of it.
Q13 Kate Osborne: Post the case, did any of those brands come back to you in any way? If not to apologise, at least to acknowledge that you had been a victim and that maybe they had not acted in the best way?
Georgia Harrison: Not really. I do not think any of them really wanted to admit that the reason they stopped working with me was because of that, so to come back to me now and apologise would be taking accountability. Some of them I do work with again now, and I can understand that potentially maybe I was just a flight risk at the time. But some of them I will never work with again, just because I am very much aware that they dropped me in my time of need.
Kate Osborne: Yes, that is understandable.
Georgia Harrison: Yes.
Q14 Kate Osborne: When we were talking earlier you were telling me about how many young girls or women approach you—I think you said all the time or certainly very often—to tell you about themselves being in a similar situation. Can you tell me a bit more about how many people have contacted you directly since you spoke out about your experience? How do you deal with someone who is in a similar situation, and what do you say to them when they are looking for advice?
Georgia Harrison: All together I would say thousands. When it was at the peak of what happened, I was getting just a complete influx that I could not even keep up with. But to this date, bearing in mind we are about three years later, I still would say I get about five to six messages a day from either a victim or a victim’s family. They are usually just asking for help, detailing what they have been through. It does tend to be around image-based sexual abuse, but I feel like it is now more a broader scope, where it can be domestic abuse, people who have been sexually assaulted or rape victims. I usually just do my best to direct them to whatever charity I feel can help them the most.
When it comes to image-based sexual abuse, I will always give them, like, some form of advice, a little guidance and reassurance that, “Hopefully things can get better for you and will.” But I will then pass them over to the Revenge Porn Helpline, who can then help them from there.
I dread to imagine how many people they get sent through via me, let alone in general, as when you Google for help in image-based sexual abuse, that charity is the first thing that comes up. But I can definitely assure you that the statistics will in no way truly reflect just how much of an issue this is in society, because so many women do not come forward, they just do their best to live with it if they can.
Q15 Kate Osborne: I am shocked by the fact that you say thousands, and so many others who are dealing with this must be out there. You did brilliantly campaigning for, and managing to get, the requirement to prove intention to cause distress in intimate image abuse cases to be repealed. The loopholes in existing law mean that thousands of websites containing intimate image abuse material cannot be removed from the internet, once again resulting in the victimisation of so many. In your view, what more needs to be done to support victims of non-consensual intimate image abuse?
Georgia Harrison: As someone who has experienced the court process, to actually prove in a courtroom that in fact you were a victim of image-based sexual abuse, and that whoever targeted you had done this without your consent, is obviously a really hard journey to go through. It is one thing to see them go to prison, and it is a brilliant thing, do not get me wrong, but it almost feels like a kick in the teeth that you can go through that entire process and then at the end of it the Government say “Oh, but by the way we’re not going to make this imagery or these videos illegal.” Like, they are still legal to be shared online, but you have actually proven you did not give your consent to have them filmed, and they are all of a sexually explicit nature.
Quite frankly, the only way to make things make sense is that at the end of a successful court case such footage should then be deemed illegal, and then it would be so much easier for us to get that removed from these platforms. It would not just be easier for the charities trying to get it removed, it would be easier for the platforms that have to remove it. They are in a situation where, if the footage is not proven to be illegal, do they even have the right to remove it?
Q16 Kate Osborne: As a legislator it seems clear to me what we need to do to put these protections in place. What else can the platforms do, do you think?
Georgia Harrison: The best thing that the platforms could do is to have some form of a 24/7 phone line where you could get through and speak to a human about what you are going through. When you look at like any other sort of service, Vodafone, BT, if I have something wrong with my wi-fi I can get someone on the phone any time of day. So how is it possible that these platforms that are making billions of pounds a year—not even millions, billions—cannot have some sort of a compliance phone line where you can call them up and say, “Look, this is an unconsented video, it’s so important that you do something about this,” and someone goes “Okay, I hear what you’re saying, we’re on it”? It would make such a difference to victims out there, and they would not feel unheard; there is nothing worse than feeling like you are drowning and no one is available to answer the phone.
Kate Osborne: Thank you so much.
Q17 Chair: You just used a really interesting phrase, “Prior to my name being cleared.” Did you feel like the one on trial?
Georgia Harrison: At times you do, there is always some element of that. When I was actually having to give evidence, I just remember thinking, “Oh, why did I put myself in that situation?” But I could never have seen that situation coming. But when I was on the stand I definitely felt at times like I was being accused of putting myself in that scenario, and in a way that maybe I deserved what happened to happen to me.
Q18 Bell Ribeiro-Addy: Thank you so much, Georgia, for speaking. It is so important that you have taken the time to do this and that you are campaigning on these issues. I want to go back to some things that you have already said. You were talking about some sort of mechanism for proof of consent before it being published on the various different social media platforms: some sort of a pause. Do you think it would be too much for a social media company, once somebody uploads something like that, to pause it immediately and then require explicit evidence of consent before moving forward with publishing it more widely?
Georgia Harrison: I am not too sure about how all these platforms work, because it is not something that I work in myself. But what I have been made aware of is that some platforms are implementing an algorithm where basically, if you were to own the account obviously you have to give some sort of passport or identification to say this is your account, so you can then upload videos of you. But then they have an algorithm which can detect if there is another human being in the videos. So a little like when we are on our photos, you can type in mum, and all your pictures of your mum will come up. It knows that that is your mum and it knows if there is someone else next to her, so if there is another individual in the videos this algorithm can then say, “This person has not consented to being in this footage therefore it will not make it to the platform.” However, all they have to do is upload their identification as well, alongside the account owner, and sign something saying, “I give permission to be in this video.” Something like that would prevent so much unconsented footage making it to these platforms. The technology is out there, so why are they not putting it on their platforms?
Q19 Bell Ribeiro-Addy: It sounds like this particular process is something you have spoken about before. Have you ever spoken to a social media company about this, and have they responded about why this would or would not work?
Georgia Harrison: No, it is just something I have been made aware of that other platforms are using, so I do not understand why they are not all doing it. For instance, as someone who does TikTok, if you were to upload a video on TikTok, and it is in, like, your underwear or something, it gets screened and it will not usually make it to the platform. I am sure certain things get through, but they have the technology in place to make sure that certain things are not being uploaded to their platforms.
I am sure if you are running a platform that does facilitate pornography it is different, but they do have means of making sure that basically everyone in there has consented, so why are they not doing that?
Q20 Bell Ribeiro-Addy: That makes a lot of sense. I want to take you back to your court case as well, because you talked about getting a guilty verdict, and you also talked about your loss of earnings, issues of reputation, and so on. I just want to be clear as to what the court case actually gave you once they gave the guilty verdict, so that when other people go through this process or have taken it that far, we know what kind of compensation they might receive, or what they are allowed to do. So the guilty verdict was against your ex-partner?
Georgia Harrison: Yes.
Q21 Bell Ribeiro-Addy: And following on from that, because you could prove that he uploaded these things without your consent, was he then required to pay you any sort of compensation?
Georgia Harrison: He was required to pay back the money that he earned off the video to the state. I do not want to say an exact number, I think it was around £27,000. So basically, whatever he earned off it did not go to me, it went back to the state. But I was then awarded £5,000 in compensation, which was more for the trauma that I had to experience as a result of what he had done, rather than me actually getting awarded the money that he made off it. However, to date I have not received a penny off him, but that is technically what should be happening.
Q22 Bell Ribeiro-Addy: And there were no moves to, let us say, reward you or compensate you for the loss of earnings? I suppose everything that would have contributed to you not being able to book certain jobs afterwards, for instance?
Georgia Harrison: That was something that I had to pursue via a civil lawyer. I did win that; however I have also not received any money from it. But I was very lucky to have a lawyer that was happy to do it no-win-no-fee, whereas obviously a lot of victims would not be able to do that. They would never be able to afford anyone to look after them in a civil courtroom, because the costs really do rack up. So technically no, there is not really much compensation for a victim if they are to follow this through in a court case.
Q23 Bell Ribeiro-Addy: Okay, thank you very much. You also talked about victim blaming, and I just want to go into that in more detail. We hear evidence on a range of different issues, and a lot of the time women are blamed for the situations that they find themselves in. Obviously you felt like you were blamed at some point, and you would have spoken to a lot of women who experience the same thing. Would you be able to tell us a little about that?
Georgia Harrison: Yes, for the other victims I speak to, so many of them go through these issues. I feel that, from his defence side, it was very much like they were trying to sort of spin the narrative that I was in love with him and I had somehow orchestrated this whole issue to get back at him. That was very much what they were trying to go with when I was on the stand, so there was definitely an element of victim blaming and trying to spin it on me. But that is something I always knew I was going to walk into, and I just had to stay strong throughout and just keep speaking my truth.
I have always said before, I am a very confident individual and I have spoken in public before, but I still found the whole process excruciating. So for a victim that is a lot shyer or slightly more timid, it is sometimes near impossible to get through taking that stand. I really think it is important that people are safeguarded against victim blaming. It is nice that at this point they are starting to give victims the option to do it via a Zoom link, because maybe that will give them the space to be able to deliver their story properly and with truth, without feeling too emotional. It is a hard scenario and yes, I did have to deal with a lot of victim blaming throughout.
Q24 Bell Ribeiro-Addy: I know in your particular situation that obviously you did not consent to being filmed. On other occasions we know that women sometimes may send an image under the expectation that it is not going to be shared or may record themselves again under the expectation that it would be something that was kept within their partnership. It has been said that women undergo a lot of victim blaming for things such as that. Has that been your experience from people you have spoken to?
Georgia Harrison: Yes, and it really does break my heart. So many women feel like they have done something wrong because at the time they consented to the image or the video being taken. But if you have an agreement that it has been made in privacy between you two, that is how it should be kept. If it is being shared without your permission, it is a law break and you have every single right to pursue justice.
Some women do not even want to come forward; they feel like they almost deserved it because they gave permission, which is obscene and they should always take steps to come forward. But I have definitely spoken to some victims who, in their experience with the police, feel that they were judged because they allowed the video or picture to be taken. There should be no element of judgment for that because that is not breaking the law. The only thing that is breaking the law is sharing it without consent, and it is definitely important that police are aware of that. So if women come to them, they should not be like, “Well, did you let them take it?” That should not be a problem and you should not be judged for that; it is not an issue.
Bell Ribeiro-Addy: Thank you very much, Georgia.
Q25 Kirsten Oswald: Maybe I can take you a little further down that road because it is really the important issue, in terms of the way that the police and other authorities and parts of officialdom deal with this notion of victim blaming that you have spoken very eloquently about. Do you think that there is a need for more awareness raising, more training on these points that you have made about where things are illegal and where they are absolutely not illegal?
Georgia Harrison: Yes, definitely. I have always been very honest that I had a great experience with the police; I felt they handled my case very well. But when I speak to other victims, a lot of them feel that they are not taken seriously. A lot of people just do not understand the emotional trauma and damage that something like this actually does to an individual’s life. It would be really good if the police force went through certain training to just bring awareness to how life-changing this sort of crime really is, and definitely not blaming the victim whatsoever.
Q26 Kirsten Oswald: Do you think that might give people more confidence to come forward as well? I have heard what you have said today about the very large number of people who have come and spoken to you. I am sure that there are many people who have gone to no one and are sitting at home very worried about this and what it might mean. Do you think that, and other things that you might be able to suggest, would make women feel more empowered to come forward in these situations, because the support would be there?
Georgia Harrison: Yes, anyone dealing with someone who is reporting this sort of issue really needs to be aware of just how much shame surrounds this sort of crime. It is just a natural emotion that comes when something like this happens, and at times you almost blame yourself.
It is really important that the victim can come forward almost immediately after the crime. It is very important that police are aware that when they are coming to them, they are still going to be feeling such an array of emotions: shame, blame, all these things. So it is important that they do not say anything that can trigger them at that point, because that is where they will then feel like they need to retract. So if someone did have training on how to handle this emotion, what to say and what not to, I think more people would proceed forward with their claim.
Q27 Kirsten Oswald: Thank you for that. You sounded very expert when you were talking earlier about all the things that you had had to do. You were looking for evidence, if you like, of what had happened. You talked about the different platforms and so on. You have essentially had to become a bit of an expert in this field, which is something that you would not have anticipated having to do.
You talked about the impact on your job, which is not something that is unique to you; the job issue is going to be very worrying for any woman finding herself in this situation. It strikes me a lot of women will have to go digging themselves to try to find the answers in terms of what to do next and what might happen. I just wonder if you are able to share any further thoughts on that, because it is a big concern for people who find themselves in this situation.
Georgia Harrison: Yes, I was not lucky that for a long time my career was paused, but I am lucky that the career I am in is slightly forgiving, and I have managed to get that back on track. But if you were, say, a politician, a teacher, there are so many careers where actually, after something like this happens to you, there would be no going back. It is so important to be aware of that because that is how devastating this is to your life.
I always tell any victims going through it, my only point of contact is the Revenge Porn Helpline or any of the charities out there. Not only can they give you support in terms of therapy and advice, but they can try to track this imagery down and get it back before your employer or your family find out about it. But the only help for these victims right now is the RP Helpline; it is literally all there is, and it is not a big team.
Kirsten Oswald: Thank you so much for all that.
Chair: Elliot, did you have any questions?
Elliot Colburn: No, that was very thorough. Thank you, Chair.
Q28 Chair: Thank you. Georgia, just one final question and it is around content. How big a difference would it make if the content was deemed to be illegal once there was a conviction?
Georgia Harrison: It would make every single bit of difference. Like, at the moment there are almost 30,000 images and pictures that the RP Helpline says have already gone through the court case. They have won, so they are technically illegal, but they are still legal right now.
The changes of the Online Safety Bill mean that platforms actually do have to take accountability now. So the right legislation is already there in place for if it was illegal, to literally just take it down. We just need someone to say, “This content is now illegal,” and by next week it would have changed tens of thousands of women’s lives, who right now are all sitting there waiting for someone to say it is illegal just so all that can go down, and they can sleep at night.
For me, it means I would have a future knowing full well that no one is ever going to be able to find or see that video of me again, as they should not, but until this gets made illegal that is not going to happen.
Chair: Thank you very much. If there is anything that you wish to add in writing after this session, please do. On behalf of all the Committee, can I thank you for being brave enough to come and speak to us today and speaking out on behalf of probably hundreds of thousands of women?
Witnesses: David Wright and Keily Blair.
Chair: Can I thank our panellists for this second panel in our inquiry into intimate image abuse? We have Keily Blair, chief executive officer of OnlyFans, and David Wright, chief executive of SWGfL, and director of the UK Safer Internet Centre. Committee members will ask you questions in turn. If at any point you have not had a question addressed to you and you wish to come in then please indicate and I will bring you in at an appropriate moment.
Q29 Kirsten Oswald: David, could you start by explaining for the Committee how the StopNCII.org hashtag tool actually works?
David Wright: Indeed. First, thank you for the invitation to come and talk to you. As you say, I am CEO of the charity SWGfL, and we operate the Revenge Porn Helpline that we heard about so eloquently from Georgia. I want to pause for a moment just to pay tribute to Georgia’s contribution and for the courage that she needs to come forward on behalf of, as you say, Chair, thousands and thousands of others as well. We have operated the Revenge Porn Helpline since 2015. We support adults who are victims of non-consensual intimate image abuse, and it was the first of its kind in the world. I will perhaps talk later about some of the changes that we have seen.
As you have said, we created StopNCII.org which is a platform that supports any adult who has been threatened with having their intimate images posted online. It was launched in December 2021, and the way it works is that, say you are being threatened—any adult globally—you visit the website on your device; you create what is called a hash, a digital fingerprint that uniquely identifies the image or video that you have on your device. It is important to say the victim, the individual, never shares their image. It never leaves their device. It is the hash code, a digital fingerprint, that is then added on to the StopNCII dataset, and we then distribute that to participating platforms as a signal to enable them to prevent that image or video from being posted on their platform.
Today, there are 10 platforms that take the hashes—Facebook, Instagram, TikTok, Bumble, Reddit, OnlyFans, Snapchat, Niantic, Threads and Pornhub—to enable them to prevent anybody else, anywhere in the world from posting that image. So, it discharges the threat, and we heard Georgia describe that experience. It is massively powerful technology. We started in December 2021 and, as of last week, StopNCII.org is now protecting nearly 700,000 images and videos across 278,000 different cases. It has been enacted over 15,000 times over that period to prevent that content from being posted which, on behalf of the charity, I personally take as a huge success on behalf of those individuals because otherwise those images would have been posted online.
Q30 Kirsten Oswald: Thank you for that. I know that StopNCII.org merged with a pilot run by Meta in 2021. Can you tell me why you decided to partner with Meta?
David Wright: Yes, it is a reincarnation of a Meta project from 2018 called Not Without My Consent, where they essentially created the idea. There were some subtle differences. It was just Facebook at the time, and you could upload your image to Facebook; they would then create a hash and prevent anyone else from uploading it. That was unique to Facebook and some of the media coverage was not particularly conducive. It happened six months after the Cambridge Analytica story, and you need a lot of trust to upload your previously unshared intimate image to a platform. StopNCII.org then worked with Meta over a two-year period and the development of the platform is testimony and a tribute to Meta for supporting the development of it to where we are today.
Q31 Kirsten Oswald: What happens when a new industry partner signs up to accept the hashes?
David Wright: As a charity, we are expending a lot of effort at the moment trying to engage more and more. We have 10 platforms—including OnlyFans—that I have talked about, but we need thousands. The more platforms that we can get taking the hashes, the more harm and distress we can immediately prevent people from experiencing.
Q32 Kirsten Oswald: Thank you for that. Keily, can you tell us what made OnlyFans decide to partner this way and to start receiving these hashes?
Keily Blair: Absolutely, and thank you to the Committee for inviting me here today. The work you are doing here is incredibly important, and I just wanted to also thank Georgia. Listening to her story, it is hard not to be horrified by her experience, so I am pleased that we are all here to try to help prevent further victims. That is very much the reason why, actually, at OnlyFans, we decided to partner with StopNCII.
For us, the issue of consent is so important to being able to exist as a platform. We are an inclusive platform for adults over 18, some of whom choose to share explicit images, and so the idea of express written consent for anybody being able to share those images is incredibly important. We chose to partner with StopNCII, frankly, because the technology is groundbreaking in terms of what it is capable of achieving, as long as we get platform-wide adoption. Ten platforms should only be the beginning of this project, and the more platforms that are involved, the more people like Georgia we can all help to protect, and we can stop that proliferation across other platforms. One of the things about OnlyFans is, because of the paywall, things stay behind the paywall a lot of the time. That is not the case in the open internet. The proliferation that Georgia spoke about that, yes, one platform may well take it down, but what happens about the next platform or the next platform or the next platform? That will be solved by a legislative change in the way that you spoke about earlier, but also the adoption of world-class technology like StopNCII has in place.
Q33 Kirsten Oswald: Can you tell me how many hashes OnlyFans has acted on so far?
Keily Blair: In terms of positive matches, there have been 65 positive matches on OnlyFans out of the 15,000 that David mentioned. That is 65 people who otherwise would have ended up in Georgia’s position, who we were able to prevent from having their images uploaded. That is just in relation to positive matches on the platform. We also receive additional reporting from the Revenge Porn Helpline outside the StopNCII platform. One of the issues that Georgia raised was about the fact that a lot of platforms need to wait for somebody to say, “This image is illegal.” For us, we do not wait. All we need is somebody to tell us, “This is an image of me,” and we will take it down if they do not consent to it being up, whether it is explicit or non-explicit. Consent is a must-have on our platform.
Q34 Kirsten Oswald: Thank you for that. David, you have spoken about the numbers of platforms that you are dealing with. You are aware that we recently wrote to 16 tech companies and social media platforms to ask why they do not currently partner with StopNCII. What do you make of the responses that were received?
David Wright: Again, I very much thank the Committee for doing that. First, it has actually sparked a number of conversations and a number of engagements, so that is testimony to the engagement that you have effected. I note that the responses from a number of platforms would suggest they are not in a position to use StopNCII hashes now. A couple already have technologies to identify and prevent nudity or sexually explicit content. For us, non-consensual intimate image abuse is a very precise term, and it does not have to include nudity. Intimacy is much more of a term that we would recognise and we would work with. In many countries in the world, in many cultures, in many religions, content does not have to be sexually explicit to have the same catastrophic effect on victims’ lives. For us, having sexually explicit filters is not the same as working with StopNCII. Again, we point that out to many platforms, and it acts as a signal that would then supplement whenever they are operating and receiving images or videos on their platforms.
Q35 Kirsten Oswald: Google and Microsoft, for instance, have not implemented the hashing technology on their platforms. Is that something you think that they could do?
David Wright: In the spirit of keeping answers short, yes.
Q36 Kirsten Oswald: Keily, did OnlyFans have any issues or difficulties in joining this initiative?
Keily Blair: Again, I am going to try to be short. In terms of the timeline from when we signed contracts to when we were able to implement, we started the implementation in January. We finished the first phase of the implementation the same month. It took approximately 80 hours of tech and engineering time to be able to fully integrate phase 1. Phase 2 was implementing the feedback loop, which is enabling us to actually provide feedback to StopNCII about hash matches and things like that. That took a little more time; that was more complex; and that was fully implemented by the end of March. In terms of monthly maintenance hours, I would say, typically, between four and five hours a month of dev and tech time, a little legal advice from time to time comes into things when there is contracting to do, but it is not an overly onerous process.
Q37 Kirsten Oswald: Why do you think that some platforms are reluctant to start receiving hashes from StopNCII?
Keily Blair: It is very hard for me to comment on other platforms and what they may be experiencing. All I can say is that, for us, it was absolutely worth the time and I would have spent double the amount of time to be able to implement it because of the effectiveness of the technology in protecting victims.
Q38 Kirsten Oswald: Thank you. David, do you want to add anything to that?
David Wright: Yes. Some of the answers to that particular question may be illuminated in some of the responses. Again, I am going to reiterate that perhaps there is confusion between sexually explicit content and intimate image abuse. We are just trying to find some clarity around the distinction between those two.
Q39 Chair: Keily, can I just take you back and ask you to be really explicit in explaining this to us? Consent is a must-have; that is the phrase you used. Can I just clarify: if somebody uploads content to your site that there is more than one person in, do you require consent from all people in that image or video before it ever goes live?
Keily Blair: Essentially, we do now as we have evolved our processes over time. In November 2022 we changed our processes to require that consent to be expressly given to OnlyFans before the content goes live. Previously, all creators had the obligation to obtain that consent, and for that consent to be explicit and written, but it was not required to be provided to us for us to verify before the content went live. So, there has been a change.
Q40 Chair: And now it is?
Keily Blair: Now it is, yes.
Q41 Chair: Thank you for that. Can you explain the phrase that you used that your content is, “Behind a paywall”? I get that. And so, effectively, it is trapped behind a paywall. If I am a bad actor, how do I get it out from behind the paywall and put it on a different site, and how easy is that?
Keily Blair: It might be helpful if I explain. It seems like you know very well how the platform works, but some people may not understand the difference between creators and fans on the platform. To open an account on OnlyFans to create and monetise content, you have to go through quite a strict onboarding process. You provide your name, your date of birth, government issued ID, a selfie, social security number, tax details, and bank account details. Ultimately, we know who all the creators are on our platform, but the content belongs to the creator. If an individual chooses to share content on our platform, they may also choose to share it on other platforms simultaneously.
The content on our platform is behind a paywall. You cannot record it if the creator has turned on DRM, for example, which is an open-source technology that Netflix and other websites use. For example, if you have ever tried to screenshot a Netflix thing, essentially, you just get a black box. We have that technology in place to prevent people from taking material off the platform, but bad actors tend to try to use multiple platforms to get things out there. For us, it is enabling people, when they report things to us, for it to be taken down very swiftly. Georgia referred, obviously, to a 24/7 helpline. We have a 24/7 response line that deals with any high priority reports that, for example, include intimate image abuse, so things will be dealt with and responded to extremely quickly.
Q42 Chair: David, some of the other platforms—let us name names because I have privilege here and I can, Pinterest and Match—have argued that their existing systems are already sufficiently robust. Could you just outline how they are not?
David Wright: From recollection, the Pinterest response talks about it already having filters for sexually explicit content. Again, I am going to make the point that content does not have to be explicit for it to be intimate. Intimate image content can have the same catastrophic impact as sexually explicit, so that is where I would return to.
Match, in its response, said that you are not able to upload content to its platform. There are other aspects in terms of where we have seen some particular abuse that emanates from those sorts of platforms. Again, it is about trying to prevent that sort of content and, as a minimum, signposting people if they are a victim of NCII to where they can get help immediately. As Keily said, a 24-hour response is what is needed.
Q43 Chair: Keily, you said quickly. How quickly?
Keily Blair: In Georgia’s case, for example, the content was taken down 24 hours and 18 minutes after she reported it to the platform. We aim to action it as soon as possible. As soon as somebody makes a report, it is taken down immediately rather than, “Can we get proof, can we get this, can we get that?” We take people at their word. If they say, “I don’t consent to this image being up there,” we remove the content. It will also trigger a full review of the account in question to see if there are any other concerns regarding the account.
Q44 Chair: If I am an 18-year-old girl with a non-consensual intimate image, 24 hours might seem like a lifetime.
Keily Blair: Yes.
Q45 Chair: If I am somebody with no profile, an 18-year-old student, do I get a 24-hour service?
Keily Blair: Yes, it is the same for everybody
David Wright: Can I just add more numbers here? StopNCII.org works instantly; it is a technology interface. It does not require the reporting because it is instant, immediate and continuous support. From the Revenge Porn Helpline perspective, in the last four years we have reported 114 images to OnlyFans and the content was immediately taken down and then reviewed. That is uncommon in our experience. Normally, we make a report and then content is reviewed in terms of in a queue and then action is taken. We would like to see all platforms take the content down, then review.
Chair: Take down first, then review. Thank you.
Q46 Kate Osborne: Thank you both for coming today. David, can you explain how some non-consensual intimate image content can remain on the internet and be accessible in the UK even when it has been proven to be non-consensual?
David Wright: Indeed. We have seen advances from the Online Safety Act 2023. First, we welcome the Online Safety Act and the reduction in terms of the bar to some form of offence. For example, you no longer have to prove intent, so we very much welcome that. The offence is all about sharing and the intent to share; it is not about the content. So, yes, platforms have an obligation to remove content. I am going to give you an example: we supported the National Crime Agency over an 18-month period up until 2021. One particular perpetrator was extorting women and girls for images. We set about trying to find the victims who were women; the Internet Watch Foundation tried to find the victims who were girls. We found 200 victims of this one perpetrator, and we reported over 160,000 images that he had extorted of these women. We had 147,000 removed. There is a residual 15,000 images online that we are unable to take down.
With the Internet Watch Foundation, we approached internet service providers to see if we could block access, as we routinely do with other illegal content. The response was, “No, the content is not illegal. We are not allowed to be blocking access to legal content.” That is why we are here. I note in the Sunday Express last week that there was an article ahead of today’s committee. A spokesperson for the Government said the Online Safety Act, “Will require sites to block access to websites hosting illegal non-consensual intimate images if ordered by a court via Ofcom powers.” When they say sites, I presume they mean internet service providers, but it is not particularly clear. We are perplexed by this statement because, first and foremost, the Online Safety Act does not deem non-consensual intimate images as illegal. So, we are rather confused. Yes, Ofcom will have more powers, so they will have service restriction orders that they can impose to do with payment gateways. If those fail, they will be able to serve service restriction orders to block access to content. I would suggest this is wholly inadequate. We have already heard a lot can happen in 24 hours, but this is going to be months. That is how content can stay online.
Ninety per cent of the content the Revenge Porn Helpline reports gets taken down. We have reported over 330,000 images; we have had 300,000 removed. There are 30,000 images online that we know are NCII, 15,000 of which—including Georgia’s content—remain online, typically in countries where they have no interest. They may be hosting this content specifically to generate traffic from different countries, typically Russia or Latin America, where we have no control and they are not going to respond to us. Other regulators around the world only have a 90% take-down rate too. We cannot expect 100% of platforms to remove content. We need other mechanisms to be able to block access to this content to stop the re-victimisation that Georgia powerfully talked about.
Q47 Kate Osborne: From what you are saying, it would make a difference if adult non-consensual intimate images were classed as illegal content in the same way as, say, child sexual abuse material is. Would there be any unintended consequences if NCII content was treated in the same way as child sexual abuse material?
David Wright: That would be enormous progress to be able to get to that point. Again, you heard from Georgia about the re-victimisation where content is posted online and exists in countries beyond our jurisdiction. Applying the same sorts of approach and legislation would serve exactly that particular purpose. We have many years’ experience of managing, restricting and taking down child sexual abuse content with the Internet Watch Foundation and the National Crime Agency, we just need to mirror that.
Q48 Kate Osborne: Keily, how would OnlyFans deal with child sexual abuse material if it were found to be hosted on your platform? Are you able to proactively take it down, or is it reactive?
Keily Blair: As I mentioned earlier, we have a very strict identity and age verification process for creators before they are able to join the platform, and there are a couple of ways in which we deal with that material. We start off by making the environment hostile and difficult for people to be able to share that material in the first place. In the event that people do choose to share that material, even though we are a UK-based platform, we choose to voluntarily report to NCMEC—the National Center for Missing & Exploited Children—based in the US. That is a global clearinghouse for CSAM material. If we identify anybody who has attempted to share CSAM on our platform, not even successfully but attempted, and been blocked from sharing it because, again, we take the NCMEC hash—much like we do StopNCII’s hash as well—we report those images to NCMEC.
To give you a sense of context in terms of numbers, in 2023 we reported 347 attempts or postings of CSAM material on OnlyFans. By contrast, one of the other social media platforms posted 30 million reports to NCMEC. So, it gives you a sense of creating a hostile environment in the beginning, making sure that you know the age and identity of the people who are posting information because that drives accountability. If you have accountability and you can say, “I can see who posted that, I can provide that information to NCMEC,” then the reports become actionable and things can change.
You asked about unintended consequences. One of the difficulties about CSAM versus NCII is the classification of it in terms of legality. For my sins, I am a lawyer by background, so I go back to that as being where I think about things too. NCII content is not illegal per se in this country, and it is definitely not illegal per se in lots of jurisdictions. While platforms like us choose to take consent very seriously and will take it down, others choose to hide behind legal definitions to keep content up. That could be one of the challenges in terms of implementing a system, but it is not an insurmountable challenge either, so it is worthwhile doing.
Q49 Kate Osborne: Thank you. You mentioned before about consent with regards to your platform. If consent is given but then withdrawn, is that dealt with in the same way?
Keily Blair: Yes, if somebody contacts us and says, “I previously consented to my image being on the platform, I withdraw that consent,” we will immediately take that image down, no questions asked.
Q50 Elliot Colburn: Keily, what conversations have you had with Ofcom regarding the new Online Safety Act regulations, specifically when we are talking here about NCII?
Keily Blair: We are already regulated by Ofcom under the video-sharing platforms regime, and we have an ongoing supervisory relationship with them. We have not had a great deal of interaction with them so far about NCII in particular. As you have seen from the most recently published code of conduct, the focus has very much been on other issues rather than about NCII. We would be very happy to talk to Ofcom further about the steps that we take—they know the steps that we take to prevent NCII—but also to advocate for greater powers. The recourse of having to seek a court order to take something down is beyond most people in terms of expense, time, effectiveness, all those other things, so I would echo the comments that David made around the inadequacy of that as a provision.
Q51 Elliot Colburn: David, to pick up on that point: regarding Ofcom’s guidance on how platforms should deal with NCII, is it too reliant on voluntary action? Could it be strengthened? Could it go further?
David Wright: Like everybody, we spent a lot of time responding to Ofcom’s consultation. Clearly, the bulk of the legal harms consultation was around child abuse content, as Keily said, which is to be expected. NCII appears, but as a secondary, perhaps a third particular content. In terms of strengthening that position, we have seen a huge growth in the number of cases that we are managing through the Revenge Porn Helpline. To give you an idea, in 2019 we managed 1,600 cases; that doubled in 2020, we think fuelled by covid, to 3,200, then to 4,400 in 2021, 8,900 in 2022 and then last year—we only published this data yesterday—it was just under 19,000 cases. So, we have seen a tenfold increase in four years.
A number of instances have happened. Not least, in March last year we saw an increase which we do not know, but we can only attribute to Georgia’s case. Again, I want to pay tribute here to Georgia for giving courage to others in that situation to step forward and to reach out for help, and which contributed to the dramatic rise that we continue to see. Do the codes go far enough? No. No, they do not, but we are anticipating that that will change once the child sex abuse material, as well as terrorist content, is implemented; usually then NCII will follow. But I will just add, as time goes by more victims are feeling harm.
Q52 Elliot Colburn: Thank you. Keily, just to finish up on the Online Safety Act, are you confident that OnlyFans has already moved to meet the new requirements that are coming in under that Act, or do you still think you have areas to go to update policies to be in compliance?
Keily Blair: I am confident that we have the necessary measures in place to meet our obligations under that Act. That will always continue to evolve. Harms change, the threats shift. It is important that we as platforms do not remain static, and we continue to look at what we can do that is better, find new technology that is out there, work with charities, work with Government. But also platforms do not need to wait for legislation to do that, and that is probably one of the things I would say is really important. Also, highlighting the incredible work that the Revenge Porn Helpline, StopNCII, and NCMEC do; there is so much more the platforms can voluntarily choose to do without waiting to be dragged there.
Elliot Colburn: We are about to be disturbed any minute, but I will take my chances and ask the next one. You have already mentioned the changes in your verification procedures in 2022 and what that means, the difference between a creator and a fan. There we go.
Chair: I have to suspend the meeting.
Sitting suspended for a Division in the House.
On resuming--
Chair: Welcome to the resumption of our session this afternoon on intimate image abuse. We will go straight back to where we left off. Elliot?
Q53 Elliot Colburn: Thank you. Keily, I will repeat from where we left off. You explained earlier in the session about the 2022 changes to your verification procedures. Have you noticed any tangible impact that that has had on the number of complaints of NCII that you have received as an organisation?
Keily Blair: When we look at complaints about NCII, they come in various different forms. From our perspective, the best way to look at complaints is to actually think about the complaints we receive from law enforcement. We already action any user complaints, but sometimes law enforcement requires further investigation. For example, since January 2023 there have been 22 UK-based law enforcement inquiries about NCII in general. We investigated all those and actually only confirmed three cases were actually NCII images that had appeared on the platform. There is a measurable drop in terms of actual confirmed cases but, as I mentioned earlier, all we need is somebody to say to us, “I don’t consent to this being on the platform,” we will not wait for law enforcement to take action or for someone to be prosecuted. So, yes, we have seen a tangible drop in terms of cases.
Q54 Elliot Colburn: Given that they are, I grant you, small numbers, had any of those law enforcement numbers attempted to contact the platform before they went to law enforcement, or did they all begin their journey through contacting a law enforcement agency? Are you aware of that data?
Keily Blair: I will need to come back to you on each of those three specific cases because some are ongoing.
Elliot Colburn: Thank you.
Keily Blair: In most cases, as soon as the user contacts us, as long as we are able to identify the content and the creator in question, we are able to take it down. What happens sometimes is we get incomplete information. Someone may say, “I think there’s an image of me,” and so we would need to go back to them and say, “Can you tell us who the creator is? Can you tell us anything about it that would help us to identify it?” and then we are able to action it and take it down.
Q55 Elliot Colburn: Thank you. I would like to take you to the Reuters report that more than a dozen cases of NCII were reported to the police in the United States and filed last year. Of course, not everyone will go to the police and that is reflected in the numbers that you said about UK-based law enforcement, but how many other cases are you aware of, and are the law enforcement numbers just the tip of the iceberg in your view?
Keily Blair: The Reuters report was over an eight-year period, and the confirmed cases they came to OnlyFans with were actually over a 59-month period from January 2019 to November 2023. Of the number of cases they mentioned where they had seen OnlyFans being listed in the report, actually Reuters only approached us for six cases. In one of those cases there was no OnlyFans account, there was no creator account, so to the point I mentioned earlier about actually being able to identify whether there was a creator on the platform. In the other five cases we had already taken down the content previously. So, yes, it was NCII material, yes, it appeared on OnlyFans, but we had identified it and taken it down, which is exactly what you should do.
Q56 Elliot Colburn: In those five cases, how quickly were you able to do that? Was that another 24-hour situation, or did that take longer?
Keily Blair: I would need to come back to you with the details for some of those cases because it is different for each case but, as David noted earlier, it is as soon as we hear it.
Q57 Elliot Colburn: Did you want to comment on that, David?
David Wright: Yes, I would just like to make a contribution, particularly about the Reuters report. I happened to give an interview to the journalist as well. There were all sorts of different questions connected to that as part of the investigation, and so I spent an hour explaining about StopNCII and the work that has gone on with the different platforms, including OnlyFans. I also suggested that readers of that article could well be victims, and it could well draw victims, and it would be really important to include a link to StopNCII or any available support if victims were affected by that report. Given the time that I contributed, it was really disappointing that none of that was included in the particular report. Like I say, it is not directly related to the question but it is relevant.
Elliot Colburn: It is helpful context.
David Wright: It is the context, yes.
Q58 Elliot Colburn: Dodgy journalism; who would have thought it? Never mind. That may well actually come into the second part of my question, but tell me if I am wrong. As part of that report, it was suggested that people inferred that it was still quite commonplace that people were managing to circumnavigate your consent checks. What would your take be on that? How robust do you feel they are? If there are issues that are coming up in terms of getting round those consent checks, have you been able to identify how that has occurred and then put mechanisms in place to stop that from happening? Or is it not as big an issue as the report might have suggested?
Keily Blair: First, we will always strive to end up with zero cases of NCII on our platform. The number we want to be saying is zero, so any one case is a serious case to us. We investigate any report to understand whether somebody has circumvented our controls to be able to upload that content and, for example, make changes when necessary. That was part of the driver behind changing the consent policy to being proactive rather than reactive. It is, again, recognising that a better way to protect people is to expect their consent to be provided to OnlyFans in advance of material being posted on the platform. We take every report very seriously; we look for opportunities to action it.
You asked if we think our controls are robust. With all user-generated content platforms, every single social media platform globally, there is a risk of NCII being shared. What we have seen is that creating an environment whereby you know the age and identity of the people who are sharing information, you are able to attribute content to an individual, and you are able to provide actionable information to law enforcement, or to help victims when they report it. Those are the steps that platforms can take to make it even more difficult for people.
There will always be bad actors. Our job is to make it as hard as possible for them to act badly and to support victims. One of the things that we do, as David noted, is actually when somebody reports to us that they believe they have been a victim of NCII on our platform or on any other platform, we will point them to victim support resources. We will direct them to StopNCII or to the Revenge Porn Helpline because we recognise, as Georgia mentioned earlier, some of the challenges being faced by law enforcement to genuinely support victims in this area.
Q59 Elliot Colburn: Thank you very much. A final question from me, Keily, is regarding Ofcom’s investigation into concerns that children under the age of 18 have managed to get subscriptions to OnlyFans and obviously accessed some of the sexually explicit content that is on there. That was put down to a coding error. How long did it take OnlyFans to identify the error which enabled that to happen, and what mechanisms were put in place to prevent a reoccurrence?
Keily Blair: In terms of the Ofcom investigation, that is obviously ongoing and I do not want to prejudge the outcome of that, but I want to provide some context for you in response to your question. The investigation is regarding whether the measures that we had in place were sufficient to prevent under-18s from accessing restricted content, including potentially explicit content. In terms of the measures that we had in place at the time, at all times we had in place a requirement for people to enter their full name, payment card, and to complete an age assurance process. That age assurance process was always voluntarily set above 18 at all times during the relevant period.
The coding error was in relation to 20 versus 23 in terms of the age it was set at. At all times we had measures in place to try to prevent under-18s from accessing the platform. We are going to continue to work with Ofcom to help them to get to the bottom of whether they are comfortable, whether 20 is an adequate solution. We believe that it is. In terms of the time it took to identify it, as soon as we identified it in early January, we went to our tech team to confirm that there had been a misconfiguration at our end in relation to the age assurance software coding it at 20, let us be clear, never under 18. Once we confirmed that, we picked up the phone to Ofcom, and we said, “Hey guys, we need to tell you something,” and then we followed up in writing with them afterwards. We will continue to work with Ofcom to give them the confidence in the age assurance measures that we have.
Elliot Colburn: Thank you both very much for your answers. Thank you, Chair.
Q60 Chair: Keily, would it give you a further layer of assurance if you required, instead of a debit card, a credit card that you cannot get unless you are over 18?
Keily Blair: It is a really great question because it is one that we have grappled with as well. Minors can get credit cards under their parents’ names, so it would not necessarily prevent the issue. Having age assurance in place in the UK certainly helps in terms of that additional level of requirement. Also, having a paywall in place in the first place definitely assists, so anything that we can do to close the gap is good
Q61 Chair: Thank you. David, how is the Revenge Porn Helpline funded?
David Wright: We have some contributions from the Home Office, and the Scottish Government provides some contribution. Other than that, it is funded from our charitable reserves.
Q62 Chair: Are any of the platforms helping to fund it?
David Wright: At StopNCII we had a philanthropic donation that was made to help us support and establish that. On an ongoing basis we are anticipating that platforms will contribute into the operational costs of that, but that is a different question to the Revenge Porn Helpline. No, there are no contributions to the Revenge Porn Helpline.
Q63 Chair: When you say you anticipate that the platforms are going to help contribute to StopNCII.org, how confident are you in that anticipation?
David Wright: Without funding it will not exist, so I am confident about that.
Q64 Kirsten Oswald: Staying with you, David, and taking you back to some of the things that you were outlining for us earlier about the different kinds of images, not always the obvious: to what extent do you think non-consensual images, which are not sexual but might be culturally insensitive, are a concern? How well do you think that kind of concern is able to be dealt with at the moment?
David Wright: It comes back exactly to the point that intimate images are not necessarily sexually explicit. During building StopNCII we spent a huge amount of time determining the language and the terminology, and we arrived at non-consensual intimate image abuse. Globally, StopNCII.org currently works with 94 NGOs around the world. Clearly, the Revenge Porn Helpline supports adult victims in the UK, but we have limited capacity so we partner with these 94 NGOs. There are very few; I think there are five around the world like the Revenge Porn Helpline. Some of the other 89 around the world often do other things; they may well support children. It is predominantly those NGOs that report to us about exactly these sorts of issues.
So, yes, we see it in the UK, but if I think about RATI in India—the NGO that we direct and we cross refer with in terms of StopNCII—they promote StopNCII. Where a victim arrives at StopNCII.org we will direct them back to their national support opportunity—I was going to say service—depending on whatever it is. For example, countries like India and many others where exactly this is an issue, where, based on culture, based on religion, just merely being photographed or taken in an image with your arm around somebody has catastrophic implications for them. That is why we refer to non-consensual intimate image abuse, not non-consensual sexual image abuse.
Q65 Kirsten Oswald: Thank you, that is very helpful. Keily, is there anything you want to add to that?
Keily Blair: The point that David makes is a thoroughly excellent one and shows the lack of understanding by some of the other platforms that he referred to earlier where they are simply talking about preventing nudity. It is a lack of understanding about intimacy in different social contexts and that is why, for us, the key issue is consent. The people featured in the image have to consent to being in the image. If somebody raises to us a picture of them entirely fully clothed but in a compromising position culturally or, for example, maybe not wearing a hijab or something along those lines, we would absolutely take that image down. It does not need to be sexual in nature for us to act.
David Wright: Can I just add as well, it is not directly related to this but the comments about the police prompted me. I hear what Georgia said about Essex Police and that is wonderful; it is great that she had that experience. That is not generally our experience in terms of the victims that we support from a Revenge Porn Helpline perspective. We spend a lot of time coaching victims who have contacted the police and reminding them that, “Yes, you are a victim of a crime. You need to go back and you need to report this. This is how you do it.” It is frustrating that we have to expend time doing that. Like I say, it is not directly related to that, but it was a prompt.
Kirsten Oswald: It is a point that was probably very well made at this point in the proceedings.
David Wright: Thank you.
Kirsten Oswald: Thank you for that, and thank you both.
Chair: Elliot, on that point?
Q66 Elliot Colburn: Yes, on that point, thank you. Sorry to drag it on, but the Victims and Prisoners Bill which is going through Westminster at the moment: have you had any opportunity to review the mechanisms within that and whether or not that would help complement the work of the Online Safety Act to try to help victims navigate the process? We have had a lot of debate around IDVAs and ISVAs, the independent advisers. It does not sound to me, though, that they would be any use or apply necessarily in cases of NCII. Have you done any work with the victims Bill to ascertain whether it would actually help in those instances where victims are struggling to navigate the process?
David Wright: I have awareness that the victims Bill is going through. It would be great to extend a conversation about it if there are some specific aspects that we could latch on to or that we could contribute into there. That would be a great conversation to have.
Q67 Chair: David, you will know as well as I do that we invited Meta to come today. It refused to come. You will know that we have written to a number of other platforms that have not yet graced us with a response. What message do you have to those platforms that are choosing not to engage, and is there anything more this Committee can do to help them get to a more constructive relationship with you?
David Wright: A wonderful question. I would encourage all platforms to engage with StopNCII. We have heard a great example here, as well as some implication in terms of engineering and legal lift in terms of integration. I would suggest it is not a big thing. We have specifically gone about doing that to make it as easy as we possibly can for platforms. We are merely mirroring existing mechanisms. We are using existing technologies; we are just applying it in this particular way. Typically, platforms will already have the understanding of the technical capabilities that they need. We are trying to make it as easy as we can.
It is about engagement, and the letters that were written have been immensely helpful, but it seems that there needs to be more pressure applied. Ofcom may well be able to help too in terms of encouraging platforms to take the hash list in recognition of the issues. This afternoon, we have heard the impact of that, and we hear that impact every day with the Revenge Porn Helpline. Sadly, some people find themselves in a catastrophe as a result of NCII. It is through them and for them that platforms should do the right thing and engage with this particular process.
Chair: Thank you very much. I think Keily made the point earlier that we should not have to wait for legislation, that the platforms could be engaging now, could they not? Can I thank you both for being here this afternoon? It has been incredibly helpful. If there is anything you wish to add in writing, then please do so.