Women and Equalities Committee
Oral evidence: Tackling non-consensual intimate image abuse, HC 336
Wednesday 20 November 2024
Ordered by the House of Commons to be published on 20 November 2024.
Watch the meeting
Members present: Sarah Owen (Chair); Alex Brewer; David Burton-Sampson; Kirith Entwistle; Natalie Fleet; Catherine Fookes; Rachel Taylor.
Questions 76-154
Witnesses
I: Clare McGlynn, Professor of Law, Durham University; Sam Millar, Asst. Chief Constable and Strategic Programme Director of the VAWG Taskforce, National Police Chiefs Council; and Lorna Woods, Professor of Internet Law, University of Essex.
II: Gisela Carr, Deputy Director of the Interpersonal Abuse Unit, Home Office; Alex Davies-Jones, Parliamentary Under-Secretary of State, Ministry of Justice; Jess Phillips, Parliamentary Under-Secretary of State (Minister for Safeguarding and Violence Against Women and Girls), Home Office; and Laura Weight, Interim Director for Victims, Vulnerabilities & Criminal Law Policy Directorate, Ministry of Justice.
Witnesses: Professor Clare McGlynn, Sam Millar and Professor Lorna Woods.
Chair: Good afternoon and welcome to the Women and Equalities Committee. Today we are looking at non-consensual intimate image abuse. I am delighted that in our first panel of experts we will hear from Professor Clare McGlynn KC, from Durham Law School at Durham University; Professor Lorna Woods OBE, Professor of Internet Law at the School of Law, University of Essex; and Assistant Chief Constable Samantha Millar, Strategic Programme Director at the National Police Chiefs’ Council for the Violence Against Women and Girls Taskforce. Thank you for your time, and welcome. David, will you start the questions?
David Burton-Sampson: My first question will be to you, Assistant Chief Constable—do you mind if I call you Sam?
Sam Millar: Please.
Q76 David Burton-Sampson: Sam, StopNCII told us that of the victims they work with who report to the police, four times as many report a negative experience, rather than a positive one. Why do you think that is the case?
Sam Millar: Your first question gets right to the heart of it, certainly for policing. Even yesterday, a victim said to me that she is in a conversation with 450 victims of deepfake imagery, but only two of them had had a positive experience of policing. It is deeply worrying with those victims. Let us be honest: for women—these are all women, although it could be a male victim, but predominantly these are women victims—the last thing they probably want to do is to turn up to a police station and to have that sort of evidence unpicked and looked at. The fact that they turn up to that poor service means that our system—this is not individual police officers, but the system—is failing to put in the knowledge, the mindset, the specialist understanding or a response to legislation, which is pretty difficult for a generic police officer to keep in their mind. Something about that rings a big alarm bell for policing on how we change that.
I have many thoughts about the infrastructure that is not in place—the way we operate holistically as a capability—to be able to get a response that I would want any victim to walk in and get. We have a journey to go on. We do not have an infrastructure set up to provide a service that meets the needs of this legislation. That is what we need to build and I am very clear about that.
I feel that I am in a very lucky position, because my mantra at the moment—as Strategic Programme Director for VAWG—is that in policing we need to bring a capability together that understands the offences we are talking about. They are deeply complex to provide a service to, and that is where we are heading. Next year, we are hoping to bring a number of programmes together where we have learnt a lot of lessons about the complexity of dealing with victims of RASSO, rape and sexual offending. The Online Safety Act 2023, with the legislation that brought, is a whole new challenge for policing. I sit here and I think that it is utterly devastating that 450 women walked into police stations and did not get the service that they needed. My job, with my team, has to be to build an infrastructure.
I reached into a very normal force in the build-up to coming to this Committee and asked, “So, DI, talk me through what it would be like for a victim to walk into your police station today?” We deal with these crimes inherently in the same way we deal with many crimes, and that cannot be the right model. There is something to do nationally—how do we get that capability?—and with how that filters down to local delivery, because that is where all the victims will have walked into a police station and received a pretty poor service. In fact, the victim from last night walked into seven police stations, so nobody has to tell me that the service is not right.
Q77 David Burton-Sampson: Another quite concerning thing that we have heard during this inquiry is that some victims are asked by the police to leave the content online to aid investigation and prosecution. That is obviously not great. What is being done to educate police officers on why it is the worst possible outcome for a victim?
Sam Millar: I completely agree. There is a real lack of understanding of the legislation: “What do we have the power to do?” But if you go back to the beginning, how does an offence like this get reported? How easy and accessible is it for a victim to talk to police? If you go to the College of Policing, we have guidance about the offences, but very little guidance at the moment about how you undertake a good investigation: “What does good look like?” That is why you need a national framework.
A new operating model needs to set out what a good investigation looks like, because it cannot be right that officers are failing to understand that by keeping an image online, it is just constantly retraumatising—t is giving access. We keep things online, but perpetrators are in prison, and guess what? They come out of prison, and we hand devices back. So there is nothing about an investigation that is right.
Policing has to understand the legislation. It has to build the guidance. It needs to then cascade the training. It needs to understand how that is working locally and being delivered. We do not have the infrastructure set up to support those victims, and that is a very inadequate place to be. You are right: our investigations are not consistent. There are plenty of occasions when we do take the content down and then the investigation starts, but it is that inconsistency of practice that we have to put right. I go back to the point that it is obvious to me, having worked on the taskforce for three years, that we do not have an infrastructure built and set up to deal with online harm.
Q78 David Burton-Sampson: It sounds as though you know what needs to be done to fix this. Is anything currently being done—
Sam Millar: Yes.
David Burton-Sampson: And what is being done to improve training and guidance?
Sam Millar: There is activity. We from the taskforce have seen this from the very beginning. The threat and risk assessment that we have now done for two years tells us that online harm is a thematic priority area that we should be considering. We have allocated resource. We have worked with Ofcom to understand how to influence the guidance. We have worked with our contact management capability to understand how we make reporting more accessible.
If I can use an example for comparison, I also oversee the policing response to spiking. A year ago, we launched an online reporting tool, which takes a victim through a very bespoke capability whereby you can report what is happening—you do not have to walk into any police station. We just make it more accessible, and that has to make it more straightforward for a victim. How do we not have that capability here? We are in the discovery phase with our contact management team to understand how we bring that in. I think that within six months we will understand what that looks like, and I am hoping that we can offer a similar sort of service to the one that we can offer to a victim of a spiking offence.
We are doing the basic work, but it has to be invested in. I am making a very big bid to Ministers about investing in relation to VAWG and investing in this co-ordination. Let’s learn the lessons from other offence types and get the infrastructure in place here.
Q79 Chair: May I follow up on the point about national guidance, guidelines and an infrastructure, Sam? As technology continues to grow and develop, how are you going to ensure—is there a way of ensuring—that the infrastructure keeps up with the changes in technology?
Sam Millar: That is a very interesting question. I will go straight to the comparator. The epidemic nature of VAWG and the offending types that we see against women, predominantly, are not invested in to the level that they need to be across policing. Again, we are making the argument that you need to take this as seriously as organised criminality and counter-terrorism; you need to invest in it and you need to help policing to build the capability.
Our plans at the moment take all the learning that we have had for three years around a new operating model for rape and sexual offending, the activity around VAWG and all our research capability that currently underpins a lot of our activity in the college, bring that together and allow it to have the strength, leverage and status of other crime types. Then we need to invest and put it on the same scale as counter-terrorism. That is the level we need to invest in if we are really going to take this on and deal with the volume that you are talking about.
Everything in our threat assessment says that this will only get bigger. If it gets bigger, it gets more harmful. If you had sat with Jodie last night, you would realise the devastating impact that that crime has had on her life. I would tell anybody who says to me that online crime is not as serious as burglary or robbery that they need to meet Jodie. The devastation is huge, and we have therefore got to put infrastructure in place.
Professor McGlynn: It is good to hear that acknowledgment of the challenges that face the police. I would like to add, from working on image-based abuse for so long, that it often falls through the gaps of different elements. We can take a lot of learning from Project Soteria, where the police are improving their investigations. I hear from organisations such as #NotYourPorn about the problems women have reporting in this area.
We have also got the policing response to non-contact sexual offences. I have been involved in drafting some of the guidance with the College of Policing. Some of the offences come under that guidance and training, but not all of them do. There is therefore a risk that we will have a national operating model for rape and serious sexual offences, guidance and training for some non-contact sexual offences and some effort around image-based abuse offences. We must make sure that, in relation to image-based abuse offences, we have the same sort of engagement as we had with Project Soteria, with academics and police building that knowledge and the survivor framework. I just want to make sure that we do not always fall between the cracks.
Q80 Catherine Fookes: I thank the witnesses for coming to us today. This is a question for Sam, following up on what David asked. We heard from witnesses yesterday in a private session about how traumatising their experience of reporting to the police was, I am afraid. For example, one person said that she gave her account of what happened for three hours to an officer. She then had to give it again a few days later to a different officer—another three hours of going through the experience, even though she had already told someone what had happened.
In another case, someone gave evidence for a recording for four hours. The police lost the DVD, which was corrupted, and she had to do it all over again. Another comment was that when both those people went to police, there was no compassion or care at the door and no immediate understanding that they were victims. They felt that having a single police liaison officer to support victims through the process was one thing that would help. Can you talk about that? It would mean having at least one police officer at every station being ready to take statements from victims who have experienced this type of crime. I would like to hear your comments on that, please.
Sam Millar: Probably one of the most challenging areas is how we get a capability across policing that is available 24/7 and that we can keep professionally alive to the trauma of whoever walks through the door. Police officers have to deal with such a range. The basic curriculum has to be trauma informed, so that we send police officers out from day one with the right capability and knowledge. How do we keep that continuous personal development going? We are looking at the curriculum for a probationer police officer. We are constantly trying to influence the College of Policing on training.
That is why the coming together of several programmes hosted by the College of Policing is critical, so that we can be at the heart of where standard setting and guidance are created. There is massive recognition of the fact that getting that trauma-informed approach at the front end of policing, when a victim walks into a police station, is our challenge. That applies across a range of VAWG offending. I agree that the ambition is there; it is a question of how we do it. Invariably, the officers who first meet the victims are not specialists, so that first triage is quite generic.
That is our challenge: how to get them to have the knowledge of the legislation, the mindset and the ability to be able to handle a victim in such difficult circumstances when they have walked in and have to tell that story. That is our challenge.
Q81 Chair: Can I push back a bit? I do not think that people should necessarily need training to be polite and listen to somebody. This is about not just how the process was, but how people felt when they walked into a police station, which goes to the heart of the culture of the workplace and how the police treat people who come into it. I understand that the only way to do that with current police officers is through training, but surely we should have a police force that takes seriously anybody who walks into a police station and says, “I’ve been a victim of a crime.”
Sam Millar: Yes, I completely agree. Would I want every police officer to be like that? Of course I would. Is our training driven to achieve and deliver that? Of course it is. It is about frontline supervisors and ensuring that our police officers are fit to do the job.
That is why, three years down the line in a taskforce that was brought about because of the utter devastation of women’s confidence to trust police to take them seriously, we have been on that journey. I would say that we have made progress. I have now done 31 years’ service, and it is a massively different place, but that does not mean that we can stop the search.
Having spent three years hearing the testimonies about what a poor service the frontline often is, I am absolutely clear that we must learn from what we have seen for three years and build on that, and influence that culture. That is why part of our workstreams are about influencing the culture and the language we use. There is a lot of criticism around victim-blaming language, and we will change that by changing our guidance, our toolkits, our behaviours and our leaders.
This is not just a single workstream. We are driven to move the culture on. I am a woman—I am a representative of females in policing—so I want it to change, and I would want to think that anybody I know could walk in and at least have a polite, respectful service.
Q82 Catherine Fookes: The message we are getting is that they need one single person who then takes them through the process—who tells them when their court date is, who supports them and who is their police liaison officer. Is that on the horizon? Will victims of this crime get that?
Sam Millar: That goes back to the fact that the response from policing at the moment is quite reactive. I cannot look at you, Catherine, and tell you that that is available in every station, because I do not believe it is, but that is why you need an operating model to say, “What does a service against this crime type look like?” It is absolutely obvious—isn’t it?—that having a point of contact in a station for any police officer to go to for help, support and guidance and to support victims is what you would want. But you will not get that if you do not set out your stall to have an operating model for it. At the moment, we are reactive.
Q83 Rachel Taylor: To move the subject on a bit—sorry, Sam, this one is also for you—what is the process for taking down child sexual abuse material? Could that be applied to NCII content?
Sam Millar: I am not a technological expert in this field. There is a process, as with any of the police’s mandated powers. If we had the powers to do it, we have the liaison officers who will link in with the tech companies, and there will probably be an authority to go through to take it down. I do not particularly worry about that.
What I worry about is what power the legislative framework gives us to do it. That is my focus at the moment. I know, as a senior officer, that I carry plenty of authorities and powers. Once you have a power, you can act on it. We do not struggle to use powers.
Taking down means you have to have a lawful requirement to be able to do so, and for me, having got the overview of this crime type, one of the single biggest issues we have is that the actual image is not unlawful. If you do not have a power—a bit like with telephones—you cannot seize them or do anything with them. We have to act within the lawful requirements.
Q84 Rachel Taylor: What you are saying is, if those powers were there, the same process could be used to take down those images.
Sam Millar: Absolutely; that is not the issue.
Q85 Rachel Taylor: Do you think the law should be changed to class NCII content as illegal in the same way as child sexual abuse material and terrorist content?
Professor McGlynn: I would go about the question in a different way. What you are getting at is how we manage to get this material removed from platforms, and particularly from those that are recalcitrant and refusing to do it. If some of that material was treated in the same way as child sexual abuse material, we could put in place a process, like with child sexual abuse material, to get some of those websites blocked, for example.
In terms of the overarching concern I would have, it would be great to put in place a process like that—you go through a criminal case and get a conviction, and some of that material would then go through that process so that you can get it removed and blocked from those websites—but if that is all we do, in a sense we still have an awful lot of non-consensual intimate imagery out there that is difficult to remove, because not everyone goes to report to the police and not everyone gets a successful conviction.
I think it would be preferable to have a system whereby anybody could get a court order to get material removed or to get the internet service providers to block particular websites. In other words, that would be accessible to everyone who had non-consensual intimate imagery of them shared without their consent, and not only to those at the end of a criminal conviction process.
Professor Woods: I would like to add a couple of points. There are some civil courses of action that a victim of NCII could use. I came up with six on Monday, although I do not know if there are more: data protection rules, defamation, misuse of private information, copyright—I have forgotten some of the others. There is also—I can never remember this one—a tortious intent to hurt. Those would then allow somebody to use the tools that come through the High Court, and there could be the possibility of injunctions or of an order to remove and destroy material.
The problem with that is that it is going to be fact-specific; not everybody would fit within the confidentiality bucket, and not all the tools would be available for data protection. It also puts the onus on the victim, and going to the High Court is not cheap.
For the avoidance of doubt, I would like to say that although the Online Safety Act 2023 contains an obligation for regulated user-to-user services to have an effective takedown system, it does not give users an individual right to say, “Take this particular item of content down.”
The other point is that it applies only to things falling within the definition of illegal content, and there are two issues with that. One is the scope of the offences in the first place and whether they are adequate. The second is how we understand illegal content in the Act, and I think there are three ways of understanding that. One is the broadest way: content of a type likely to be NCII, which would fit with systems and processes and with dealing with content in bulk. So if you are thinking, “Let's have a hash database, and we can search that,” it might be that sort of tool.
You could also understand it as content that is related to a particular criminal offence, and then all subsequent reposting—when it goes viral—is the same sort of content. Or you could see it very, very narrowly as content that is related to a specific criminal offence. You could have an instance where content is posted once, with the relevant mental elements, so you get an offence, but then it is shared broadly by people who just are not thinking or do not really care. In that instance, it would not be a criminal offence. Potentially, we have a problem where a particular image is flitting in and out of the regulatory regime.
Q86 Rachel Taylor: You have talked about a number of civil processes, all of which appear to be costly and lengthy. How long does it take, for example, to get an image taken down? To come back to my initial question, do you think that NCIIs should be classed as illegal in the same way as child sexual abuse material?
Professor McGlynn: They would have to be classed as illegal after a process. I guess that is what I am saying. One process is: if you go through a criminal case, you get a conviction and the court could make an order that that material is illegal. You would still need a process to enable you to get the ISPs to block it, and that process does not really exist at the moment. Just declaring the images illegal would not actually do that; we need the process in place to treat NCIIs like CSAM.
I was suggesting what is happening in some other jurisdictions. For example, in some of the Canadian provinces, they put in place swift and easy online processes, like with what we used to call the small claims court where we go online if we have a claim against a plumber or something. They have put in place systems like that, so that people could go online to get a very easy, straightforward and inexpensive court order to get material removed, the perpetrator to delete it, etc.
That is what I would put in place, so that we avoid the existing civil law system, for which you need at least £10,000, the time, the lawyers and what have you. It has been done in other countries and, ideally, that is what I would recommend here.
Q87 Rachel Taylor: What steps, if any, do you think that we could take to block access to content that is hosted overseas?
Professor Woods: The Online Safety Act has some provision relating to content that falls within the regime. Ofcom has a range of enforcement powers, which start with information notices to tell a company that it has got it wrong and is not doing the right thing, and would it care to improve its performance. That then moves to the possibility of fines, before moving to a section of powers called business disruption measures. The first one of those is aimed at companies that provide services to the company hosting the material—ad providers, payment providers and things like that. That allows Ofcom to go to court and to tell those providers not to continue to provide the service, so we can disrupt the finances.
The final possibility is a blocking order, but that is very much seen as a last resort. If you think of a big general platform, the potential for inadvertent blocking of content that is perfectly benign or legal is high. In a way, it would be easier to use such measures to tackle collector porn sites or something like that, rather than something like Facebook or Instagram with such a range of content on it. The Online Safety Act only applies to services regulated by the Online Safety Act, so it is not capable of being used for, say, BT or something like that.
Professor McGlynn: What you have described is the very long-winded, lengthy and bureaucratic process that is exceptional. For example, we have to notify the Secretary of State in some of these contexts. It is not designed for dealing with trying to get material removed from online. A different type of process is needed. We might get something removed from one website, but the next day it could appear on the next website. You do not want to have to go back to ground zero and start another lengthy process again. It needs a bespoke process, as for CSAM.
Professor Woods: That blocking process is not designed for individual items of content. It is about when a company has failed in its duties more generally. Although you could use it, it is not ideal.
Q88 Rachel Taylor: Moving on to my last question, if NCII content were made illegal like CSAM, could someone be prosecuted for possessing it on their phone, say, despite not being aware that it was produced non-consensually?
Professor McGlynn: It would depend how legislation around that was framed. It would all depend on the drafting. It could be possible. One could draft up a criminal offence such that, in certain circumstances, possessing it constituted a criminal offence. You could do that. Of course, if elements of it were a form of extreme pornography, it is already a criminal offence to possess extreme pornography. So some would be covered, but it would depend on the drafting of the criminal offence.
Q89 Chair: Clare, is it possible for the Committee to write to you with a couple of follow-up questions on the idea of the small claims court? Because I think we will have questions around time and the severity of that and whether that is something that victims and survivors want.
Professor McGlynn: Yes. It is definitely something that has been supported.
Q90 Kirith Entwistle: I want to talk a bit more about the Online Safety Act. Under the Online Safety Act at the moment, are service providers obligated to remove all instances of NCII that have been uploaded? You have touched on this already.
Professor Woods: Would you like a very short answer?
Kirith Entwistle: Yes.
Professor Woods: No. Would you like me to explain that?
Kirith Entwistle: Yes, please.
Professor Woods: Part of the reason is the nature of the duty. It is designed to operate at a general level. As long as they have got a reasonable system in place for dealing with the problem in general terms, the fact that there might be some cases that slip through the net would probably mean that they are not in breach of their duty. It is making best efforts, reasonable steps. It is not looking for perfection.
The other reason is partly the definition of NCII as a crime and how broadly that is drafted, particularly around mental elements. I touched on this when I said that I think Ofcom is saying that, in each case an image is posted or shared, you have to prove that a criminal offence is behind it, rather than saying this is content that has triggered the criminal law and that we will get all instances. So it seems possible to get the first posting. You may be able to get it to come down, but you might not be able to make it stay down.
Professor McGlynn: That is a choice that Ofcom has made as to how to interpret the legislation. That is not in the Online Safety Act itself. These are choices that Ofcom is making about how to, in effect, limit what it is going to do.
Q91 Kirith Entwistle: That is useful to know. You have touched on this already, but I want to explore a little bit around consent. We have talked a little bit about content that is uploaded that the user might not be aware was obtained without consent. I guess I want to know if the Online Safety Act includes images that were uploaded where the user might not be aware that consent was given.
Professor McGlynn: Do you mean in terms of a criminal offence or in terms of platform liability?
Q92 Kirith Entwistle: I guess I mean in terms of platform liability.
Professor Woods: It comes back to the scope of the criminal law.
Professor McGlynn: The criminal offence.
Professor Woods: Yes. It is what the criminal law says about whether to satisfy the definition you need either an intention to harm, which I think is not—I am saying that if you have got a high mental element, then consent might or might not come into it. If you are making it a consent-based offence, then the question of consent will become important, and you then have the question of how a platform infers consent. Is it looking at each individual case or does it get general rules about types of content and types of context? Can we look at the account that is spreading these sorts of things? Have they got a pattern? It is a question of specificity, but Clare can talk more about the offence.
Chair: Before Clare comes in, we have 20 minutes left and about seven questions to get through, so can we please keep the answers as concise as possible? If there is any further detail that you want to add, perhaps you can write to us after the session.
Q93 Kirith Entwistle: What difference does the September announcement of NCII being made a priority offence make, in terms of the powers under the Online Safety Act?
Professor McGlynn: It just updated the legislation. The Online Safety Act had already provided that the non-consensual sharing offence was a priority offence; what this did was just update it, so that it included altered images and deepfakes. The priority offence itself was in the original Online Safety Act; this just means that it now covers altered images, such as sexually explicit deepfakes.
Kirith Entwistle: So it has slightly expanded it.
Professor McGlynn: Yes—it has tidied up the legislation so that it covers deepfakes.
Q94 Kirith Entwistle: Could you tell us a little bit about how your proposed stay-down amendment might work?
Professor Woods: That goes to the question of how broadly or narrowly we understand illegal content—whether we are looking for a criminal offence in each case or whether we want to look at types of content. I wondered whether it was possible to amend the Act to clarify that if an image was found to be illegal content, then it should stay as illegal content, even if it is shared on by people who did not know that. This is to tackle the problem of an image being in the regime and then out of the regime. It is saying that if it is the sort of thing that the regime should tackle, then it should cross that threshold and stay there.
Kirith Entwistle: That’s great. Thank you.
Q95 Alex Brewer: Clare, we have talked a little about this already, but I want to ask about civil law legislation. Is there a requirement for that for NCII, to improve victims’ access to redress and support, and if so, what would that look like?
Professor McGlynn: It is about providing a comprehensive, holistic approach. It is about providing comprehensive criminal law remedies, but also adding the civil law context, because not everyone wants to report to the police. But also, as I said, there are additional orders that you would want, maybe to get the perpetrator to delete or remove material, or to get the platform to delete material.
You could also have a civil law statutory right, like the one in the Protection from Harassment Act 1997 for stalking—there is a civil offence there and a criminal offence. You could have the same for intimate image abuse, so that, again, you could bring a case and get some compensation, for example. It is about providing survivors with a range of options—as wide as possible—that might suit their particular needs.
Q96 Alex Brewer: In written evidence, you described Ofcom’s powers as “inadequate to respond to the need for thousands of images across many websites to be removed”. Can you explain why and how those inadequacies might be addressed?
Professor McGlynn: In relation to Ofcom, I would make two general points. The first relates to, for example, blocking recalcitrant websites that will not remove NCII, as we have gone into. The powers given to Ofcom in the Online Safety Act are not designed to deal with those individual items, so they simply do not work.
The second thing I would say about Ofcom is it doesn’t also have individual powers to deal with things by comparison with, say, the Australian eSafety Commissioner, who can bring cases on behalf of individuals, who can contact platforms on behalf of individuals. They have much greater powers to interact with individuals and take action on their behalf. I think in this space that would be really useful.
You could have, for example, an eSafety commissioner that was the body responsible for perhaps blocking those websites that are not taking the material down. I wouldn’t necessarily add those powers to Ofcom, because it has so many powers already, but if, for example, one established an eSafety commissioner or an online safety commission, that body could have such powers to deal with individuals as well.
Q97 David Burton-Sampson: Sam, I have another question for you. We are aware of cases in which people who have been convicted of intimate image abuse have had their devices returned to them by the police with the NCII content still on them. Do you know why that is happening? Do we need new powers to stop that from happening—to ensure that the contents are destroyed?
Sam Millar: Briefly, yes. It is happening because we don’t have the powers to stop that property getting back into the hands of the perpetrator. We have to operate within the realms of the law. That is why I would say anything that pushes the opportunity to make NCII illegal will help. So that power is required. Police will seize, retain, destroy if they are given the powers to do it.
Q98 Chair: What role do confiscation orders play, and is there a role for them?
Sam Millar: Confiscation orders require quite a protracted process and the legislation on which they are based comes from other remits. So there is something about due process. I am a big believer in, “If you have got a piece of legislation, use it.” So I would always use legislation that can pull in. But a confiscation order is more of a financial power and process. Is that really where we have got to grapple for a piece of legislation to do something as simple as keep a device with images on from getting into the hands of the perpetrator?
Let’s enhance the legislation and get the right powers, rather than seeking to pull across. Invariably it is probably going to get complex and get challenged, and then we will have stated cases, and it just makes it more problematic. For me, if this legislation is not doing what it needs to in the basic format of protecting the victim, like retaining a phone and a device because it has images on, we need the powers.
Q99 Chair: Moving on to culturally intimate images, Sam, I will start with you. I guess I know what the answer might be, but would you elaborate, please? Are the police sufficiently trained and equipped to support victims of NCII where the image is culturally intimate?
Sam Millar: I think this is a massive challenge for us. I will look to my colleagues, please, to put me right if I miss it, but I just don’t think the Online Safety Act gave us the opportunity; it has not gone far enough to be sensitive to that. This is a generic piece of legislation that we are trying to apply across some really difficult, complex areas. That is not a good starting point.
We have no guidance, and that is a fundamental gap for us. So, we have got to generate it; we have got to work on it and go back to this. We have to invest in this area because we cannot leave these gaps. We have to know what the response is to a victim, and this is all about the victim and the harm, and the policing response has to meet that. And if the guidance isn’t there, I will struggle to be able to justify how a police officer is able to do it. Policing has to get itself fit for purpose and operate in a sensitively cultural way, so it is absolutely a requirement. So yes, it is inadequate.
Q100 Chair: Clare, do you think the Online Safety Act is strong enough in this regard?
Professor McGlynn: It is, I guess, based in the criminal law. In the criminal law around intimate image abuse, the definitions of what constitutes intimate image abuse are based on sexual, nude, partially nude and toileting. So the criminal law does not cover that range of culturally intimate images that I think you are talking about. This issue has been raised many times over the years. When the Law Commission looked at it a couple of years ago, it said the law would be too vague to extend what constitutes an intimate image to a range of the sorts of images you are talking about, but the issue is raised by a range of organisations working with individuals.
It is more complicated to draft such a provision because exactly what would be intimate varies in different contexts, but there are different ways of doing it. The now Minister put forward an amendment to the Criminal Justice Bill in April to deal with some of these issues. There is also an example in Australia, where there is a civil law provision around culturally intimate images, so my colleagues and I have said for a number of years that we could at least adopt what has already happened in the Australian context. It only applies in one context relating to the expected attire of an individual, so it does not cover the range, but it would be somewhere to start in terms of dealing with those images.
Professor Woods: As a corollary, if it does not fall within criminal law, the regulatory regime will not bite. The only possible way you might catch it within the regulatory regime is if you are talking about content that is in some way harmful for children, and I am not even sure about that.
Q101 Chair: We are going to move on to synthetic NCII, or deepfakes. Clare, what legislative steps do you think are needed to tackle the proliferation of deepfake imagery and nudification?
Professor McGlynn: We have had the recent change whereby it is now a criminal offence to share sexually explicit deepfakes without consent. The next step needed is to criminalise their creation. The previous Government announced last April that it was going to introduce such legislation, and the current Government have said they plan to do so. I hope that legislation is comprehensive because the previous version had, for example, a limited definition of “intimate image” such that, for example, if you produced a sexually explicit deepfake and it had pixellation over the nipples, it would not have constituted an intimate image. That obviously needs to change—we just need the same definition of “intimate image” as the one we use currently.
The previous Government proposal would have also limited it to certain motivations that would need to be proven, meaning it would only have been a criminal offence to create it if you could prove certain motivations. The risk there is that you would easily have had defences saying, “It was my artistic expression,” and/or “I was doing it for humour,” so I am hoping we will have a comprehensive approach. There is a version of that in the House of Lords at the moment in Baroness Owen’s private Member’s Bill to criminalise creation—a comprehensive approach has been adopted there.
It must also include solicitation of sexually explicit deepfakes, which, as you are aware, happened in Jodie’s case. We know that is how perpetrators work online: in large communities, they are taking, trading and sharing, as well as asking others to make these images. We therefore need to make sure it covers solicitation as well.
Q102 Chair: Do you think that the platforms that host synthetic NCII should be held accountable?
Professor McGlynn: Yes, absolutely—and there is then a role for, for example, Ofcom, which could be taking greater action against them. But, of course, the platforms that themselves host the material should be held accountable. We absolutely need to hold accountable the platforms that, for example, are facilitating the search. I know you spoke to Microsoft and Google, but for example—even just the other night—when you type “deepfake” into Bing, it autocompletes to “deepfake porn”. Although if you type “deepfake porn” into Google it does not now bring it up, if you put “deepfake porn” into Google with the images, it brings it all up. The changes they make are half-hearted, and they need to be properly comprehensive to try to reduce the rise of this material.
Q103 Natalie Fleet: This is a question for Sam: do police officers receive any specialist training on responding to victims of synthetic or deepfake NCII, and if not, would you support such training being offered?
Sam Millar: The fact that a victim can walk into a police station and want to report it means that our frontline capability must be trained in it. It is about prioritising it. This is why NCII has to be enhanced, and come alongside CSAM, so it can fight for a place in officers’ frontline training. There is a place for it. It is needed. They will have to interface, and we have talked about what this might look like. Of course, officers need the knowledge, and we will push this as a requirement as we develop the policing capability that needs to wrap itself around this.
Chair: Thank you very much. We will be writing to Clare about those follow-up questions. Thank you all for sharing your time and experience today. I am very grateful.
Witnesses: Gisela Carr, Alex Davies-Jones, Jess Phillips and Laura Weight.
Chair: Welcome to the Women and Equalities Select Committee. We have in front of us Jess Phillips, the Minister for Safeguarding and VAWG at the Home Office, and Alex Davies-Jones, the Parliamentary Under-Secretary of State at the Ministry of Justice. Thank you very much for coming. I am going to hand straight over to Rachel.
Rachel Taylor: I thank the Ministers for attending. Minister Phillips, would you prefer I call you Jess?
Jess Phillips: Jess.
Q104 Rachel Taylor: Thank you, Jess. That is much easier for me as well. You were here for the whole of the previous session, so some of the questions may appear repetitive. Apologies for this.
Victims of NCII who report to the police are four times as likely to report a negative experience as a positive one. Why do you think this is the case, and what are you doing to improve the police response?
Jess Phillips: I wish I could say I was surprised by this particular statistic. I am afraid to say you would find similar read-outs across the entire violence against women and girls space.
I would say it is probably more acute in this particular area, for the sole reason of the newness of the internet—although it has been around for quite some time It is not something that officers are necessarily—it’s not your classic crime. As someone who has been a victim of this crime myself, a not-great response is, to be honest, not a surprise.
Sam gave a very good account of exactly what needs to be done and the infrastructure needed for this to happen; this is frameworks around violence against women and girls training more broadly, it is about standards in response, it is about what we monitor from police forces—how they are performing in a variety of cases of violence against women and girls.
We have been in government for 16 weeks. All of that is part of the Home Office’s work: looking at exactly what those standards monitoring systems are, to improve the experience of victims who come forward about violence against women and girls and, thereafter, the charging. It was in the manifesto that we would improve police training; that will be the case and that will include bespoke training on various different things.
I was here for the previous sitting and, to the point about having specialist officers, in my view you must have what I will be striving for: universal levels of knowledge in all police officers at the same time as having specialist, properly resourced police officers.
Q105 Rachel Taylor: To pre-empt the next question about images that remain online, the worst possible outcome for victims is content that remains online because the police have asked for it to be kept there to aid their investigation.
Jess Phillips: Absolutely.
Rachel Taylor: Because, each second that it is there, more people see it and more harm is created. Do you think that police officers are being adequately trained on the specific need for material to be removed; or is there something the Department could do straightaway in that area?
Jess Phillips: Clearly not, as the Committee has taken evidence that that has happened. Specific guidance has been created—I think in May 2024—about what is meant to happen in these cases, but guidance is only as good as the people who know it exists. I believe that the College of Policing has created an e-learning package specifically on this subject, but I cannot at the moment say to the 43 different police forces, “You have to do this by 20 June next year.”—I literally do not have those powers.
Our Department must come up with a regime that monitors what is happening, but I think Sam covered it pretty well: there needs to be more training, it needs to reach everybody and the guidance needs to be clearer. That is the responsibility of both the NPCC and the College, who we work with. I do not think Sam would mind my saying that not a day goes by when I do not see Sam. This is the second meeting I have been in with her today. We work incredibly closely, to the same aim, but it will take time.
Rachel Taylor: Thank you. I know this is a difficult area for you, so I appreciate your evidence.
Q106 Chair: Minister, you mentioned improvements. How are you measuring and monitoring those improvements? What standards are you putting in place and when are we likely to see some improvements?
Jess Phillips: That is the million-dollar question. I wish I could say it will happen overnight.
Chair: But is there an expected—
Jess Phillips: Yes. The Home Secretary has announced a regime of improvements to police standards. We are expecting massive improvements in police culture and attitudes towards victims of these offences, specifically to eradicate victim blaming. Last month it was announced that measures such as—for the first time—proper vetting will be in place, and that those standards will be put on a statutory footing. We are expecting to legislate for that in this parliamentary Session, soon. In this place, things never happen when you thought they were going to happen. I have never known a Bill that came forward on the day it was meant to—some of you might not have experienced that yet, but it happens.
The truth is: it is going to develop. We will create the violence against women and girls strategy, which will definitely include this and which will be published in springtime—though that is another thing that changes in Government. But I cannot sit here and say to you, “The thing I’m going to monitor is this,” because it might turn out that I need to monitor something completely different—how many cases come forward and how the Ofcom regime works.
We are all waiting to find out what exactly the regulator’s standards will be—whether it works or interacts with criminal law in the way you have just heard. We will not know those things, in lots of cases, until the middle of next year. How you monitor what is happening in this space is unknown, but it will be included in the violence against women and girls work and the standards for policing work.
Alex Davies-Jones: Thank you for your question, David. Apologies that I could not make the earlier session; I had to be in the Chamber on the Front Bench.
I thank the Committee for all the work they are doing on this vital area. On your point, we are aware of cases where this has happened. It is a difficult area; the courts do have the power under certain sections of the Sentencing Act to seize these devices used for the purposes of or for facilitation of the commission of the criminal offence. That can be done. The problem we have is around the data and statistics on how often this is happening—it is really tricky to capture or obtain. Therefore, the current Sentencing Council is reviewing this, and keeping it under review, to see if more work needs to be done in this area. I am sure the views of the Committee will be taken into consideration, too.
Q108 David Burton-Sampson: Do you agree that, at the very least, the law should dictate that those images should be removed from the device before it is returned in every situation?
Alex Davies-Jones: These images should not be there. That is very clear. Yes.
Q109 Chair: But that currently is not the case. How will we ensure that these images will be removed?
Alex Davies-Jones: It is about working with the judiciary to ensure that they are operating within the sentencing guidelines they currently have. We need to respect that the judiciary are independent and that they are following the guidelines they have been given. The Sentencing Council and we, as the Department, are looking at this and we will work with the judiciary to make sure that the views expressed here are fed back.
Q110 Chair: Moving on to technology and hashing: we have heard from Revenge Porn Helpline about the valuable role that hashing technology plays in preventing harmful content from being uploaded. Should Ofcom make accepting hashing technology a requirement for major platforms operating in the UK? My second point is on technology. We heard from survivors yesterday who talked about the lack of facial technology and the outdated technology used by the police. Is there any plan to improve that technology?
Jess Phillips: Specifically on hashing, from our perspective there is absolutely no reason why the same hashing could not be used—I am the Minister who leads also for child sexual abuse imagery, where it is clear the use of it is far more widespread, and a part of Ofcom’s requirements. I see no good reason why the same technology could not be used, but it will be for Ofcom to determine, following the consultation on the guidance on violence against women and girls, which I believe is coming in February 2025—of course, my officials and I take a huge amount of time to be part of that, especially with our interest in child sexual abuse imagery prevention.
I have to say I see no reason why it could not be used. The technology that has been created by the Revenge Porn Helpline is nowhere near as widely used as you would think; I think it is 10 companies currently agreeing to use it. I suppose I could say—while I am here on a public platform—the rest could crack on.
Chair: That is what we have heard from other witnesses. Is there anything you want to add on hashing, Alex?
Alex Davies-Jones: Just that I totally agree.
Q111 Alex Brewer: We have repeatedly heard that NCII content itself needs to be made illegal in the same way as child sexual abuse imagery. Microsoft has told us that that would unlock faster action. Will you commit to making NCII content illegal?
Alex Davies-Jones: We have committed to making an offence of creating a deepfake, making that illegal, and we will be legislating for that in this Session. There are already a number of offences in the realm of this. In terms of what is illegal, there are a number of offences on voyeurism, threat to share, sharing and so on. There are already a number of offences in the realm of this. We are currently looking at whether there are gaps and whether they need to be closed, and we are looking at that quite carefully.
Q112 Alex Brewer: Can you expand a bit on that context? What we have heard very strongly is that this is a gap—I think that that has come from both victims and industry. How will you be looking at that, and what is the timeframe for looking at those gaps and when they will be filled?
Alex Davies-Jones: We are looking at that very closely. The gap related to deepfakes specifically is something we are committed to dealing with in this Session. If a non-consensual intimate image is uploaded, that is illegal; we have made that illegal now. It is priority content, and platforms will have to check and to remove it. Ofcom’s codes of practice will be published next month, and the Act will be implemented next year. All those offences do exist in terms of making that image illegal, and therefore it should have to be removed by the platforms. That image is illegal. There are gaps related to deepfakes, which we are legislating for.
Q113 Chair: In previous panels, Google told us that it just downlists it because it is not illegal and so it does not treat it the same as CSAM. Can you see how there might be a difference in how compliant companies and platforms would be if this content were made illegal—point-blank?
Alex Davies-Jones: It is illegal. We need to be clear here that non-consensual intimate imagery that is uploaded is illegal. Gaps currently exist around deepfake creation, which we are legislating for in this Session. Part of the problem is that we have registered it now as priority content, over which Ofcom will have to do their codes of practice next month, and then the platforms will be compelled to take action against this material. That is what we are all waiting for, and I appreciate it is frustrating and slow, but they are an independent regulator. They have to go through the motions of going through their codes of practice, implementing the Act, etc. And then the platforms will have to act—as if it were CSAM—to remove this content.
Q114 Chair: So if a company such as Google were not taking this material off their search engines and were just downlisting it, they would still be promoting illegal content?
Alex Davies-Jones: Yeah, Ofcom would be able to take action against them, and the powers that they have would be to fine them £18 million or 10% of their global revenue.
Jess Phillips: Alex and I fantasise about what we might spend some of that £18 million on.
Chair: Perhaps funding the Revenge Porn Helpline?
Q115 Catherine Fookes: I just want to clarify something. As I understand it, at the moment it is the sharing of it that is illegal, not the creation of it. That is the loophole—
Alex Davies-Jones: That we are closing.
Catherine Fookes: So, we need to make any content illegal in the Online Safety Act. Is that what you are saying we are doing?
Alex Davies-Jones: No, that’s not the loophole. The loophole is around the creation of the image, and that is the loophole that we are closing. Currently, sharing that image without consent is illegal.
Q116 Catherine Fookes: But making it is not.
Alex Davies-Jones: Yes, and that’s what we are closing.
Q117 Chair: And we know that a large amount of this content is created and hosted overseas. Should that be blocked as well?
Alex Davies-Jones: If we make a criminal offence in this country the creation of that deepfake, then yes.
Q118 Catherine Fookes: Thank you, Ministers, for coming before the Committee and for all the great work you have both done over the years on the topic of violence against women. This question is for you, Alex. What difference does the September announcement of non-consensual intimate image abuse being a priority offence make?
Alex Davies-Jones: It means, as we have heard, that the platforms will have to actively make sure that this content is not proliferating on their platforms and take action against it to remove it. When they are made aware of it, they have to take action to remove this imagery. Otherwise, they are falling foul of the Online Safety Act and therefore are liable for Ofcom’s powers to take action.
Q119 Catherine Fookes: So they will have to take it down. Will you consider a stay-down amendment to the Online Safety Act so that, once an item of content is removed, all further cases of its being uploaded will also automatically need to be removed, without further moderation?
Alex Davies-Jones: That is something DSIT is looking at. We are awaiting the code of practice to be published by Ofcom next month. I appreciate that it is frustrating that we have not got them now, but we are waiting to see what is in the codes and in any implementation of the Act—that could be something that is in the Act. Organisations and the Department have fed into what they want to see in the codes of practice from Ofcom. We will have to wait and see what is in them next month.
Jess Phillips: You could make some strong recommendations.
Q120 Catherine Fookes: Yes, we could—we should as a Committee; I am sure we will. Will the Government consider implementing new civil law legislation for a statutory regime on NCII to improve victims’ access to redress and support?
Alex Davies-Jones: There are already a wide range of civil actions that can be taken against those who are perpetrating intimate image abuse, including actions for defamation and harassment. Victims and survivors are able to get that redress directly from the perpetrator.
Q121 Catherine Fookes: We have heard of one group of victims being offered £100 each. Is that sufficient redress?
Alex Davies-Jones: I cannot comment on individual cases. The whole point is that they are done by the judiciary—if under civil law, it is a civil case, but if through another means, it is independent. What I can say is that we need to make victims and survivors aware of their rights under the victims code. That is something that is squarely within my remit and that this Government are taking seriously—promoting the victims code, refreshing it and consulting on it early next year. It is on all of us to ensure that victims and survivors know their rights under the code and know that they are eligible to get compensation.
Q122 Alex Brewer: Alex, a question for you again. Professors McGlynn and Woods have labelled Ofcom “wholly inadequate” when it comes to dealing with individual victims of NCII. The Revenge Porn Helpline agrees, unsurprisingly. Should there be a separate body specifically for enforcing the removal of NCII?
Alex Davies-Jones: I am aware of the evidence that you have heard and of people’s feedback about Ofcom. Ofcom is in a unique position here, where it has yet to publish its codes of practice. A lot of this has yet to be implemented. There are a lot of ifs, buts and maybes on how this Act will work. My lovely colleague in the Home Office sat next to me likes to use the phrase, “We have to suck it and see”—I said it so you didn’t have to—around a lot of this. Again, that is incredibly frustrating.
We have to wait, but I would add that we are not just sitting back and waiting—the legislation is but one aspect of what we are doing, and a whole stream of work is going on across Government on tackling violence against women and girls, on tech abuse and on intimate image abuse more widely, and on how we can support victims and survivors.
It will not be just legislation that fixes this, but on Ofcom, we have to wait and see in terms of the codes of practice under the existing legislation. If that is not strong enough or acting as Government intend, we will not hesitate to take action.
Jess Phillips: For lots of us, the frustration that you will hear from us is partially as well because we want this to happen quickly and well, and for the regime to be very bold. To give Alex some credit, she spent years of her life fighting for stuff to go into the Online Safety Bill. It took a long time to get to where we got to, however satisfactory we may think it is or is not, but we are now on the precipice of finding out whether these fears are real and what needs changing. The Government would never rule out further changes, but at the moment it is a timing issue—let us wait and see what we get from the years and years of work that this building put into that particular Act. Then we can see what might need changing.
Q123 Alex Brewer: Given that this is a crime that is happening now and increasing with alarming frequency, how long are we going to wait to see whether these changes embed in the way that we want them to?
Alex Davies-Jones: It is a very good question, which we are having close conversations with our colleagues in DSIT about. As I said in my previous answer, we are not just waiting until the legislation is implemented to take action. A whole ream of work is going on, that underpins all this, to tackle violence against women and girls and to support victims and survivors. Just in the MoJ, we are consulting on the victims code early next year to see whether it is fit for purpose.
I know that there is a gap in how we support victims of online and tech abuse, so I encourage the Committee to feed into that consultation when it goes live next year. We will legislate to expand the powers of the Victims’ Commissioner. We are closing the legislative gaps, but reams of work are also going on with the Department for Education, the Department of Health and Social Care, the DWP—across Government—to support victims and survivors in this area.
Jess Phillips: If we thought that a regulator and/or the Online Safety Act were an absolute panacea, we would all just sit back, but so much work has to be done. The policing response to this has absolutely nothing to do with the Online Safety Act, and we are looking at ways in which we can improve policing all violence against women and girls crimes.
On tech abuse and intimate images, in all my years of working with victims of exploitation, I have rarely known a case of exploitation that did not start with an intimate image. We are looking at how we improve the policing response across the board and ensure that intimate image abuse is not just seen in the same light as when back in the day people said, “Just a domestic.” It has been a bit, “Oh, it’s just kids sending naked pictures around.” Stamping that out, whether in policing, the courts or victims’ services, is going on now.
Chair: Thank you. Just because they were mentioned, I should say that representatives of DSIT were invited to come today, but we will write to them with any follow-up questions.
Q124 David Burton-Sampson: Alex, this is another question for you. We talked briefly about compensation. Just to expand a little on what Catherine mentioned, one of the survivors that we heard from told us that the courts had awarded her and 14 victims of NCII abuse £100 each from their abuser. I know you cannot comment on individual cases, but it would be interesting to see whether you thought that was sufficient. As far as we are aware, the criminal injuries compensation scheme does not apply to victims of online abuse. Will you commit to including that type of abuse in the scope of the scheme?
Alex Davies-Jones: I cannot comment on the first point. It is for the courts to determine awards, but I would argue that no amount of money is going to fix this. Money is not the only answer in terms of support available to victims and survivors. The Home Office funds the helpline. The Ministry of Justice funds several support services across the country for victims and survivors of these crimes. A range of support is available to victims and survivors.
The point about CICA is not true. I am aware of victims and survivors who have had redress through the scheme. The scheme is not set up for specific offences, but for what has happened to someone as the result of a violent crime. One of the awards can be made as the result of disabling mental injury. The independent body will award on a case-by-case basis, and I repeat that I am aware of victims and survivors who have had redress through the scheme.
Part of the problem is that the scheme is massively oversubscribed, as you can imagine. Its core purpose is to provide compensation to eligible victims who are injured as a direct result of violent crime. As you are all aware, we are in a very difficult financial environment and I wish that I had a pot of money that would provide compensation to all victims of crime, but that is not the case. That is why the scheme was set up with its core aims and purpose. But I am aware of victims and survivors who are eligible to claim compensation. If victims and survivors do not know that, we need to make them aware of the victims code, which explicitly sets out what they are eligible for. We need increased awareness of it so that they can claim.
Q125 David Burton-Sampson: Thank you very much for that response. It is good to get some clarity. You both know this from your personal experiences, but we met some victims recently, and the mental impact of the abuse on them is almost like non-physical rape. I know that that is a harsh way to put it. I think we need to keep that in mind: it might not be physical violence, but it absolutely is mental violence that these people undergo.
Alex Davies-Jones: It is a disabling mental injury, which is what the scheme covers.
David Burton-Sampson: I agree 100%—thank you.
Q126 Kirith Entwistle: I want to come back to this issue. We have heard from survivors who have said that they would greatly benefit from things such as counselling, therapy or support, and £100 compensation would not even really cover two sessions. That is just something I wanted to put on record: there is an explicit need for that as a core form of support for survivors of this sort of abuse.
Alex Davies-Jones: I totally agree, and that is why I made the point that financial redress is just one element of the victim support available to them. The MoJ funds a ream of support services for tackling violence against women and girls, to support victims and survivors across the country, including therapeutic support, with grants directly to police and crime commissioners so they can fund services in their area for whatever they see fit, and the Victims and Prisoners Act 2024, once it is implemented, will have a duty to collaborate. That will go even further, so that police and crime commissioners can commission services directly in their area to support their communities, so people can get that support.
Q127 Rachel Taylor: If someone wakes up to find their intimate images are available on a porn website, what would you advise them to do?
Alex Davies-Jones: Oh gosh. I would advise them to phone the police to report the crime, and hopefully they would get—at least once we have got to grips with it—a really good response from the police and they would know how to take action to deal with it. There is also the Revenge Porn Helpline, which I hate the name of but which the Home Office funds, to get them support and help them to get that material taken down. They have a circa 90% success rate, I think, in getting that material taken down. That is absolutely what I would advise them to do.
Q128 Rachel Taylor: But if it is on a non-compliant website and the Revenge Porn Helpline has not been able to help them to get that taken down, what would you suggest they do?
Alex Davies-Jones: If it is on a non-compliant website, sadly, it is horrendous. I would ask them to seek out victim support and go to somebody to get that support. The problem we have is that until the Online Safety Act is implemented, there is no way of getting that material taken down.
Q129 Chair: The Revenge Porn Helpline has been mentioned already, and its case load in the last four years has increased tenfold from 1,600 to 19,000 cases. Its funding is up in March—I think you know what question I am going to ask—so will the Government commit to ensuring that its funding is renewed?
Jess Phillips: First and foremost, I just want to say what amazing work they do. They have helped me personally to have some of the images of me taken down, so I feel completely and utterly indebted to them—although that will not form part of my decision making; I just declare that interest. The Budget-setting process that has just gone on in this building with the Chancellor comes back to us in the Home Office, and we are now undertaking the line-by-line work on what needs to be funded. All I would say is that the Government have declared violence against women and girls a national emergency, and it is one of the missions for Government.
Funnily enough, while the Home Office currently funds the Revenge Porn Helpline and Alex’s Department, the Ministry of Justice, funds victim support that goes out in local communities, we very much feel that in a proper violence against women and girls strategy, we need to ensure that every bit of commissioning across the system takes account of violence against women and girls, including health commissioning and education commissioning. It should not just be pots of money that get given out from the Home Office and the Ministry of Justice; it has to be a wholesale Government effort, and that is what we are working towards. Unfortunately, I cannot say at the moment exactly what our budget allowances will be for next year.
Q130 Catherine Fookes: One of the earlier witnesses talked about an online safety commissioner. Do you both think we should have one of them? She used the example of Australia.
Jess Phillips: Australia, yes—so that victims can get in touch with them.
Alex Davies-Jones: We have the Victims’ Commissioner and the Domestic Abuse commissioner. There are number of commissioners where victims and survivors can go to get support already and we would not want to be diluting their roles, voices and core purpose. We have already committed to expanding the role of the Victims’ Commissioner in primary legislation this Session. If appropriate, and if we feel that there is a gap, then of course we would not hesitate to act, but we will also raise this with the existing commissioners to see what more we can do to support victims and survivors of this abuse.
Jess Phillips: On any guidance that Ofcom produces about violence against women and girls, the statute states that they have to speak to both the Victims’ Commissioner and the Domestic Abuse Commissioner, but making those commissioners ensure that they have teeth over any process that they are part of is constantly an issue on our agenda.
Q131 Catherine Fookes: There is such a huge amount of these images, and online abuse is such a huge area, that it may be that is what is needed.
Jess Phillips: Yes, a specific one. Again, we have to go back to suck it and see—you are right. The trouble with all violence against women and girls crimes is volume. When actually trying to tackle really serious, high-volume crimes, even if I said let’s have a person for that, is one enough? We have to see how the regime that has been set out goes, but as Alex and I said, and as part of the violence against women and girls strategy and mission, we will keep a watching brief on whether it is working and propose changes if it does not.
Q132 Chair: We know the Chancellor set out very strict measures in the Budget and how we spend taxpayer’s money. Will the Government consider ringfencing a proportion of the fines that have been levied against tech companies by Ofcom to fund the provision of support services for victims of online abuse?
Alex Davies-Jones: My understanding is that if the fines are issued by Ofcom—once the Act is implemented next year—that money will come back to the Government.
Chair: Okay.
Jess Phillips: I suppose it is for us to then factor that in.
Chair: So it will come back to the Government, and it is yet to be determined how that would be spent or allocated.
Q133 Natalie Fleet: Alex, you recently said that you were considering what further legislative measures were needed to ban sexual deepfakes and nudification apps. What measures are you considering?
Alex Davies-Jones: A range of them, in terms of: how do we create the most effective law? What is going to work in this field? Nothing is off the table in our current considerations of how best to proceed in creating these new offences and taking action in that space. I am keen to hear the recommendations from this Committee.
Q134 Natalie Fleet: Should victims of synthetic NCII be able to claim damages from the relevant deepfake creator platforms and host websites?
Alex Davies-Jones: We have already talked about redress in terms of civil action that can be taken and so on, directly from perpetrators. I know that one of the things that has been considered is: should victims and survivors be able to take redress from the platforms directly? Again, that should potentially be considered.
Q135 Natalie Fleet: Should they also be able to claim damages from search engines that produce the results for those websites and platforms?
Alex Davies-Jones: This is something we debated really heavily during the creation of the Online Safety Act 2023. I was the Opposition shadow Minister leading for the Labour party on the Bill, and we went about this around the houses on how one accesses compensation and how one gets that redress. We currently have the Ofcom range of powers that we have inherited from the previous Government; they are producing their codes of practice, and they have the powers to issue the fines. I think we have to wait and see. But as we have said, nothing is off the table in terms of what could be looked at.
Q136 Natalie Fleet: The Government committed to strengthening police training on violence against women and girls. Can you provide more detail on what this will mean in practice for deepfake NCII specifically?
Jess Phillips: When it comes to exactly what it needs to be for deepfake and NCII, the Government does not write the police training. It is usually written in concert with experts and in fact, I think they are working with some of the experts that you have had in front of you today. The College of Policing works with people, such as those who run the Revenge Porn Helpline and others, to create the content that might be needed.
The Government do not invent the police training; the role of the Government is not only to oversee that process and monitor whether we think it is good enough, giving advice and guidance, but to monitor whether police forces are undertaking that training. A constant bugbear of mine was that not every force in the country was trained in domestic abuse, which is unacceptable and not something that the new Government will tolerate. Exactly what the training regime will be in the future, and what will be in it, is up to the College of Policing, but the Government will be keeping an incredibly close eye and using whatever power we have to insist that that is everywhere.
Q137 Natalie Fleet: Microsoft has asked that funding be made available for police training and judicial education specific to deepfake NCII harm. Will the Government make that funding available?
Alex Davies-Jones: Microsoft has asked for funding!
Jess Phillips: The announcements about funding, including police funding settlements, will be coming out in due course. The regime in the past with regard to police funding for specific training—I go back to the example of domestic abuse—is that the Home Office has funded forces where they have not done it. I have to say, I think that is unacceptable.
We will not change the culture of violence against women and girls in any of the institutions that we want to be better—whether that is when you first go forward about it to the police, or in our courts, or whether it is when you go to the doctor and talk to them about your mental health needs because of what has happened to you—and we will not make it better by central Government Departments giving out pots of money for schemes that we wish to see happen. We might have to do that to get those things to happen, but those things have to become core.
The volume of this crime, the pain that you have seen from the victims that you have spoken to and the data you have been given by experts tell me that this should become standard day-to-day police work. Therefore, it is very likely that the Home Office will work to help develop the training and will fund specific pilots on things to see if things work—but in a push-back to Microsoft, they are very welcome to fund that should they want to. It is the responsibility of police forces around the country to ensure that their police officers are trained properly and have that training updated appropriately.
Q138 Kirith Entwistle: Alex, you mentioned gaps in the protection for victims in current legislation around intimate image abuse. Do those gaps include non-sexual, but culturally sensitive images? For example, women being pictured without headscarves.
Alex Davies-Jones: This is a complex area, and I know it has been discussed in connection with previous legislation, previous Bills that were brought forward in this House. We need to be very clear that what counts as intimate for one person can be very different for someone else. It can also be different between different communities and groups. The Law Commission has already looked at this and concluded that it would be impossible to craft a definition that suits everyone and that therefore it could result in overcriminalisation. However, where such images are uploaded, they could be caught under existing offences, such as blackmail or harassment. But we are keeping this area of law under review and will not hesitate to act if there are gaps.
Q139 Kirith Entwistle: How do the Government intend to legislate against that type of abuse? You have outlined some of the challenges, but is there any way to legislate against it outside existing parameters?
Alex Davies-Jones: The Law Commission have already looked at this. They concluded that it would not be possible to do so. I mean, that does not necessarily mean we cannot, but we respect the Law Commission and their independence in looking at this quite closely. It is also important to note that these non-consensual images do not necessarily need to be sexual in nature. Images that are looked at that are currently covered by offences are things like breastfeeding or toileting, for want of a better word, but it is not black and white, and therefore it is really difficult to differentiate between what is intimate for one person and intimate for another person.
Laura Weight: To add in, the Committee asked a few questions—I know the previous panel discussed this—about the difficulty in the content itself being, in and of itself, illegal. The comparison with child sexual abuse images and with terrorism material has been run a few times; I think the difference is that the offences—the priority offences for the purposes of the OSA—have that consent element. We have heard here how difficult that is. The images themselves, unless it is about an image that has been created without consent, are difficult for content moderation. That is why we will be reliant on the Ofcom guidance and looking at how that is assessed. To do that in other circumstances, for the types of images that you are talking about, those same considerations would be there, and it would be potentially even more difficult for that content moderation to happen.
Q140 Catherine Fookes: Moving on to the subject of sextortion—I am not sure who is most appropriate to ask, but perhaps Jess—what are the Government going to do to tackle the threat of sextortion by those operating overseas?
Jess Phillips: I have the exact data on this, which I will just find for you.
Catherine Fookes: While you are looking it up: this is very topical, because—
Jess Phillips: I have a specific thing specifically on what we are doing abroad, in answer to your question. But carry on, as you were.
Catherine Fookes: This has partly been brought to light by a campaign called Fearless, which has been launched recently, which I am sure you know—
Jess Phillips: Yes, it was on the news heavily this week. It is funny because, I have to say, for somebody who has spent a lot of my life and career working in and around this space, this particular new threat that is really specifically targeting boys and young men is absolutely harrowing. Most of it is happening under the auspices of, basically, the social media sites that would be covered by the Online Safety Act. However, it is fundamentally a crime of both sexual abuse and blackmail.
The cases that have been in the news in the past week are a terrible case of Nigerian gangs. The National Crime Agency, which sits under the auspices of the Home Office, works incredibly closely with partners all across the world, globally, to ensure that we are looking at detection. There have been examples. One of the most depressing parts of my new job is that, every day, I read the NCA outline of examples of people who have been found to have done these crimes and been criminalised for them. But yes, there is a global taskforce on child sex abuse, in which Britain takes a very firm role.
Q141 Natalie Fleet: What are the challenges facing police with regards to bringing to justice criminal gangs overseas committing sextortion?
Jess Phillips: The challenges are as they are for all types of this issue. It is the same with CSAM and human trafficking. We will not do it unless we do it globally—it has to be done in concert with other countries around the world. Most of the CSAM material, for example, and online terrorism, which comes under most of this data, is found by other partners. The vast majority of CSAM that comes into our country for local law enforcement comes from the USA. These international systems are set up, but of course it is very, very difficult—not impossible—for us to take action in other countries. And that does happen.
Q142 Chair: Just to follow up on Natalie’s question, as sextortion becomes more of an issue, will you be using the existing systems that you have in place with CSAM, for example, or do you envisage that there will need to be a separate one?
Jess Phillips: It would entirely depend on the age of the person, because of the CSAM legislation. In lots of the cases, the revenge porn helpline is seemingly the best repository for data on exactly what this is looking like. I think 34% of their caseload is now sextortion. They were the ones who first alerted me to it. Where people are under 18, this is across the board an incredibly complicated area of law. If somebody is putting their images online, you cannot consent as a child to that. That is an easy line to draw in the law.
It is a relatively new phenomenon that is growing. We are spending a lot of time thinking about exactly what needs to happen in this particular space, but at the moment it would still fall along the lines of 18 and under.
Q143 Catherine Fookes: The Molly Rose Foundation said that Ofcom’s approach to sextortion is disappointing and will do little. Does the Online Safety Act extend the necessary powers to Ofcom to force social media companies to act?
Jess Phillips: Again, it does, only on the basis of the age of the victim of this crime. However, I go back to my original point about the hashing. I see no reason why, for adult sexual abuse images or intimate image abuse, hashing technology could not be used. I would very much encourage the family in the terrible case of Murray to feed into the consultation, even though it is a violence against women and girls consultation that will be done by Ofcom. Sextortion cases will form part of that consultation, and I encourage them to take part in that.
Q144 Chair: Do you think that the cliff edge between a child and an adult is far too violent a drop, particularly not taking into consideration vulnerable young adults as well?
Jess Phillips: It’s far too violent a drop in every single area where it happens, whether it is in care work or housing. There are so many areas in society where the cliff edge is 18, and 16 in the case of some offences that we deal with. It is complex. The law has to draw a line somewhere. I suppose it is from the point of view of protecting children and making everything as robust as it possibly can be for children. If the law is a brutal tool, it is for those who provide services, like Governments, to make sure that there is a softened blow—but at the moment I see too many examples where that is not the case.
Q145 David Burton-Sampson: When we met some victims, we became very aware of the deep mental impact that this has had on them. They alluded to the fact that there had been no therapeutic support for them as victims and survivors of this crime. Are you aware of any therapeutic support? If not, do you think that more should be done to support victims?
Alex Davies-Jones: We have already talked about what both Departments fund in terms of victim support. The majority of it is done through the MOJ. We have a rape and sexual abuse support fund, which provides £26 million to over 60 specialist support organisations that offer tailored support programmes to all victims, including victims of non-consensual intimate image abuse. It is there to help them cope with the disabling mental injuries that this crime causes and the severe mental impact it has. As I said, we fund police and crime commissioners directly from the MoJ so that they can provide services locally and whatever tailored support is needed, so that is there.
Part of the problem is the awareness of the victims code. Victims and survivors need to be aware of exactly what they are entitled to, so we must do a whole suite of work to promote the code and consult on whether it is fit for purpose. We are doing all those things.
Q146 Catherine Fookes: It is clear from today’s session and to those of us who have worked in this space that violence against women and girls is an epidemic. I was really pleased to hear the representative from the police earlier say that it deserves the same attention as terrorism. We must absolutely change our culture to deal with it. One of the areas we have not talked about much today is schools, what we can do around education and how we stamp this out from the beginning in our schools. What are you doing in that regard, please?
Jess Phillips: For decades, we have sought to do very little other than basically tidy up a mess of crimes already perpetrated, but preventing them has become one of the major pillars of the work that this Government will seek to do. About eight Committees of the House of Commons, including a forebear of this Committee, which I sat on, and the Education Committee, said that we needed compulsory healthy relationship education, and that came into law in 2016. However, pretty much since that legislation passed, progress has relatively stalled in that space, and there is a whole history of things that have gone wrong. The Department that Alex and I work closest with in our mission is the Department for Education.
What works to actually prevent this? We need to design and put into schools programmes that not only deal with all the different sorts of violence against women and girls and safeguarding, but contain long-lasting, preventive measures for the future. At the moment, I feel a bit like we just do something because it is better than doing nothing, and we do not know exactly what works. A huge part of our mission—Alex and I co-chair the violence against women and girls mission board—will be finding what that is and spreading it about like muck.
Q147 Chair: You have talked about changing the culture, and that is a huge piece of education, but we have heard time and again that one of the cultures that has failed victims and survivors is the culture of the police, which lacks infrastructure regarding NCII. The survivors we spoke to referred to a culture of misogyny. They were really blunt and very clear that they felt that the police just did not care about it, and even intimidated them at some stages. One survivor even told us that the police referred to her as a prostitute. We can train individual police officers, but what can we do to tackle the culture within the police with regard to NCII and VAWG?
Jess Phillips: We will achieve nothing in the criminal justice system unless we do what you have just outlined. Incidents of police-perpetrated abuse and misogyny—a number of cases have come to light in public—are undoubtedly shocking, so we have to do everything possible. We are obviously in the middle of a two-part inquiry—the Angiolini inquiry. That has started, and in fact in the case of Sarah Everard, part of the story involves an intimate image. That is why I say you cannot separate the two.
We have committed to ensuring that the first part of the Angiolini review gets put in place in full, but we are waiting for the second part of the review, which is not just about police cultures and everything, but specifically about how police deal with the seriousness of this crime to stop it escalating to being, as in the case of Sarah Everard, a much more serious sexual crime that ended her life.
For the first time, there will be proper standards and vetting. That comes out of the Angiolini review, and also from what the Home Secretary has set out about what will have to be the standards. Misogyny found in police officers will not be tolerated, as part of a vetting regime that does not just vet you when you go in. What has to be created is a regime with constant updates.
There is another issue about violence against women and girls within police forces; that is about, for want of a definitely better word, the “sexiness” of it within police forces. It is very appealing to do public order policing or firearms policing. We have got to make this something that police officers feel is the very core. It is not just getting rid of the misogynistic bad apples. This is about making violence against women and girls something so core to policing that being good at it is the reason that you get promoted.
Alex Davies-Jones: I want to make a quick point on that as well—a political point, if I may, around leadership. We no longer, thankfully, have a Home Secretary who makes spiking jokes or a Prime Minister who makes sexist comments. We have a Home Secretary and a Prime Minister who have made tackling violence against women and girls a political priority. That does really count for something in terms of the direction of this country, the leadership and the political will to tackle this head-on. I think that is really important.
Q148 Chair: I have a couple of points of clarification before we close the session. At the first panel, Sam Millar was very clear that it was not within the police’s powers to remove or take a device, or remove images off a device, from somebody who had been convicted. Perpetrators could be given back their devices, and quite often are given back their devices, and the police do not have the power to remove those images. Is that your reading of where we stand at the moment, or does clarity need to be given on the guidelines?
Alex Davies-Jones: I made some previous comments around the courts managing to obtain the devices and getting the content removed. The Sentencing Council is looking at this to see whether the guidance needs to be clearer. It is currently within the courts’ powers to take this. They can do it, and we are—well, the Sentencing Council is—looking to see whether that guidance needs to be clearer.
Q149 Chair: Okay. Because in the instance of CSAM, that just wouldn’t be the case, would it?
Alex Davies-Jones: Yes.
Q150 Chair: I want to come back to the fact that this is an international issue and we will not be able to solve this problem without dealing with international actors. If someone in Moscow uploaded the video of Georgia Harrison, for example, then ISPs in this country should be blocking that. Is that correct?
Jess Phillips: An example of where this has been successful was on some of the nudification apps. They had already stopped UK users being able to use them. They had blocked UK users because of UK law.
Alex Davies-Jones: Or the threat to create UK law.
Jess Phillips: The threat—we didn’t even need to do it. The more we talk about this, the better. This is really important. Making tech companies be better and do better and carve out things for us, whether they are in California or Moscow, comes from even the threat from a Committee like yourselves. It is really important that we keep talking about it.
Yes. The answer within the regime is that it is still illegal content if it can be seen here—if the user is here.
Q151 Chair: The unfortunate thing is that the companies are not currently willing to do that. Microsoft and Google gave evidence and seemed very much of the belief that it is about context. Because this is not illegal in the same way a child cannot give consent, they were erring on the side of caution. If it were made illegal in the same way child sexual abuse material is, they would not have that wriggle room.
Jess Phillips: All I would say back to them is that it is illegal. Intimate image abuse is illegal, so I would push back to them. We might not morally have got to the point where we find it as repugnant as child abuse—although I have to say that sometimes I sit here and people make out that everything is absolutely rosy with regard to tech companies and child abuse, and I would just like to put a line there that says that is not necessarily the case—but that is a cultural thing that will have to come in as part of the new regime, because it is illegal. Terrorism, child abuse and sharing of intimate images have the same level in the Act.
Laura Weight: Again, it is the non-consensual intimate image sharing that is the offence; that is what is being added as a priority offence—sharing those images. My understanding of the way that interacts with the Act is that if it would be illegal if it was happening in this country, then even if it is happening outside the country—if the image is being shared from outside the UK—that would still be caught by the provisions of the OSA, in respect of that priority offence.
Q152 Chair: According to the information and evidence that we heard from Microsoft and Google, they are not currently treating NCII as illegal.
Laura Weight: It is not currently a priority offence for the Act; that will be why. As the Ministers have said, non-consensual sharing of the images is illegal within the criminal law. The addition of that to the schedule for the OSA will then bite through the Ofcom—
Alex Davies-Jones: The Ofcom guide implementation is next year.
Q153 Chair: So you are expecting to see that change. Will you expect to see that change from Microsoft, Google and other platforms?
Alex Davies-Jones: indicated assent.
Jess Phillips: Tell you what: I expect to see it change—
Alex Davies-Jones: Now.
Jess Phillips: Today.
Alex Davies-Jones: They don’t have to wait.
Jess Phillips: Crack on, once again—I would like to go back to my previous comment that they could, absolutely, crack on today. Wouldn’t that be lovely?
Q154 Chair: And if they were not to comply with that following the OSA and those changes next year, they could be open to fines.
Alex Davies-Jones: Yes, under the range of powers that Ofcom has to enforce the Act.
Chair: Thank you very much. Does anybody else have any other questions? No. That brings proceedings to a close. Thank you very much.