Supplementary written evidence submitted by David Wright CBE, CEO SWGfL [IIA0008]
Dear Sarah,
We are writing to provide detailed follow-up evidence for the Women and Equalities Committee following our oral testimony on 6 November 2024, alongside subsequent evidence from Professors Clare McGlynn and Lorna Woods and the contributions made by Ministers Jess Phillips and Alex Davies-Jones.
We welcome the Online Safety Act’s (OSA) provisions to criminalise the sharing of non-consensual intimate images (NCII) and expand protections to include synthetic media such as synthetic sexual content, or “deepfakes”. These are commendable steps forward. However, we strongly challenge the claims made by the Ministers that the Act will make NCII content itself illegal and, by extension, solve the problem of access to such content on non-compliant platforms. We believe that this assertion is inaccurate and risks creating a false sense of security for victims, policymakers, and the public.
Challenging Ministerial Claims
In her evidence to the Committee, Minister Jess Phillips stated:
“This law will mean that the material itself is illegal, and therefore there will be very clear lines about its removal” (Q112).
Similarly, Minister Alex Davies-Jones asserted:
“Under the Online Safety Act, platforms must treat NCII content in the same way as they treat child sexual abuse material (CSAM). This ensures swift action to remove such content and prevent it from being accessible online” (Q113).
These statements are misleading for the following reasons:
“The criminal legislation does not make the content (the actual images/videos) illegal in and of itself. The sharing of such material is criminalised, but the images remain lawful unless specifically designated as illegal content through platform obligations.” (Additional Evidence, 22 November 2024, para 3)
The OSA places the onus on platforms to assess whether specific NCII content meets the threshold of “illegal content.” This case-by-case judgement process lacks the clarity and uniformity required to guarantee swift removal.
“NCII is not treated with the same urgency or systemic response as CSAM. Platforms are only required to act when they determine there are reasonable grounds to infer a criminal offence has been committed”.
This results in inconsistent enforcement and leaves victims exposed to further, ongoing harm.
Why Additional Mechanisms Are Essential
The Online Safety Act cannot adequately address the circulation of NCII content without complementary mechanisms, particularly ISP blocking. As we noted in our own evidence, platforms that refuse to comply with takedown requests—or operate outside UK jurisdiction—continue to host significant volumes of NCII content.
Professor McGlynn and Professor Woods highlighted the inadequacy of current enforcement mechanisms, stating:
“Even if Ofcom were to issue a business disruption order against a non-compliant platform, this process is lengthy, bureaucratic, and ineffective for dealing with the scale and dynamic nature of NCII content. ISP blocking is essential to address this gap”.
Ofcom’s enforcement and business disruption powers are aimed at systemic failures by platforms to conform, they are not intended to be used to target individual pieces of content that cause persistent harm.
Recommendations
To close the significant gaps in the current framework, we propose the following:
A Survivor’s Voice
A survivor who has followed the work of the Women and Equalities Committee has also given a statement following the evidence given by Ministers. We include it here to demonstrate the impact of the Ministers’ statements while protecting their anonymity:
“The ministers' reticence to understand that the OSA does not go far enough to protect survivors from the harms of NCII is deeply disappointing. Survivors like myself are saddened by the dismissive response to the recommendations made by advocates and experts such as Sophie Mortimer of the Revenge Porn Helpline and Professors McGlynn and Woods that work on the ground in this space daily and see the real impact that the current gaps in legislation continue to have on people like me. Such experts would not waste their time continually asking for these protections to be implemented if they were not vital lifesaving measures.
If NCII content were considered illegal material to create and possess on a par with the treatment of CSAM and Terrorism it would go a long way to enable disenfranchised and vulnerable survivors like myself to rejoin society and live freely, knowing that actions such as ISP blocking could be taken to prevent the spread of content online. I have been a client of the Revenge Porn Helpline for more than five years. Despite their tireless ongoing work to remove my content, it is still not safe for me to live, work or maintain an online presence like any other young woman in this country due to the lingering content on non-compliant sites. If it were possible for my content to be blocked, it would unburden me from the continual torment I suffer at the thought of the thousands of people deriving pleasure from my convicted abuser's brutal and sadistic handiwork. Knowing that at least in my home country, nobody can view my worst moments online any longer would allow me to safely start to take back control of my life.”
Conclusion
The Online Safety Act is a step forward in prosecuting NCII offenders, but its claims of addressing the imagery itself are overstated. Without mechanisms like ISP blocking and proactive hashing, victims will continue to suffer from the ongoing circulation of their images.
We urge the Committee to challenge the Ministers’ assertions and advocate for these additional measures, ensuring that NCII content is treated with the same urgency and seriousness as CSAM.
Yours sincerely,
December 2024
SWGfL, Belvedere House, Woodwater Park, Pynes Hill, Exeter, EX2 5WS
Tel. No. 0845 6013203 Fax 01392 366494 Register in England and Wales No. 5589479
Charity No. (England and Wales – 1120354, Scotland SC051351)
VAT Reg. No. 880 8618 88 Email enquiries@swgfl.org.uk Website www.swgfl.org.uk