Further written evidence submitted by Professor Clare McGlynn KC (Hon), Durham Law School, Durham University [IIA005]

 

This additional evidence was prepared following the evidence session before the Committee on 20 November 2024. It provides additional evidence to the Committee on the following issues:

 

-          ‘Illegal’ content and removing NCII from non-compliant websites. In particular:

-          Proposals to criminalise the creation of sexually explicit deepfakes

-          Law on taking of intimate images without consent requiring reform

-          The inadequacies of the current civil law to tackle image based abuse and options for reform

-          Police failings investigating image based abuse and lessons for future reforms

-          Culturally intimate images and possible steps forward

-          New Regulatory Body such as an Online Safety Commission

 

‘Illegal’ content and removing NCII from non-compliant websites

 

  1. There is considerable confusion and obfuscation about this issue. My earlier evidence sets out in detail of powers in the Online Safety Act.[1]

 

Summary

  1. While the act of sharing intimate images without consent is prohibited conduct (the behaviour is ‘illegal’), the criminal legislation does not make the content (the actual images/videos) illegal in and of itself.
  2. CSAM is different because it is an offence to possess CSAM (and to create it).
  3. NCII is only ‘illegal’ content for the purposes of platform obligations under the Online Safety Act.
  4. The obligations under the Online Safety Act and as set out by Ofcom under its guidance make clear that the treatment of CSAM is different from other content such as NCII. They are not treated the same.
  5. Under the Online Safety Act, platforms and search are only required to take proportionate steps to prevent users encountering ‘illegal’ content and remove it where identified.
  6. These obligations only apply where services have made an illegal contents judgment in relation to each item of content (according to Ofcom guidance) This is wholly different to CSAM.

 

What is the key issue?

 

  1. The key issue raised by the Revenge Porn Helpline is that there are websites that refuse to remove some NCII. These are websites outside of the UK and websites that have no interest or intention of ever complying with a Helpline request. There are thousands of these images circulating online causing immense harm to many individuals.

 

 

Clarifying what changed with the Online Safety Act

 

criminal law changes

  1. The Act changed the criminal law on non-consensual sharing of intimate images to make this a consent-based offence and to include AI/synthetic/deepfake images. This provision was an amendment to the Sexual Offences Act 2003.
  2. This is a criminal offence and therefore makes particular behaviour unlawful. It does not make the content unlawful.
  3. For example, if an individual takes an intimate image without consent and then shares it without consent, they have committed the criminal offence of non-consensual sharing of intimate images (and maybe an offence of taking). But, the image itself is not ‘illegal’.
  4. Images such as this would only become illegal if it was a criminal offence to possess such material, as is the case with the extreme pornography laws and CSAM offences.

 

Regulatory changes in the Online Safety Act and obligations on platforms

  1. The Act introduced various regulatory requirements on platforms. It made the non-consensual sharing offence a priority offence meaning that the platforms have particular obligations in relation to that offence (and higher than in relation to non-priority offences).
  2. This means that the platforms which are subject to the Act have to take proportionate steps to prevent users encountering priority illegal content and have to remove it swiftly when they are aware of it.
  3. Platforms have to consider each piece of content and decide if there are ‘reasonable grounds to infer’ that a criminal offence has been committed. If so, their obligations come into play such as removing material.
  4. If the platform determines that there are reasonable grounds to infer that a criminal offence has been committed, the content is ‘illegal content’ for the purposes of the Act.
  5. The content does not become illegal content in and of itself (similar to CSAM) for all or any purposes. It is only ‘illegal content’ for the purposes of the Online Safety Act obligations. This means that it is illegal in terms of the platform obligations under the Act such as taking proportionate measures to remove the content.
  6. Therefore, when the Minister, in her evidence (qu 112) refers to the image being illegal, she means illegal in the context of the obligations under the Online Safety Act. Similarly, when referring to the material being illegal in qu 113, this is referring to it being ‘illegal content’ for the purposes of the Online Safety Act.

 

Online Safety Act and obligations regarding CSAM

  1. The obligations set out by Ofcom under the Online Safety Act in relation to CSAM are different to other forms of content.
  2. CSAM and NCII are not treated the same.
  3. Under the Online Safety Act, user-to-user services (eg social media) must prevent user exposure to such content and minimise the duration of any such material being present.

Search services guidance regarding CSAM

  1. Search services must employ systems to reduce the risk of encountering CSAM.
  2. The real differences between CSAM and other content that is illegal (for the purposes of the Act) can be seen, for example, in Ofcom’s draft Code of Practice on Illegal Harms. It provides that search engines should ensure that CSAM URLs are deindexed based on a list produced by an expert body and which is regularly updated (ie the IWF lists).[2] There is detailed provision on ensuring that the body identifying the CSAM is authorised to do so, and the lists and information is kept up to date.
  3. There is no such provision in relation to material such as NCII.
  4. Nonetheless, this shows what sort of measures and processes are required in order to ensure that search engines de-index and other platforms take similarly strong action.

User-to-user guidance and CSAM

  1. Similarly, the Ofcom guidance sets out the detailed obligations of user-to-user services (eg social media) in relation to CSAM in Annex 8 of its Illegal Content Guidance such as hash matching and detecting CSAM.[3]
  2. The guidance in relation to other forms of content such as NCII are different and do not include the same processes as for CSAM.
  3. In relation to NCII, Ofcom’s guidance on those specific offences applies and that states that services have to make an illegal content decision in relation to each item of content and decide whether there are reasonable grounds to infer that a criminal offence has taken place, and then take the necessary proportionate steps to reduce individuals encountering such material and to remove it where identified.
  4. It is technically possible that two different services providers could take different views on whether or not an item of content meets the threshold of ‘reasonable grounds to infer’ that a criminal offence has taken place in relation to that content. A victim could theoretically be faced with two platforms making different decisions. This would not happen in relation to CSAM.

 

Google’s obligations and (incorrect) comparisons to CSAM

  1. At qu 113, the Chair reported that Google had stated that it downlisted NCII, rather than removed it and the Minister said that the Online Safety Act means that platforms will have to treat NCII as CSAM.
  2. This is not the case based on the current draft Ofcom guidance. See above.
  3. To clarify, under the Act, Google, as a search engine, has obligations to use ‘proportionate systems and processes designed to minimise the risk of individuals encountering search content’ that is priority illegal content and other illegal content of which they have been made aware (section 27).
  4. It is not yet clear exactly what will constitute a proportionate response to minimising encountering NCII.
  5. However, the material is not of the same nature as CSAM where Google will remove all such material both proactively and in response to reports. See above that the obligations regarding CSAM are set out in the detailed Guidance from Ofcom.
  6. It appears that Google is suggesting that if there a process by which specific items of content were classed as illegal, it would then take steps to treat that content as similar to CSAM. That requires a process, outlined above (and as detailed in the Guidance regarding CSAM), that designates specific content as illegal for the purposes of ensuring it is removed by platforms.

 

What does this mean for non-compliant websites?

  1. If they are subject to the Online Safety Act, they have an obligation to prevent users encountering NCII and to remove it where they have reasonable grounds to infer that a criminal offence has been committed in the sharing of the content.
  2. If the platform is subject to the Act, and there are circumstances where there is a conviction relating to the non-consensual distribution of content, there would be clear grounds for saying that the obligation on the platform to act under the Online Safety Act is engaged.
  3. So, non-compliant platforms should take steps to remove material, if they are within the scope of the OSA.
  4. But, if they refuse to act, we are left considering what are the enforcement options.

 

Ofcom powers under the Online Safety Act were not designed for these cases

  1. Ofcom powers are wholly inadequate for this purpose.
  2. The Act and powers of Ofcom were not designed with this problem of individual pieces of content in mind. The Act focuses on general systems and processes.
  3. The powers of Ofcom, such as business disruption orders, were designed as exceptional (hence notification to the Secretary of State) and are at the end of a lengthy, bureaucratic process.
  4. These powers cannot deal with the ongoing issue of thousands of images across multiple websites. This is not a one-off situation that could be settled, even if this lengthy process produced a result of a business disruption order against ISPs to block non-compliant websites.
  5. Even if, one day, a business disruption order was made, the next day, the content might be uploaded to a different non-compliant website and the whole – long (and expensive) – process would have to start again. 
  6. Even if this process was effective, would Ofcom be able to prioritise such cases? And on an on-going basis?

 

Do we need to make NCII ‘illegal’?

  1. As noted above, even following a criminal conviction, the content itself is not illegal. But there is no straightforward way to make such content ‘illegal’. The corollary is CSAM where possession is an offence. That is more straightforward because what constitutes CSAM is clear (ie involving under-18s).
  2. This is not the case with NCII which requires a determination of non-consent and so is not clear from the face of the content.
  3. The sharing of the material – the behaviour – is illegal. But not the content itself.
  4. Therefore, there is no way to make it ‘illegal’ other than through a process by which the content is designated as ‘illegal’.
  5. This might be through a court order at the time of a conviction.
  6. Or it might be through a civil process (such as that referred to elsewhere in this evidence submission) with a court determination that the content is illegal and further orders are made to remove/delete the material.
  7. Then, there needs to be a process by which the designated illegal material is then ordered to be removed from websites. This could be via the civil process identified already. That order could be made against a non-compliant platform.
  8. If that platform refused to comply, a court order could be made to ISPs to block the website.
  9. Note that ISPs are not part of the Online Safety Act at present.
  10. Or, a different administrative process is put in place, such as following the court order stating that ISPs should block the content, notifying a body such as the Revenge Porn Helpline (co-operating with the Internet Watch Foundation) that the material has been designated illegal, the website has refused to remove it, and so ISPs must block it. The IWF would then add the website to their list of material to be blocked.

 

Proposals to criminalise the creation of sexually explicit deepfakes

  1. The Ministers said that they are currently considering proposals to criminalise the creation of sexually explicit deepfakes.
  2. Any such provision must be comprehensive if it is to be effective in terms of redress for victims (no loopholes and easy defences).[4]
  3. The law should be consent-based, focused on the harms to victims, rather than the motives of perpetrators. If there is a requirement to prove specific motives, such as sexual gratification or intention to cause harm, not all forms of creation will be covered.
  4. If motives are required to be proven, it will be relatively straightforward for a perpetrator to claim that the material was created for artistic purposes or for humour. We know from experience of the non-consensual sharing offence that these defences are easily made and difficult to challenge, and they inhibit further police investigations. Often, even if one thinks that someone shared an image, or created a deepfake to cause distress to another, there is no actual evidence that that is the case and so prosecutions are not taken forward.
  5. The message being sent by the criminal law is that it is wrong to make a sexually explicit deepfake without consent; not that it’s only wrong if you can prove they did it for certain specific motives.
  6. Further, we know that motive thresholds are difficult to prove and inhibit police investigations (which is why they were removed from the non-consensual sharing offence). A consent-based approach aligns with the current non-consensual sharing offence and is a more appropriate basis for education and campaigns to change attitudes.
  7. The definition of intimate image should be the same as in the other intimate image abuse offences. The previous Government’s proposal was more limited and would have excluded images such as where nipples are pixelated or have emojis over them.
  8. Solicitation should be clearly covered. While some cases of solicitation might be covered by the current law, this is unlikely, particularly if the other party is out of the jurisdiction. Including this specifically will also make this element of the law more well-known and therefore a better deterrent.

 

The law on non-consensual taking of intimate images requires reform

 

  1. The Criminal Justice Bill under the previous government also included provisions to introduce a consent-based offence prohibiting the taking of intimate images without consent. This fell with the General Election.
  2. The provision implemented one of the Law Commission recommendations (which followed recommendations from previous research and civil society) and would make the law much clearer and more straightforward.

 

Inadequacy of current civil law and options for reform

 

  1. Recommendations to strengthen the civil law have been made by experts in the field for many years including myself and colleagues[5] and also form part of the campaign by the End Violence Against Women Coalition, #NotYourPorn and GlamourUK to tackle image base abuse.[6]

 

Why are civil orders/options required?

 

Victims’ rights to justice and redress

  1. Civil remedies recognise victim-survivors’ desire for avenues to support and redress beyond the criminal law.
  2. It gives the ability to take fast, effective, and at times pre-emptive action to have images removed and limit further distribution with minimal additional stress to victims.
  3. It addresses the borderless nature of online distribution channels by targeting both content hosts and individuals who share images without consent.

Failings of the criminal law and criminal justice system

  1. The criminal law is largely failing victims and there is little chance of that changing in the near future, even if there are some improvements in training and guidance.
  2. It is not clear why all the women and girls now being victimised should not be able to be seek justice and redress now, but instead be told they have to wait until the whole criminal justice process has been transformed before they secure some measure of justice.
  3. Understandably, not all victims want to report to the police. They deserve additional justice and redress options. 
  4. The current law does not provide the needed orders and options for victims of these specific crimes.

Requirement for distinct powers relating to image based abuse

  1. For non-consensual sharing, victims want the images removed asap. Current laws are inadequate for this purpose.
  2. Image based abuse is different from many other physical sexual offences. This is why a holistic, comprehensive approach is needed recognising the distinct, online and tech-facilitated nature of image based abuse.

Need to strengthen civil law

  1. The current civil law options are confusing, complex and largely inaccessible due to the cost and need for specialist legal advice.

 

What are the current civil law options for image-based abuse?

 

  1. There are a range of civil law options currently available for victims of image based sexual abuse, as mentioned by Ministers in their evidence to the Committee on 20 November 2024, such as misuse of private information, defamation and similar (qu 120). There are also other rights under copyright law and GDPR.
  2. However, these actions are inaccessible to all ordinary people.

 

What are the problems with the current civil law options?

 

Piecemeal laws that do not cover all circumstances of image based abuse

  1. While there are a range of civil actions that an individual could bring, as well as rights under GDPR rules, these are piecemeal. They do not cover all cases of image based abuse; that is not what they were designed for.

Confusing and complex law that requires specialist and expensive legal advice

  1. Tort law in this area can be confusing, especially working out which torts, or other civil claims such as copyright and GDPR, are applicable in each particular case.
  2. Specialist legal advice to required understand the law; and even amongst civil lawyers, there is a lack of knowledge of image based abuse specifically. Finding a lawyer with this specific expertise is not straightforward.

Civil laws largely unknown by victims and the public

  1. As the laws are piecemeal and confusing, there is little general knowledge about these possibilities among victims, support organisations, lawyers themselves.
  2. It could be that more is done to raise awareness, but due to the complexity of the law, that itself would be a challenge. Also, due to the inaccessibility of current remedies (need for specialist legal advice and large funds to be able to pay court and legal fees), there is not a strong case for mounting such a campaign.

Expensive and therefore inaccessible for most people

  1. The current civil law is simply not an open to ordinary people. There is no legal aid. The costs of legal action is seriously expensive in light of court fees (potentially in the region of £10k or more) as well as legal fees.

Largely impossible for a victim to take action, even if supported

  1. Because the civil law options are confusing, expensive and largely unknown, it is simply not an option for most victims.

 

What could be done?

  1. Follow practice of UK’s Protection of Harassment Act 1997 in introducing a statutory civil offence (to sit alongside criminal offences).
  2. Follow best practice in other jurisdictions such as British Columbia, Canada by introducing comprehensive approach covering civil and criminal laws.[7]

 

Enact a statutory civil law regime

  1. Introduce statutory civil offences relating to image-based abuse
  2. Detail the civil law orders that an individual could apply for, such as

 

Introduce an easy to access, online, swift court process

  1. This has been done in other countries.[8]
  2. There is no reason why we cannot do this if we want to make a difference in this area and offer justice to victims.
  3. We already have a range of online court and tribunal processes for different issues. If we take this issue seriously, and want to give victims options, we need to take these steps.

 

Police failings and training

 

  1. The policing response needs to ensure that image based abuse does not fall between the gaps of current training and guidance.

 

  1. Many the failings of the police, identified by organisations such as #NotYourPorn and the Revenge Porn Helpline, are similar to those identified in Project Soteria which has worked to transform police investigations of rape and other serious sexual offences.

 

  1. Project Soteria, and the new National Operating Model, was developed through collaboration between academics, the police and those with experience of working with and supporting survivors. It is this combination that led to the changes and improvements.

 

  1. Image-based abuse is distinct and a similar approach is required to ensure that the experiences of both working with and reporting to the police are integrated into any new guidance and training.

 

  1. There is already guidance and training being developed as part of the policing response to the Angiolini Review. I was part of the research team developing this training and guidance with the College of Policing.[9] The guidance and training on non-contact sexual offences includes some forms of image based abuse such as voyeurism, taking and sharing intimate images.

 

Culturally intimate images and possible steps forward

 

  1. The Minister rightly stated that the Law Commission determined that changes to the current definition of intimate image to include a range of culturally intimate images would be complex make and so they did not recommend a change in the law.[10]

 

  1. However, there is clear evidence in the community of the harm being experienced everyday by women and girls.

 

  1.                    We could take incremental steps to begin to bring some measure of justice and support to the women who are being victimised right now. My colleagues and I made these recommendations to the Law Commission inquiry. [11]

 

Recommend adapting the Australian civil law regime

 

  1.                    Australia currently provides protection in these circumstances as part of its civil law regime, with section 44B of the Enhancing Online Safety Act 2015 prohibiting “posting an intimate image”, which is defined in section 9B to include images where:
  2.                    because of the person's religious or cultural background, the person consistently wears particular attire of religious or cultural significance whenever the person is in public;
  3.                    and the material depicts, or appears to depict, the person: (a) without that attire; and (b) in circumstances in which an ordinary reasonable person would reasonably expect to be afforded privacy.

 

  1.                    There is an exception if the defendant did not know that the person consistently wears that attire whenever they are in public.

 

  1.                    My colleagues and I have recommended to the Law Commission adopting the Australian provisions for the criminal context in England & Wales - replacing ‘consistently’ with commonly or usually – so as to include images which have the potential to cause significant harms.

 

Advantages of this approach

  1.                    It is specific to the cultural or religious context and so does not rely extending the scope of the law on ‘private’ or intimate images more generally.
  2.                    It therefore is more likely to gain support as it avoids fears over ‘slippery slope’ or over-criminalisation that may come with a broader definition.

Disadvantages of this approach

  1.                    This approach separates out some images of black and minoritized cultural groups from a more general approach to ‘intimate images’.
  2.                    The proposed law would also require various thresholds to be satisfied (which makes prosecutions and investigations more challenging) including (a) consistently wearing specific attire (b) and in public (which can be challenging to define, particularly quasi-public settings such as large parties, schools) and (c) reasonable expectation of privacy.

 

New Regulatory Body such as an Online Safety Commission

 

  1.                    I would like to clarify the recommendations regarding an Online Safety Commission (qu 130). For many years, other countries have tackled the issue of online harms – and image based abuse in particular – with regulatory bodies that are expert and focused on online safety.

 

  1.                    This recommendation has, therefore, been made in relation to image based abuse for many years, and before the advent of the Online Safety Bill.[12] Such dedicated and expert Commissions have proven to be a far more effective means of tackling images based abuse than current UK processes.

 

  1.                    This recommendation also forms part of the campaign by the End Violence Against Women Coalition, #NotYourPorn and GlamourUK to tackle image base abuse.[13]

 

Online Safety Commission, not one commissioner

  1.                    The recommendation is for an Online Safety Commission similar to those in other countries such as Ireland and the Australian eSafety Commission.
  2.                    This is different from the roles of the Victims Commissioner, Domestic Abuse Commissioner and similar.

 

The Online Safety Commission would:

  1.                    Take over the online safety functions of Ofcom
  2.                    Also, be granted additional powers to respond to individuals and deal with individual content (such as the powers of the eSafety Commission).

 

Why is an Online Safety Commission needed:

  1.                    Powers are required to support individuals, to act in relation to content (not systems and processes) and could be the body that manages matters such as dealing with non-compliant websites (given the right powers).
  2.                    A body that is expert in the field and is able and willing to prioritise these harms.

 

 

22 November 2024

 

 

 

 

 

 

 

9


 


[1] https://committees.parliament.uk/writtenevidence/130477/pdf/

 

[2] Ofcom, Consultation: Protecting people from Illegal Harms Online (9 November 2023), Annex 8, recommendation 4G on deindexing CSAM. See discussion in McGlynn et al (2024). Pornography, the Online Safety Act 2023 and the need for further reform. Journal of Media Law, 1–29. https://doi.org/10.1080/17577632.2024.2357421

[3] https://www.ofcom.org.uk/siteassets/resources/documents/consultations/category-1-10-weeks/270826-consultation-protecting-people-from-illegal-content-online/associated-documents/annex-7-draft-illegal-content-codes-of-practice-for-user-to-user-services/?v=330405

[4] For more detail on the harms of creating sexually explicit deepfakes, and the justifications for criminal sanctions, see my evidence to the Australian inquiry into deepfake sexual abuse: file:///C:/Users/clare/Downloads/Submission%206.pdf

[5] Clare McGlynn and Erika Rackley (2017) 37  ‘Image-Based Sexual AbuseOxford Journal of Legal Studies 534; Clare McGlynn et al, Shattering Lives and Myths: a report on image-based sexual abuse (2019); Clare McGlynn and Erika Rackley, Policy Submission to Law Commission Consultation on Intimate Image Abuse (2021).

[6] https://www.endviolenceagainstwomen.org.uk/wp-content/uploads/2024/06/Image-Abuse-Bill-Campaign-Policy-Asks-.pdf

[7] https://www.bclaws.gov.bc.ca/civix/document/id/complete/statreg/23011

[8] https://www.cbc.ca/news/canada/british-columbia/intimate-images-act-b-c-civil-rights-tribunal-how-to-remove-images-1.7096179

[9] https://assets.college.police.uk/s3fs-public/2024-08/Evidence-review-sexual-exposure-contact-sexual-offending.pdf

[10] https://cloud-platform-e218f50a4812967ba1215eaecede923f.s3.amazonaws.com/uploads/sites/30/2022/07/Intimate-image-abuse-final-report.pdf

[11] See discussion in the Law Commission report and my evidence: https://www.claremcglynn.com/_files/ugd/e87dab_b3a67112fc76434dba953514053c8152.pdf

[12] See report Clare McGlynn et al, Shattering Lives and Myths: a report on image-based sexual abuse (2019).

[13] https://www.endviolenceagainstwomen.org.uk/wp-content/uploads/2024/06/Image-Abuse-Bill-Campaign-Policy-Asks-.pdf