Written evidence from Professor Clare McGlynn, Durham University [IIA0014]
Published evidence submitted by Professor McGlynn to this inquiry.
IIA0003
IIA0005
IIA0012
Justifying a consent-based offence of creating sexually explicit deepfakes
Professor Clare McGlynn KC (Hon), Durham University, January 2025
Key points
- Inconsistencies, confusions and wrong priorities
- Why criminalise the voyeuristic taking of an image – including where victim is not aware and there is no onward distribution of images – but not the same in digital form?
- Why criminalise possession of a bestiality image, or representation of rape, but not the creation of a sexually explicit deepfake made with a real image of a real woman and without her consent?
- Why criminalise taking an image of a woman nude on a bed without her consent, but not an image of her originally clothed but made nude using AI?
- Why reform the law on taking and sharing to be consent-based because the previous law based on motives was ‘unsatisfactory’, but now re-introduce those same problematic definitions?
- Why prioritise men’s rights to create non-consensual sexually explicit deepfakes, over women’s rights to control their sexual lives and integrity and right to live free from fear of distribution of intimate imagery?
- Problems with Government’s proposed motive-based offence:
- Inconsistency and confusion will result if the creation offence is based on motive thresholds, reversing progress made to end ‘patchwork’ of offences
- Law Commission review of intimate image abuse offences concluded that motive thresholds were ‘unsatisfactory’ as they failed to cover all contexts of abuse and were difficult to evidence.
- Distress/sexual gratification fails to capture the full range of reasons for non-consensual creation of sexually explicit deepfakes creating loopholes in the law.
- Evidencing motives is a known problem which led to reform of the law on distribution
- Red herring of concerns with criminalising children in light of CPS guidance
- Creation offence enhances human rights protections
- This is a legitimate restriction on Article 10 rights to freedom of expression in order to protect Article 8 rights to privacy, sexual integrity and reputation of victims, and prevent the disclosure of information received in confidence. Sexually explicit deepfakes are known as ‘digital forgeries’ in US emphasising false nature.
- Women’s freedom of speech is infringed by online abuse and the ever-present threat of deepfake sexual abuse.
- Laws on possessing extreme porn have been declared ‘human rights enhancing’ by Parliament’s Joint Committee on Human Rights and the case regarding sexually explicit deepfakes is stronger.
- Justifying a comprehensive, straightforward, consent-based creation offence:
- Digital voyeurism: creating sexually explicit deepfakes is a new form of voyeurism and should be similarly criminalised
- England will be falling behind best practice as other jurisdictions have introduced comprehensive creation offences
- Significant harms of deepfake creation including it being experienced as a sexual assault, violation, an implicit threat
- Not ‘just’ a sexual fantasy but creation of a digital file that can easily be shared by accident, malice, hacking.
- Definition of intimate image in proposed offence introduces new inconsistencies across the law and is likely to exclude nudes with emojis or black strips over nipples rendering the law open to ridicule.
Law Commission concludes motives thresholds as ‘unsatisfactory’
- Problems with proving specific intent/motives of perpetrators : The original ‘revenge porn’ provision in section 33 of the Criminal Justice and Courts Act 2015 made it an offence to distribute an intimate image without the consent of the person in the image and where the perpetrator acted with intent to cause distress. The requirement to prove this specific motive gave rise to many problems:
- Distress requirement failed to cover the range of motivations and purposes: Images are distributed without consent for many reasons including sexual gratification, humour, financial gain, to gain status (kudos/group bonding), and for no discernible reason. Therefore, there were many gaps in the law particularly where men were trading and sharing images in groups (‘collector culture’) and where shared for financial gain.
- Challenge of evidencing intention to cause distress: Even if the intimate images were thought to be distributed to cause distress, it was very difficult to prove this was the case. That is, there was simply no evidence to support such a claim/assumption. Or, the evidence was deemed insufficient/inconclusive.
- Additional hurdle/threshold meant police/prosecutors did not pursue cases: There was considerable evidence that the additional hurdle of proving motives led the police to drop many cases.
- Law Commission concluded motive requirements were ‘unsatisfactory’
- According, when the Law Commission reported on intimate image abuse in 2022, it recommended a new legal framework that created a ‘base’ offence based only on non-consent, removing the motive requirements.
- The Law Commission explained the problems with the motive threshold:
The conduct causes harm to the victim, regardless of the motivation, or the lack of it.
We have heard consistently that this makes the current offences too limiting.
First … Intimate image abuse is perpetuated for many reasons, and sometimes for no clear reason at all. Intimate images may be taken for a joke, shared in exchange for different images in return, for financial gain, to gain social status, or simply because someone felt like it. The conduct causes harm to the victim, regardless of the motivation, or lack of it.
Secondly, it means that prosecutions may fail, or not even be started, if it is difficult to provide evidence of a specific intent. Even if a perpetrator admits to the act of non-consensual taking or sharing, if it cannot be proved that they did so to cause distress or obtain sexual gratification, they will not be prosecuted.
This is unsatisfactory.
- Reform in Online Safety Act 2023: Following the Law Commission recommendations, the Online Safety Act 2023 reformed the law, creating a new ‘base’ offence in section 66B of the Sexual Offences Act 2003 of non-consensual distribution of intimate images, with no requirement to prove specific motives. There are additional offences (with higher penalties) where the prosecution prove intention to cause distress or sexual gratification.
- Labour Government plans for consent-based taking offence without motive threshold: The Government has already announced its plan to introduce a consent-based taking offence (not requiring proof of motives). This is also based on Law Commission recommendations that the existing law was overly complex and failed to criminalise all harmful actions. There will also be more serious offences.
Problems with Government’s proposed criminalisation offence
- While it is very welcome that the Government are acting swiftly to introduce a creation offence, there are a number of concerns with the current amendment both introducing inconsistencies with the current approach:
- Requires proof of specific motives of intention to cause distress, alarm or humiliation, or sexual gratification; and
- More limited definition of ‘intimate image’.
Inconsistency and confusion in the law if creation offence based on motive thresholds
- One of the key drivers behind the Law Commission’s recommendations was to simplify the law and end of the ‘patchwork’ of confusing, piecemeal provisions which had many gaps. It concluded that the ‘current legal framework comprises a patchwork of offences and does not appropriately or effectively deal with intimate image abuse’ (para 2.79).
- The law failed to understand the infinite variety of ways in which image abuse is perpetrated. Victims, police and prosecutors found the law confusing and overly complex resulting in few prosecutions.
- If the Government introduces its proposed creation offence, it will be inconsistent with the laws on taking and sharing, introducing new complexities into the law.
Distress/sexual gratification does not capture the range of reasons for non-consensual creation of sexually explicit deepfakes
- Humour/banter: many nudify apps and deepfake porn websites are marketed as humorous and for entertainment. This provides an easy justification for perpetrators.
- Group status/prowess ‘collector culture’: The group-based, collector culture of deepfake abuse must be understood and is not captured by these motives. Men may produce sexually explicit deepfakes in anticipation of future sharing to communities of users valorising deepfake sexual abuse. They do so not to cause distress to a specific victim (who they either may not know and/or hope they do not find out). They are part of communities bonding over their ability to produce this material (and also why criminalising solicitation is vital).
- community contribution and art: Many deepfake communities and deepfake apps/websites characterise their activities as artistic, again providing an easily identifiable justification (see research here).
- Financial gain: Many men provide services of creating deepfakes (also why criminalising solicitation is so vital). If they distribute these images, an offence has been committed. But, we should also target the creation, to try to reduce the instances of non-consensual distribution and to stop the harm at source.
Problems with evidencing motives
- Even if police consider distress/sexual gratification motives may be at play, they will be challenging to evidence. The police and prosecutors cannot just assume that someone did this to alarm someone else or for sexual gratification, there must be identifiable evidence to support the prosecution.
- We know from previous experience that there simply is often not such evidence.
- ‘joke to humiliate’: The Government has stated that its proposed offence will cover where someone creates a deepfake as a ‘joke to humiliate’ someone. How will this be evidenced? Is every ‘joke’ an example of humiliation? If it is to humiliate, is it even the case that this is a joke?
Red herring of concerns with criminalising children
- The Government has stated that it is concerned with criminalising children. This fails to recognise that there are detailed, and well-known police and CPS guidance protects against unnecessary prosecutions of children.
- The Law Commission rejected this argument about young people when justifying a consent-based intimate image abuse offence. The Law Commission stated:
We have carefully considered the arguments and conclude that it is appropriate for the intimate image offences to continue to apply to perpetrators and victims of all ages. The criminal justice system, in particular the youth justice system, is designed to respond to the risks associated with children being criminalised.
Cases involving children are only prosecuted where there is a public interest in doing so (suitably robust prosecutorial guidance would help ensure this).
Freedom of expression and Article 8 rights to privacy
Creators’ Article 10 right to freedom of expression may legitimately be limited
- The right to freedom of expression may be restricted where the action is lawful, necessary and proportionate in order to (a) prevent disorder or crime, (b) protect the rights and reputations of other people and (c) prevent the disclosure of information received in confidence.
- The rights of creators’ can justifiably be restricted as they are adversely impacting on the victim’s Article 8 rights to privacy and reputation by creating false and misleading images/representations, often using material obtained in confidence.
- Defamation analogy and creation of ‘digital forgeries’: freedom of expression does not extend to protecting false statements protected by defamation laws. Producing sexually explicit deepfakes creates false statements that risk damaging reputations and causing significant harm. Draft US federal legislation on deepfakes refers to this material as digital forgeries conveying clearly the false nature of them.
- Privacy and data protection: freedom of expression does not protect against unauthorised use of private data, including our images. The non-consensual creation of deepfakes uses personal data (images and videos) without agreement.
- Pornography is low-level speech: Pornography does not attract the same levels of protection as other forms of speech, such as political speech. Obscenity and pornography are low-level speech that attracts reduced obligations and the countervailing interests – women’s privacy and autonomy – are significant.
- Victim’s Article 8 rights to privacy: Creating sexually explicit deepfakes violates a victim’s right to privacy and sexual integrity, and relate to a particularly intimate aspect of a person’s private life, their sexual lives. The European Court has particularly emphasised the significance of protecting an individual’s sexual life and choices regarding how they present themselves sexually. It has also emphasised the significance of intrusions through imagery.
Criminal laws on extreme pornography are ‘human rights enhancing’
- In 2014, Parliament’s Joint Committee on Human Rights justified criminalising the possession of extreme pornography on the basis that it is ‘human rights enhancing’, protecting the rights of women and girls to live free from violence and abuse. In particular, the Committee noted that:
- We consider that the cultural harm of extreme pornography, as set out in the evidence provided to us by the Government and others, provides a strong justification for legislative action, and for the proportionate restriction of individual rights to private life (Article 8 ECHR) and freely to receive and impart information (Article 10 ECHR).
- These justifications apply even more strongly to the creation of sexually explicit deepfakes which involve false depictions of another person that are created without consent.
Deepfake sexual abuse infringes women’s rights to freedom of expression
- The UN Special Rapporteur on Freedom of Expression and others has affirmed that online gender-based violence is “proliferating with the aim of intimidating and silencing women”. There are many UN and other international statements recognising online abuse and deepfake sexual abuse as silencing women online and inhibiting their freedom of expression. Legislative action is justified to protect women’s rights to freedom of expression online.
Justifying criminalising creation of sexually explicit deepfakes
Digital voyeurism: creating sexually explicit deepfakes is a new form of voyeurism and should be similarly criminalised
- It must be remembered that the current law (and Government’s proposals) criminalise voyeurism, namely taking an intimate image of someone without their consent. This applies regardless of whether the victim is aware of the conduct, and even where there is no onward distribution.
- The production of the voyeuristic image is a breach of privacy and violating nature of the conduct is sufficiently harmful to be criminalised.
- Creating sexually explicit deepfakes is the new, modern version of voyeurism – digital voyeurism. The perpetrator creates the image, and the harm is inherent in the non-consensual violation of privacy and dignity, even if the image is not further shared and regardless of the motives.
England will be out of step with other jurisdictions that have introduced comprehensive criminalisation offences
- A creation offence has already been introduced in Texas, the Netherlands, and the Australian state of Victoria. None of these offences are limited to only certain motivations or contexts. If English law is limited, we will not be at the forefront of developments in this area, but introducing a more constrained, limited offence. Denmark has also announced plans to introduce a creation offence. Korea is introducing laws criminalising possession. More detail available on different jurisdictions in a briefing here and in an academic article here.
Significant harms of deepfake creation
- non-consensual creation violates a person’s sexual autonomy and dignity. The creator decides how the person appears, talks, and acts in sexually explicit ways that deeply affects an individual’s personal integrity and dignity.
- Victims experience the creation of sexually explicit deepfakes as a sexual assault
- Creation of sexually explicit deepfakes is experienced as a threat: the image is the threat. A digital image/video can instantly and easily be shared whether accidentally, maliciously, or through hacking.
- Threat of being deepfaked pervades the lives of women and girls: The easy creation and prevalence of deepfake image-based abuse material is experienced as an ever-present threat pervading women’s lives, with young women in particular expressing a palpable sense of despair at the lack of control over their identities and sexual autonomy.
- Pervasiveness of deepfake image-based abuse is a form of cultural harm to the community normalising and legitimising non-consent
- Deepfake image-based sexual abuse as part of a wider pattern of offending such as where the man convicted of plotting to murder and rape Holly Willougby was found to have sexually explicit deepfakes of her in his possession when planning his offending.
Not ‘just’ a sexual fantasy but creation of a digital file
- Creating a sexually explicit deepfake is entirely different from a drawing, idea in one’s head or where someone cuts out images and sticks them on nude bodies (this would not be a realistic image within the legislation).
- It is creating a digital file that could be shared online at any moment, deliberately or through malicious means such as hacking. The production of deepfake sexual abuse material creates a clear risk of harm. It is experienced as a threat, even if no direct threat is made to an individual.
- In any event, it is not clear why we should privilege men’s rights to sexual fantasy over the rights of women and girls to sexual integrity, autonomy and choice. This is non-consensual conduct of a sexual nature. Neither the porn performer nor the woman whose image is imposed into the porn have consented to their images, identities and sexualities being used in this way.
Inconsistent intimate image definition open to ridicule
- The proposed definition of intimate image includes images showing ‘exposed breasts and genitals’.
- The Law Commission determined that ‘exposed’ did not sufficiently clearly cover images where emojis or black strips were placed over nipples (see for example para 3.146).
- This is precisely why the definition of intimate image was changed in the Online Safety Act, which now includes at section 66D(6)(c) and explanation that intimate image includes these cases: ‘where all or part of the person’s genitals, buttocks or breasts would be exposed but for the fact that they are obscured, provided that the area obscured is similar to or smaller than an area that would typically be covered by underwear worn to cover a person’s genitals, buttocks or breasts (as the case may be).’
- This is a real issue and these images are produced. Excluding them from the law opens the law up to ridicule.
- Two separate definitions of intimate images revives the confusing nature of the law going against the grain of recent reforms.
February 2025