Written evidence submitted by #NotYourPorn, Chayn, Dr Fiona Vera Gray, End Violence Against Women Coalition (EVAW), Faith & VAWG Coalition, Glitch, Imkaan, Professor Clare McGlynn, Rape Crisis England & Wales, Refuge, Welsh Women’s Aid, Women & Girls Network (WGN), Women’s Aid Federation England

 

 

 

Joint VAWG Sector Principles for the Online Safety Bill

 

Organisations and experts in Violence Against Women and Girls who have contributed to and support this briefing: #NotYourPorn, Chayn, Dr Fiona Vera Gray, End Violence Against Women Coalition (EVAW), Faith & VAWG Coalition, Glitch, Imkaan, Professor Clare McGlynn, Rape Crisis England & Wales, Refuge, Welsh Women’s Aid, Women & Girls Network (WGN), Women’s Aid Federation England

 

  1. Introduction

 

The online world is a critical area where abuse of women and girls is very real, is increasing and needs specific naming and commitments. The Online Safety Bill in its current form omits reference to Online VAWG despite the government’s commitment to a VAWG analysis to online and offline offending in the recently published Tackling VAWG strategy. The legislation must ensure that tech companies are being held accountable for enabling and facilitating online harms to be perpetrated on their platforms. While we must not lose sight of the individual perpetrators of said harms, this legislation is an opportunity to require tech companies, and the regulator, to interrogate the relationship between perpetrators of Online VAWG and the platforms they use in and to create a system of accountability and safety. This is instead of what we often see at present where the onus lies with individual users to keep themselves safe by changing their behaviour and/or coming offline. This legislation should be the first step in committing to a future where women and girls are able to navigate the online world to learn, work, communicate and grow, free from the threat of Online VAWG.

 

As organisations and individuals that are experts in Violence Against Women and Girls our joint principles call for the introduction of an Online Safety Bill that:

 

        Recognises and names Online VAWG

        Has an intersectional understanding of Online VAWG and its impacts

        Supports the liberty and freedom of expression of women and girls

        Creates a consent based culture

        Provides for specialist support

        Requires tech companies to be transparent and proactive in preventing and tackling Online VAWG - including meaningful sanctions for perpetrators

        Mandates an effective and robust regulator

        Requires safety by design

        Is future proofed

 

  1. Key Recommendations

 

The following are key recommendations that our organisations believe are necessary to ensure the Online Safety Bill tackles online VAWG in all its forms:

       Recognition of online VAWG as a specific harm in the Bill, with an accompanying Code of Practice developed in consultation with the VAWG sector to set clear expectations for how online VAWG cases are investigated and clear, consistent online VAWG reporting standards for platforms. The definition of online VAWG in the Bill must recognise the intersecting ways abuse can affect different women and girls.

       Meaningful engagement by Ofcom in their policy development with the specialist ‘by and for’ led sector in recognition of the wider impact of online harms on women who experience the material and lived reality of structural inequality and discrimination. This recognition must also be reflected in the definition of “an adult of ordinary sensibilities” which should be considered within a trauma informed, intersectional and gendered framework.

       Inclusion of all forms of image-based sexual abuse as harmful within the Bill with commercial porn websites specifically named in the Bill and subject to a higher level of scrutiny by a regulator that is empowered to issue take down notices. Criminalisation of image based sexual abuse offences must not include a motivation requirement and anonymity must also be automatically granted to all victims.

       10% of the revenue raised from the Digital Services Tax ring-fenced to fund specialist VAWG sector efforts to effectively address online VAWG, with 50% ring-fenced for specialist ‘by and for’ led services for Black and minoritised women and girls. Alternatively, 5% of any fines levied by Ofcom to be directed to funding specialist VAWG sector support services, and for 50% of this amount to be specifically ring-fenced for specialist ‘by and for’ led services supporting Black and minoritised women and girls.

       Transparency reporting to include a separate VAWG category and a requirement for tech companies to be more transparent about their content moderation and allow trusted research institutions and civil society organisations to access anonymised and disaggregated data about content removals, complaints, appeals process and sanctions imposed.

       A robust, effective and proactive regulator with the power and resource to order the take down of image-based sexual abuse and harmful content and provide an effective challenge to cross-industry tech companies.

       A new high level principle requiring a company to take into account, and to address and reasonably mitigate against potential harms, alongside ethical frameworks and online harm analysis when designing software and devices that ensures the default setting is safest for all potential users to navigate, and the least likely to nudge and encourage users into participating in forms of online hate or abuse. Any mechanisms and settings for managing and reporting content must be accessible and appropriate for all disabled people.

       A commitment to ‘future-proofing’ in the area of online harms including online VAWG regulation, to ensure that in particular the ever growing use of AI and other ways in which online harms will be perpetrated in future are within scope of the emerging policy and the regulator’s powers.

 

  1. Explicit Reference to Online VAWG and recognition of harms

“Going through all of this has had a profound impact on my life. I will never have the ability to trust people in the same way and will always second guess their intentions towards me. My self-confidence is at an all time low and although I have put a brave face on throughout this, it has had a detrimental effect on my mental health.” - Ellesha, survivor of image-based sexual abuse

The Bill should include explicit recognition of Violence Against Women and Girls (‘VAWG’) in all its online forms. Online VAWG refers to acts of violence or abuse that we know disproportionately affect women and girls and is a wide and ever growing set of behaviours, as perpetrators use the online world in order to perpetrate harm in addition to the online space being a context where new forms of VAWG have been created.

 

Online VAWG includes but is not limited to online stalking, online harassment including sexual harassment, grooming for sexual purposes, online threats and abuse including rape threats, domestic abuse perpetrated online also known as tech abuse, doxxing and image based abuse, which includes child sexual abuse and sexual exploitation.

 

Image-based abuse includes:

        Image-based sexual abuse[1]: all forms of taking, making and sharing nude or sexual images without consent, including threats to share and altered images

        Intimate image abuse[2]: the taking, making or sharing of nude, sexual or other intimate images without consent, including threats to share and altered images such as deepfakes. This latter definition incorporates images such as those taken of women and girls without their expected religious or cultural attire.

 

All these forms of abuse should be recognised as related to one another because they have common drivers: women’s and girls’ persistent inequality, and other inequalities which intersect with this. These include but are not limited to the particular misogynistic racism sometimes referred to as misogynoir[3] targeted at Black women online, who research shows receive the most abuse from strangers on social media[4]. Image-based abuse against women and girls, and other forms of online VAWG, takes place in the context of gendered and social norms that can reinforce harmful stereotypes and gender inequality. The ‘Her Net Her Rights’ report found that women are 27 times more likely to be harassed online than men.[5] Amnesty International states that 1 in 5 women in the UK have been subject to online harassment or abuse. Refuge’s report The Naked Threat evidenced that 1 in 7 young women have experienced threats to share their intimate images or videos[6].

 

Online VAWG should be understood as part of the wider continuum of Violence Against Women and Girls, with the online space as a context for VAWG being perpetrated offline as well. Research by Women’s Aid found that 85% of women who experienced online abuse from a partner or ex-partner said that it was part of the pattern of abuse they also experienced offline. Refuge’s internal stats also show that the most common issue reported to their tech abuse team relate to online security and stalking.[7] Essentially it is a new context for persistent and enduring forms of VAWG. We see this in the way that public sexual harassment has extended into the online space.

 

The online world has also created new forms of VAWG, an example being image based abuse. Research from Refuge found that 83% of women who had experienced threats to share their intimate images from a current or former partner experienced other forms of abuse, including over a quarter who experienced sexual abuse. Latest data on image-based sexual abuse showed that 82% of prosecutions were flagged as being domestic abuse-related[8], further emphasising how online abuse operates as part of the continuum of VAWG.

 

“I was in counselling from May 2020 – February 2021. The police investigation ended in December 2020. Throughout those 10 months, I battled with depression, fear and anxiety. I am still dealing with the aftermath of my trauma and experience some PTSD symptoms.” - Georgie, survivor of image-based sexual abuse

 

The harms of online VAWG are severe, myriad and cumulative. The impacts include mental, physical and psychological trauma. The impacts also relate to a loss of opportunities and access; to education, work, communities, support and information, both online and offline. The effects can be long lasting, wide-ranging and have been described as “shattering lives”. [9]

 

VAWG is an established, and important, lens by which the disproportionate harm directed towards women and girls can be understood, measured and tackled. The Online Safety Bill should align with the Government’s Home Office led Tackling Violence Against Women and Girls Strategy and Action Plan in recognising the specific ways in which women and girls are targeted, alongside other inequalities for example relating to ethnicity, age, sexual orientation and/or gender identity, disability, immigration status and more. The Bill should also align with the Home Office led Tackling CSA Strategy, with prevalence studies in England & Wales suggesting that some 15% of girls experience some form of sexual abuse before the age of 16.[10]

 

The UK government is a signatory to a number of international treaties and conventions which identify VAWG and commit to tackling it. International and domestic law and policy around prevention of violence against women and girls (e.g. the Home Office Tackling VAWG strategy and the Istanbul Convention) recognise that policy should be joined-up and that tackling VAWG necessitates a commitment from all policy areas.

 

“We didn’t feel supported by the police but we also didn’t feel like the law supported the police to get a prosecution either.” – Ruby, survivor of image-based abuse

 

The current systems are failing women and girls - in terms of the responses from social media platforms as well as criminal justice system approaches. Refuge’s specialist tech abuse team report that platforms take weeks and weeks to reply to requests and reports of online VAWG, and often they fail to respond at all. Where they do, they frequently don't understand violence against women and girls and say content doesn't breach their community standards. In the few cases where action is taken this is usually limited to removing content rather than more effective measures like removing perpetrators from sites. Women supported by Refuge also say that they are advised by police to come offline as the ‘solution’ to the online abuse they are experiencing. 

 

‘My ex would post horrible things, threatening things like “Tell [NAME] I’m coming for her.” Sending me loads of private messages. He hacked into every single social media account I had and then changed my passwords. He would contact me through LinkedIn and my PayPal account with messages, hundreds of messages. If (my employer) posts anything on Facebook, he will comment on there.

 

I reported to Facebook, and they just come back with ‘you can block this person’s account.’ A lot of the time you go to report things and they (online platforms) don’t really do much. I was frustrated that there wasn’t any action. I ended up deleting my Instagram account and my LinkedIn account, I don’t use Snapchat anymore.

 

I was in a really dark place, him constantly posting stuff - I had really bad anxiety. I’d have panic attacks and it was constant worry of what he’s going to post next. Is it going to impact my job? What’s coming next?’ - Refuge Service User

 

A regulatory approach which specifically identifies VAWG as a named online harm within the scope of the Bill is needed.

 

3.1 Recommendations

 

        Inclusion of Online VAWG as a specific harm in the Bill

        Ofcom to prepare a code of practice for providers of regulated services describing recommended steps for the purposes of preventing and prohibiting Online VAWG. This Code of Practice to outline clear expectations on the responses of tech companies to online user reports, complaints and requests as well as to civil and criminal investigations in VAWG cases, including how evidence is handled, stored and shared.

        For the Online VAWG Code of Practice to be developed in consultation with the VAWG sector. Online platforms frequently do not understand VAWG when it is reported to them so it is vital the code of practice builds in an understanding of VAWG.

        For the Code of Practice to include universal standards around reporting processes for social media sites i.e. ensuring that any processes are clearly communicated, straightforward, accessible to all users, responsive and fit for purpose.

 

  1. Intersectionality

 

Research shows that online abuse disproportionately impacts women of colour, disabled and LGBT+ people. A 2017 report by LGBT organisation Stonewall found that 10% of LGBT people experienced homophobic, bi-phobic and transphobic abuse or behaviour online in the month prior to the survey[11]. In 2020 Galop identified that 8 in 10 respondents had experienced anti-LGBT+ hate crime and hate speech in the last 5 years[12] and found that nearly 60% of the trans people they surveyed had experienced transphobia online. [13] Amnesty International also demonstrated that BME Women politicians are subject to massively disproportionate levels of abuse online. 2019 research by the disability charity Leonard Cheshire shows that reports of online disability hate crime increased from 2017 to 2018 by 33%. A recent petitions committee enquiry also found high prevalence of online hate crime against disabled people.[14]

 

The 2020 Glitch and EVAW report ‘The Ripple Effect’ found that Black and minoritised women and non-binary people were more likely to report suffering increased online abuse during COVID-19, with 38% saying that the context of the pandemic had led to increased online abuse[15].

 

In the Ripple Effect survey, gender was the most often cited reason for online abuse. 48% of respondents reported suffering from gender-based abuse; 21% of respondents reported suffering from abuse related to their gender identity and sexual orientation, followed by 18% for their ethnic background and 10% for their religion and 7% for a disability.

 

It is clear that there needs to be a gendered lens when considering online harms, but there also needs to be an intersectional analysis which recognises the multiple characteristics which can be used by abusers to direct and cause harm, for example in racialised misogyny and misogynoir.

 

The way in which harm is understood must come from a trauma informed, intersectional approach that recognises that context is crucial (for example when considering forms of image based abuse and the ways it can be perpetrated within specific marginalised communities) but that does not then fall into cultural relativism and stereotypes. This balanced and considered approach requires meaningful engagement with the specialist ‘by and for’ led sector.

 

We have concerns that the terrorism agenda within the Online Safety Bill will unintentionally create an additional level of scrutiny and monitoring on specific minoritised communities, as we have seen with the wider Prevent agenda. These are communities who already experience discrimination, over-policing and structural racism, both online and offline.

 

4.1 Recommendations

 

        A definition of Online VAWG in the legislation and Code of Practice that recognises the online space is both a context for violence against women and girls and has created new forms of violence against women and girls, and which is intersectional and recognises the multiple intersecting characteristics of women subjected to online VAWG and the harms it creates.

 

        For the recognition of harm to explicitly acknowledge the intersecting ways in which abuse can be experienced by women and girls. For Ofcom to recognise that this can compound trauma, particularly for women and girls who have previous experience of VAWG. And therefore the definition of “an adult of ordinary sensibilities” to be considered within a trauma informed, intersectional and gendered framework. This is in recognition of the fact that there is a specific gendered experience of certain harms, in that some behaviours are experienced as harmful by women in a way that men may not recognise as such.

 

        Meaningful engagement by Ofcom in their policy development with the specialist ‘by and for’ led sector in recognition of the wider impact of online harms on women and girls who experience the material and lived reality of structural inequality and discrimination.

 

        To improve user redress and advocacy, funding should be given to specialist VAWG sector organisations supporting survivors of online VAWG and working on prevention, with 50% of this funding going to ‘by and for’ led Black and minoritised specialist organisations. Funding can be provided by ring-fencing 10% of the Digital Services Tax which, according to the Office for National Statistics, raised £29 million in the first month of operation alone or alternatively by ringencing a 5% proportion of the fines levied by Ofcom.

 

 

  1. Women & Girls’ Freedom of Expression

 

“The online world is scarier now.. [I] have become very protective of my online presence. I create social media content for my job and work as an actress which means I have an extensive online presence. I am scared that one day, my ex or one of his “friends” will find a way to make my career choices impossible. That they will seek to humiliate me by sharing images of me in my virtual place of work. This is not a paranoid fear – this is a tangible possibility.” - Georgie, survivor of image-based sexual abuse

 

Women, particularly those from minoritised and marginalised communities are being silenced by Online VAWG and unable to freely use and enjoy the internet as a result. The Bill has been subject to criticism that it constrains freedom of expression however this fails to recognise that women and girls already remove themselves from online spaces, refrain from expressing their views and have to exercise a degree of “safety work”[16], that inhibits and curtails their experiences and free expression. The suggestion that the right to free expression and the rights of women and girls are in direct conflict is a false dichotomy that fails to take into account how online abuse limits the free expression of women and girls and results in them retreating from online and offline spaces.  Rather than regulation being seen as in opposition to freedom of expression online and offline, such action can and should be justified as a human-rights enhancing approach to ensuring equality of access for marginalised groups, specifically in this context women and girls. This understanding was most recently affirmed in the House of Lords report on freedom of expression online[17] which says that they  "recognise that receiving serious abuse can leave people less free to express themselves.”

 

Research published by Girlguiding[18] in 2019 showed that 33% of girls and young women aged 11-21 had received abusive comments on social media and 43% of girls admitted to holding back their opinions on social media for fear of being criticised - creating a situation where diversity is threatened.

 

As it stands, too often the ‘solutions’ currently offered by social media companies force women to do the ‘safety’ work. They rely on the individuals who have experienced the abuse to make reports and complaints and instead of being responsive to abuse or enforcing meaningful sanctions, they offer suggestions such as taking breaks from being online. This advice is echoed by the police. Refuge reports that women receive advice from police that they should come offline as a response to the abuse they are receiving. It is women who are having to change their behaviour rather than platforms proactively mitigating risk and responding to perpetrators. This is especially damaging given we live more and more of our lives online and rely on online spaces to perform everyday tasks like banking and food shopping, which is set only to increase.

 

The dangerous and pervasive nature of the ‘incel’ movement also illustrates the need for action. The tragic incidents of violence perpetrated by incels, such as recently occurred in Plymouth, highlight the link between violence online and offline. It therefore cannot be acceptable for women to be advised to just change their online behaviour. Furthermore, the grooming of young men and boys into this dangerous ideology, driven by social media algorithms[19] leading them to encounter extremist content, highlights the important role platforms have in mitigating risk. Women and girls must be free to participate in the online and offline world without fear of such violence.

 

This draft Bill does not do enough to guarantee the liberty and freedom of expression of women and girls. Establishing online VAWG within the definition will ensure this specific form of harm and the way it negatively affects all of our society will be named and addressed. Thus working towards an online world where women and girls are free to operate without the fear, silencing and reduced opportunities that Online VAWG creates, not just to specific individuals but at a cultural and societal level.

 

5. 1 Recommendation

 

        That the Bill takes a rights based approach that centres the right to access online spaces free from harassment, abuse and threats

 

 

  1. Right to live in a consent based culture

 

“It seems that the deeply ingrained societal shame and taboo around sex and our bodies allows some people to believe there is an automatic ‘green light’ to use any sexual content of an individual as a tool to attempt to ruin their life or exploit them in other ways. When I saw the images and recordings of myself, I also saw thousands of other people too. I am not the first, and I will not be the last until these views and regulations change. As time goes by the anger settles, although sadly the feeling of violation never fades.” - Madison, survivor or image-based sexual abuse

 

“I reported the situation to the police and went in search of more evidence. What I found showed a number of websites dedicated to non-consensual pornographic content and online forums designed to cater for the anonymous sharing of explicit content. It showed a trail of evidence that suggested that my situation was not a one-time act but a prolonged behaviour – my ex-partner has, in all likelihood, been engaging in non-consensual image sharing for a number of years without my knowledge.” -  Georgie, survivor of image-based sexual abuse

 

6.1 The Commercialised Porn Industry

 

A failing of the Bill is the lack of specific reference and consideration to the commercialised porn industry, which is hugely powerful and resourced, and a significant contributor to both online and offline VAWG. For example, MindGeek runs an online porn global monopoly, with over 100 pornographic websites, including Pornhub. These collect more data than Netflix.  In 2019 there were over 42 Billion visits to Pornhub, which means there was an average of 115 million visits per day.[20] Globally, the porn industry’s revenue estimates are as high as $97 billion[21]. In 2020 the website OnlyFans handled payments worth £1.7bn.[22]

 

The porn industry has very little oversight and yet monetises huge amounts of content, a significant proportion of which is user generated. It is inherently harmful in the ways it promotes gendered and racialised sexual norms that contributes to a cultural context where sexism and racism is persistent. It normalises, minimises, eroticises the absence of consent by hosting huge volumes of videos of rape, abuse and other non-consensual acts.

Porn is one of the fastest growing industries and the absence of regulation means that it has been able to set the context when it comes to the content it hosts, the safety of its users and those who generate content, and the accountability of its systems and reporting mechanisms. It is a context where what is commercially appropriate is prioritised over what is safe, and where decisions are based on profitability. 

Recent research by Dr Fiona Vera Gray and Professor Clare McGlynn[23] found that:

        1 in 8 titles shown to first time viewers of the most popular pornography websites in the UK describe sexual activity that constitutes sexual violence (whilst also being in contravention of their own - essentially meaningless - terms and conditions).[24]

        Videos featuring descriptions of sexual activity between family members was common, particularly sexual activity between immediate family members, for example: ‘Brother f**ks his sister in her sleep’; ‘When Mom’s Mad, Dad Goes To His Daughter’; ‘Daddy keeps f**king daughter till she likes it’.

        They found many titles describing coercive and exploitative sexual activity such as ‘Boyfriend forced gf for sex’ and ‘She Woke Up Being F**ked’.

        Titles identifying material as constituting image-based sexual abuse were also evident, focused largely on videos created without consent, particularly voyeurism videos using hidden or ‘spy’ cameras and upskirting. Such as: ‘F**ks Sleeping Mom Hidden Camera’; ‘Beach Spy Changing Room Two Girls’; ‘Pharmacy Store Bathroom Hidden cam’; ‘Upskirted While Putting Groceries In The Car’.

        These videos also commonly feature racist terms and demeaning references to Black and minoritised women and men in their descriptions.

Vera Gray et al. highlight how pornography is a powerful cultural influence, a significant part of the “cultural scaffolding” that shapes our understanding of sexuality and the boundary between sex and sexual violence.[25] This contributes to a societal failure to take violence against women seriously and means that acts of sexual violence which are also predicated on an absence of consent are perhaps less likely to be recognised as such, and in fact encouraged to be seen as “sexy”. Upskirting, a form of image-based sexual abuse, is a prime example of this, where the minimising of this behaviour, alongside the common occurrence of such imagery in pornography and even in tabloid media, creates a cumulative impression that the abuse is trivial, if not harmless.

There is a lack of research to show the direct relationship between the consumption of mainstream pornography and the viewing of child sexual abuse videos and images, though it is widely cited that sex offenders end up viewing child sexual abuse, as a continuum of legal pornography.[26] There is a need to situate sexual offending within the wider societal context and acknowledge that it often begins with activities that are legal, normalised, and even encouraged by society. In this sense, policy-makers might consider that sexual violence and abuse is not an aberration from societal norms, but is entirely consistent with the hyper-sexualised and violent culture produced and normalised online and in the media. 

 

In 2019 there was a record amount of video uploads to Pornhub with over 6.83 million new videos uploaded. Pornhub states that every upload is moderated by a human, which is impossible given how much is uploaded. Pornhub and others state also that they have the permission of everyone in the video - there is currently no way of holding them to account for what they say they are doing, and there is no consequence for them not doing it, rendering their terms and conditions meaningless when compared to their content.

“Pornhub were very reluctant to assist with the police investigation. The officer received one very vague reply when he asked for the details of the account and the videos that had been uploaded. All other correspondence was ignored, although Pornhub dispute having received these emails as they were sent to ‘an old email address’ despite the fact that they had responded on that email address to the first contact.” - Ellesha, survivor or image-based sexual abuse

Other international jurisdictions have taken steps to ensure mainstream porn websites are properly regulated to ensure they do not host videos that have been shared without the consent of an individual depicted. The Canadian government’s proposals for forthcoming harmful online content legislation[27] seek to address imaged-based sexual abuse and hold mainstream porn websites responsible for hosting non-consensually shared images. The proposals would apply “to online communication service providers” — a new category to be defined in the legislation that the government has said would target major platforms like Facebook, Twitter, TikTok and Pornhub. The new bill would mean platforms would have to remove illegal content, including intimate images shared non-consensually, ​​within 24 hours of it being flagged.[28]

Germany has also criminalised violating someone’s “intimate privacy” by taking pictures and the dissemination of intimate images to a third party without consent.[29]

 

6.2 Image-based sexual abuse

As Clare McGlynn and Erika Rackley[30] point out, rape porn and image-based abuse, as well as harming the individual ‘victim’ in a deeply gendered way, also cause ‘cultural harm’, in that they ‘may help to sustain a culture - a set of attitudes that are not universal but which extend beyond those immediately involved as perpetrators or victim-survivors of image-based sexual abuse—in which sexual consent is regularly ignored’.

“When the police interviewed my ex-partner, he admitted to sharing the images without consent but claimed his intentions were not malicious toward me. That meant that the so-called “Revenge Porn Law” did not fulfil the criteria, nor did any other legislation for malicious communications or harassment under criminal law. For that reason, he walked away without any consequences.” - Georgie, survivor of image-based sexual abuse

Any legislation that prohibits or criminalises image based sexual abuse (such as the Law Commission recommendation that cyberflashing should be an offence in the Bill), must not include a requirement for a specific motivation behind the act in order for it to constitute an offence. Motivation for image-based sexual abuse can be varied and overlapping, as men may wish to not only cause distress, alarm or humiliation but also seek to ‘be funny’ or boost status among their friends. Furthermore, motivation is ultimately hard to prove in court.

The Law Commission’s proposals regarding cyberflashing would therefore result in very few prosecutions, further deepening the lack of trust survivors may have in the criminal justice system as a result of dismally low sexual offences prosecution rates. Instead, legislation introducing new image-based abuse offences should recognise image-based abuse as a harmful intrusion and encroachment on women’s personal space, grounded in a lack of consent, rather than focus on the motivation of perpetrators.[31]

6.3 Recommendations:

 

 

        For the definition of “indirect” harm in s 46 (7) not to rely on causation, particularly given how difficult it is to prove. Instead a definition that recognises that harm is created by content that endorses, excuses or encourages VAWG.

 

        All forms of image-based sexual abuse, including the taking, sharing and threats to share intimate images, to be included in the definition of what constitutes harmful in the Bill

 

        Anonymity to be automatically granted to all victims of image-based abuse

 

        For commercial porn websites to be specifically named in the legislation as a "regulated service", being a provider of online services where users can generate and share content or search content and thus under the remit of the OfCom regulator.

 

        For the legislation to acknowledge that pornography that depicts, endorses, or encourages, attitudes or behaviours underpinning VAWG to be recognised as content that is harmful to adults and children.

 

       For a new criminal offence to cover false representations of consent to sharing intimate images on public websites and platforms

 

        For the regulator to be empowered to issue take down notices in respect of image-based sexual abuse and other forms of online VAWG in response to individual instances.

 

        The criminalisation of image-based sexual abuse that does not require a specific motivation to be proved.

 

  1. Right to Support

 

“When we reached out to the Revenge Porn Helpline, they were amazing. They reported the images directly to the site owner and because of their hard work, some of the images have been removed from the platform, although it has taken a while for this to happen.” Ruby - survivor of image-based sexual abuse

 

Any preventative approach must include provision of support, and that means establishing a requirement for it, as well as funding and resourcing. We know that specialist organisations – the ones which women are most likely to turn to when they have experienced offline and online abuse are severely underfunded and unable to respond to the huge levels of demand they face. The increasing dependency on the online world during the Covid pandemic has provided ample breeding ground for perpetrators to abuse women online. Between April 2020 and May 2021, Refuge has seen on average a 97% increase in the number of complex tech abuse cases requiring specialist tech support when compared to the first three months of 2020, demonstrating how increasingly critical specialist support for survivors is.The structure of funding and commissioning models mean that these specialist organisations are hard pushed to deliver their core work, and therefore unable to grow at the pace needed to be able to respond to these additional forms of harm.

 

There is a high incidence of Online VAWG, with 1 in 7 young women having experienced threats to share their intimate images or videos[32], and the likelihood is that the prevalence is only set to grow. Financial provision for support for victims of online harms must be provided in order that they are able to receive independent, specialist and trauma informed support and advocacy from organisations that are experts in responding to VAWG and working with online harms.

 

7.1 Recommendations

 

A funding package for victims of online VAWG to be launched alongside the Bill. The funding for this to raised via:

 

        10% of the revenue raised from the Digital Services Tax ring-fenced to fund specialist VAWG sector efforts to effectively address online VAWG, with 50% ring-fenced for specialist ‘by and for’ led services for Black and minoritised women and girls.

 

        Alternatively, 5% of any fines levied by Ofcom to be directed to funding specialist VAWG sector support services, and for 50% of this amount to be specifically ring-fenced for specialist ‘by and for’ led services supporting Black and minoritised women and girls.

 

  1. Right to Be Informed

 

In order to prevent and regulate online harms it is vital the government, regulator and the public understand the extent of the issue and are able to identify patterns of abuse. This is only possible if data collected is disaggregated in terms of different forms of harms and protected characteristics. This enables targeted approaches and, where there are links with other policy areas such as criminal justice, it allows policy makers to have joined-up data-sets.

 

8.1 Recommendations

 

        Transparency reporting should include a separate VAWG category which could include abuse such as ‘rape threats’, image-based abuse etc. Companies should show how they are taking steps to remove users who commit ‘online VAWG’.

 

        Tech companies to provide greater transparency about their content moderation efforts, including by allowing trusted research institutions and civil society organisations to access anonymised and disaggregated data about content removals and complaints submitted to the platforms, covering the type of action taken, the time it takes to review reported content and increased transparency around appeals processes.

 

        Tech companies to seek feedback from users on how they felt a complaint was handled and to report on this too - in essence user satisfaction.

 

        Tech companies need to be transparent about their investment in and resourcing of content moderation, and need to invest more resources in human content moderation. This should also include access to trauma-informed support for human moderators given the risk of vicarious trauma and/or retraumatisation.

 

        Tech companies should provide more transparency about their policies related to dehumanising language based on gender, ethnicity and other protected categories.

 

        All the above should be accessible, and easy to find and the regulator should compile these reports – with analysis – and potentially a rating system on their website.

 

  1. An Effective and Robust Regulator

 

A regulator with the teeth and independence to hold tech companies meaningfully to account is essential in addressing and preventing online harms as well as moving us away from voluntary piecemeal responses from individual tech companies. We have significant concerns about Ofcom’s ability to effectively enforce regulations in its current form. Given their other regulatory functions and the size, range and power of the tech companies they will be taking on they need significant resourcing and enforcement powers.

 

83% of respondents to the Glitch & EVAW Ripple Effect survey who reported one or several incidents of online abuse during COVID-19 felt their complaint(s) had not been properly addressed. This proportion increased to 94% for Black and minoritised women and non-binary people.

 

In the paper ‘Shattering Lives and Myths – a report on image-based sexual abuse’ [33]Clare McGlynn and colleagues found that for many victims their first imperative is for material to be taken down, but that this is difficult, cumbersome and time consuming. The Bill provides no recourse for this despite it being such a priority for victims. The Regulator should be given the power to order images to be taken down, in response to individual reports.

 

A year later, the content is still online against my wishes. Attempts to get it removed using DMCA takedown notices work for some time, for them to only be reuploaded. Filing for DMCA takedown notices is an issue in itself. It is dangerous to file these notices yourself as they require a name, address and contact details which are then shared with the person who uploaded the content and is then sometimes used for blackmail or doxing.” - Madison, survivor of image-based sexual abuse

 

“​​There is nothing to stop him from doing it again and the websites he frequented are still in existence. Their list of victims grows daily and the potential for re-traumatisation is staggering where there is no guarantee of removal.” - Georgie, survivor of image-based sexual abuse

 

We may wish to learn from the example set by other countries in response to online VAWG. In Australia the eSafety Commission has a team that is able to remove non-consensual material from the internet at the request of victims.[34]

 

Additionally, South Korea’s Advocacy Centre for Online Sexual Abuse, which is funded by the Ministry for Gender Equality provides comprehensive support for survivors including counselling and legal advice as well as having the power and  dedicated resource to take down intimate images which have been shared without consent. A regulator with the power to properly address online VAWG by taking harmful content down, and sufficient, Government-funded support for victims would ensure the UK is in line with international online harms legislation.[35]

 

9.1 Recommendations

 

        For an independent regulator to have adequate resource, independence and enforcement powers to provide a meaningful challenge to tech companies

        For an independent regulator that is resourced and committed to developing a specialist knowledge base relating to Online VAWG with engagement and where appropriate scrutiny from third sector specialists in Online VAWG

        For the regulator to be proactive in enforcement of regulation of tech companies - identifying trends, closely monitoring the efforts of tech companies around moderation, take downs and support signposting.

        For the regulator to have the power and resource to order the take down of image-based sexual abuse and harmful content

        For the regulator to have a clear system of accountability and scrutiny

 

 

 

  1. Safety By Design

“Platform liability is very important for so many reasons, not just to stop the images appearing on the sites in the first place. Two of the victims in our group have been blackmailed over Instagram by troll accounts over the images. They’ve pressured the members of our group for money and threatened showing the images to their families. The platforms need to have a responsibility for this and making sure it doesn’t happen.” - Ruby - survivor of image-based sexual abuse

The Bill has inadequate provision regarding the ways tech companies should be encouraged to acknowledge and respond to the fact that their products can facilitate, and even encourage, harm. There is not enough focus on the platform design and the systems and processes that are developed.

 

It is clear that platforms such as social media sites and dating apps which are very profitable due to their reach and users, are ideal spaces for adults and young people who seek to ‘groom’, deceive and abuse through anonymised usernames and private chat which is invisible to others. Similarly, other platforms provide ideal ‘conducive contexts’ for individuals and groups to ‘pile on’ and harass and shame vulnerable individuals with impunity. When the potential for serious harm and abuse is clear at design stage, it should be acknowledged by the company and mitigated against. Co-design and training from the specialist VAWG sector for the tech industry to help identify potential opportunities for abuse should be seen as best practise.  

 

While platforms are introducing measures such as users filtering out key words or accounts they don’t want to see, responsibility cannot rest solely on the user, it must rest primarily with the software designer, the tech company: those financially benefiting from the product and the data collected by it.

 

10.1 Recommendations

 

        The Bill should include a new high level tech commissioning and design stage related principle which is: a company should be required to take into account, and to address and reasonably mitigate against potential harms.

 

        There needs to be ethical frameworks and online harm analysis when designing software and devices that mean that the default setting is the software/device at its safest for all potential users to navigate, and the least likely to nudge and encourage users into participating in forms of online hate or abuse; or being exposed to these.

 

        Any mechanisms and settings for managing and reporting content must be accessible and appropriate for all disabled people.

 

        The regulator should make a high priority of enforcing this norm.

 

 

  1. Future Proofing: fake porn and more

 

This legislation is a way to explicitly state that Online VAWG will not be tolerated - establishing a precedent that will enable future work in tackling and preventing its harms. There must be an ongoing duty to interrogate the impacts of the Bill and the extent of the success of the Regulator. As part of future proofing we must treat this legislation as an initial step, following which considerable work must be done to build on it.

 

Within the scope of the Bill, part of future-proofing is ensuring that emerging technology does not provide a loophole for abuse. Fakeporn[36] is a serious problem and victims suffer significant harms when this material is made and created without their consent. While attempting to address some of the current online harms the Bill is not forward looking enough to respond to ways in which AI and other advancements can and will be weaponised to harass and abuse women and others.

 

Tech is increasingly being used to perpetrate domestic abuse, and this trend is likely to only continue given the increasing move to digital. Research from Refuge found 1 in 14 adults have experienced threats to share intimate images in England and Wales, but this figure rises to 1 in 7 among young women[37], suggesting this form of abuse is increasingly common among younger people and the issue could grow in prevalence. As the pandemic resulted in a greater time spent on line so too did tech abuse - between April 2020 and May 2021 Refuge saw on average a 97% increase in the number of complex tech abuse cases requiring specialist tech support when compared to the first three months of 2020.

 

Some strides have been made in Scotland and Australia which we believe the Bill could learn from. The Abusive Behaviour and Sexual Harm (Scotland) Act 2016 which makes image-based sexual abuse a sexual offence explicitly includes “whether or not the image has been altered in any way”.[38] The inclusion of fake porn/deepfake images or video recordings in this definition of image-based sexual abuse offence thus ensures that the law is future proofed.

 

Another example is Australia where certain jurisdictions, such as South Australia, New South Wales, and the Australian Capital Territory (ACT), explicitly define an “intimate” or “invasive” image as including images that have been altered or manipulated within their intimate image abuse offences.[39]

 

11.1 Recommendations

 

        A commitment to ‘future-proofing’ in the area of online harms including online VAWG regulation, to ensure that in particular the ever growing use of AI (see deepfakes) and other ways in which online harms will be perpetrated in future are within scope of the emerging policy and the regulator’s powers.

 

        To include fakeporn/deepfake images within the scope of any legislation that criminalises image based sexual abuse

 

  1. Survivor Testimonies

 

Annex A: Testimony of Survivor Ellesha May Garner

Annex B: Testimony of Survivor Georgie Mathews

Annex C: Testimony of Survivor Madison

Annex D: Testimony of Survivor Ruby

 

  1. Organisations in support of this briefing

 

Organisations and experts in Violence Against Women and Girls who have contributed to and support this briefing:

 

#NotYourPorn

Chayn

Dr Fiona Vera Gray

End Violence Against Women Coalition

Faith & VAWG Coalition

Glitch

Imkaan

Professor Clare McGlynn

Rape Crisis England & Wales

Refuge

Welsh Women’s Aid

Women & Girls Network WGN

Women’s Aid Federation of England

 

We are grateful to the survivors who shared their experiences.

 

Submitted 08.09.2021

 

 


Annex A

 

Testimony

 

Back in 2018 I was made aware that two videos of me had been posted onto the porn website Pornhub.

To say it was a shock would be an understatement, I hadn’t taken or consented to taking sexually explict videos so I just couldn’t understand where these videos had come from. After watching the videos for the first time I realised that my mentally and emotionally abusive ex boyfriend had filmed us without my knowledge or consent. Whilst we’d had sex on a couple of occasions he had decided to commit voyeurism and completely disrespect my right for privacy.

After our breakup and me expressing the fact that I didn’t want to speak to him again due to the behaviour he displayed during our relationship, he decided to post these videos on to his Pornhub account for thousands, potentially millions of strangers to watch.

After watching the videos I was in a state of disbelief, anguish and anger. My self confidence and mental health were in tatters and I just didn’t know where to go from here. I remembered vaguely hearing about this type of thing being illegal and realised that I needed to phone the police and report him. The lady I spoke to on the phone was very helpful and asked me to attend the station in order to write up a statement on what happened.

When I first visited the police station I had immediate concerns as to how this case was going to be handled. Whilst the officers were empathetic and comforting, it was quickly obvious that they hadn’t dealt with something like this before and they weren’t sure of the procedure when it came to beginning to investigate what had happened. After going away to consult with colleagues, they advised that they wouldn’t be able to access the website as it was blocked on their servers and that he was also wanted by another police force for committing a very similar crime against another girl. Due to the fact that they weren’t able to access the website, it was left me to gather all of the evidence for them. I had to go through the indignity of watching and recording the videos over and over and looking through the rest of his account to check for other illegal material.

Following over 6 months of investigation (2 months of which waiting for him to be arrested and questioned) and a lot of toing and froing between the two police forces which resulted in no physical evidence such as his phone and other devices being retrieved or searched, the case was dropped as they had insufficient evidence and due to the fact that they couldn’t prove that my ex had done this to ‘deliberately cause me distress’.

Unfortunately due to how the law is written, you have to outwardly prove that the perpetrator has committed the crime in order to cause the victim distress. This is something that is incredibly difficult, near on impossible, to prove and also makes no sense as there is no other reason to do this to someone other than to manipulate, disrespect and upset them.

 

My case was noticed by the cyber crime team who apologised on behalf of the force for the previous failings in the investigation and advised that they had reopened the case and were reinvestigating.

 

Unfortunately it ended with the same outcome with the CPS essentially providing a defence for the perpetrator by theorising on who could have posted these and why they were posted, all despite the fact that they were linked directly with his account online, his email address and his IP address. Also, he had done this multiple times to multiple different women but still the CPS were questioning my morals and my believability. I have never experienced victim blaming like it and for the first time during this whole process, I understood why people chose not to report things like this to the police. It was devastating, I would say almost as devastating as having this happen to you in the first place.

 

Pornhub was also not held accountable at any point for allowing someone with an extensive record of abusing women to create and post onto the site with no checks on whether what he was posting was consensual or not. The name of the videos outwardly showed a lack of respect for the person in them and also mentioned the fact that it was his ex girlfriend, why would you ever give consent to an ex partner to upload sexually explicit videos of you online after a break up? Surely that should have been something picked up when vetting the videos posted onto the site? Why were they not flagged and removed as potentially illegal?

 

Pornhub were also very reluctant to assist with the police investigation. The officer received one very vague reply when he asked for the details of the account and the videos that had been uploaded. All other correspondence was ignored, although Pornhub dispute having received these emails as they were sent to ‘an old email address’ despite the fact that they had responded on that email address to the first contact.

 

As far as I am concerned Pornhub have no moral standing when it comes to the crimes committed on their site. They at this time openly invited anyone and everyone to share videos without ever checking who was behind posting these videos, if consent from both parties had been received and whether or not the content was illegal. Then when something illegal had been found on their site, they washed their hands of any responsibility and failed to assist in catching a prolific women abuser.

 

Going through all of this has had a profound impact on my life. I will never have the ability to trust people in the same way and will always second guess their intentions towards me. My self confidence is at an all time low and although I have put a brave face on throughout this, it has had a detrimental effect on my mental health.

 

This crime is happening more and more and unless something is urgently done to rectify the huge flaws within the system, how the perpetrators and websites are held accountable and how prevalent victim blaming is when reporting this crime plenty more people will have to suffer the same injustices and distress as I have.


 


Annex B

 

Name: Georgie Matthews Date: 20/08/2021

 

I am a victim-survivor of intimate image abuse. I found out in May 2020 that an ex-partner had been sharing intimate images of me online without my consent.

I found out about the image sharing via Facebook. I was contacted by a stranger using an alias who, although they appeared to be acting as a good Samaritan, had gone to great lengths to conceal their own identity and despite extensive efforts from the police, the identify of this person is still unknown.

Following this communication, I reported the situation to the police and went in search of more evidence. What I found showed a number of websites dedicated to non-consensual pornographic content and online forums designed to cater for the anonymous sharing of explicit content. It showed a trail of evidence that suggested that my situation was not a one-time act but a prolonged behaviour – my ex-partner has, in all likelihood, been engaging in non-consensual image sharing for a number of years without my knowledge.

I sought the help of charities and trusted friends to identify where the images were and, where possible, to have them taken down. The police could do nothing in this regard, other than strongly advise my ex-partner to have them removed where he had the power to do so.

I have no idea to this day just how much content is still out there on the internet and feel certain that no matter what I might do in the present, this will loom over my future.

When the police interviewed my ex-partner, he admitted to sharing the images without consent but claimed his intentions were not malicious toward me. That meant that the so-called “Revenge Porn Law” did not fulfil the criteria, nor did any other legislation for malicious communications or harassment under criminal law. For that reason, he walked away without any consequences.

There is nothing to stop him from doing it again and the websites he frequented are still in existence. Their list of victims grows daily and the potential for re-traumatisation is staggering where there is no guarantee of removal.

 

In the months immediately following my discovery, the police investigation continued, and I sought counselling. I was in counselling from May 2020 – February 2021. The police investigation ended in December 2020. Throughout those 10 months, I battled with depression, fear and anxiety. I am still dealing with the aftermath of my trauma and experience some PTSD symptoms.

The online world is scarier now. Whenever I see a message from a stranger in my inbox, or someone I haven’t spoken with in years, my first thought is that they have found something. I was followed on Instagram by a copy-cat account that I believe to be connected with my image abuse and as such have become very protective of my online presence. I create social media content for my job and work as an actress which means I have an extensive online presence. I am scared that one day, my ex or one of his “friends” will find a way to make my career choices impossible. That they will seek to humiliate me by sharing images of me in my virtual place of work. This is not a paranoid fear – this is a tangible possibility.

 

As a single person, romance continues to be a source of anxiety. Trusting new potential partners fills me with fear and I do not know when I will feel comfortable enough to let another person into my bed. Until I can overcome that fear, I will remain single. I feel that loneliness intensely – but even more that that, I feel angry that my sense of freedom has been corrupted by things I have no control over. Despite my best efforts, I cannot trick my brain into forgetting the trauma.

 

In March 2021, I decided to start speaking out. I wanted to express how widespread the issue was and how badly current legislation failed. I shared my story publicly online and within a week, I was speaking on BBC TV and radio and connecting with activists all over the country and the world. The extent of the problem became incredibly obvious. This issue is global.

As soon as I started looking, it was like a fog rising on a landscape that had always been right in front of me. News stories concerning cases like mine were (and are) everywhere. Friends, colleagues and strangers came to me with their own stories of abuse and I felt both overwhelmed and unsurprised.

The culture of image abuse is rampant across all sectors of society, with the most vulnerable of course being the most likely to suffer. Image abuse is common in cases of domestic violence and an indicator for stalker/obsessive behaviour. Underage pornography and sex trafficking go hand in hand. Homophobia, transphobia, misogyny and racism are the columns on which image abuse stands. Image abuse is fuelled by hate. The Online Safety Bill must understand that where online abuse is found, so too is hate and oppression. We must look to regulate and punish the companies that profit and invest in hate.

 

I strongly believe in the freedom of expression and of sexuality. I also believe in the power of compassion. I want this bill to make sure it upholds those beliefs. The Online Safety Bill must recognise the difference between the freedom of expression and oppression.

When someone wishes to express their sexuality publicly – and safely – they should not be prevented from doing so. Whether it be to earn a living or for personal satisfaction, that should be allowed to continue – it is their choice.

But when someone wishes to exploit someone else’s sexuality, take something that is not theirs and

share it without consent, that is oppression.

There are thousands of websites and forums that exist expressly for the purpose of exploiting the sexuality of the oppressed and the vulnerable. These online havens for abuse are run by very clever people who enjoy loopholes – both virtual and legislative – that allow them to maintain their abhorrent practises.

The Online Safety Bill should take pains to ensure that the people they employ to regulate online services have the necessary training and resources to challenge exploitation and hate where it is found. When the police were investigating my case, it was painfully obvious that there were not enough resources – both human and monetary – to address my issues thoroughly or efficiently. I have no doubt that other evidence might have been located if the cyber-crime team were better funded.


Annex C

 

My Testimony – Madison

 

I am a female in my mid-twenties. I have been doing online sex work for two years, primarily as a “camgirl” and creating content. This line of work was completely my choice, and something I felt and still do, feel liberated by. I started this line of work after I left my degree in Nursing due to being diagnosed with a condition which made it hard for me to withstand the physical nature of that career.

 

I have always felt assured, open, and interested about the whole notion of sex and all connotations that reside with it. Because of this, I decided that online sex work would be well suited to me. I could work in the comfort of my own home, with the hours and flexibility that I needed to suit my life.

 

Before starting I spent two months researching everything I needed to know about online safety and how to work as professionally as possible in this role. I ensured to let immediate family and close friends know I had made this choice, to be transparent, as I was aware of the possibility of being ‘outed’. Once I was confident in my knowledge, I began working online and I quite quickly grew to love it. Although, I soon came to realise it has some life altering consequences, which you can never be fully prepared for.

 

My IBSA story has many aspects and continues to develop as time goes by, but it begins with me being recorded without my knowledge or consent, then subsequently being uploaded to numerous tube sites, on multiple occasions. Some of the uploads included my partner, too. The perpetrator of this initial act of recording and uploading is unknown to me. At the time of discovering the uploads, I contacted the platforms I was working on when recorded for advice on how to stop this from happening and get the content removed, which came to no avail.

 

In early 2020, I was notified that the images and recording links had got into the hands of people in my hometown in Wales, and that they were “going viral”. I am unsure how this happened exactly, as none of my personal details have ever coexisted with my online persona. I believe it was probably pure chance that someone recognised my face on a tube site and took the opportunity to use it against me.

 

In a short amount of time, these images and recordings had been viewed thousands of times by people. The links were sent to my friends, family, acquaintances and beyond. I then discovered they had got into the hands of an old boyfriend of mine in Wales, who decided to find my current partner’s past girlfriend on social media to send them to her, to further make it “go viral” where I live currently in England. As a result, these images and recordings managed to travel from my hometown, to where I now currently live, despite there being over 300 miles between the two locations. At the time, the uploads were shared/viewed approximately 8,000 times.

 

As a consequence, I lost modelling contracts and was asked to cease all involvement with a local bridal business where I helped out frequently, despite the owners of business being fully aware of my online job from the beginning and having no prior issue with it. Their explanation for asking me to leave was due to their embarrassment of people talking about and sharing the content of me. I felt extremely ostracised.

 

I contacted the Revenge Porn Helpline for advice as I was unsure of my rights as it was not a typical instance of revenge porn, and at the time I was not aware of the term Image Based Sexual Abuse. I was advised they could not help me due to it being my choice of work. I was told to speak to the police.

I contacted the police, as advised. They told me they could do nothing about the distribution of the “porn”, or the platforms it was on, but they could go ahead with a case for harassment regarding the ex-girlfriend of my partner as we had evidence of her involvement in sharing the content. I felt I had hit a dead end, but I could not just let this go as it was taking over my life. I agreed and gave an interview with the police.

 

A week later I received a phone call at 11:45pm while I was asleep, from a Police Officer who was aware of my case despite having no previous contact with me before. This officer said that I, “should probably drop the case”. Bewildered and startled by receiving this important phone call late at night, I agreed, but asked to have it kept on record in case it needed to be revisited. For context, my partner’s ex-girlfriend is a Prison Nurse, working in a local prison with the inmates. The Police Officer was aware of her job, and I believe he wanted to spare her from any trouble with her own career and life. I still ask myself, why was I, my life and career, not taken as seriously?

This experience with the police left me at a loss, as I contacted them in hopes they would help me with the sites that were facilitating this to happen as that is where the problem stemmed from.

Instead, they were only half-interested about one individual, instead of seeing the bigger picture.

A year later, the content is still online against my wishes. Attempts to get it removed using DMCA takedown notices work for some time, for them to only be reuploaded. Filing for DMCA takedown notices is an issue in itself. It is dangerous to file these notices yourself as they require a name, address and contact details which are then shared with the person who uploaded the content and is then sometimes used for blackmail or doxing.

I lost over a stone in weight at the time and suffered severe anxiety from the whole ordeal for months. I still think about it daily. I have been stopped from doing things I love and in turn faced blatant ostracisation and violation. I am incredibly grateful that my family and close friends are very supportive of me, as many other people are often shunned by their own family and friends when IBSA occurs. I have had to spend a lot of time working on myself and my mindset to overcome what has happened.

 

Sadly though, it is still prevalent over a year later. My partner runs a business, and this has also been affected by these images and recordings being shared. Recently we have learned that another business is using this information to ‘out’ my partner and myself to clients. This has resulted in a significant financial loss. This is a prime example of how out of control IBSA can get when those affected are continually left to be targeted, as the stolen content just hangs around on the internet, for anyone to access and use at their own discretion.

The hardest part in all this was probably being told that it was my own fault as I chose to do this, so what did I expect? It is quite a juxtaposition to experience IBSA as on online sex worker. Yes, I chose to do this line of work. However, my right for choice is irrelevant when the control and consent is being taken away so easily by the perpetrators and platforms facilitating and profiting from stolen content, images and recordings.

 

It seems that the deeply ingrained societal shame and taboo around sex and our bodies allows some people to believe there is an automatic ‘green light’ to use any sexual content of an individual as a tool to attempt to ruin their life or exploit them in other ways. When I saw the images and recordings of myself, I also saw thousands of other people too. I am not the first, and I will not be the last until these views and regulations change. As time goes by the anger settles, although sadly the feeling of violation never fades.


Annex D

 

My experience of image based sexual abuse (IBSA) involved being told by by someone I knew on social media, in a kind way, that my images were on an anonymous image board platform.

 

I received that message when I was out on a walk with my family, and it was a horrible message to receive. You feel like your world is falling apart around you. I raced home and looked at the link to the website because I didn’t have any signal where I was. I found that there were these images of me when I was 17 years old from a girls holiday. I wasn’t completely nude in the images but they were provocative - in one of the photos all I had on was bikini bottoms and in another I was covering myself with a small towel which wasn’t really covering everything.

 

Immediately, I reported it to the police and then waited for a follow up call. Because the community where I’m from, the news (and the link to the images) travelled round very quickly. I soon became aware that a lot of people that I know were impacted by this website as it categorised images by area and our area is a close-knit community. This was horrible, and it induced paranoia in me. A few other people got in touch telling me about the website which, of course, I already knew about so this was stressful because you felt like everyone was finding out about it. A couple of my friends’ sisters and my friends’ friends were on the website and felt very distressed.

 

Following my experience, I found out that the police had received 30 reports of similar experiences to mine, in one day. I then decided to try and build a WhatsApp support group so that people could gate-keep who knew and so victims weren’t being harassed by multiple messages telling them they were on the website but also so we could spread and share any updates and/or information we had from the police. From here on in this testimony, I will talk about the experience that we had as a group. There were 17 of us at its peak but at the initial stages roughly 15 members of the group, victims of IBSA. Everything we have done since the group formed has been a collective decision. All of the group members were women, and a lot of the victims struggled with the emotional trauma of being an IBSA victim.

 

Initially the police took my case seriously but I did need to prompt them. Some of the desk officers knew people on the website and that seemed to move the investigations along more quickly because, like I said before, the community is really close and everyone knows everyone. However, the follow up call I got was just over 24 hours later, it was at 11 o’clock at night and I am a teacher so it wasn’t the best time to receive a call, and the response from police was apathetic and slightly defeatist. I was told something to the effect of “with these anonymous image board sites, we don’t get prosecutions and there isn’t a lot we can do”. This wasn’t a reassuring response for victims. Some of the women in the WhatsApp group didn’t even get a follow up call. Of those that did, they felt like the officer’s attitude towards them was negative and victim blaming. Three of those victims filed complaints about the officer. Following this, we didn’t hear a huge amount back from the police. We tried to keep each other updated when we did. Some of us were sent victim support information by the police which we shared with the group but some people in the group weren’t. Some people in the group were given crime references numbers - I was because I was 17 in the images but others weren’t. It was completely inconsistent and as a group we didn’t feel like we were being taken seriously.

As soon as the officers found out that I was 17 in the images, I was told by the officers that they could no longer look at the website because of protection issues in relation to images of those underage. This was a huge spanner in the works because then everyone else who had reported the non consensual sharing of intimate images or filed an incident report, their investigation was cut short because some of the details (like the images of me being taken when I was a minor). This felt like a catch-22 - you report it but then someone is under 18 so the police can’t look at the website. The police said there was nothing they could do to remove the images from the website. However, when we reached out to the Revenge Porn Helpline (RPHelpline), they were amazing. The RPHelpline reported the images directly to the site owner and because of their hard work, some of the images have been removed from the platform, although it has taken a while for this to happen. The police need to put more emphasis on the platforms and the sites more so than the individual.

 

We had more concerns as we heard nothing. Members of our group were growing more distressed so collectively we wrote a letter to the chief constable and we were able to have a meeting with the superintendent a few weeks later. We aired some of our concerns about the victim pathway and how we felt we had been let down. Ultimately, the case was closed locally and it was transferred to the regional cyber crime unit and we haven’t heard anything since.

 

The fact that the local case was closed with no follow up investigation, none of us were interviewed after filing our original reports just shows how IBSA is not a high priority for regional police. This is of course due to budget constraints but also training limitations as well. As a group, we didn’t feel there was a focus on individual perpetrators as from talking amongst ourselves in the WhatsApp group we found a link between three of the women as they had an ex-partner in common who was cautioned about this kind of behaviour when he was 16. It is therefore clear that the police didn’t follow up on any lines of inquiry. Additionally, the local police officers palmed the case off to the regional team and acted as if the case wasn’t their problem. The problem for us as a group was that the photos on the website were accompanied with loads of really personal information: our names, the secondary schools we had been too, where we worked currently, family relations - there was all sorts of information on there that can only be characterised by local knowledge, it is not information you can get off social media or anywhere else on the internet. This had a huge impact, we felt our safety was threatened particularly as this particular image board characterises photos by area - there is a tag for every area of the UK and beyond. It feels like perpetrators are all around you and the experience made me nervous and paranoid. I am a confident person. We didn’t feel supported by the police but we also didn’t feel like the law supported the police to get a prosecution either.

 

The law does not support victims. The focus on the individual perpetrator needs to be tighter and the motive element removed and there needs to be liability for platforms. This issue is so widespread. Like I said earlier, I am a teacher at secondary school for sixth form students and we see IBSA happening all the time. As much as you can try and cover it in PSHE workshops or in enrichment days, there is a lot of emphasis (in schools at least), on the fact that if you share images you are contributing to child pornography and there is no narrative about what happens when you are an adult. Students go out into the world thinking that IBSA is somehow okay after you become an adult and there needs to be more emphasis on that. Furthermore, one of the first memories I have of nudes being shared was when in secondary school, 12/13 years ago, a friend of mine sent a nude to her boyfriend and her boyfriend and his friends put it as their Bebo profile picture. Even all those years ago, there should have been a duty on platforms to clamp down on that behaviour, and the image wasn’t taken down for ages. Around the same time, another one of my friends had her intimate images shared around all the schools in our area.

 

This has been happening for much longer than we think and the law needs to support the police in investigating platforms and also in being able to prosecute individuals.

 

Platform liability is very important for so many reasons, not just to stop the images appearing on the sites in the first place. Two of the victims in our group have been blackmailed over Instagram by troll accounts over the images. They’ve pressured the members of our group for money and threatened showing the images to their families. The platforms need to have a responsibility for this and making sure it doesn’t happen. Not only does someone feel at their most vulnerable and exploited when someone takes a private image and shares it, but it is worse when someone is actively targeting them for the images which were shared without their consent. In terms of the sites themselves, most of these sites exist to facilitate IBSA - I personally can’t think of a legitimate reason why an image board site (such as the one my images were shared one) exists to share images anonymously unless you are trying to humiliate, distress or exploit someone.

 

I hope that you recognise how serious the need is for legislative change in this area.

 

 


[1] Defined by Professor Clare McGlynn and Erica Rakely here and used in this briefing: https://claremcglynn.files.wordpress.com/2021/05/mcglynnrackley-stakeholder-briefing-5-may-2021-final-1.pdf

[2] Ibid

[3] Term coined by Moya Bailey and Trudy

[4] https://decoders.amnesty.org/projects/troll-patrol/findings

[5] Her Net Her Rights – Mapping the state of online violence against women and girls in Europe

[6] Refuge Report, The Naked Threat, 2020 https://www.refuge.org.uk/wp-content/uploads/2020/07/The-Naked-Threat-Report.pdf

[7] Statistics for January 2020 - March 2021

[8]https://www.ons.gov.uk/peoplepopulationandcommunity/crimeandjustice/articles/domesticabuseandthecriminaljusticesystemenglandandwales/november2020

[9] https://claremcglynn.files.wordpress.com/2019/10/shattering-lives-and-myths-revised-aug-2019.pdf

[10] https://www.csacentre.org.uk/documents/scale-nature-review-evidence-0621/

[11] https://www.stonewall.org.uk/resources/lgbt-britain-hate-crime-2017

[12] https://galop.org.uk/resource/online-hate-crime-report-2020/

[13] Galop, Trans Hate Crime Report 2020: https://galop.org.uk/wp-content/uploads/2021/06/Trans-Hate-Crime-Report-2020.pdf

[14] https://old.parliament.uk/business/committees/committees-a-z/commons-select/petitions-committee/inquiries/parliament-2017/online-abuse-17-19/

[15] The Ripple Effect Report https://glitchcharity.co.uk/wp-content/uploads/2021/04/Glitch-The-Ripple-Effect-Report-COVID-19-online-abuse.pdf

[16] https://www.tandfonline.com/doi/full/10.1080/01924036.2020.1732435

Vera-Gray, Fiona & Kelly, Liz (2020). Contested gendered space: public sexual harassment and women’s safety work. International Journal of Comparative and Applied Criminal Justice

[17] https://committees.parliament.uk/publications/6878/documents/72529/default/

[18] https://www.girlguiding.org.uk/globalassets/docs-and-resources/research-and-campaigns/girls-attitudes-survey-2019.pdf

[19]https://www.theguardian.com/commentisfree/2021/aug/17/incel-movement-extremism-internet-community-misogyny

[20] https://www.pornhub.com/insights/2019-year-in-review

[21] https://cease.org.uk/wp-content/uploads/2021/07/210607_CEASE_Expose_Big_Porn_Report.pdf

[22] https://www.bbc.co.uk/news/uk-57269939

[23] F.Vera-Gray, C.McGlynn, I. Kureshi, K.Butterby, Sexual violence as a sexual script in mainstream online pornography, The British Journal of Criminology, 2021; https:// doi.org/10.1093/bjc/azab035

[24] https://www.huffingtonpost.co.uk/entry/porn-website-tcs_uk_5d132febe4b09125ca466358

[25]  F.Vera-Gray, C.McGlynn, I. Kureshi, K.Butterby, Sexual violence as a sexual script in mainstream online pornography, The British Journal of Criminology, 2021; https:// doi.org/10.1093/bjc/azab035

[26] Pornland by Gail Dines, and C4’s Married to a Paedophile for example both feature interviews with sex offenders who claim that their porn consumption spiralled out of control, resulting in them accessing child sexual abuse.

[27] https://www.canada.ca/en/canadian-heritage/campaigns/harmful-online-content/technical-paper.html

[28]https://nationalpost.com/news/politics/federal-online-harms-bill-would-allow-secret-hearings-and-raises-charter-concerns-critics

[29]https://www.internetlab.org.br/en/inequalities-and-identities/how-do-countries-fight-the-non-consensual-dissemination-of-intimate-images/

[30] McGlynn, Clare & Rackley, Erika (2017). Image-Based Sexual Abuse. Oxford Journal of Legal Studies 37(3): 534-561

[31]  https://www.independent.co.uk/voices/cyberflashing-new-law-online-safety-bill-b1888633.html

[32] https://www.refuge.org.uk/wp-content/uploads/2020/07/The-Naked-Threat-Report.pdf

[33] https://claremcglynn.files.wordpress.com/2019/10/shattering-lives-and-myths-revised-aug-2019.pdf

[34]https://claremcglynn.com/imagebasedsexualabuse/korean-lessons-on-supporting-victims-of-image-based-sexual-abuse/

[35] Ibid.

[36]​​https://claremcglynn.com/imagebasedsexualabuse/fake-porn-deepfakes-and-need-to-reform-criminal-law

[37] https://www.refuge.org.uk/wp-content/uploads/2020/07/The-Naked-Threat-Report.pdf

[38]https://www.legislation.gov.uk/asp/2016/22/part/1/crossheading/disclosure-of-an-intimate-photograph-or-film/enacted

[39] https://www.lifehacker.com.au/2019/11/ai-generated-revenge-porn-is-our-new-unfortunate-reality/