Written evidence submitted by Refuge (OSB0084)

 

 

September 2021

 

About Refuge

 

Refuge is the largest specialist provider of gender-based violence services in the country supporting over 7,000 women and children on any given day. Refuge opened the world’s first refuge in 1971 in Chiswick, and 50 years later, provides: a national network of 48 refuges, community outreach services, child support services, and acts as independent advocates for those experiencing domestic, sexual, and other gender-based violence. We also run specialist services for survivors of modern slavery, ‘honour’-based violence, tech abuse and female genital mutilation. Refuge runs the National Domestic Abuse Helpline which receives hundreds of calls and contacts a day across the Helpline and associated platforms. 

 

Summary

 

Refuge welcomes the opportunity to submit written evidence. Domestic abuse affects millions of women and children every year, with more than one in four women (27.6%) aged 16-74 experiencing domestic abuse at some point in their lives. The use of technology to facilitate domestic abuse, also known as tech abuse, is becoming increasingly prevalent. Technology can equip perpetrators with further tools to coerce, control and abuse women. The Online Safety Bill is a landmark opportunity to transform the response to online violence against women and girls (VAWG), of which tech abuse is one type. Tech abuse can take many forms and occur across a range of platforms and devices, including but not limited to:   

 

This submission sets out Refuge’s views on existing provisions in the draft Online Safety Bill and how they can be strengthened to better address online VAWG, as well as recommendations for additional policy and legislative measures that must be included in order for the Bill to truly transform the response to online VAWG. We have responded to questions where we have expertise and insight.

Refuge has developed a specialist tech abuse service to support survivors and ensure that all of Refuge’s services and staff continue to adapt to and learn the ways in which perpetrators use technology to inflict abuse. The tech abuse team is comprised of specially trained staff, and was formed in 2018 in response to the growing threat and devastating impact of tech abuse on survivors, including children. Thousands of survivors experiencing domestic abuse and other forms of gender-based violence have reported tech abuse to Refuge staff and have subsequently received specialist support and tech safety planning. Refuge works to help empower women to use the internet, and not have to censor themselves due to the abuse being perpetrated against them. For example, we have developed and disseminated a range of information and tools to help women to stay online, including a new tech safety website (refugetechsafety.org).

Our dedicated tech abuse team has seen a rise in the number of complex tech abuse cases, with an average 97% increase in cases requiring specialist tech support between April 2020 and May 2021 when compared to the first three months of 2020. The two most common issues reported to the team are online security and stalking, and social media.[2] Market research recently commissioned by Refuge, which will be published in full later in the Autumn, has revealed:

36% of women report experiencing at least one behaviour suggestive of online abuse or harassment;

This figure rises to 62% among young women (aged 18-34).[3]

This suggests that online VAWG is becoming increasingly common among younger people, and that it may further grow in prevalence if proactive steps are not taken to address this threat.

Online VAWG has a devastating impact. Survivors supported by Refuge tell us of the chilling effects of tech abuse on their mental health and physical safety, for example where the abuse has involved location-tracking. The insidious nature of tech abuse and the ease with which perpetrators can constantly harass, abuse and monitor survivors causes fear, stress and uncertainty. Over half (53%) of women supported by Refuge’s tech abuse team between July 2020 and March 2021 said the tech abuse had left them feeling unsafe online, and 19% had their location compromised as a result.[4] One survivor told us:

“I was in a really dark place, him constantly posting stuff - I had really bad anxiety. I’d have panic attacks and it was constant worry of what he’s going to post next.”

The impact of tech abuse is further compounded by the substandard response to survivors by online platforms and the police. Women reporting their experiences of abuse are often met with a lack of understanding of the seriousness of these online harms. Tech companies and the police are failing to take sufficient action to prevent and address online VAWG, leaving survivors with limited options for protection and justice. Women and girls are being let down, and as a result, many find they have little choice but to remove themselves from online spaces, therefore becoming increasingly isolated from friends, family and online public life and debate.

45% of UK adults responding to a Refuge-commissioned survey said that social media and other online platforms were generally not accountable to their users.[5] 

The Online Safety Bill is a crucial opportunity to improve safeguards for women and girls experiencing online VAWG. Yet in its current form, the Bill omits reference to online VAWG and is unlikely to result in much-needed improvements to protections for survivors. This is despite the government’s commitments to tackling online and offline offending in the Tackling Violence Against Women and Girls Strategy, and as part of the UK’s Presidency of the G7.[6] Refuge strongly recommends that online VAWG is explicitly recognised as a specific online harm on the face of the Bill, and that the regulator is required to produce a corresponding VAWG code of practice as a matter of urgency, to ensure technology companies and the regulator prioritise the prevention and tackling of these harms. 

As the largest specialist provider of support services for survivors of domestic abuse and other forms of violence against women and girls, Refuge is in a singular position to represent the views and experiences of survivors. Our specialist tech abuse service is unique in this country, and as well as providing support to survivors, we also work with global experts to campaign for change and raise awareness of tech abuse. All our positions are developed in collaboration with survivors and our frontline staff. We will soon be publishing a report which will include detailed policy recommendations relating to the Bill and associated codes of practice, further findings from market research and in-depth survivor engagement work. We would be pleased to share the report with the Committee at a later date, in the Autumn. At this stage, Refuge urges the Joint Committee on the Draft Online Safety Bill to recommend the following:

 

We also want to take this opportunity to highlight to the Committee the risks that other poorly regulated technologies pose to survivors of domestic abuse. Refuge’s tech team has supported women who have been abused by perpetrators using a broad range of increasingly inexpensive devices, including internet-connected devices such as ‘smart’ alarms and doorbells. The government has a number of upcoming opportunities to better protect women from these harms, such as the Product Security and Telecommunications Infrastructure Bill which aims to regulate the security of smart products. We would encourage the government to consider how this piece of legislation could be used to increase the safety of smart products for women and girls.

 

Objectives

  1. Will the proposed legislation effectively deliver the policy aim of making the UK the safest place to be online?

Inclusion of VAWG

The draft Bill will not deliver on the policy aim of making the UK the safest place to be online, due to the omission of online VAWG. Without a specific focus on online VAWG and a corresponding code of practice, the Bill will likely result in little or no change for women and girls experiencing the life-threatening and life-limiting consequences of online VAWG,  and online platforms will be unlikely to prioritise the prevention and tackling of these harms. Whilst the Online Harms White Paper made limited references to online VAWG, such as the inclusion of ‘coercive behaviour’ as a harm, the draft Bill makes no reference to women, domestic abuse or VAWG. In its current form, the Bill would in all probability result in the regulator prioritising online harms that are specifically cited in the Bill, such as child sexual abuse and exploitation and terrorism, when developing codes of practice and monitoring compliance with the new regulatory framework. Survivors of tech abuse and other online VAWG would have to show that instances of tech abuse meet the thresholds for illegal content or content that is harmful to adults or children, and be left to navigate reporting processes which were not developed with their safety and experiences in mind. This would fundamentally risk women’s safety, as well as shutting women and girls out of online spaces, impeding their freedom of expression and isolating them from online public debate.

The inclusion of VAWG on the face of the Bill and development of an online VAWG code of practice would help deliver on national and international policy objectives. As part of the UK’s Presidency of the G7, the government committed to improving online safety and reducing harmful online content and activity, and to taking an approach which supports those who are especially impacted by online harms, specifically citing women and the proliferation of gender-based violence.[7] [8] Furthermore, the recently published Tackling VAWG Strategy commits the government to ensuring the safety of women and girls, both online and offline offending as a priority.

Despite this, the Tackling VAWG Strategy limits itself to outlining how the Bill will better protect children, giving no detail on how it will be used to safeguard women – despite the strategy’s focus being tackling violence against women. The Bill represents an opportunity for the government to prove its commitment to prioritise the safety of women and girls by making online VAWG a focus of the legislation.

In Refuge’s experience, the understanding of domestic abuse and other forms of VAWG remains very low amongst social media and tech companies, and survivors struggle to persuade them that content is harmful or illegal. Frequently, survivors reporting tech abuse to online platforms find that their reports go unanswered for many weeks or that platforms do not take into account the history and context of domestic abuse and coercive control in their reports, and deem them as not in breach of community standards, as the following examples illustrate.

Survivors have been sent images of their road signs or front doors by the perpetrator after they have fled to a safe location. This causes severe fear and anxiety, with women feeling physically unsafe because the perpetrator knows their location.

Another form of tech abuse involves a perpetrator sharing, or threatening to share, images of a survivor pictured without the hijab, which to the victim and her community may be considered intimate images. A photo of a door or of someone’s hair is not usually considered to be harmful, and it is likely these images would not be assessed as harmful content by a platform. However, when viewed within the lens of domestic abuse, such images were clearly sent with the intent to cause harm and stress, and are damaging to the survivor’s mental health and physical safety.

These examples of communications are clearly very harmful to the survivor, both psychologically and physically, but would likely not be understood as such or seen as in breach of community standards by online platforms viewing the images at face value and without reference to the context of domestic abuse and VAWG. In addition, many forms of tech abuse are illegal, yet the criminal justice response to tech abuse is poor and often reliant on outdated legislation, as outlined below and in section 7 in further detail. It is therefore vital that online VAWG is explicitly recognised on the face of the Bill as an online harm, and a dedicated online VAWG code of practice drawn up by the regulator to ensure platforms take appropriate steps to support survivors.

The scale and pervasiveness of tech abuse and online VAWG is such that urgent legislative action is needed. More than one in four women will experience domestic abuse at some point in their lifetime, and the police receive a domestic abuse-related call every 30 seconds.[9] [10] In Refuge’s experience of supporting over 7,000 women and children every day, technology is increasingly being used to coerce, control and abuse survivors, including children:

The prevalence of tech abuse and online VAWG is likely to only increase without legislative action, due to the availability and affordability of tech platforms and devices and as the general public becomes ever more reliant on technology in their daily lives. This shift has become all the more apparent since COVID-19 – 92% of respondents to the Glitch and EVAW survey reported using the internet more during the pandemic, and Ofcom found that UK internet use reached record levels in April 2020.[13] Without drastic action to hold perpetrators to account and to stem the tide of online VAWG, it is likely that tech abuse and online VAWG will continue to proliferate.

Refuge therefore recommends the following:

 

Response to tech abuse by online platforms and the criminal justice system

As outlined above, there are significant shortcomings in the response of online platforms and the criminal justice system to online VAWG, which are jeopardising survivors’ safety and effectively allowing perpetrators to continue abuse with impunity. In our experience of supporting survivors reporting offences related to tech abuse to the police, they frequently fail to investigate and charge these online crimes. The current criminal offences concerning harmful online communications are not fit for purpose, as they often do not cover many forms of communications used by perpetrators of domestic abuse. They are also vaguely defined and often poorly understand by police officers. In some instances, the police have advised survivors to come offline as a ‘solution’ to the abuse they are experiencing. This not only reduces survivors’ ability to take part in online debate, connect with family and friends, and perform day-to-day tasks such as online banking and food shopping, but can also risk escalating the abuse as the perpetrator shifts to “in-person” forms of abuse when unable to contact the survivor online. The response to online VAWG cannot be for victims to change their behaviour – the focus of government and the criminal justice system in this area must be on holding perpetrators to account. The following survivor stories detail what is frequently experienced by survivors of tech abuse:

“When I was pregnant I was getting threats about my child, and that people would kick my child out of me. A lot of (the messages) were fake accounts – 43, so it was 43 accounts. I reported it to Snapchat; well I haven’t heard anything back to be honest. I reported three times. I also reported it to the police because it became too much. Their advice was to get rid of social media”  Refuge client, on her experience of reporting threats to the police and Snapchat.

Everything was in his laptop, my Instagram my Facebook. He checked basically everything on my mobile and my bank account. The police just told me you can delete him at first and I told the police, I don’t want to delete him, I just want to prove that he is (in the account) – he has control even now and he has no right to control me. After that I just deleted him, and the next week the police called me and told me we can help you with Instagram (but) it was too late I already deleted him” Refuge client, on reporting the perpetrator’s access to her accounts.

The protections available to survivors through online platforms are potentially even more restricted than redress via criminal justice. Survivors reporting content via reporting and content moderation processes often wait many weeks or even months to receive acknowledgement of their report. Refuge’s tech abuse team has ‘trusted flagger’ status with many major social media sites which should theoretically provide a channel to human moderation and guarantee a faster response to reports. However, in practice the team still waits upwards of four to six weeks for responses from some platforms. As referenced previously, tech companies typically do not have a good understanding of domestic abuse and coercive control, and therefore fail to comprehend the contextual element of reports of tech abuse made on their platforms and take appropriate action. Platforms often use checkbox-only functions for users to describe why content is harmful, yet domestic abuse and VAWG are rarely given as options.

A common experience for survivors is to receive hundreds of abusive messages and content from perpetrators, often across multiple platforms. Because of the limitations of reporting processes, survivors are forced to report individual pieces of content at a time and to each platform in turn. In addition, perpetrators frequently create multiple new online accounts to harass and abuse survivors – Refuge has supported women who have been abused by dozens or hundreds of fake accounts, all of which she suspects is the perpetrator, who will often pose as a stranger. Perpetrators frequently use fake names, or even the victim-survivor’s name, when creating these accounts and some social media platforms appear unwilling to take action in these cases because the perpetrator has not used his own name. Where platforms do respond and take action following a report of tech abuse, this is usually limited to the removal of individual pieces of content, rather than more effective measures such as the removal or banning of the perpetrator from the site. Actions which survivors can take themselves are also often limited to blocking the perpetrator. This can often elevate their risk of harm, and provides limited protection when the perpetrator is easily able to create new fake accounts. Additionally, any public, abusive posts they make are still available to view for everyone else, including the survivor’s children, family members, and new partners. Urgent changes are therefore required to reporting processes in order to better protect women and effectively deliver on the aim of making the UK the safest place to be online.

The following survivor stories demonstrate a typical response from social media platforms.

I reported to Facebook, and they just came back with you can block this person’s account. They don’t really do much. No physical support whatsoever. (I was) just frustrated that there wasn’t any action, that they wouldn’t do anything” - Refuge client.

“I reported to Facebook saying this is not appropriate and they came back to me saying there is nothing inappropriate there. So from there I used Facebook to the minimum. Zero personal information in my Facebook, I feel restricted about that. I used to be a very positive outgoing person, now I feel like a person who wants to be very invisible. I don’t want to share anything. It really makes me emotional too, that’s I would say trauma. On top of that they (the perpetrator) can get away with that – it’s incredible” Refuge client.

Refuge therefore recommends a dedicated code of practice be developed by the regulator, in consultation with the VAWG specialist sector, which clearly sets out standards for platforms to adhere to in preventing and addressing online VAWG.

 

 

 

 

 

 

 

 

 

 

 

 

Refuge recommends the following:

  1. Are children effectively protected from harmful activity and content under the measures proposed in the draft Bill?

The focus in the Bill on children’s online safety and preventing and minimising child sexual abuse and exploitation is to be welcomed. However, the Bill requires further consideration if it is to result in improvements to safeguards for children who are victims of domestic abuse. As per the Domestic Abuse Act 2021, children are recognised as victims of domestic abuse in their own right. 97% of children living with domestic abuse are exposed to that abuse, and 62% of these children were directly harmed. Children exposed to domestic abuse are not simply witnesses, but also victims of abuse - 52% of children exposed to domestic abuse had behavioural problems, 39% had difficulties adjusting at school, and 60% felt responsible or to blame for negative events.[14] It is also common for perpetrators to try to contact children after they and the non-abusive parent have fled, such as through the child’s devices, gaming consoles and platforms, and may seek to use this contact to determine the location of the survivor-parent and child. Such contact, understood within the context of domestic abuse, is clearly harmful to both the child and to the survivor-parent, and threatens their physical safety. The harms of domestic abuse and other forms of VAWG must be understood and reflected in the Bill in order to strengthen protections for child victims of domestic abuse.

Children are also harmed by domestic abuse that is directed towards the survivor-parent, as recognised in the Domestic Abuse Act. Protecting the non-abusive parent is a necessary component of protecting their children, including from tech abuse as the non-abusive parent is often a critical safety factor for children. For example, a survivor supported by Refuge experienced her ex-partner hacking into her eBay account. He then purchased items for himself, reducing the survivor’s ability to financially support her children. The best way of ensuring children’s safety is to ensure the safety of the survivor-parent to live free from fear of abuse and to raise their children. The inclusion of VAWG in the Online Safety Bill would therefore also improve children’s online and offline safety.

“He was checking my Facebook, who my friends were, reading the conversations I had with my friends in Messenger. Another issue has been sharing images of my son on Facebook. Anyone in the world could see his (the perpetrator’s) profile, his account (was) public. He planted the picture on this public space. I reported to Facebook and they came back to me saying there is nothing inappropriate there. He found out my address through WhatsApp, he knows where I live. I cannot share information in WhatsApp – because I’m worried that he will show up” Refuge client, whose former partner monitored her use of social media and posted images of her child online.

Refuge recommends the following:

  1. Does the draft Bill make adequate provisions for people who are more likely to experience harm online or who may be more vulnerable to exploitation?

The Bill does not make sufficient provision for victims of domestic abuse and other forms of violence against women and girls. Domestic abuse is a deeply gendered crime which has a disproportionate impact on women. Official statistics consistently demonstrate that the vast majority of domestic abuse victims are women and the vast majority of perpetrators are men. 27.6% of women aged 16-74 in England and Wales will experience domestic abuse at some point in their lives compared to 13.8% of men. 92% of defendants in domestic abuse-related prosecutions were men in the year 2019/20, and 77% of victims were women.[15] [16] Some women are also more likely to experience domestic abuse due to their race, ethnicity, sexuality and/or other identities. For example, 14.7% of disabled women experienced domestic abuse in the year ending March 2020, compared to 6% of non-disabled women.[17] Whilst official statistics do not adequately provide a full picture of gender disparity, as they do not account sufficiently for repeat victimisation or capture coercive and controlling behaviour, they plainly illustrate the gendered nature of domestic abuse. Successive surveys have also shown the gendered experiences of online abuse, which supports the centring of online VAWG in the Online Safety Bill:

The Bill allows for content to be read by service providers in “reference to that particular adult, taking into account any of the adult’s characteristics” when interpreting content as harmful to adults, thus introducing an element of subjectivity into the test of an “adult of ordinary sensibilities.” This is a welcome acknowledgment that some women, such as those from Black and minoritised communities, are impacted disproportionately by violence against women and girls and that they may experience different forms of online harms. It is a positive step that providers are encouraged to account for the subjectivity of harm and different experiences of online abuse, but this provision should go further. The Bill should also require providers to take into account online VAWG and the relationship between victim-survivor and perpetrator in contextual readings of harm. Ultimately, Refuge argues that online VAWG should be given a much more central role in the Bill if it is to make adequate provision for women and girls. Online VAWG should be explicitly named on the face of the Bill and a dedicated VAWG code of practice developed by the regulator, in consultation with the specialist VAWG sector.

Support for survivors

In addition, a funding package should be launched alongside the Bill to ensure survivors of online VAWG have access to specialist, holistic support services. As Refuge’s frontline staff have to undertake a significant amount of advocacy work to support survivors of tech abuse to seek redress from the police and/or online platforms, adequate resourcing of these services is vital. Refuge is unique in having a dedicated tech abuse team, and the team has provided holistic support to thousands of women, resulting in improved outcomes for survivors. A substantial level of specialist knowledge and time is required to support clients and to elicit responses from social media companies. The team have also developed ‘trusted flagger’ status with many social media platforms, relationships which take time and resource to build, as well as links with global experts in online VAWG. Specialist VAWG organisations are severely underfunded and face an insecure funding landscape and historic funding cuts. Financial provision for specialist, independent support for survivors must therefore be guaranteed in the Bill. This could be achieved by directing 5% of any fines levied by the regulator to funding specialist VAWG sector support services, with 50% of this amount specifically ring-fenced for specialist ‘by and for’ led services supporting Black and minoritised women and girls, as suggested by the Joint VAWG Sector submission to the Committee. 

Product developers should also routinely develop guidance for users setting out steps to be taken if their products are used to harm or abuse users. Refuge recently launched a new tech safety website (refugetechsafety.org) and provides resources, step-by-step guides and an interactive chatbot to help women secure their devices and online accounts and use technology safely. We also work with experts across the world to ensure our work and holistic support to survivors is informed by the latest technological developments. With their often vast resources, tech companies should be able to develop similar guidance for their users.

 

Refuge recommends the following:

  1. Is the “duty of care” approach in the draft Bill effective?

The duty of care approach will not be fully effective without strengthening the duties relating to adult’s online safety. We welcome the duties within the Bill for regulated services to operate systems for users to make complaints and to easily report content - for which we make recommendations in section 1 - but it is apparent that the duties relating to adults’ online safety are currently too weak and will do little to protect women. For instance, duties to protect adults’ online safety, to provide additional reporting processes and to carry out adult’s risk assessments only apply to Category 1 service providers. The threshold for Category 1 services will be set at a later date by the Secretary of State, but it is likely that only the largest high-reach social media platforms will fall into this Category, meaning there will be fewer safeguards in place for users of medium and smaller sized platforms. In comparison, the duties for providers relating to illegal content and children’s safety are likely to result in more effective protections for users, as they include duties to use systems and processes to minimise illegal content and to prevent children from encountering harmful content in the first place. The duties for adults’ online safety must be strengthened and brought in line with those for children’s online safety and illegal content.

Refuge recommends the following:

  1. Does the Bill deliver the intention to focus on systems and processes rather than content, and is this an effective approach for moderating content? What role do you see for e.g. safety by design, algorithmic recommendations, minimum standards, default settings?

Reporting and moderating systems

Minimum standards can play a role in improving approaches to content moderation. A dedicated online VAWG code of practice should set out minimum standards for online platforms to meet, especially for reporting and moderating processes, so that users know what to expect tech companies to do to prevent and address online harms. As outlined in section 1, urgent improvements are needed to reporting systems to ensure they are as quick, efficient and trauma-informed as possible and ensure that platforms are taking effective action. This is a much-needed step in transforming the response of online platforms to survivors of tech abuse and online VAWG. Current systems are not adequately tackling instances of online VAWG, meaning survivors have reduced confidence in using social media and other platforms and in sharing their personal details online; some are forced to come offline due to a lack of alternative options.

“I don’t even do any purchases online anymore. I used to use Amazon sometimes; I don’t even do that anymore. I don’t give out bank details. I’m quite cautious now. I don’t bother with the dating stuff now, I’ve just deleted it. I don’t want to do any of that” Refuge client on the impact of tech abuse.

We also suggest that tech companies invest in human moderators rather than fully automating reporting and content moderation systems. The highly contextual nature and nuances of domestic abuse would likely prove difficult for algorithms to effectively identify all forms of tech abuse. We anticipate there will always be a role for human moderators, especially for online VAWG, and would regard with caution moves by platforms to solely use algorithmic processes as a means to meeting their duties of care within the Bill. Human moderators would need to receive training in all forms of violence against women and girls to ensure they understand the nature of domestic abuse and VAWG, and are able to investigate reports sufficiently and take a preventative approach to online VAWG. For further detail on Refuge’s views on the role of algorithms, please see sections 13 and 14.

Safety by design

The draft Bill does not provide sufficient provision to motivate service providers to have regard to safety by design. A duty of care is currently included for providers to operate their service using proportionate systems and processes designed to prevent children from encountering harmful content and to minimise the presence of illegal content. The same duty must be extended to content that is harmful to adults. The duties relating to such content are currently far weaker, only requiring risk assessments which include examining how the operation and design of the service may reduce or increase risk of harm. Safety by design is vital to improving the safety of women and girls. Service providers who have an understanding of the forms and impact of VAWG could develop products and platforms that were built with women and girls’ safety in mind. Refuge is aware of many apps and devices that have enabled, or even encouraged, tech abuse as a result of poor and often inadvertent design functions. This includes platforms which allow untraceable or anonymous communications or those which have generic or default passwords or password recovery systems, meaning survivors can easily be locked out of their accounts by perpetrators. Dating apps are also often used to abuse women online, and to encourage women to meet offline, where the risk of physical harm escalates. A duty should be placed on all companies making design choices which could facilitate online VAWG to respond to and prevent these harms occurring in the first place. Both government and the regulator should strive to increase the extent to which companies must consider how their products can be used to perpetrate tech abuse and encourage them to design safer functionalities and features.  

Moreover, the government should consider using the Online Safety Bill to introduce a duty on online platforms to cooperate with one another where tech abuse and online VAWG is occurring across multiple platforms. It is very easy for perpetrators of domestic abuse to move from one platform or app to another to pursue a survivor across all her social media channels. If blocked on one platform, the perpetrator can simply move onto another or set up an account with a fake name. Yet online platforms collaborate very little with one another, even when the platforms are all owned by the same parent company, as the following survivor stories demonstrate. Platforms should be required to cooperate and work together to address the ease with which perpetrators can move from one platform to another with impunity.

 

A survivor supported by Refuge experienced tech abuse across several social media sites, including WhatsApp, Facebook and Instagram. All three platforms are owned by Facebook, but the survivor was required to log individual reports with each platform in turn, and Facebook refused to even discuss the report she had made to WhatsApp.

 

In another example, a survivor who was stalked by a perpetrator she met on a dating app secured a stalking protection order, but she does not know if the perpetrator has been removed from the app, or whether the company has shared details of his abusive behaviour with other dating apps.

 

Refuge recommends the following:

 

  1. Does the proposed legislation represent a threat to freedom of expression, or are the protections for freedom of expression provided in the draft Bill sufficient?

 

The impact of tech abuse on women’s access to the internet, and therefore freely expressing their views online, should be considered in discussions on the Bill’s impact on and protections for freedom of expression. As a result of the failures of the criminal justice system and online platforms to provide sufficient safeguards to survivors, women and girls are finding they have little alternative but to self-censor online and sometimes remove themselves entirely from online spaces. In some instances, survivors have been advised by police to come offline, suggesting victims should change their behaviour rather than holding the perpetrator of the abuse to account. Refuge research has revealed:

 

38% of women who experienced online abuse from a partner or former partner felt unsafe or less confident online as a result.[21]

 

Almost one in five women who received threats from a partner or ex-partner to share intimate images or videos said they left the house less and/or used social media less as a result of the threats. Research by Glitch and EVAW also found that after facing online abuse, 48% of Black and minoritised women and non-binary individuals and 41% of white respondents reported spending less time online.17 18 The Bill has been subject to criticism for its perceived infringements on freedom of expression, yet these discussions fail to take into account the widespread silencing of women and girls already taking place. Suggesting women’s rights to safety and to freedom of expression are in direct opposition is a false dichotomy which ignores the right to free expression online for women and girls.

 

Technology is increasingly being used by perpetrators of domestic abuse, and it likely that without action to improve protections for survivors online, more women and girls will be shut out of online life. This not only infringes on their freedom of expression and reduces their enjoyment of social media and other forums, but also affects their ability to perform day-to-day tasks in an increasingly digital world. Social media has been described as the town square of the modern age. Indeed, one in five girls and young people aged 11 – 21 said online forums and spaces had been an important source of support for them during the pandemic; this figure rises to 27% among LGBQ and 26% among disabled girls and young women.[22] With the average UK adult now spending more than a quarter of their waking day online, tech abuse and online VAWG have significant consequences - as set out in the survivor story below - for example, for women seeking to promote their online businesses. Arguments about the Bill impeding on freedom of expression do not consider the consequences for freedom of speech on not regulating our online spaces and forums. We therefore support the retention of legal, harmful content within the Bill.

 

It puts me off going on my phone to be honest with you. I don’t want to put up with using social media anymore. It’s just too much to deal with; it impacts my mental health a lot” Refuge client on the impact of tech abuse.

 

Refuge recommends the following:

 

Content in Scope

  1. The draft Bill specifically includes CSEA and terrorism content and activity as priority illegal content. Are there other types of illegal content that could or should be prioritised in the Bill?

VAWG offences should be prioritised in the Bill to the same level as child sexual exploitation and abuse and terrorism content and activity. Many instances of tech abuse and online VAWG amount to criminal offences, such as online harassment and stalking, non-consensual sharing of intimate images or videos and, following a Refuge campaign, threats to share intimate images or videos. The criminal justice system response to domestic abuse in general is inadequate – successive inspections and reports have highlighted systemic failures to protect women and girls, and domestic abuse convictions fell by 37% between 2016 and 2020 – and in Refuge’s experience the police perform even more poorly when investigating and charging the online versions of these offences.[23]

The prevalence of illegal VAWG content online warrants a similar level of prioritisation to other illegal content explicitly included in the Bill. More than one in four women will experience domestic abuse at some point in her lifetime, and in England and Wales 37% of all stalking and harassment offences recorded by the police in the year ending March 2020 were domestic abuse-related. [24] [25] Sadly, an average of two women are killed every week by their partner or ex-partner.[26] Indeed, Refuge often sees tech abuse raised in Domestic Homicide Reviews (DHRs), of which we have been involved in the quality assurance of for over ten years. DHRs are multi-agency investigations of the circumstances of homicides or suicides, which have or appear to have, resulted from the abuse of an intimate partner or family member. The reviews make recommendations to public bodies and statutory agencies to improve safeguarding of victims and prevent future homicides or suicides. Where tech abuse has played a role in a domestic homicide, DHRs should be able to make recommendations to the police and other statutory services and agencies.

In addition, legislative measures are required to address the failure of the criminal law to keep pace with technological change. Key legislation relating to tech abuse, such as the Communications Act 2003, Malicious Communications Act 1988 and the Computer Misuse Act 1990 are outdated instruments, and frequently misunderstood by the police. For example:

A woman Refuge supported who was physically, sexually, financially and psychologically abused by her former partner was supported to gather evidence of the perpetrator hacking into her email and business Instagram account. This left her unable to access both accounts and to run her business.  The perpetrator also sent abusive messages to clients impersonating her, resulting in a loss of income. On reporting this to the police, the survivor was advised to report the issue to Instagram as it was not the police’s responsibility to manage social media platforms, and the email hacking did not amount to a criminal offence. With Refuge’s support, the survivor reminded the police that the hacking did fall under the Computer Misuse Act; the police response was that this offence was only applicable for terrorism and serious organised crime.

Therefore, focusing on terrorism without also elevating the priority of violence against women and girls in the Bill may risk reinforcing the narrative that online crime only concerns terrorism and child sexual exploitation and abuse, and does not apply to VAWG.

Social media companies are also failing to recognise tech abuse and online VAWG offences as the crimes that they are. A significant amount of advocacy is required by Refuge’s tech abuse team to persuade platforms that criminal behaviour has occurred and to take effective action. The onus is currently on the survivor to collect evidence and report to the police. In some instances, survivors who have requested content and data from social media sites in order to compile evidence for the police have been informed that the police must lodge a formal request directly with the company in order to access data. As set out previously, this is unlikely to occur if the police do not have the legislative tools or understanding of tech abuse and online VAWG to investigate such offences. The inclusion of VAWG on the face of the Bill and an online VAWG code of practice would help both the police and tech companies to identify online VAWG perpetrated on their platforms and understand the actions to take to in response.

Refuge recommends the following:

  1. Are there any types of content omitted from the scope of the Bill that you consider significant e.g. commercial pornography or the promotion of financial scams? How should they be covered if so?

The omission of online VAWG from the Bill is of significant concern. As there is currently no duty to specifically address domestic abuse or other forms of VAWG in the Bill, or any mentions of VAWG, survivors would need to show instances of tech abuse to be either illegal, harmful to adults or harmful to children. Given the widespread lack of understanding of domestic abuse among social media companies, and the issues outlined in section 1, this would be unlikely to result in a positive outcome for the survivor. Refuge recommends that online VAWG be explicitly named and recognised on the face of the Bill and that the regulator be required to develop a code of practice on online VAWG which would set out in detail the types of content that constitute online VAWG.

The Online Safety Bill may also be a suitable legislative vehicle to adopt recent recommendations made by the Law Commission on the criminal law governing harmful communications and intimate image abuse. Refuge supports the Commission’s proposal for a new ‘harm-based’ communications offence to replace the offences within section 127(1) of the Communications Act 2003 and the Malicious Communications Act 1988. The Commission’s review found the legislation to be ambiguous and outdated, and furthermore that the criminality threshold was often set too low, which chimes with our experience of supporting survivors who have been victims of offences covered by the Acts.[27] In addition, we strongly support the introduction of a single comprehensive base offence for the non-consensual taking or sharing of an intimate image of a person. This offence should have no additional requirement to prove the motivation of perpetrators, and should include an expansive definition of intimate images that reflects the diverse experiences of victim-survivors and minimises the barriers they may face when reporting this abuse to the police. We urge the government to adopt these proposals at the soonest legislative opportunity.

Refuge also supports recommendations on the commercial porn industry made by the VAWG sector joint submission to the Committee. Research by Dr Fiona Vera Gray and Professor Clare McGlynn found that one in eight titles shown to first time viewers of the most popular pornography websites in the UK described sexual activity that constitutes sexual violence.[28] We echo the following statement from the submission: As Clare McGlynn and Erika Rackley[29] point out, rape porn and image-based abuse, as well as harming the individual ‘victim’ in a deeply gendered way, also cause ‘cultural harm’, in that they ‘may help to sustain a culture - a set of attitudes that are not universal but which extend beyond those immediately involved as perpetrators or victim-survivors of image-based sexual abuse - in which sexual consent is regularly ignored.’”[30] In particular we support calls for commercial porn websites to be specifically named in the Bill as a regulated service, to ensure clarity of scope, and for the Bill to acknowledge that pornography that depicts, endorses or encourages attitudes or behaviours underpinning VAWG to be recognised as content that is harmful to adults and children. 

Refuge recommends the following:

 

  1. What would be a suitable threshold for significant physical or psychological harm, and what would be a suitable way for service providers to determine whether this threshold had been met?

The threshold for significant physical or psychological harm, in reference to content that is harmful to adults and children, must account for the subjective and contextual nature of domestic abuse. Specialist VAWG organisations should be involved in the process of determining the harm threshold to ensure this reflects the context of domestic abuse and subjective elements of harm. Frequently, online platforms judge reported content at face value, rather than alongside the history of domestic abuse and coercive control. If the threshold for harm is unable to account for context, many instances of tech abuse may not be deemed as harmful or in violation of platform community standards, despite the significant harm it causes survivors, as the survivor story below illustrates. Online VAWG must therefore be recognised as a specific harm within the Bill to ensure online platforms understand the forms and impact of these harms.

A survivor supported by Refuge was frequently contacted at a specific time each day by a fake account. She suspected it was her former partner contacting her as the time had been of significance during their relationship. This would cause distress and harm to the survivor, as it gives the impression that the perpetrator has not forgotten her and is still seeking to control, coerce and cause fear. An online platform viewing such content without understanding the context of domestic abuse would likely not view this as harmful.The threshold for the “significant adverse” impact which content must have on an adult or child to meet the criteria of content that is harmful to adults or children should also be able to reflect the enormous impact all forms of domestic abuse have on survivors, including children. Domestic abuse has a significant effect on survivors’ mental and physical health. Almost a quarter (24%) of Refuge’s clients have felt suicidal at one time or another, and survivors are up to three times as likely to develop a mental illness as women who do not experience domestic abuse.[31] [32] Survivors’ housing, employment, finances, and relationships with family and friends can also be impacted. Domestic abuse is a key cause of women becoming homeless, and one in four survivors who experience economic abuse has debts in excess of £5000.[33] [34] Thus, it is vital that the threshold of “significant” impact sufficiently reflects the serious and long-term effects of domestic abuse.  

In addition, the definition and threshold of “significant adverse physical or psychological impact” currently excludes harms that result from financial impacts of harmful content. This could result in the exclusion of some forms of economic abuse. Research by Refuge and The Co-Operative Bank found that economic abuse – whereby a perpetrator restricts a survivor’s ability to use, acquire and maintain money or other economic resources – affects approximately 39% of adults in the UK.[35] We have supported women who have experienced perpetrators targeting their online businesses or influencer accounts on social media, damaging their finances. Economic harms must be recognised in the Bill; to not do so risks denigrating the harmful experiences of survivors of economic abuse and excluding this form of abuse from the regulatory framework. As one survivor told us,

“My business is all online. He started posting stuff about me that wasn’t even true on (my publisher’s) public page. And that really upset me, I had to call my publisher and tell them the story” – Refuge client, whose ex-partner targeted her book publisher’s social media accounts.

Refuge recommends the following:

 

  1. Are the definitions in the draft Bill suitable for service providers to accurately identify and reduce the presence of legal but harmful content, whilst preserving the presence of legitimate content?

 

There are a number of issues with the definitions in the draft Bill relating to harm and legal but harmful content that require attention. These definitions will be crucial to the interpretation of the Bill by service providers enacting their duties of care; without due consideration poor definitions could result in a maintenance of the status quo. Such a situation would mean tech abuse on online platforms continues to flourish, with perpetrators held unaccountable, and women and girls suffering the impacts on their mental and financial health, and their feelings of isolation and safety both online and offline. As previously mentioned, Refuge recommends that the government consult with the specialist VAWG sector on the definitions and processes for determining harm, in order to ensure all forms of online VAWG are understood and captured in any process. Statutory definitions for domestic abuse and coercive and controlling behaviour, as per the Domestic Abuse Act 2021 and the Serious Crime Act 2015, could also be utilised or referred to in definitions.

Firstly, vague concepts are used in the definition of content that is harmful to adults without further explanation, such as “reasonable grounds”. It is stated that the service provider must have reasonable grounds to believe there is a risk of the content having an adverse impact, but no detail is provided on what constitutes reasonable, nor what a provider or user would have to show to evidence this decision. It is also problematic that priority content harmful to adults is yet to be determined by the government, in consultation with the regulator. It is unclear on what basis content will be designated as priority, and this should be set out in the Bill to ensure transparency of objective.

A further concern with the definitions in the draft Bill is that judgement of risk is placed in the hands of the service provider. Section 46 (3) of the Bill states that “(c)ontent is within this subsection if the provider of the service has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities.” As evidenced at length throughout this submission, online platforms are consistently failing to understand tech abuse as domestic abuse or to comprehend the subjective and contextual nature of these harms. Given the widespread misunderstanding of domestic abuse, placing the judgement of risk solely with untrained online service providers who do not have specialist VAWG training is a key deficiency of the Bill. The harms of both domestic abuse and online VAWG should be explicitly included within the Bill. 

As outlined in section 3 of this submission, section 46 (6) of the Bill begins to introduce an element of subjectivity into the test of what an adult of ordinary sensibilities is. However, this section should be strengthened to ensure that platforms read content in reference to the history of domestic abuse and VAWG, as well as to the person’s protected characteristics.

Finally, we welcome the inclusion of indirect harms in the definition of content that is harmful to adults. Perpetrators frequently use indirect means to contact, coerce and intimate survivors. For example, they may contact the children, friends or family members of the survivor, in order to cause harm and distress to the survivor. It is right that the Bill makes allowance for such indirect harms, in recognition that they are also harmful content. As one survivor told us:

The children have got a PlayStation and he was paying for it. So every time the children turn the PlayStation on, his name pops up on the screen. The children wanted to delete him off. I think (the tech abuse) is still ongoing because he’s paying for the PlayStation.” Refuge client describes her suspicions that her ex-partner has attempted to contact her children by creating accounts and inviting them to play on a gaming platform.

 

Refuge recommends the following:

 

 

Services in Scope

 

  1. The draft Bill sets a threshold for services to be designated as 'Category 1' services. What threshold would be suitable for this?

Refuge recommends that further clarity be provided on the precise threshold conditions for Category 1, 2A and 2B for services. The Bill instructs the Secretary of State to make regulations specifying threshold conditions, only outlining that these conditions will relate to the number of users and functionalities of a service, and, for Category 2A and 2B, any other factors the Secretary of State considers relevant. It is therefore unclear where the threshold level will be set or which online platforms will fall into various categories, although it appears likely that only the largest, high-reach platforms will meet the criteria for Category 1. This could damage the integrity of the new regulatory framework, as duties of care for adults’ online safety, such as to provide additional reporting processes for content that is harmful to adults, will only apply to Category 1 services. Refuge is concerned that this could lead to a two-tier system of regulation, whereby smaller and medium sizes platforms may be more open to higher levels of risk of harm. It is our experience that perpetrators of domestic abuse do not discriminate by the user base of a social media site or app when seeking to abuse a survivor.

Larger social media sites are commonly reported to our staff as platforms for abuse – indeed, Refuge research has revealed:

45% and 32% of women who experienced online intimate partner abuse reported this taking place on Facebook and Instagram respectively;[36]

              An analysis of risk assessment comments of Refuge cases involving social media revealed “Facebook” to be the most commonly used term.[37]

However, perpetrators will use any means or platform available to contact and harass the survivor. For example, women supported by our frontline staff and tech abuse team have experienced abuse on platforms such as Muzmatch, which has around 500,000 users globally.[38] If the duties of care for adults’ online safety only apply to larger providers, it may be that perpetrators do learn to distinguish between different platforms based on their categorisation, in order to avoid increased scrutiny. All platforms, regardless of size, should have duties of care to protect their adult users from online VAWG in order to provide the greatest possible protection to survivors.

Refuge recommends the following:

 

  1. Are the distinctions between categories of services appropriate, and do they reliably reflect their ability to cause harm?

The distinctions between categories of service do not currently reflect their ability to cause harm. In particular, the number of users of a service has little influence on the platform’s ability to host abusive content that harms women and girls. Please see our response to the preceding question for further detail on Refuge’s views on the categorisation of services by size.

We would also support additional oversight, particularly at the design stage, of products with functionalities that particularly enable or encourage harmful content to be disseminated, such as those that encourage anonymous or untraceable communications. Some apps and platforms have in-built features such as self-destructing content or the ability to send content randomly, which hampers the process of evidence gathering when reporting content to the platform or police. Refuge’s tech team have seen numerous examples of women and girls experiencing abuse on these apps, such as intimate image sharing (also known as so-called “revenge porn”), unsolicited sharing of sexual images or “cyber-flashing,” and child sexual abuse and exploitation. 

Further clarity should also be provided on the inclusion or exemption from the scope of the Bill of certain functionalities, such as the sharing and receiving of voice messages on social media. Survivors supported by Refuge have experienced abuse through one-to-one voice messages and voice notes on Instagram and WhatsApp. Clarification is needed on whether these features, and indeed WhatsApp as a platform, would fall within the regulatory scheme. If such features and platforms are not included in the scope of the regulations, this could again lead to a two-tier system of regulation which would not reflect the reality of survivors’ experiences of tech abuse and online VAWG.

Refuge recommends the following:

 

Algorithms and user agency

  1. What role do algorithms currently play in influencing the presence of certain types of content online and how it is disseminated? What role might they play in reducing the presence of illegal and/or harmful content?

As referenced in section 5, the use of algorithms in reducing the presence of illegal and harmful content is problematic as a result of the highly contextual nature of many forms of tech abuse, which can be difficult for algorithms to identify. We would urge caution in the use of algorithms, particularly in content moderating and reporting processes, and suggest that investment instead be focused on human moderation. For example, YouTube’s algorithms currently grant more weighting to some user profiles if YouTube have agreed with the majority of their previous reports of inappropriate content, meaning they are more likely to accept future reports. We are wary about the adverse impacts of this system and of perpetrators using this weighting to their advantage, and argue that some form of human oversight is required.

There may be some areas where algorithms and other technological advances could assist with reducing online VAWG. For example, the use of fake accounts by perpetrators is a common issue, as illustrated in the survivor story below. Women tell us they have been contacted by dozens or hundreds of fake accounts with fake user names, which they suspect to the perpetrator. New accounts can be set up quickly with minimal effort, and survivors have been informed by some platforms that they are unable to take action against fake profiles where the perpetrator does not use his own name. There may be a role for software which can identify fake accounts created from the same IP address where one account is reported for abusive content. Engagement with the specialist VAWG sector would be required to ensure such functionalities are designed to better support survivors and hold perpetrators to account. Overall, we suggest algorithms should be used with caution.

“The people that he gets to stalk me - his friends. He’s using other people to put stuff on my Facebook. I reported all this (fake accounts) to Facebook. There’s nothing wrong they said. There’s no picture, there’s no friends on there (the fake account). It’s blatant someone’s done a dud profile.” Refuge client on her experience of online stalking and being harassed by fake accounts.

Refuge recommends the following:

  1. Are there any foreseeable problems that could arise if service providers increased their use of algorithms to fulfil their safety duties? How might the draft Bill address them?

Refuge is concerned that the increased use of algorithms by providers may result in the creation of systems that are unable to understand the context-specificity of tech abuse and other forms of online VAWG. Assessing harms flowing from online VAWG requires a solid understanding of domestic abuse and other forms of violence against women and girls, as well as an ability to account for contextual and subjective elements of harm. Great sophistication would be required to build an algorithm capable of comprehending the subjectivity of online harms and the nature of domestic abuse and VAWG. If providers were to exclusively use algorithms to comply with the regulations, we fear this would result in the maintenance, or worsening, of the current situation, whereby survivors’ reports of tech abuse are often rejected by platforms. Instead, tech companies could invest in human moderators and in comprehensive training in all forms of VAWG, so that staff are equipped to support users reporting online VAWG and able to effectively minimise the prevalence of online VAWG. Investment in human moderation in reporting processes could help result in a more trauma-informed system and improved user experience. Support for moderators including clinical supervision should also be provided to address vicarious trauma and other impacts of viewing harmful VAWG content. The increased use of algorithms cannot be the sole solution to providers fulfilling their safety duties, and the Bill should give consideration to this. Investment, training and support for human moderators would be a preferable response from online platforms in carrying out their duties of care.

Refuge recommends the following:

 

The role of Ofcom

  1. Is Ofcom suitable for and capable of undertaking the role proposed for it in the draft Bill?
  2. Are Ofcom’s powers under the Bill proportionate, whilst remaining sufficient to allow it to carry out its regulatory role? Does Ofcom have sufficient resources to support these powers?

We have responded to both questions within this section.

The success of the regulatory framework depends on the regulator having the resources to effectively enforce it. Currently the regulator’s enforcement powers and resourcing appear insufficient to enable it to carry out its functions. It is currently unclear what resourcing will be allocated to the regulator to staff its new role, which is of particular concern given the considerable existing regulatory functions of Ofcom, the proposed regulator, and the highly specialised knowledge that regulating online VAWG would require. In our response to the Online Harms White Paper consultation Refuge expressed support for a new public body being established as the regulator, in order to guarantee that the regulator could develop expertise on the broad range of online harms covered by the duties of care and effectively hold tech companies to account.

The proposed enforcement powers in the draft Bill also require improvement if the regulator is to ensure that social media companies and other platforms will pay regard to their duties and co-operate with the regulator. For example, criminal sanctions against senior executives are kept in reserve in the draft Bill, and would only come into force with the introduction of secondary legislation and a two-year wait following this. Sanctions are also only applicable for non-compliance with information notices, which suggests they do not apply to all breaches of duties of care. If this is the case, it is questionable as to whether companies can “buy” their way out of regulation via the payment of penalties, rather than face criminal sanctions. The regulator should have powers to hold senior executives liable for breaches of duties from the commencement of the regulatory framework, as financial sanctions alone may not be effective given the vast resources of larger companies. In addition, the regulator should have the power to issue take-down notices. Survivor engagement research on image-based sexual abuse has found that for many victims their first imperative is for the material to be taken down and removed from the internet.[39]

There are a number of additional measures that should be introduced in the Bill, or by other means, to ensure the regulator is capable of undertaking its role:

Refuge recommends the following:

  1. How will Ofcom interact with the police in relation to illegal content, and do the police have the necessary resources (including knowledge and skills) for enforcement online?

The regulator and regulated services should work closely with the police and share information and data to assist with criminal investigations. A major barrier in reporting tech abuse is the difficulty in obtaining evidence from online platforms. Some women supported by Refuge who have made requests to platforms for evidence have been informed that the police must lodge a formal request for evidence directly with the company. The regulator could work with tech companies and the police to develop clear guidelines on how companies should cooperate with police and survivors wishing to report crimes committed on their platforms. An online VAWG code of practice should also set out clear requirements for platforms to provide law enforcement with the data they require to investigate and prosecute tech abuse offences.

From our experiences of supporting thousands of women who have experienced tech abuse, it is apparent that the police do not have the resources to adequately enforce online VAWG crimes. Nor do they have a sufficient understanding of tech abuse, and as a result police officers can minimise the seriousness of VAWG perpetrated online. We recommend training is rolled out to the police on the severe impact of tech abuse as a form of violence against women and girls, to ensure online offending is not treated as less serious or unconnected to offline crime. The police must have sufficient resources, and access to appropriate technology, to promptly investigate VAWG crimes committed online. The government should also ensure that where possible it facilitates effective working relationships between the police and specialist VAWG support services working with survivors.

Refuge recommends the following: 

  1. How much influence will a) Parliament and b) The Secretary of State have on Ofcom, and is this appropriate?

Parliament should be able to instruct the regulator on particular online harms for which a code of practice should be developed, although the regulator should also be able to draw up additional codes of practice of its own volition. Refuge strongly recommends that Parliament instruct the regulator to develop a code of practice on online VAWG, and that the code of practice be developed in consultation with specialist organisations.

  1. Does the draft Bill make appropriate provisions for the relationship between Ofcom and Parliament? Is the status given to the Codes of Practice and minimum standards required under the draft Bill and are the provisions for scrutiny of these appropriate?

We recommend that VAWG specialist organisations be included in the list of organisations the regulator must consult before producing its codes of practice. Once codes have been drafted, there should also be further opportunities for scrutiny as learnings are gathered. For example, key recommendations from Domestic Homicide Reviews should feed into the development of codes of practice.

  1. Are the media literacy duties given to Ofcom in the draft Bill sufficient?

 

Further consideration is required of the media literacy duties placed upon the regulator. These duties should be strengthened so that online VAWG is a central focus for literacy and awareness raising activities. For example, the duty to promote media literacy among the public includes carrying out, commissioning or encouraging educational initiatives. These campaigns must reflect the harms and prevalence of tech abuse and other forms of online VAWG. Their focus should be on communicating that causing harm online is not acceptable in our society, and that in many cases such behaviour amounts to a criminal offence. It is also important for the impact of online VAWG on survivors to be incorporated into media literacy initiatives; this aligns with section 103 (2b) of the draft Bill which defines media literacy as including an awareness of the impact that online material may have, for example  on the behaviour of those who receive it. Additionally, awareness campaigns should encourage members of the public to report harmful online content or behaviour when they come across this, and to take active steps rather than to be bystanders to online harms. Such campaigns should always be created in consultations with the specialist VAWG sector, and caution paid to inadvertent education of perpetrators on new methods to abuse women and girls. Refuge would also suggest the government play a role in increasing the public’s understanding of online safety and tech abuse by funding and launching public awareness raising campaigns, in partnership with the VAWG specialist sector.

 

The regulator will need to play a role in increasing awareness of the duties of care when they come into force. We would recommend the regulator seeks to promote their role, the responsibilities of technology companies and the rights people have when using online forums and spaces. In particular, the public should be informed of what they can expect platforms to do to prevent and respond to online harms and what users should do if they do not meet these standards. The regulator should also provide guidance to regulated services on the ways in which platforms and devices can be used to perpetrate VAWG online, and the impact of this harm.

Refuge recommends the following: 

 

Conclusion

 

The Online Safety Bill is a major opportunity to transform the UK’s response to online violence against women and girls, and one which must be maximised. Online VAWG takes lives and devastates lives, often having long-term effects on mental and financial health, physical safety and social connections, as well as shutting growing numbers of women and girls out of online life. As evidenced throughout this submission, social media giants and the criminal justice system are falling far short in their efforts to protect women and girls online. Advice given to survivors to come offline or to block the perpetrator, both of which may elevate the risk of harm, illustrate the widespread lack of understanding of tech abuse and online VAWG. Without proactive legislative action to address the significant shortcomings in the current response to tech abuse and online VAWG, these harms are likely to only increase. Research has shown that young women are more likely to report experiencing online abuse, suggesting online VAWG is becoming more common among young people.[40] The affordability and availability of online platforms and other tech devices, and our increasing dependence on the internet, should compel us to act now to tackle this growing threat and better safeguard women and children today, and for future generations.

 

To strengthen the Bill, Refuge strongly advocates that the Joint Committee on the Draft Online Safety Bill recommend that:

 

 

21 September 2021

 

33

 


[1] This is also known as so-called “revenge porn” but Refuge does not endorse the use of this term as this implies some wrongdoing on the survivor’s part. It also implies the sharing of such images or videos has been done for the purposes of sexual gratification, when frequently this is not the case and the purpose is to coerce, control and humiliate the survivor. 

 

[2] Analysis of issues reported to Refuge’s tech team, statistics for January 2020 – March 2021.

[3] Research was carried out by Opinium between 24th and 27th August 2021. The sample consisted of 2,264 UK adults aged 18+ and was weighted to nationally representative criteria.

[4] Statistics for July 2020 – March 2021.

[5] Opinium survey for Refuge.

[6] HM Government (2021), Tackling Violence Against Women and Girls, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1005630/Tackling_Violence_Against_Women_and_Girls_Strategy-July_2021-FINAL.pdf

[7] G7 United Kingdom (2021), ‘G7 Internet Safety Principles,’ https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/986161/Annex_3__Internet_Safety_Principles.pdf

[8] G7 United Kingdom (2021), ‘G7 Interior and Security Ministers: Ministerial Commitments, Annex 2: Protecting against online exploitation, violence and abuse,’ https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1016393/G7_London_Interior_Commitments__Annex_2_-_Protecting_against_Online_Exploitation__Violence_and_Abuse__PDF__192KB__4_pages_.pdf

[9] ONS (2020), ‘Domestic abuse prevalence and trends, England and Wales: year ending March 2020,’https://www.ons.gov.uk/peoplepopulationandcommunity/crimeandjustice/articles/domesticabuseprevalenceandtrendsenglandandwales/yearendingmarch2020 

[10] HMIC (2014), ‘Everyone’s business: Improving the police response to domestic abuse’. https://www.justiceinspectorates.gov.uk/hmicfrs/wp-content/uploads/2014/04/improving-the-police-response-to-domestic-abuse.pdf

[11] Glitch UK and End Violence Against Women Coalition (2020), ‘The Ripple Effect: COVID-19 and the Epidemic of Online Abuse,’ https://www.endviolenceagainstwomen.org.uk/wp-content/uploads/Glitch-and-EVAW-The-Ripple-Effect-Online-abuse-during-COVID-19-Sept-2020.pdf  

[12] Girlguiding (2021), ‘Girls’ Attitudes Survey 2021: A snapshot of girls’ and young women’s lives,’ https://www.girlguiding.org.uk/globalassets/docs-and-resources/research-and-campaigns/girls-attitudes-survey-2021-report.pdf

[13] Ofcom (2020), ‘Online Nation 2021 report,’ https://www.ofcom.org.uk/about-ofcom/latest/media/media-releases/2020/uk-internet-use-surges   

[14] SafeLives(2014), In plain sight: The evidence from children exposed to domestic abuse,http://www.safelives.org.uk/sites/default/files/resources/In_plain_sight_the_evidence_from_children_exposed_to_domestic_abuse.pdf 

[15] ONS (2020), ‘Domestic abuse prevalence and trends, England and Wales: year ending March 2020.’ 

[16] CPS (2020), ‘CPS data summary Quarter 4 2019-2020,’ https://www.cps.gov.uk/publication/cps-data-summary-quarter-4-2019-2020 

[17] ONS (2020), ‘Domestic abuse victim characteristics, England and Wales: year ending March 2020. https://www.ons.gov.uk/peoplepopulationandcommunity/crimeandjustice/articles/domesticabusevictimcharacteristicsenglandandwales/yearendingmarch2020#sex

[18] European Women’s Lobby (2018) ‘Her Net Her Rights: Mapping the state of online violence against women and girls in Europe,’ https://www.womenlobby.org/IMG/pdf/hernetherrights_report_2017_for_web.pdf

[19] Refuge (2020), ‘The Naked Threat,’ https://www.refuge.org.uk/wp-content/uploads/2020/07/The-Naked-Threat-Report.pdf

[20] Glitch and EVAW (2020), ‘The Ripple Effect: COVID-19 and the Epidemic of Online Abuse.’ 

 

[21] Opinium survey for Refuge.

[22] Girlguiding (2021), ‘Girls’ Attitudes Survey 2021.’

[23] Calculated using CPS (2020), ‘CPS data summary Quarter 4 2019-2020,’https://www.cps.gov.uk/publication/cps-data-summary-quarter-4-2019-2020and the data published alongside the CPS VAWG Report 2018-19, available for download here: https://www.cps.gov.uk/cps/news/annual-violence-against-women-and-girls-report-published-0  

[24] ONS (2020), ‘Domestic abuse prevalence and trends, England and Wales: year ending March 2020.’

[25] ONS (2020), ‘Domestic abuse and the criminal justice system, England and Wales: November 2020,’ https://www.ons.gov.uk/peoplepopulationandcommunity/crimeandjustice/articles/domesticabuseandthecriminaljusticesystemenglandandwales/november2020

[26] ONS (2020), ‘Homicide in England and Wales: year ending March 2019,’ https://www.ons.gov.uk/peoplepopulationandcommunity/crimeandjustice/articles/homicideinenglandandwales/latest#how-were-victims-and-suspects-related

[27] Law Commission (2021), ‘Modernising Communications Offences,’ https://www.lawcom.gov.uk/project/reform-of-the-communications-offences/  

[28] F.Vera-Gray, C.McGlynn, I. Kureshi, K.Butterby, Sexual violence as a sexual script in mainstream online pornography, The British Journal of Criminology, 2021; https:// doi.org/10.1093/bjc/azab035 

[29] McGlynn, Clare & Rackley, Erika (2017), ‘Image-Based Sexual Abuse,’ Oxford Journal of Legal Studies 37(3): 534-561

[30] Joint VAWG Sector Principles for the Online Safety Bill Submission

[31] Refuge (2019), ‘Domestic abuse and suicide: Exploring the links with Refuge’s client base and work force,’ https://www.refuge.org.uk/wp-content/uploads/2018/07/domestic-abuse-suicide-refuge-warwick-july2018.pdf 

[32] Chandan et al (2019), ‘Female survivors of intimate partner violence and risk of depression, anxiety, and serious mental illness,’ The British Journal of Psychiatry: https://www.cambridge.org/core/services/aop-cambridge-core/content/view/B33176643C1858B2D502E584D160F794/S0007125019001247a.pdf/female_survivors_of_intimate_partner_violence_and_risk_of_depression_anxiety_and_serious_mental_illness.pdf  

[33] St Mungo’s (2014), ‘Rebuilding Shattered Lives,’ https://www.mungos.org/publication/rebuilding-shattered-lives-final-report/  

[34] Refuge and the Co-operative Bank (2020), ‘Know Economic Abuse,’ https://www.refuge.org.uk/wp-content/uploads/2020/10/Know-Economic-Abuse-Report-2020.pdf

[35] Ibid.

[36] Opinium survey for Refuge.

[37] Analysis of risk assessment comments from Refuge client cases involving social media, statistics for January 2020 to May 2021.

[38] Muzmatch (2020), “Muzmatch Reaches 500,000 Users & Continues Growing,” https://muzmatch.com/en-GB/blog/dating/muzmatch-reaches-500000-users-continues-growing

[39] McGlynn, Clare and Rackley, Erika and Johnson, Kelly and Henry, Nicola and Flynn, Asher and Powell, Anastasia and Gavey, Nicola and Scott, Adrian (2019) 'Shattering lives and myths: a report on image-based sexual abuse,' Project Report. Durham University; University of Kent.

[40] Opinium survey for Refuge.