Written evidence submitted by The Office of the Children’s Commissioner (OSB019)

 

Children’s Commissioner’s interim findings on online peer-on-peer abuse and inappropriate content online

Introduction from the Children’s Commissioner, Dame Rachel de Souza

I began my term as Children’s Commissioner in March this year, shortly before the Everyone’s Invited website went live. Since then, over 50,000 stories have been shared on the website by brave young people, mostly girls, describing their experiences of sexual harassment and abuse – often perpetrated by their peers.

These stories are truly shocking. They show that peer-on-peer abuse is so common that young people simply accept it as part and parcel of daily life. Too often, sexual harassment is not challenged or taken seriously by the adults in children’s lives.

Much of the Everyone’s Invited testimony refers to behaviour taking place in schools (both state and independent). As a former school leader myself, I know that schools certainly need to do more to tackle it, along with their safeguarding partners and the wider education system. I welcomed the opportunity to sit on the reference group to Ofsted’s review of sexual abuse in schools and colleges, in which I pushed for better support for victims of peer-on-peer abuse in schools, along with greater oversight and clarity by Ofsted and Government in relation to this issue.

But this is a complex, deeply-rooted, and ultimately societal problem. It is driven by harmful attitudes about sex, relationships, and gender, often held by adults as well as children. Schools alone cannot stamp out this behaviour – all of us have a role to play.

I am particularly concerned by the fact that much of this sexual harassment and abuse happens online. In some cases, online platforms directly facilitate abuse, such as when nude images are shared on social media or by private message without a young person’s consent. But online channels also contribute to the underlying beliefs and attitudes which drive abuse, most notably through exposing children to inappropriate content, including (but not limited to) pornography, much of which is extreme, violent, and degrading to women. As Children’s Commissioner, I make no judgement about what adults look at in their private lives. But I am concerned by the need to reduce children’s access to content that can cause them harm.

Having shared my concerns with ministers across Government, I welcomed a commission from the Secretaries of State for Education and Digital, Culture, Media and Sports to look into this very issue (see Annex 1). Their letter recognises the Children’s Commissioner’s “unique role in representing the rights, views and interests of children” and asks me to work with them to identify “whether there are actions which we can take now to further protect children before the Online Safety Bill comes into effect.” I was asked to provide specific solutions on the role of parents, carers, and wider family, as well as to ensure the voices of children are heard as the Online Safety Bill undergoes pre-legislative scrutiny.

This report summarises the work my Office has undertaken so far in response to this commission. It sets out my early findings, proposed solutions, and next steps. I am grateful to the support of everyone who has engaged with my team and myself as we have developed our thinking, including Government, industry, schools, parents, and charities, and I look forward to continuing these conversations as I continue my work.

Dame Rachel de Souza

Children’s Commissioner

 

 

 

Work undertaken so far

This report is informed by the following work, undertaken by the Children’s Commissioner and her team since receiving the commission from Government in May 2021:

 


Key findings

  1. Children are routinely exposed to a wide variety of inappropriate content online, which contributes to the problem of peer-on-peer harassment and abuse. This includes pornography, but also heavily filtered images and other extreme body image content, content showing physical violence and gore, and content relating to self-harm, eating disorders, and suicide.
  2. Research by Revealing Reality for the BBFC has found that over half of 11-13 year olds have seen pornography at some point. This rises to two-thirds of 14-15 year olds and four in five 16-17 year olds. Some children see pornography at primary school age, often accidentally (e.g. through an advert or pop up).[2]
  3. Online pornography often features violence and degradation. One study estimated that 41% of professional videos depicted violence against women, and a concerning minority featured non-consensual sex.[3] Parents are often unaware that this kind of content can be found so quickly and easily on the internet, and can be under the false impression that online pornography is similar in tone to a top shelf magazine.
  4. Unsurprisingly, research shows that exposure to pornography shapes children’s attitudes and beliefs about sex and relationships. Research by Middlesex University for the CCO and NSPCC found that over half of boys, and 4 in 10 girls, think that porn is realistic.[4] Furthermore, numerous longitudinal studies have concluded that viewing pornography (especially violent forms) in adolescence increases the risk of a young person behaving in a sexually aggressive or coercive way.[5] A Government-commissioned review has found an association between porn usage and harmful attitudes and behaviours towards women and girls.[6]
  5. Responses to our Big Ask survey show that children themselves worry about the impact of online pornography, yet feel pressure to view it. For example, one boy told us:

“I was pressured into [] watching horrific pornography that effects how young boys behave towards and think they can treat women. As a boy myself, I was unable to understand the everyday struggle of the girls in my class, then one day I did. I was ostracised for not cat calling girls in the class, watching pornography or sexually assaulting any girls.” Boy, 16

Another child told us:

“I came across pornography on the internet and I think it turned my life upside down - it still affects my mental health.” Boy, 15

Children also told us that they use pornography to learn about relationships and sex, as they do not receive the right kind of information about this from their parents and teachers.

They also told us how social media affects their wellbeing and body image:

Social media because it surrounds us it constantly […] it puts everyone down with fake information and hate and even I have it at 13 and I’m always on it.” Girl, 13

Another child told us:

Social media as it pressures us all to be perfect; have a perfect body, perfect face, perfect health and perfect life in general.” Girl, 15

Children also told us how they wanted more protection online:

I don’t feel I was informed of my online safety from a young age and that it was considered a priority. Technology and social media are constantly developing so why aren’t our laws and protections for children on these platforms updating with it?” Girl, 14

  1. In addition to consuming pornography, children are put at risk when nude images or videos of them are shared, sometimes without their consent. This material is child abuse imagery, and while it is important that children sharing it are not criminalised, it needs to be addressed. Girls spoken to by inspectors for Ofsted’s review reported that they can be contacted by up to 10 or 11 different boys a night to be asked for nude or semi-nude images.[7]
  2. Children’s viewing and sharing of pornography (and nude images/videos of themselves and their peers) is facilitated by a wide variety of online platforms. Most obviously, children can access this content via dedicated adult sites, often found using a search engine. They can also access it via social media, messaging apps, and online games. App store platforms play an indirect role in enabling children to download these apps.

It is important to note that most (but not all) of the services described above have terms of use that either prohibit children from accessing the service at all (as in the case of adult sites), or that set a minimum age (as in the case of most social media and messaging apps). Furthermore, posting or sharing pornography and other forms of inappropriate content is forbidden on some of these services.

  1. Even though most companies specify a minimum age in their terms of service, not enough is done to properly enforce them. In some cases, users are not required to create an account and sign in. In other cases, they are required to create an account, but their age is not properly checked when they do so. Most of these platforms use an age gate that simply requires users to submit their date of birth – a method that children easily and routinely circumvent by lying about their age. For example, the minimum age on most social media platforms is 13, yet research by Ofcom shows that 42% of children aged between 5 and 12 years old use social media.[8] Previous CCO research has found that nine in ten 12 year olds and six in ten 8 year olds say that they have used a messaging app or site with a minimum age of at least 13.[9] It is impossible to protect children online if we do not know who the children are.
  2. All of the online services we engaged with recognised the need for the industry as a whole to do more to protect children from seeing inappropriate content online, and were willing to speak to us about this issue.
  1. It is welcome that there is increasing regulation of online services, including through the ICO’s Age Appropriate Design Code and Ofcom’s regulation of Video Sharing Platforms (VSPs). Notably, the Online Safety Bill is soon to undergo pre-legislative scrutiny, and will represent a significant step forward in protecting children online, by putting a duty of care on many online services. However, these regulations could be enhanced to provide even greater protection for children, especially in relation to age verification and age assurance. Furthermore, the provisions set out in the Online Safety Bill are likely to only come into effect in 2023-24 at the earliest.

 

The Children’s Commissioner’s proposals

Our work so far indicates that there is an urgent need to tackle children’s experiences of online peer-on-peer abuse and their exposure to inappropriate content, including pornography. As recognised in the commission we received from Government, urgent action is needed now, ahead of the Online Safety Bill being passed. In the coming weeks and months, the Commissioner will be talking to Government, industry, civil society, and other stakeholders about the following possible solutions:

  1. Improving the use of age verification and age assurance technology by online services.

Age verification and age assurance are not silver bullets that will prevent all children from seeing any pornography. The same is true of systems designed to prevent children from accessing harmful products in the offline world, such as alcohol and cigarettes. However, better use of age verification and age assurance online would make it significantly more difficult for children to access pornography, and reduce the likelihood of children stumbling across it accidentally.

Although there can be no one-size-fits-all approach to age verification and age assurance, there are certain broad principles that these systems should meet. For example, systems should be:

Adult sites and tech companies that take their responsibilities towards children seriously are already looking at ways they can improve their systems, and should be supported to do so. However, if progress is not made fast enough, the Government should consider stepping in and legislating ahead of the Online Safety Bill. This could happen through the following routes:

 

  1. Guidance for parents on supporting children with their online lives and relationships

There is already a wealth of high-quality information available to parents about keeping their children safe and well online, provided by organisations such as Internet Matters, ParentZone, the NSPCC, and more. However, some of the most brilliant insights for parents are likely to come from young people themselves. The CCO is going to work with young people aged 16-21 to produce advice on what they would have liked their parents to have known about their online lives as they grew up. The themes covered by the guidance will be identified through consultation with young people, parents and charities, but will likely include healthy relationships, positive self-image, pornography, and cultures promoting sexual harassment among teens.

As part of this work, the team will be conducting focus groups and workshops with parents and teenagers. We have also written to leaders of charities with expertise across the digital, parenting, and education sectors to invite them to join a steering group that will help direct and sense-check our guidance.

We intend to publish our guidance before the end of the year, and will be putting together a strong communications campaign to maximise uptake among parents.

  1. A long-term funding agreement for the UK Safer Internet Centre.

The UK Safer Internet Centre is a partnership of three leading children’s internet safety charities – the Internet Watch Foundation, Childnet, and SWGfL. Together they provide education, advice, and support for parents, professionals, and children, including those who have been victim to online abuse. This includes a service that removes large volumes of child sexual abuse material from the internet, including that which is shared by children and young people themselves. The UK Safer Internet Centre receives 50% of its annual funding, £1.3 million, from the EU. This funding will be lost unless secured by Government at the forthcoming Spending Review, at a per annum cost of just 10p per child under the age of 15.

The Online Safety Bill is a once in a lifetime opportunity to reset the relationship between online services and their youngest users. The intention and spirit of the legislation is just as it should be – to place a duty of care on online platforms to keep vulnerable users safe and well, especially children. It is now time to make sure that the new regulatory regime will be as effective as it can possibly be. To that end, the Children’s Commissioner proposes the following measures to enhance the provisions already set out in the Bill:

  1. Age verification and age assurance

It is not yet clear whether the legislation will require platforms to employ age verification and/or age assurance systems. The Bill should give Ofcom the power to direct companies to use proportionate age verification/assurance technologies on platforms that pose risk to children.

 

  1. Access to commercial pornography

Under the Government’s current approach, websites that do not host user content or enable users to interact appear to be out of scope of the duty of care. This means that some commercial pornography sites are currently out of scope. The Bill must ensure that all commercial pornography sites are captured and subject to the duty of care. The Secretary of State has indicated that the Government is actively looking at this, which is welcome.

 

  1. A dedicated complaints route for children

Children tell us that they want immediate support when things go wrong online. But far too often nothing happens when they report incidents of abuse and harmful content directly to platforms. 

As the largest vulnerable user group on the internet, it is right that children should have direct access to share concerns and complaints with the regulator.

The Bill should make provision for Ofcom to establish a child-facing aspect of their regulatory function, to address specific complaints made by children and gather evidence of new and emerging risks.

 

  1. Private messaging

We strongly welcomed the Government’s decision to include private messaging platforms within the scope of the duty of care. We know that the direct messaging channels convey a high proportion of online child sexual exploitation and abuse – including abuse by peers.[10]

However, Ofcom’s power to direct companies to use technology to identify child abuse could be stronger than the current plans allow. The Bill should grant Ofcom the power to direct companies to use accurate software to scan for child abuse and grooming.

 

  1. Cross-platform abuse pathways

It is rare for peer-on-peer sexual abuse to be committed on a single platform. It is common for a young person to be coerced into sharing a nude image on one messaging platform, from which it is downloaded and shared rapidly across multiple social media networks, messaging, and video sharing sites. The content spirals immediately beyond the victim’s, perpetrator’s, and original platform’s control.

The Bill should address the cross-platform nature of peer-on-peer sexual abuse by mandating a centralised, industry-wide approach to taking down child abuse content, and place a duty on platforms to share information about known offences.

 

  1. Enforcement of the duty of care

We welcome the range of sanctions that will be available to Ofcom, including fines of up to £18 million or 10% of annual global turnover, disruption of business activities, and ISP-blocking. However, the power to impose criminal sanctions on senior managers will only be issued when platforms fail to share information with Ofcom.

The Bill should give Ofcom power to impose criminal sanctions on senior managers in response to any serious breach of the duty of care, not just when platforms fail to share information.

It’s crucial that children are fully informed when platforms have breached their duty of care, and understand the actions being taken to ensure their safety online. The Bill should require companies that breach their duty of care to communicate the breach, and actions they are taking in child-friendly language.

 

Next steps

New regulation by Government of online platforms, including the forthcoming Online Safety Bill, represent a massive step forward in keeping children safe from peer-on-peer abuse online. We have been delighted to begin conversations with Government (and other stakeholders) about how regulation can be made as effective as possible, and to identify solutions which protect children before it comes into effect.

Our work so far shows that there is still more to be done to enhance the efforts of Government, industry, civil society and others to protect children from online peer-on-peer abuse, and reduce access to inappropriate content. In the coming months, the Children’s Commissioner and her team will be undertaking further work including:

Tackling online peer-on-peer abuse will remain a key priority for the Office, and we will continue to work flexibly and collaboratively to identify solutions that can be acted upon.

 


Annex 1: Our commission from Government

24 May 2021

Dear Rachel,

There has rightly been much attention drawn over the last few weeks to the sexual abuse and harassment of children and young adults and in particular to the testimonies that have been posted on the Everyone’s Invited website. The government and our departments are committed to tackling this problem robustly wherever it arises.

The Government has already established a helpline in partnership with the NSPCC and commissioned Ofsted to conduct a review into sexual abuse in schools. We are pleased that you are a member of the Reference Group and look forward to the report due by the end of May. On 19 April, the Office for Students published a statement of expectations setting out the processes, policies and systems universities and colleges should have in place to prevent and respond to harassment and sexual misconduct. We are also considering what else may be needed in this space.

However, we recognise that schools, colleges and universities alone cannot solve this problem. There are wider societal and cultural issues at play. In particular, a large amount of pornography is available on the internet with little or no protection to ensure that those accessing it are old enough to do so. This, in turn, is changing the way young people understand healthy relationships, sex and consent. While being online can be a hugely positive experience for children and young people, there is growing concern about the impact of exposure to harmful or inappropriate content, such as pornography.

To tackle this problem, the government has an ambitious plan to make the UK the safest place to be a child online. A key part of this is a ground-breaking new system of accountability and oversight of tech companies that will be enshrined in law through the Online Safety Bill, which the government published this week.

The strongest protections in the new regulatory framework will be for children. Companies will need to prevent children from accessing inappropriate content, such as pornography and minimise children’s exposure to behaviour such as bullying.

They will also need to support children to have a safe, age-appropriate experience on services that are designed for them. Companies that fail to protect children will face huge fines or, in the most egregious cases, be blocked in the UK.

Beyond regulation, we also know that users want to be empowered to manage their online safety, and that of their children, but there is insufficient support in place and they currently feel vulnerable online. The Government will be publishing the Media Literacy Strategy later this year. This will be an important vehicle to set a clear direction for our joint efforts to educate and empower internet users, including parents and children, to make more informed and safer choices online. We would be keen for your involvement to help steer this work, alongside stakeholders from across the media literacy landscape. This is a complex and multifaceted problem, with no simple solution.

We are clear that tech companies must not wait for regulation to protect children from harmful content. The government is already undertaking a range of initiatives to support companies to take action now to keep children safe online.

Given the Children’s Commissioner’s unique role in representing the rights, views and interests of children, we agreed we would work with you to get your input on whether there are actions we can take now to further protect children before the Online Safety Bill comes into effect. This should build on the work that Government has set out through the Online Safety Bill. We would also welcome your views on how to work with schools, parents and charities to support them around building strong social norms against underage access to pornography, around children using the internet safely and educating those groups on the impact that some internet content can have on healthy sex and relationships. The role of parents, carers and wider family is crucial in this and we would be grateful if you could advise on specific solutions to address this.

To begin this work we would like to invite you to a roundtable with tech companies, civil society, law enforcement and schools to discuss what more can be done ahead of legislation. As the Online Safety Bill undergoes pre-legislative scrutiny in Parliament, we would also welcome your support to ensure the voices of children are heard in that debate.

We look forward to discussing soon the proposed scope, process and timelines of this work, and working with you on this important challenge.

 

Rt Hon Gavin Williamson CBE MP Secretary of State for Education

Rt Hon Oliver Dowden CBE MP Secretary of State for Digital, Culture, Media and Sport

September 2021

 

 

12

 


[1] Recommendations of the Gender Equality Advisory Council 2021 to the Leaders of the G7, G7 Gender Equality Advisory Council, June 2021, link.

[2] Young people, Pornography & Age-verification, British Board of Film Classification, January 2020, link.

[3] What is the impact of pornography on young people? A research briefing for educators, PSHE Association, accessed 3 September 2021, link.

[4] “…I wasn’t sure it was normal to watch it…”, Middlesex University, June 2016, link.

[5] What is the impact of pornography on young people? A research briefing for educators, PSHE Association, accessed 3 September 2021, link.

[6] The relationship between pornography use and harmful sexual attitudes and behaviours, Government Equalities Office, February 2020, link.

[7] Review of sexual abuse in schools and colleges, Ofsted, 10 June 2021, link.

[8] Children and parents: media use and attitudes report, Ofcom, 28 April 2021, link.

[9] Access denied, Children’s Commissioner’s Office, December 2020, link.

[10] Access denied, Children’s Commissioner’s Office, December 2020, link.