Written evidence submitted by Catch22 (COR0152)

 

  1. About Catch22

 

Catch22 is a charity and social business, working to deliver and reform public services. For over 200 years we have worked across children’s social care and youth justice. Our services work to reduce violence, support children at risk of exploitation, and those and their families already affected.  

 

Since 2019 we have co-delivered The Social Switch Project, equipping frontline professionals and youth workers with the confidence and knowledge to address issues faced online. This project followed research we have done in this space, including Social Media as a Catalyst and Trigger to Youth Violence.  

 

Across all our services, the issue of online harms is exponentially increasing and now, during the Covid-19 period, young people rely on it for both social and educational purposes.  Before the start of the crisis, knife crime and serious violence was on the rise: according to ONS data there were 47,500 offences with a knife in 2019, an 82% increase on 2014. Social media is a recognised catalyst for youth violence, and with young people confined to their homes and screens, this risk will be growing, with potential repercussions as lockdown measures are eased. Catch22’s frontline practitioners are acutely aware of the growing risk of young people being confined at home without the same supervision they would usually encounter. 

 

1.1.  In April 2020, Catch22 hosted a panel discussion Online Harms: Education or Regulation? with representatives from Ofcom, Google and techUK, as well as youth workers and researchers. It was clear from that discussion that: 

 

1.1.1.   The large social media platforms are getting the spotlight for addressing online harms, but there are many other players, including those organisations building the technology, the devices, and connectivity.

 

1.1.2.   There are huge gaps in education for parents, teachers and frontline professionals nationwide.  

 

1.1.3.   Good programmes exist and are in demand but there needs to be a focus on effectively expanding such programmes. The Social Switch Project is a clear example of success and needs to be expanded – to support professionals; to give more young people digital opportunities; and employers need to join to give young people those work experience opportunities.  

 

1.1.4.   Online harms regulation must offer direct, constructive support to platforms.  

 

 

 

  1. What harms are we seeing? 

 

Catch22 has recognised major issues in our services across three key areas related to online harms. We see an urgent need for national programmes addressing these issues: 

 

2.1.     Serious violence, inciting group aggression  

2.1.1.  Photos and videos of violence, often a group of young people victimising a young person associated with another group - this content has triggered reprisals, with Facebook, Instagram, Twitter and Snapchat all used platforms.  

2.1.2.  Young children are given phones with no guidance or monitoring. 

2.1.3.  Witnesses, often to gang activity, are getting younger, with a case in Merseyside of a 7-year-old witnessing harmful content after his brother shared content. 

2.1.4.  In response to a Catch22 Online Harms Consultation (conducted in May 2020), we have heard of graphic violent content seen online, from stabbings to a beheading.   

2.1.5.  Parents/carers have overhead provocative conversations through interaction their child is having on gaming platforms. 

2.1.6.  At-risk youth have reported an increase in provocative language online throughout the lockdown, and they have expectations of a possible increase in violent reprisal once the lockdown is lifted.

 

2.2.  Child sexual exploitation

2.2.1.   Young people are befriending strangers through apps such as Snapchat and forming relationships in which they seem a lot more open to sharing about themselves than they would in person. 

2.2.2.   Young teenagers share explicit videos and pictures due to peer pressure or believing they are in a relationship. We have worked with a 12-year-old who was groomed on Instagram, who had then sent images to the offender and run away from home. We know of similar cases which have resulted in suicide.  

2.2.3.   Our victim services have seen online conversations move into child sexual exploitation, deeply impacting victims’ mental health. 

2.2.4.   Children as young as 10 years old are sent explicit images over Instagram (such as genitals) and manipulated to the point of sending explicit images of themselves.  

2.2.5.   Grooming through online gaming and their often unmoderated chat rooms.   

2.2.6.   Talk of self-harm/suicide on online forums. 

 

2.3.     Poor preventative education for parents and guardians 

2.3.1.  Drill tracks that have provoked other groups have acted as a catalyst and trigger for violent reprisal, yet there is either an ignorance from parents on what their children watch, or there is blanket opposition to content which is not necessarily harmful.  

2.3.2.  Poor use of privacy settings and a lack of parental monitoring or education on how to monitor digitally.

2.3.3.  Guardians are unaware of the extent to which young people are speaking to strangers. Since the lockdown, we have had numerous reports of grooming continuing, and escalating to sexual contact. 

 

 

 

Case study: 

2.3.4.  Catch22 has been supporting a 14-year-old girl and her family following an incident in May 2020. In response to the Covid-19 lockdown and as a fun activity, the girl's parents allowed her to stay in the family campervan for the night in their private garden. The mother was with the child until approximately 2am. 

 

The girl had been talking to an older adult male through a social media platform for some weeks. Once the mother returned indoors, unknown to the mother, the male joined the girl. He sexually assaulted her in the campervan before leaving a short time later. The mother returned a mere four hours later for her daughter to then tell her what had occurred. The girl and the family are receiving our support as they deal with the aftermath.

 

  1. What are we doing to intervene?  

3.1.     Catch22 involves victims, experts, and those at risk of harm in the design and continuous improvement of our services. We draw on our experience across a range of services, including child sexual and criminal exploitation, domestic abuse, youth violence and gangs. We use partnerships and stakeholder engagement to upskill and empower others in our communities to increase awareness of our services for potential or ‘hidden’ victims. 

 

3.2.     Catch22 is working to engage with technology and social media platforms to understand their policies around safeguarding and we regularly in engage with researchers who are doing the same. However, while the platforms may be understanding and sympathetic, we’re yet to see significant policy improvements.  

 

3.3.     Catch22 has thorough safeguarding procedures and policies in place, for example:  

 

3.3.1.   NCS Liverpool has one designated social media lead in order to limit the investigation period should the ever be an allegation made from an online perspective. While leading activities or challenges through social media, multiple staff are monitoring live conversations and are ready to block any comments that could put our young people at risk.  Safeguarding guidance and training for NCS online activity has not yet been issued by NCS Trust, however we understand only one platform will be used.

 

3.3.2.   In our schools, disclosures of inappropriate or harmful activity online are reported to the safeguarding team and then usual safeguarding procedures are followed.  

 

3.3.3.   Catch22’s Stoke-on-Trent and Staffordshire CSE and Missing Service supports young people to understand how exploitation occurs. We work to increase a child's resilience by boosting their awareness, their self-esteem and by offering alternative choices and diversionary activities. When necessary, safeguarding referrals are made. 

 

3.3.4.   Merseyside Catch22 CSE Service works with children and their families, and visits schools and other youth services to educate both young people and teachers on what exploitation is and what sort of behaviour leads to exploitation. The entire service runs on the basis of a multi-agency approach.  

  1. The Social Switch Project  

 

4.1.1.  “I think it is difficult with the ever-changing apps that are being used. It's about us learning and understanding the platforms used by young people in order for us to talk to them about it appropriately.”  Liverpool Youth Worker responding to the Catch22 Online Harms Consultation (May 2020)

 

4.1.2.  The Social Switch Project delivers training to frontline professionals across London, including teachers, social workers, and youth workers, to enable open discussions about the content young people are engaging with online. It covers the basics on how various apps operate, and safeguarding concerns. As of May 2020, 375 London professionals have completed the training, with 94% of participants stating they would highly recommend it to colleagues. 100% confirmed that the training was relevant; that it had positive impact on their skills development; and that they would act on the knowledge they had gained.

 

4.1.3.  Since lockdown, the programme has been delivered digitally in smaller groups.  

 

4.1.3.1.1. "The online version of the course meant we were right there in the middle of the services and apps that our young people are using now for their social lives and now, for their learning lives too. It taught us about what is known and, more to the point, what we don’t know.  

 

“I thought it was absolutely the right course for people like me and my colleagues. I work in safeguarding for schools across all phases. Our abilities are shaped by our experiences. Our experiences are quite adrift from what young people might be doing. I learned about apps and online environments I didn’t know about before. It allowed me to see it from the young people’s perspectives. I would recommend it very highly.” 

School Safeguarding Lead who attended The Social Switch Project  

 

4.1.4.  The project also trains at risk youth in digital and social media leadership, facilitating careers for these young people. Just this month, a 17-year-old male has been accepted on to a Google apprenticeship following participation in The Social Switch Project. 

 

  1. What do we need to see?  

 

We know all content cannot be moderated for online platforms but there is plenty more work which could be done to mitigate the harm we are witnessing.  

 

5.1.     Annual and updated training for frontline professionals  

5.1.1.  Apps and technology are constantly changing, and training must adapt in time.

5.1.2.  Encourage platforms to work with programmes that are addressing harm to build on preventative education.  

 

5.2.     Arming parents and carers with information to stay safe  

5.2.1.   Ensure that high quality, accessible training is available to parents.  

5.2.2.   The expansion of projects like The Social Switch Project to target parents’ questions. 

 

5.3.     Real efforts to prevent underage users  

5.3.1.   Ie. no under 16s means no under 16s. Users have to pass identity tests to access their services or apps e.g. users being able to post racist abuse behind fake accounts or perpetrators to pose as younger people. 

5.3.2.   To prevent anonymous cyberbullying.  

 

5.4.     Investment in youth services  

5.4.1.  Enable young people and families to build relationships with youth workers through peer mentoring and youth clubs, with full training – in lieu of lockdown this can and should still be continued virtually. 

5.4.2.  It is within these trusted relationships where respect is present that young people will confide and may take on advice. 

5.4.3.   The minimal investment currently leads to a ‘First Aid’ model: preserve life and move on, losing the principle of building meaningful relationships.  

5.4.4.   Frontline youth workers know there is no quick fix; “The sooner we accept this and adopt a model of not looking for fast outcomes, the more chance we have of maybe solving this issue.” 

 

5.5.     Intuitive software and more resources for responding harmful content  

5.5.1.  Underpinned by well-evidenced safeguarding policies. 

5.5.2.   Blocking violent content.   

5.5.3.   Greater responsibility on social media and technology platforms to work with prosecution authorities, responding quickly and effectively to legal requests for information, retrieving conversations for police and to prevent further harm to other victims. End-to-end encryption by default across platforms such as Facebook is a serious concern, likely to undermine efforts to catch sexual offenders. 

5.5.4.   Covid-19 has shown what may deemed impossible, to in fact be possible i.e. WhatsApp imposing an immediate limit on forwarding messages in order to combat fake news. Why can more not be done to combat CSE and SYV? 

5.5.5.   School-wide education for teachers to deliver as part of the curriculum.  

 

  1. “For want of a better word lip service has been paid to social media platforms and the laws that apply are ineffective; our young and vulnerable people are paying a very high price and all we can do is watch.” Liverpool Youth worker

 

 

May 2020