Catch22 is a charity and a social business working to reform public services for the better. Across our education, social care and justice services, Catch22 witnesses the impact of harms which originate online and result in violence or the sexual and criminal exploitation of children and young people offline. In 2016, Catch22 published Safer Schools: Keeping gang culture outside the gates and in 2017, published Social Media as a Catalyst and Trigger to Youth Violence. In 2020, we published the results of our National Consultation into Online Harms, and the report written by academic, Dr Faith Gordon: Children and Young Peoples’ Experiences of Online Harms: ‘Acceptable Use’ and Regulation is due for release shortly.
As well as delivering programmes focussed on violence reduction and providing one-to-one support for those affected by criminal and sexual exploitation, Catch22 and Redthread co-deliver The Social Switch Project, equipping frontline professionals and youth workers with the confidence to address issues faced online.
Youth workers, practitioners, children and young people, teachers, safeguarding professionals, tech companies and Police and Crime Commissioners regularly share their experiences and concerns with us surrounding actual harm caused by activity online.
Based on these insights, and from our in-depth research for our upcoming report, we have put together what we believe to be 6 fundamental principles for future regulation. We want to see future regulation prioritise:
For these 6 principles to be fully realised we need to see actions from companies, government and agencies:
More quantitative and qualitative research to better understand the links between online and offline behaviour, and exactly what ‘harm’ is, as well as the identification of the capabilities of tech for ‘social good’. Transparency on the part of companies is essential and genuine engagement with researchers is required;
- A coalition of tech companies and experts, working together to address safety, regulation, intelligence and protection of freedom of speech in a safe online world – including real efforts to prevent children and young people experiencing unwanted content, unwanted contact and other harm, and the development of more intuitive software to do this – with children and young people at the heart of all decision-making;
- Large-scale pro-social programmes, supporting children and young people into meaningful opportunities, career pathways – including investment in youth services – providing the right relationships to mentor and support young people to navigate this world;
- Annual and updated training for frontline professionals working with children and young people to adequately understand and be able to better support them to make the right choices online;
- A way to arm parents, guardians, and carers with the right information to keep children and young people safe, which ensures that they continue to have freedom to gain and benefit from positives of evolving technologies and the online world. These approaches should acknowledge that the need for protection should not eclipse the agency and participation rights of all children and young people in digital spaces.
The COVID-19 lock-downs have resulted in the introduction of physical distancing measures, restrictions on movement and more than 1.5 billion children and young people affected by school closures (UNICEF, 2020). Additional aspects of life have become ‘digital by default’, with children and young people having to turn to digital solutions for education, socialisation and play.
Reports estimate that internet usage has increased by over 50 percent in some parts of the world (World Economic Forum, March 2020), while others continue to experience the ‘digital divide’. Catch22 has been part of highlighting the impact on some of the most marginalised children and young people in the UK, including young care leavers.
Online platforms have provided positive opportunities for engagement, however increased time spent online has also resulted in increased exposure to risks and harm. This was confirmed by reported statistics in the UK, such as those from the Internet Watch Foundation (2020), the Home Office (2020) and research conducted by Gordon (2020-2021).
Our upcoming research
In 2021, thanks to support from the Mayor of London’s Violence Reduction Unit, Catch22 has conducted research as part of The Social Switch Project. The research was led by Dr Faith Gordon from the ANU College of Law, Australia National University and the Information, Law and Policy Centre, Institute of Advanced Legal Studies, London. This study involved extensive focus groups and interviews conducted with 42 CYP aged 10-22 years, with engagement throughout with a youth advisory group. 15 interviews were also conducted with stakeholders and professionals, from police, safeguarding, youth work, victim service provision, tech and gaming companies, regulators and the wider industry.
*Please note that the research report is due for release shortly and will be provided to the Committee in full for their attention*.
The following recommendations are derived in this larger research study due to be published soon and are being shared with the Committee in confidence. We request that these are not shared more widely until the release of the full report in November 2021.
Incorporating the insights from CYP, and the discussions with professionals, industry, policy makers and the international literature and policy review, the 7 ‘R’s have been devised to outline the recommended action points.
Tech professionals stated that qualitative research is essential for defining what “harm” is. In this current research CYP could identify ‘harm’ as: unwanted contact; unwanted content; unwanted use of data; and lack of resolution and redress.
With more transparency and engagement, the distrust and feelings of professionals always being “one step behind” could be addressed. Collaborations with independent academics can create new knowledge, generate more data on the capabilities of tech and therefore better enable society to gain insight. Platforms need to be more transparent, engage fully with researchers and organisations and show a genuine willingness to affect change.
Action: All reforms need to be based on evidence. For those that affect CYP, children and young people should be effectively consulted. Evidence about CYP should come from CYP.
Action: Given how rapidly platforms and technologies are evolving, resources need to be dedicated to independent research that is fully participatory and includes transparent input from tech companies.
What CYP often referred to as being important for them is clearly linked to their rights under the United Nations Convention on the Rights of the Child – including their right to find and share information, to their right not to be exploited. Children’s rights discourse appears to be absent from much of the discussions and debates on online harms in the United Kingdom.
Action: The United Nations General Comment No. 25 on children’s rights in relation to the digital environment makes clear recommendations. The United Kingdom needs to engage more with the international children’s rights instruments and embed the international children’s rights frameworks into proposed reforms. For example, States parties should ensure that, in all actions regarding the provision, regulation, design, management and use of the digital environment, the best interests of every child is a primary consideration (UN General Comment, No. 25).
Action: Child rights impact assessments should be mandated and should embed children’s rights into legislation, budgetary allocations and other administrative decisions relating to the digital environment. The Government should promote their use among public bodies and businesses relating to the digital environment (UN General Comment, No. 25).
Regulation and the creation of a regulatory framework has been the central focus throughout the discussions and debates in the United Kingdom. The proposed legislative framework of the draft Online Safety Bill, includes details of an independent regulator.
Regulation however is not a 'fix-all' solution. It will not address each aspect of online harm and is just one of many necessary measures.
Action: Legal but harmful content should be recognised in future legislation and the need for a clear duty of care.
Action: For regulation to be successful, emphasis needs to be placed on areas such as education and development, addressing social inequalities, and the need for transparency by companies.
CYP felt that balancing freedom of expression, access to information and protection from exploitation was a major challenge for companies, the Government and wider society. CYP placed a lot of emphasis on the responsibilities of companies and felt that they should be held accountable for inaction.
A statutory duty of care placed on social media service providers in regards to their users, as proposed by the Carnegie UK Trust, has the potential to help with protecting CYP online.
Action: Companies should be responsible for the creation and maintenance of safe spaces online and they should be held accountable for inaction in addressing concerns.
Action: An independent and transparent oversight body is required for overseeing regulation, which can ensure that companies and individuals are held accountable. CYP need to be made aware of its existence and role, and all information and complaints processes need to be accessible for CYP.
Most CYP stated that this was one of the first times that they had been asked about their experiences online and what interventions they could suggest. CYP want more opportunities to express their opinions and for their suggestions to inform change. They also want opportunities to engage with those who design, maintain, and regulate online spaces.
Action: Policymakers, legislators, practitioners, and industry need to create greater opportunities for CYP’s opinions, from a diverse range of backgrounds to be heard. The need to ensure that CYP’s experiences inform change in areas such as online safety, accessibility and in the education design and delivery space. Australia’s ‘Safety by Design’ approach is an example worth considering.
Action: CYP want to be part of the design and delivery of education programmes. They want to be part of panels that tech companies, platforms, and gaming designers consult with when designing, developing and updating new products.
CYP have said that responses from companies following a complaint often go unaddressed or there are delays. The delayed responses, often automated, were referred to as retraumatising and made CYP relive the original harmful experience.
A small number of CYP mentioned experiences of having their mobile phones taken away by the police for evidence-gathering purposes for months at a time, in response to serious incidences. For less serious incidences, professionals’ approach was too often about ‘getting offline’.
Action: CYP and their advocates want to see quick, appropriate, effective, and proportionate responses to online harms. They want personalised - not automated - responses and want to feel that companies are acting on complaints.
Action: CYP want law enforcement to outline from the outset how long they will require their phones and devices for, and they want swifter processing and better updates from law enforcement.
Action: Frontline professionals must be trained and prepared for responding to instances of online harms and divert CYP towards embracing the opportunities digital worlds can present.
Educators and safeguarding professionals feel that the lack of resources to respond to the rise in incidents of online harms, leave them feeling overwhelmed and concerned for the safety, health, and well-being of CYP.
Action: More resources are needed for those working in education and safeguarding and also adequate funding needs to be available for the provision of victim support to address harms originating online.
Action: CYP want reforms to PHSE education to include online behaviour, and professionals want to see more education on how to ‘self-regulate’, identify harmful behaviour and report such incidents too.
Research Reference for above findings
Gordon, F. (2021). Online Harms Experienced by Children and Young People: ‘Acceptable Use’ and Regulation. Full Report. London: Catch22.
28 September 2021