POSR0014
Written evidence The Royal College of Psychiatrists
1.1 The Royal College of Psychiatrists is the professional medical body responsible for supporting psychiatrists throughout their careers from training through to retirement, and in setting and raising standards of psychiatry in the United Kingdom. We work to secure the best outcomes for people with mental illness, intellectual disabilities and developmental disorders by promoting excellent mental health services, supporting the prevention of mental illness, training outstanding psychiatrists, promoting quality and research, setting standards and being the voice of psychiatry.
2. Background
2.1 On January 2020 the College published our report titled “Technology use and the mental health of children and young people.” This report explores the current evidence around the effects of the use of technology among children and young people. As well as looking at the impact of screen time, the report also looks at the impact of different types of screen use on the mental health of children and young people including:
2.2 In the report we made recommendations, including a number aimed at government and technology companies to protect users from the risks associated with the use of technology. The report also provided practical guidance to children and young people, parents and carers, clinicians and teachers on this issue. When we launched the report we became one of the first groups to call for regulators to have power to impose fines of up to 10% of global annual turnover of social media companies.
3.1 The Online Safety Bill gives Ofcom a lot of new responsibilities and power, far more then was original envisioned for the regulator. The government must make sure that Ofcom receives investment proportionate to their task otherwise many of the positive protections introduced by the Bill will never be properly enacted.
3.2 Specifically we need to make sure there is investments in people with expertise and skills. However, as Ofcom expands in response to these new duties, we have concern there may be increasing fragmentation within Ofcom.
3.3 We are also concerned that Ofcom is being asked to regulate on areas which have not been clearly defined. For example, we are concerned that terms such as harm do not have a clear definition.
3.4 As this is the first time legislating in this space, there will need to be a lot of learning in this process. The Government needs to make sure they are holding regular reviews of the work of Ofcom to understand how it is regulating and understand the impact of the legislation.
3.5 Effective regulation necessitates collaboration and coordination with other regulators, including globally, to have the required impact. Existing groups, including the Advertising Standards Agency (ASA) and the Information Commissioner's Office (ICO), potentially have a large degree of overlap in responsibilities. We seek clarification on how these groups will work together, and what knowledge is being shared between different agencies.
3.6 We also believe that there is an opportunity for Ofcom to learn from existing legislation in other countries. Specifically the European Union’s Digital Services Act.
3.7 We hope Ofcom is given the capacity to look at more preventive measures to protect people from encountering harm in the first place, and that areas which are not currently covered by the Bill, in particular the impact of loot boxes in children and young people, are reviewed
4.1 Services may not have in place adequate systems to identify and remove harmful content as soon as possible. Safety technologies may be costly for innovative start-ups, leading to the risk that only very large, profitable companies can be compliant. Systems for sharing safety tech could support innovation. Examples of design features that could support the better identification of harmful content include:
4.2 Services can act as delivery mechanisms for content: textual, aural, or visual materials. The information conveyed by this content may create the risk of harm, and this risk may be exacerbated by the service design itself. For example:
4.3 There are a number of design features and operational systems that can leave users feeling helpless and disempowered. These include:
4.4 Linked to concerns regarding the nature of content delivered via services, there are concerns that services may be designed in a manner which leads to excessive engagement. Such overuse may also exacerbate any potential risks associated with the content itself. For example:
4.5 Design may also expose users to commercial exploitation. In addition to the under-represented risks of scams and cybercrime in the Online Safety Bill, there are three main ways in which users may be exploited commercially:
5.1 We lack the kind of data that we need to draw reliable dose-response relationships between the presence of specific design features and the creation of online harms. This lack of data will greatly impact the ability for online safety regulation.
5.2 We suspect that many design processes can amplify such harms, but without industry data sharing on a large scale it’s very tough to isolate and quantify the ‘active ingredients’ here in a reliable way. For example business models often depend on advertising revenue. Advertisers want to see engagement, and extreme content is engaging; so the creator of extreme content is paid for that activity, and the recommender systems amplify it, because it is engaging.
5.3 We welcome the commitment in the Bill for Ofcom to produce a report on the ‘extent persons carrying out independent research into online safety matters are currently able to obtain information from providers of regulated services, exploring the legal and other issues which currently constrain the sharing of information for such purposes, and assessing the extent to which greater access to information for such purposes might be achieved.’ However, while this is an important step this may take up to two years to produce. The government should prioritise this report and develop a protocol for data sharing and a relevant ethical framework for informed consent as soon as possible. Currently data is mined without explicit user consent, and no framework exists as a guidance for research ethics committees.
5.4 One issue is the opacity of the technology, of not understanding the how and why you get the content you do. Qualitive research, including ‘ Young people’s online engagement and mental health: the role of digital skills’ showed the impacts, particularly in relation to self-harm, eating disorders, sexual abuse and excessive use was as follows:
“young people with mental health needs are not vulnerable due to a lack of digital skills; they are highly skilled, but these skills are not sufficient due to:
The report concludes “risky-by-design complex and opaque technological systems overwhelm the young people: there’s much that they cannot manage… For those with mental health difficulties, the consequences can be extreme.”
October 2023