Written evidence submitted by COST Action - European Network for Problematic Usage of the Internet (OSB0038)

This submission is made by the COST Action “European Network for Problematic Usage of the Internet (www.internetandme.eu and https://www.cost.eu/actions/CA16207). We are a global network of over 100 of the World’s foremost scientists from around 40 countries, working in fields including (but not limited to) clinical medicine, psychiatry, psychology, neuroscience and allied disciplines to advance the understanding and treatment of Problematic Internet Usage. We are supported by COST - European Cooperation in Science and Technology under the European Union Horizon 2020 framework programme. Among our outputs, we have produced a self-help booklet (https://www.internetandme.eu/resources/), including tips for parents on how to manage difficulties which frequently arise. In April 2021, we organised an all-day international conference, which was attended by expert delegates and over 300 members of the public from approximately 40 different countries. Participants were asked to complete a survey regarding internet use and the results showed that while people were worried about social media use (particularly as a result of informative programmes like the Social Dilemma) they were clearly also concerned by the potential for harm associated with other forms of problematic internet usage giving rise to addiction, including excessive online gaming, gambling, pornography viewing and other online behaviours (see below).

While recognising this is the first time the UK has tried to impose obligations on online platforms, and it is therefore a positive initiative, we seek to draw attention to several key limitations of the Bill as proposed.

Summary: In its previous iterations, and before reaching this stage, the Online Harms Bill engaged in considerable consultation. The original aims included an expectation that the Bill would focus on ‘harm’ and work backwards from that to impose obligations on organisations to take steps to minimise harm and that this would include considering limiting screen time in some way, particularly for children. However, the Bill in its current form, has restricted its scope and now only covers ‘user-generated content’ or companies providing search services, and therefore lacks capacity to regulate what companies (including games manufacturers) create and provide to users, thereby overlooking an emerging and substantial potential source of harm linked to its strongly addictive potential.


  1. Clarify and examine the Government's policy objectives.

The Government’s stated objectives (see its December 2020 response to the White Paper) reflect its ambition to make the UK ‘the safest place to go online’. They include:

These are all laudable motives which we support. However, as stated in the summary above, those objectives must necessarily include the potential harm caused by excessive time spent online, particularly in the case of children and young people, whose brains are still developing and who are often more susceptible to addiction.

Harmful loss of control of Internet usage, occurring in various fields (including, but not restricted to gaming, pornography viewing, gambling, shopping, video-streaming, medical information seeking, social media use), leads to addiction among vulnerable people, across the lifespan, which results in major harms (physical and mental) at the individual, familial and societal level.

Because of the above, specific forms of problematic usage of the internet are now recognized by leading public health bodies as formal medical/psychiatric conditions, including the World Health Organisation, as a major global public health and wellbeing issue. The International Classification of Diseases 11th Revision (ICD-11) ratifies several costly and burdensome conditions linked to problematic internet use as diagnosable mental disorders. These include gaming disorder, gambling disorder, and compulsive sexual behaviour disorder. The key clinical features of these disorders include loss of control of the online behaviour, leading to neglect of other important areas of life in favour of the online activity and/or significant damage to multiple aspects of wellbeing including psychological or physical health, relationships, educational and occupational functioning.

Though people can develop different behavioural addictions at any age, the form of disorder is related to age and gender, with young people tending to engage more in online gaming and video-streaming, females engaging in online shopping and social media use and males with online pornography viewing and gambling. For example, according to recent figures reported by market and consumer data organisations (which we have not independently verified), it is estimated that over 2 billion people worldwide and around 40 million people in the UK undertake online gaming and that a significant minority will suffer profound difficulties associated with loss of control of gaming behaviour. A longer time spent on video games predicts a tendency towards pathological gaming in the future. Estimates of the prevalence of the various forms of problematic (addictive) use of the internet have varied widely across different epidemiological studies, owing to the use of various methodologies and definitions, which renders consensus difficult to achieve, however pooled analysis suggests that on average around 7% of the global population are affected by one or more internet-related addictive problems. Young people and those with pre-existing neurodevelopmental and mental disorders are disproportionately affected.

The cost and burden of these disorders is carried not only by the individual, in terms of distress, loss of education or employment, relationship breakdown, sleep disorders, depression, anxiety and suicidal behaviour, but also by family members carrying the emotional and financial burden of caring for the addicted individual, and employers facing lost productivity through absenteeism and ‘presenteeism’ among affected employees. While the full health economic impact of disorders of problematic internet usage are yet to be fully realised, considering that young people are disproportionately affected in the most crucial developmental stages of their lives, the impact is likely to be long lasting and to accrue over time. Clinical services, such as the recently opened National Centre for Gaming Disorder in London, are already greatly oversubscribed.

The rights of vulnerable individuals to protection from the risk of harmful addiction, is unarguable. Policies and practices aimed at protecting vulnerable groups and preventing the onset and development of problematic internet usage are needed to fulfil this critical public health role. This is where regulation and oversight would be of enormous benefit, particularly given the strong commercial incentives driving the delivery of online material with addictive potential. For example, it is reported that one bestselling online game of 2018/19, which 78 million people were playing at any one time, made its manufacturers over $5bn profit (which we have not independently verified). Offered to consumers free at the point of access, players are then incentivised to keep on playing via a variety of techniques so that in-app purchases can be made, which allow the manufacturer to recoup its outlay and then make profit.

And the Government acknowledges its approach as ‘unashamedly pro-tech’, Oliver Dowden MP himself having stated ‘The government recognises that immersive technologies and content, offer great potential for economic, cultural and social benefits to the UK. Through increasingly compelling narratives and realistic visuals, immersive products can offer engaging experiences to audiences, not just with the aim of entertaining but with the scope to challenge, educate and inspire them.

It is perhaps then unsurprising that the Government has taken great pains to point out that ‘only 3% of UK businesses will fall within the scope of the legislation’ and has limited the businesses it proposes to regulate, accordingly.

However, even seemingly innocuous forms of online use have been found to be harmful in vulnerable individuals, such as video-streaming, which can damage academic achievement among vulnerable students, while introduction of commercial strategies designed to increase online engagement, including gambling elements such as loot boxes’, have substantially exacerbated problematic internet use.

Notwithstanding, there are good reasons to believe that regulatory practices, such as robust age verification of consumers (not just ratings systems), clear labelling of the scale of harm associated with different products, strategies to limit online access, and promoting availability of methods for self-exclusion, will have a protective effect. There is an additional need to rigorously assess the efficacy of such strategies and support evidence-based policies and interventions as they emerge.

Therefore, there is a compelling argument for the objectives of the Online Safety Bill to be expanded to include regulation of a broader range of Internet provider companies and their products, if it is to come anywhere near to achieving its stated overarching aims. In sum, if the Government wishes to address online safety, it needs to comprehensively address all areas of harm, including those generated by commercial companies providing potentially addictive content such as (but not limited to) gaming, gambling, pornography, video streaming and medical information.


  1. Assess whether the Bill as drafted would achieve the Government's policy objectives.
  1. Are children effectively protected from harmful activity and content under the measures proposed in the draft Bill?

No. See 1 above regarding the harm associated with excessive use of the internet, which has been completely ignored. Furthermore, although an assessment of harmful content is envisaged (section 19), its implementation, especially in cases when this has to occur before the United Kingdom users are able to access the service, or it involves the analysis of complex algorithms used by the service providers, will be hard to implement. We strongly encourage here a closer collaboration with academic researchers as well as the establishment of advisory groups involving the participations of children, carers and educators themselves to promptly inform the government about any existing or emerging harms. Further, as mentioned above, the harm is often embedded in the content of the entertainment activities (e.g. games), especially designed to become addictive.


  1. Does the draft Bill make adequate provisions for people who are more likely to experience harm online or who may be more vulnerable to exploitation?

No. See 1 above. Additionally, the Bill marginally touches the harmful consequences for those suffering mental health issues and other vulnerable individuals who are not children. In this sense, a stronger collaboration with health services should be encouraged and more targeted strategies should be implemented according to the evidence emerging from our studies.


  1. Is the “duty of care” approach in the draft Bill effective?

Imposing a duty on companies themselves to ensure that users are not exposed to harmful content by way of regular risk analysis appears, in principle, to be a good way of making companies take responsibility for the online safety of users. But:


  1. Identify any unintended consequences of the Bill.

By overlooking and excluding regulation of the widespread provision of highly addictive content on the internet, the Bill is unintentionally giving the go ahead to commercial organisations to increase their exploitation of vulnerable people without redress.


  1. Identify whether there are any gaps in the Bill.
  1. Are there any types of content omitted from the scope of the Bill that you consider significant e.g. commercial pornography or the promotion of financial scams? How should they be covered if so?

Yes. See 1 above. The Bill omits several potentially harmful forms of content thought to be potentially addictive and a source of serious harm for vulnerable people, including, but not limited to, gaming, gambling, pornography, and misleading pseudo-medical information.

  1. The draft Bill specifically includes CSEA and terrorism content and activity as priority illegal content. Are there other types of illegal content that could or should be prioritised in the Bill?

Yes. The Bill does not refer to the less explored part of the Internet known as “darknet”, which has recorded an exponential growth over the years in terms of illicit markets of non-licensed medical products, illicit drugs, pornography, human trafficking, among many others. Visually similar to other commercial platforms, such as eBay and Amazon and enabled by the seamless ease of funds transfers ensured by the use of decentralised cryptocurrencies, mainly bitcoins, the darknet markets create considerable new business opportunities for organised crime groups worldwide doubling their revenues in less than three years for darknet related transactions. The flexibility and ease of adding activities and/or products in absence of any type of regulations has made these platforms a new and major potential societal harm, further supported by the lowered barriers for entry to individuals, including vulnerable groups.


  1. What would be a suitable threshold for significant physical or psychological harm, and what would be a suitable way for service providers to determine whether this threshold had been met?

The World Health Organisation, in the ICD-11, has set a clear, simple standardised threshold for establishing the presence (diagnosis) of gaming disorder, gambling disorder and compulsive sexual behaviour disorder, based on clinical factors including loss of control of behaviour and the presence of associated psychological, physical and social harm. There is also an ICD-11 category that allows the clinical definition of hazardous gambling or gaming, referring to a pattern of online behaviour that appreciably increases the risk of harmful physical or mental health consequences to the individual or to others around this individual. These definitions provide clear and accessible thresholds that can be used by service providers and regulators to determine different levels of harm associated with problematic usage of the internet.

  1. The draft Bill applies to providers of user-to-user services and search services. Will this achieve the Government's policy aims? Should other types of services be included in the scope of the Bill?

No. For the reasons described above, the Bill needs to be extended to include companies providing commercial online services such as, but not restricted to, videogames, pornography, gambling activities, pseudo-medical (i.e. scientifically misleading, factually incorrect health-related) information, video-streaming, shopping, social media. 

  1. Does the draft Bill give sufficient consideration to the role of user agency in promoting online safety?

No. The Bill does not take account of the fact that ‘user agency’ may be disproportionately reduced in groups of individuals with vulnerability to problematic internet use, and relatedly that aspects of cognition have been shown to be impaired in such groups. These include children and young people, people with neurodevelopmental disorders such as attention deficit disorders or those with mental disorders such as anxiety disorders, substance addiction or obsessive compulsive and related disorders. Indeed, by definition, the process of addiction involves the loss of agency over the behaviour. Hence, preventative measures are needed to support user agency and promote better self-management of time spent on-line, focused on these vulnerable groups.


  1. Make recommendations to improve the drafting of the Bill.

The COST Action “European Network for Problematic Internet Use (CA16207) recommends that the Bill will be improved by the following actions;

1.  Extend the scope of the Bill to include companies providing online services with the potential to cause harmful loss of control and addiction in vulnerable groups. These services include gaming, gambling pornography, pseudo-medical information, shopping, social media and video streaming.

2. Introduce regulatory practices designed to promote user agency and improve healthy self-management of the internet among all internet users, while also reducing risks of addiction among vulnerable groups. These regulatory practices should include, but not be restricted to, measures such as robust age verification of consumers, clear labelling of the scale of harm (or potential harm) associated with different products, strategies to limit screen time in children and young people if their parents wish it, active promotion of methods for self-exclusion, and recommendations of where to go to seek help.

3. Establish a working group to monitor the success of the Bill with a particular remit to review its impact on critical public health indices related to problematic usage of the Internet.


17 September 2021