Written evidence submitted by Internet Matters (OSB103)

Internet Matters exists to help families benefit from connected technology.  We are a not for profit, funded by the internet industry – and we are pleased to bring leading brands together to focus on child safety and digital wellbeing.  We provide expert advice to parents, presented in a usable way by age of child, by device, app or platform or by issue.

We know that engaging with our content gives parents, carers and professionals the confidence and tools they need to engage with the digital lives of those they care for.  Having an engaged adult in a child’s life is the single most important factor in ensuring they are safe online, so providing those adults with what they need is a fundamental part of digital literacy.

We’re delighted to respond to this consultation and will be drawing heavily on our extensive research base, all of which can be found on internetmatters.org. We have confined our observations to our area of expertise; content defined as legal but harmful.

Reason for submission

With seven years’ experience of helping parents and professionals to support young people enjoy the benefits of the online world safely, Internet Matters holds valuable insight relevant to the Bill’s stated objectives; in particular:

 

       Are children effectively protected from harmful activity and content under the measures proposed in the draft Bill?

       Does the draft Bill make adequate provisions for people who are more likely to experience harm online or who may be more vulnerable to exploitation?

       Whether the media literacy provisions given to Ofcom in the draft bill are sufficient and appropriate

 

Internet Matters has established and evidence-based insight into online safety in the context of young people, as well as the concerns, experiences and understanding of parents/carers.  We are experienced in delivering effective interventions for trusted adults supporting young people’s online lives.  Internet Matters has a wide network of stakeholders across industry, government, third sector, professionals, academics, and families.  We have also honed effective ways to engage with industry around online safety solutions.

 

The Bill is challenging to understand there is much still to be defined and Ofcom’s implementation to be clarified.  This makes scrutiny challenging, however but there are points we wish to offer a view on so they can be factored in at the early stages of regime development.

 

Submission

 

1. Consider the scope of regulation to breadth of scope, ensure minimum standards for all and acknowledge the dynamic between causes and symptoms.

 

Companies in scope

 

This Bill is primarily concerned with social and search functionality - which is a political decision - the scope of the bill could have been more encompassing.  Moreover, even within that limited focus the number of companies within scope are worryingly few.  Smart speakers are not currently in scope because they search and present findings for your ears rather than your eyes - they are not covered by the Bill. Commercial content, advertising and in-game content are also exempt despite their potential to cause harm to children. These are concerning omissions.

 

Regulation must also recognise the scale, pace and ubiquity of digital technology and the opportunities and challenges that presents. Digital consumption is not bound by the traditional limitations of physical production and its potential for rapid growth is exemplified by Pokémon Go, an augmented reality mobile game, which achieved 50 million users in just 19 days from launch in 2016.   

 

Minimum standards

While there is an understandable hesitance to overburden smaller providers with regulatory requirements, this misunderstands the nature of technology.  Given the pace of growth in the tech sector, all providers should have minimum safety standards from the start, not least so if they scale at pace, safety isn’t retrofitted.

 

All toys and food must be safe for the purpose they were created, not simply those produced by large companies or on a large scale, the same should be true for online service providers. It is a recognised cost of doing business. Similarly, the Health and Safety Executive has minimum standard expectations for all businesses and even those who are self-employed.  Where harm can be significant for the individual, there is no justification for fully exempting smaller service providers.

 

A child impact assessment, rather than a risk assessment, may well be a good place to start - as this will require companies of all sizes to consider the likely positive and negative experiences children will have on their platforms and enable them to create a solution which balances the benefits of being online against the risks.

 

Root causes

For the Bill to be truly successful, separate work should consider the societal causes of many of the issues the Bill attempts to address.  As an example, racism is abhorrent and part of the problem is how algorithms circulate racist content, however a bigger issue is the societal conditions which lead members to form these views and think they are acceptable.  Societal causes, and not only their symptoms, need to be addressed to challenge the issues most effectively.  Simply relying on tech platforms changing their algorithms will not and cannot solve the problem.  At best, legislation can address the challenge of amplification and speed of take down - which would be worth doing.

 

 

2. Include Age Verification of Adult Content in the BIll

 

Age verification for adult content is not currently part of the Bill.  Whether this is part of a complex concession strategy or simply an oversight – it needs to be fixed.  In 2018, in preparation for Part 3 of the Digital Economy Act, we conducted research with parents of 4-16 year olds on their views about and children’s exposure to online pornography. 

 

Our report, We need to talk about pornography, details our findings which include:

       Parents think online pornography is more extreme and explicit than other forms of media and are afraid it would lead to unrealistic expectations of what sex involves.

       Parents fear the potential implications on sexual behaviours if their children have been watching pornographic content, over a prolonged period

       Parents are afraid that teenagers use pornography as a way of learning about sex.

       Parents are concerned that online pornography impacts the way women and girls are perceived

       Parents are worried that pornography could skew children’s idea of ideal body image. In addition, they are worried about the detrimental effects it can have on their children’s physical health if they were to go to extremes in an attempt to reach their body goal.

 

In addition to our research findings, there is increasing concern that easy access to pornography is driving the rise in peer-to-peer sexual assaults in education settings, as highlighted by the Everyone’s Invited website.  For all of these reasons, continuing to allow children to stumble across online pornography is an abdication of our responsibility to protect children.  To exclude it by omission or commission is simply indefensible.

 

3. Anonymity for Platforms and Games

 

The Bill also fails to challenge the thorny issue of online anonymity.  Whilst we recognise this is a really challenging issue - with an understandable rationale for privacy to be protected for women and children at risk of violence, journalists and whistle-blowers this argument is weakened for the online places either intended for or habitually visited by children.  Children should know - for certain who they are talking to, sharing interests or gaming with.  The Bill is silent on this issue.

 

Platforms could hold a register of all users, regardless of the name of the account, and use that to enforce their terms and conditions.  Furthermore, such a register would ensure that accounts held by users who frequently break the rules cannot be deleted and reinstated under a different name within moments. It will be crucial for any such register to be securely held and shared between platforms.

 

4.. Ensure regulation does not unintentionally disadvantage vulnerable young people

 

Our evidence demonstrates that vulnerable children and young people (VCYP) are at greater risk of encountering harm online than their non-vulnerable peers.  It is therefore vital that the particular needs, drivers, experiences, benefits and risks of VCYP are well understood and accounted for within the regulatory framework.  In line with the Age Appropriate Design Code, it is also important that this is achieved in a way that minimises the collection of their personal data

 

The Bill contains some provision for determining content harmful to VCYPs, requiring providers to take into account a child’s particular circumstances if known. Providers will need to carefully navigate the tension here with the aspirations of the Age Appropriate Design Code to minimise data collection and use by default.

 

The Bill also expects that where content is likely to affect certain children, they should consider the child to have those characteristics (instead of the default ‘ordinary sensibilities’). To meet this expectation, providers will need to maintain a robust understanding of what is likely to ‘affect’ VCYP . Transparency reports should be a place where provisions for vulnerable users have to be cited. 

 

Our evidence also shows that online connectivity is a crucial source of wellbeing for VCYP – helping to maintain trusted relationships for care experienced CYP as they move placements, providing shared experiences and support amongst the LGTBQ+ community and allowing time and activities without labels for CYP with disabilities.

 

As the framework and implementation of the regulation is developed, considerations of and consultations with young people (in line with General Comment no.25) and relevant stakeholders should be built into the process to avoid unintentionally limiting their access to the benefits of online life.

 

As examples, such unintended consequences might include:

 

5. Provide enhanced media literacy interventions for vulnerable young people

 

Given their heightened risk of encountering harm, the Bill could and should make provision for VCYP to benefit from enhanced online safety education or targeted media literacy interventions.

Research commissioned by DCMS prior to the publication of the Media Literacy Strategy identified there are already at least 170 media literacy interventions in the UK.  Their mapping exercise report from April 2021 concluded that media literacy interventions for people from more vulnerable groups was limited. The subsequent Strategy document noted that of the 170 UK initiatives identified, only 3% were targeted at disabled people, 3% at those from a disadvantaged socio-economic background and 1% at those from the LGBT community.

 

Whilst the Media Literacy Strategy rightly highlights the need to better serve vulnerable communities, more work needs to be done to identify, prioritise, fund and evaluate interventions.  When closing these gaps, as with making best use of existing programmes, it will be vital that action is coordinated so that provision is unified and not fragmented, diluting effectiveness, and confusing those we are trying to support.  As a unifying and enabling force DCMS and Ofcom, could help with evidence, evaluation and funding with trusted partners.

 

The Bill doesn’t make provision for any of this - which is a missed opportunity.

 

6. Definition of Harms - Legal but harmful

 

The Bill refers to forthcoming work to define harms - this is axiomatically both complex and critical. Consideration must be given to, and clarity provided to demonstrate:

       How direct the causal link has to be to the harm

       Where the threshold of harm should be (how widely or deeply applicable to a person of ordinary sensibilities)

       How to account for the impact of cumulative harm (the distress caused by one comment, versus many)

       How the register of harms can be easily, transparently and fairly updated

 

This is not work that should be delivered in isolation - Internet Matters has, for example, 7 years’ worth of detailed research into parental perceptions of and experiences of a range of harms - viewed through prevalence and severity.  This sort of data could be used to inform the development of this policy.

 

7. Ensure there are appropriate expectations on and of parents/carers

 

Media literacy should empower parents/carers and not burden them with additional responsibilities.  The regulatory regime relies heavily on terms and conditions – in any context, but especially online, these are not routinely read or understood.  To have real-world value terms and conditions should be understandable and delivered at relevant times to the user in practical ways. It will be up to platforms to create these in a way that parents can and do interact with, so that informed consent carries meaning, rather than something to be clicked before access is granted.

 

While it will be important not to unduly burden parents/carers, there will be areas where it is appropriate to have expectations of the role.  For example, active consideration should be given to allowing children to nominate a parent/carer or engaged adult to handle report and redress steps on their behalf.   This may not work for every child, but it would address the challenge for the vast majority. Enabling parents to advocate for their children in this way would be a meaningful step forward. Research into the connotations of the term ‘report’ and CYP friendly phrasing may help encourage CYP to act themselves when ready, but it will be useful to have scope for parental support until then.

 

To ensure parents/carers expectations of the Bill are accurate, it will be important to manage the messaging for parents/carers around the Bill.  Too much hyperbole could lead parents to conclude they can drop their guard or be disappointed when they realise the Bill is a step in an iterative process to reduce online risk and harm.   In addition, the aspiration of the Bill is for the UK to be the safest place in the world to be online, in order to avoid driving families offline, safety messages should be paired with highlights on the benefits of being online.

 

 

8. The Role of Ofcom in Transparency Reporting and Media Literacy

 

Internet Matters is pleased that Ofcom has been appointed Online Harms Regulator;  we recognise that the scale of the challenge put to them is significant.  However, much of the detail of their role is still to be determined and published. There are two areas where the definition of Ofcom’s role would benefit from early clarification -transparency reporting and media literacy.

 

Whilst Transparency Reports are a good idea, their effectiveness must be under constant review to prevent suggestions of tech companies only reporting against criteria they determine.  We recognise the complexity of a universal set of metrics for a diverse sector and so are not calling for Ofcom to create a template.  However, Ofcom must be both robust and public in its review of these reports for them to have any meaningful impact.  Reporting requirements must be relevant and appropriately proportionate to the potential impact the platforms have.

 

For media literacy, it is key to remember that a mature online safety sector already exists in the UK and Ofcom’s value, building on their existing role, will be in adding to what already works, rather than replicating it.  For Ofcom to add to what already exists through adopting a campaign model would duplicate existing efforts and therefore waste time, talent and money.  We would urge Ofcom to lead the thinking on evidence and evaluation and explore creative and innovative ways to reach the underserved and harder to reach communities.  The not-for-profit sector is not funded in a way that promotes experimentation - but Ofcom could usefully explore innovative ways to address the harder challenges.

 

9. Collaboration is key – draw on the expertise of others

 

A consultative and collaborative approach is essential for a workable regime that maximises what the online safety sector already offers and avoids pointless repetition at a slower pace and with less investment than already exists.  Ofcom has a critical role to play: to convene, to share best practice, to lead the thinking in evaluation, and to support the sector in plugging the gaps. 

 

There is no one organization that serves the needs of all audiences well - but for parents and carers supporting children and young people and for the workforce supporting vulnerable children and young people, Internet Matters has a significant amount of insight and expertise.  Internet Matters is also uniquely placed a mature industry collaboration to make a material difference to media literacy programmes.  We would be delighted to share our knowledge and expertise and work in partnership with DCMS and Ofcom as appropriate.

 

Concluding comments

 

The Bill is a welcome development in the battle to keep children safer online.  There is much that is good and much room for improvement.  The aspiration is for the UK to be a safe place to be online, but in order to avoid driving families off the internet we have to pair the safety messages with the benefits of being online and position the Bill as the next step in a long-term commitment to reduce online risk and harm. Internet Matters remains keen to support the introduction of an effective regulatory environment, which in tandem with technological innovation and an effective media literacy campaign, could transform children’s digital experience.

 

September 2021