Written evidence submitted by Parent Zone (OSB0124)

Who we are

Parent Zone is a social enterprise providing advice, knowledge and help to shape the future for children in the online world. We run projects designed to help families make the most of digital life. 

Our response

Our response is based on the work we do with families and the professionals who support them. It reflects the concerns and aspirations we hear through our projects, research and help service.

We would be happy to discuss any of the points made in this response and to provide any additional information or context.  Please send any questions to Emma@parentzone.org.uk

Introduction:

Our research in October 2020 found that 77% of parents said that digital technology had helped them through lockdown[1]. While access was unevenly spread, there was strong evidence that families were developing a new relationship with digital services for work, education and play. Their awareness of dependency was coupled with a desire to see the digital world at least as well regulated as the offline world.

Parents have for many years expressed concern and confusion about the lack of regulation of online platforms and services. They are perplexed by a system that makes going to the cinema to see a film strictly regulated for under 18s but allows open access to hardcore pornography for children online. They are frustrated when their children run up large bills gaming, and dissatisfied when told that they simply need to ‘talk to their children’ more.

Parent Zone therefore welcomes this bill as bringing an end to self-regulation and giving families the comfort of oversight. As we move towards legislation, the needs and expectations of parents should be front of mind; the Bill needs to be legible to parents, and enforceable.

 

Will the Bill meet aims of making the UK safest place to be online?

The internet is a complex ecosystem. Children make no distinction between being online and offline as they move seamlessly between social media platforms, messaging apps, and games, often on multiple devices at the same time. The ambition to make the UK the safest place to be online is welcome, but we are concerned that too narrow a focus - on user-generated content on social media sites and search - excludes gaming and commercial content. It leaves out some of the most upsetting content online for children, including commercial pornography and gaming. We are concerned that behaviours such as grooming and bullying not specifically mentioned in the Bill should not be overlooked when it comes to assessing potential for harm.

Parent Zone believes it is not possible to make the UK the safest place to go online without including commercially produced content, such as that published on Pornhub. We would also like to see gaming platforms and markets - like Steam - fall in scope.

 

Duty of Care

The concept of a ‘duty of care’ creates an expectation that platforms will be legally required to act to forestall harm to users. This is an admirable ambition but its presentation in the Bill is overly complicated and muddies the proposal.

The model for the duty of care is based on that of Health and Safety legislation. The more closely it resembles that, the better it will be.

The duty of care for children enshrined in the Children Act meanwhile calls on professionals and organisations to ensure that “all reasonable steps are taken to ensure the safety of a child or young person involved in any activity or interaction for which that individual or organisation is responsible. Any person in charge of, or working with children and young people in any capacity is considered, both legally and morally, to owe them a duty of care”.

Where children are concerned, Parent Zone believes this should apply online as well. In addition to requiring platforms and services to scan for harms and pre empt them, the duty of care should require that once organisations become aware that children are at risk (for example, that they are seeking out content about suicide or posting inappropriate sexual content) they should be legally required to act in their interests.

It is not currently clear how Ofcom will ensure that companies’ own risk assessments are adequate. The recent Wall Street Journal expose of Instagram[2] demonstrated how new functionalities can give rise to harm: in that case, the ‘like’ button encourages children to submit photos of themselves for approval. The platform knew that this was causing significant psychological harm but chose to ignore it because the functionality had other purposes. Companies’ risk assessments of functionalities that afford both good and bad behaviour must be broadly assessed. (In-stream payments would be another example: we deal with this in more detail below.) The thresholds for harms have yet to be determined: How will the regulator deal with systems that are fundamental to the business model, and may indeed have attractions for users, but which may also cause significant harm, especially to children? Until these issues are resolved, the Bill will remain a vague ambition, and so many well-meaning words.

Some functionality - including encryption -  makes it easy for platforms to claim they are unaware of problems, while simultaneously preventing external oversight. Relying on consumer complaints is one way to mitigate that; but consumers will need to have independent routes to raise their concerns, and they will need to be heard if encryption means no one can actually see what’s going on.

Finally, we are concerned that the duty of care will not cover new and emerging platforms that may be high-risk-by-design. Only Fans grew from 7.5 million users to 85 million users in less than a year by facilitating in-stream payments for sex work, attracting large numbers of young women. The different duties imposed on category 1 and category 2 platforms seems to leave a site like Only Fans outside the more rigorous requirements of the Bill. 

New and emerging services can facilitate significant harms in a very short space of time. An overarching duty of care, as originally proposed, would bring them within scope. It would reinforce the importance of safety by design, and, additionally, create a more resilient tech sector, less vulnerable to legislative changes at home and abroad.

 

Content, behaviour and wider harms.

We welcome the strong emphasis on children in the Bill. Parents expect the law to be able to protect their children and they will feel let down if their expectations aren’t met. It is still unclear how effective age-assurance technology will be in distinguishing levels of maturity and age. If the instruments are too blunt, children may be age-gated out of services and will respond by migrating elsewhere, pushing young people’s digital risktaking further from parental oversight. Without adequate age assurance, it is difficult to see how the measures designed to protect children will be effective. We would like to see more emphasis placed on parental consent technology, including the option to connect accounts and a requirement to seek parental consent for younger users.

The focus on user-generated content leaves significant gaps. Financial harms, grooming, and cyberflashing are all growing causes of concern for parents. We are seeing anecdotal evidence of young people being targeted with get-rich-quick schemes that exploit their curiosity about cryptocurrencies and their susceptibility to misinformation from influencers.  The regulator will need both the means and the mandate to conduct effective forward-scanning of harms.

We would like to see greater clarity about how regulators will work together. It is unclear to us how the gambling regulator and the financial services regulator will mesh into an effective regulatory framework.

There is insufficient emphasis in the Bill on misogynist, racist, and other hateful content. The toxic atmosphere online is a disincentive to participation for children, especially girls. For this reason we see the ‘legal but harmful’ designation that runs through the Bill as extremely unhelpful. Activity online is either harmful or it isn’t. If it is harmful, it requires a response.

In particular, platforms and services should not be able to plead radical indifference to ethics and truth. The current model of promoting content on the basis of its virality results in the broadcasting of the most divisive and outrageous content. This is not a free speech issue: people should still be allowed to say bigoted and untruthful things. But they don’t have the right to say them to millions; nor to have them spread at speed around the globe precisely because they are outrageous, and so increase ‘engagement’.

We see this as a child protection issue. The febrile and toxic aspects of social media deter children from civic participation.

Gaming is a notable omission from the Bill, since for large numbers of children, especially boys, gaming is the internet  Parent Zone has been investigating the links between online gaming and gambling since 2018 and has published two independent research reports, one on skin gambling[3] and  the other on the wider economic ecosystem in gaming, including loot boxes[4]. We are concerned by techniques derived from the gambling industry designed to keep children playing and paying, and by the rise of new forms of gambling-like functions such as betting in-stream ‘Channel points’ on Twitch. Neurodevelopmental research[5] suggests that children and young people are particularly susceptible to tactics now used in gaming, such as reward removal, that mimic gambling. Other countries are far ahead in regulation of loot boxes and until the links between gaming and gambling are addressed, the UK cannot conceivably be the safest place to go online for children.

 

A better digital world for families

The current generation of children will enter a workforce dominated by technology. Digital activity will inflect every aspect of their lives. Making sure that children have the best possible outcomes in a digital world should be an even more pressing ambition than making the UK the safest place to go online.

 

     Effects of algorithms on media literacy

It is all very well teaching media literacy in school, but if children - especially those growing up in families with extreme views - are only exposed to ‘information’ that confirms existing prejudices, it is very hard for them to exert agency and make judgements about what they are seeing.

Personalisation algorithms should fall within scope of the Bill. The tendency of algorithms to move users to ever more extreme positions is, in our view, a child protection issue. While we accept the proprietary nature of algorithms, we believe that we need to move towards a system of independent audit (not unlike a financial audit) of how those algorithms are directed to shape views and behaviour.

Media literacy should be supported with initiatives that promote user agency in the same way that food labelling encourages healthier eating choices. Relying on algorithms based on ‘a child of ordinary sensibilities’, as proposed, ignores the fact that the children most vulnerable online are those who are vulnerable offline. It is possible that, under such a system, there may be only marginal improvements in safety for the majority, while the needs of the minority who experience the most severe harms are not tackled.

     Horizon scanning to design out harms

In addition to the horizon scanning required by risk assessments, we would like to see the introduction of legally enforceable professional standards for developers, to bring the engineers and builders of the digital world in line with the structural engineers, builders and electricians of the physical world. Safety by design, including default privacy, should be a key starting point for new services. Minimal standards could be and should be defined.

     Complaints

Providing consumers with means of making independent complaints would give families the same standard of independent mediation and redress as they get from other regulated bodies. We note that a family will be able to complain to Ofcom about a programme they see on broadcast media but not content online.

   The Secretary of State and legitimacy

The proposed powers of the Secretary of State’ to direct Ofcom threaten to undermine its independence. Parliament needs to be involved in setting priorities for Ofcom, and needs to be assured that Ofcom will carry those out independently.

The Bill is so difficult to critique partly because it only establishes an architecture and gives very limited indications of what the regime will actually look like. We would like to see the Secretary of State indicate ‘priority content’ as soon as possible so that Ofcom can get on with its work, reducing the time before the legislation takes effect. Subsequently, new priority content should come out of Ofcom’s research, in consultation with parliament and the Secretary of State, and not be at the command of the Secretary of State.

 

Conclusion

As it stands, the Bill is both abstract and overly complex. A return to an overarching duty of care would make some of its complexities easier to disentangle. Too much falls outside its scope. We are particularly concerned about pornography and gaming, not least in its links with gambling. We fear the government will refer to other proposed legislation to deal with these other harms, and that this will only increase confusion since it’s not clear how the network of legislation will mesh together.  We hope scrutiny of the Online Safety Bill will simplify the duty of care, which would help undo some of the anomalies and make it more responsive to harms, especially to new harms as they arise from innovative functionalities.

 

22 September 2021

 

 

 

5


[1] Left behind in lockdown: How financial and digital divides are affecting family life during COVID-19 restrictions, Parent Zone, November 2020

 

 

[2] Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show, WSJ, September 2021

[3] Skin gambling: teenage Britain’s secret habit, Parent Zone, June 2018

[4] The Rip-Off Games: How the new business model of online gaming exploits children, Parent Zone, August 2019

[5] Adolescent problem gambling, Grant JE, 2005 in Gambling problems in youth (eds JL Derevensky, R Gupta), pp. 81-98. Berlin, Germany: Springer