Written evidence submitted by the APPG Coalition (OSB0202)

September 2021

Introduction

We write as Chairs and Officers of seven APPGs with an interest in the draft Online Safety Bill. Our APPGs, which represent a broad range of perspectives on digital regulation, are supported collectively by almost 300 Parliamentarians. Each APPG in our alliance focuses on different aspects of the online safety agenda including protecting children online, promoting digital literacy and safeguarding democracy. We are all supportive of the Government's online safety agenda.

 

Like the Government, our ambition is for the UK to be the safest place to be online. The Online Safety Bill has the potential to be a world-leading piece of legislation, setting the bar high for other nations.  Knowing that regulation in this space is inevitable, companies in scope should welcome the UK’s lead knowing how we respect both innovation and freedom of speech. A safer internet will be good for business as well as for users. We are all supportive of an agenda which promotes responsible innovation.

 

While we all admire the intentions behind the Bill, we do feel that to be truly world-leading the draft Bill requires some targeted revision. Some wording in the draft is unclear, creating uncertainty for services in scope and for users, and there are areas in which we believe the Bill must go further. There are also omissions which we hope will be inserted into the final draft. Without these amendments, elements of the Bill risk undermining the UK’s efforts to keep us all safe online.  With that in mind, we set out below our headline considerations on the draft Bill.

 

Response

 

Duty of care

 

Rather than apply a statutory duty of care to harmful content, the draft Bill applies separate duties to different categories of content. This is a different approach to that set out in the Online Harms White Paper. The advantage of a broad duty of care is that it focuses on tackling the systems driving harmful content rather than regulating the content itself. It would require companies to operate in the best interests of their users with safety by design features which mitigate harm upstream rather than hinge post hoc on content takedown.

 

While the categories of harmful content in the draft Bill are broad, they expose the Bill to legitimate concerns about free expression. This is particularly true of Clause 11, “Safety duties protecting adults: Category 1 companies”. Whereas the safety duties for illegal content and for content that is harmful to children (Clauses 9 and 10) demand services operate “systems and processes” to reduce harm, Clause 11 defers the management of harm to Category 1 services, leaving them free to manage such content however they choose as long as they set out their approach clearly in their Terms of service. This is a continuation of the status quo, with private companies deciding what legal content is visible with no scrutiny of how such moderation decisions are made. 

 

The inclusion of content that is harmful to adults is an important facet of the Bill, which should not be limited only to criminal content. However, it is important that the management of “legal but harmful” content is not deferred to company T&Cs. Better for the safety duties for content to be aligned, focusing on “systems and processes” across the board to ensure this regime is both consistent and truly human rights respecting. In addition, Ofcom should also be empowered to intervene to ensure companies’ terms and conditions are sufficiently robust.

 

Algorithmic curation of content

 

We are encouraged to see the draft Bill acknowledge the power of algorithms in several instances. Algorithms are the driving force behind what we see and how we behave online. They show us content we choose to see and much which we don’t. They recommend what we might like and nudge our user journeys, including who we meet online or what we might believe to be true. They are fuelled by data about user interests and engagement, in turn feeding users more content to keep them engaged. Their incentive - to extend user engagement times to sell more ad space - is not always compatible with society’s broader interests.

 

The Risk assessment duties set out in Clause 7 demands services take into account “(in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service”[1]. This applies to all content duties (illegal; harmful to children; and harmful to adults). This explicit inclusion in the risk assessment duties is welcome.

 

It is unclear whether algorithms are included in the definition of “systems and processes” set out in Clauses 9 and 10. Algorithmically moderating content has proven to penalise minority groups given the much documented inherent biases in algorithms. “Systems and processes” should not mean platforms use more algorithms to modify content, but instead apply more nuanced harm reduction measures to mitigate harm. This should be clarified in the draft Bill or associated Codes of practice.

 

To truly understand the role of algorithms in promoting harmful content, the regulator must be given clear powers to investigate the algorithms of services in scope. At present, there is a massive asymmetry of information. The companies have all the data and tools needed to track, measure and evaluate these harms - indeed these tools are a core part of their business -  but they make none of these available to public oversight.  While certainly playing a role, transparency reports are insufficient. Without access, regulators are forced to rely on the companies to police themselves. Ofcom needs the in-house expertise and ability to carry out an algorithm inspection with the consent of the company; or if the company doesn’t provide consent, and there are reasonable grounds to suspect they are failing to comply with requirements, to use compulsory audit powers. Chapter 5 of the Bill may be the correct place for such wording. There is precedent for UK regulators having such powers, with the ICO, IPCO and FCA all holding similar enforcement capabilities.

Enforcement

Ofcom must be sufficiently empowered to enforce this agenda at pace. Ofcom’s principal duty is “to further the interests of citizens”, putting the needs of the citizen ahead of the consumer. This principle should be upheld when considering the nature and scope of the powers available to the regulator.

As above, Ofcom should also be empowered to intervene to ensure companies’ terms and conditions are sufficiently robust. There must be high standards that Ofcom can impose on companies whose self-determined T&Cs are not up to scratch. It must also be able to use director liability as a last resort, and this power should be awarded as part of primary legislation. Leaving such a sanction to secondary legislation would not afford it the due and proper scrutiny it is warranted given the severity of the penalty.

Definition of harm

The definition of harm in the draft Bill is limited to risk of harm to an individual adult or child. This differs from the Online Harms White Paper which proposed “prioritising regulatory action to tackle harms that have the greatest impact on individuals or wider society.”[2] Many platforms in scope of these regulations are social networks. They encourage users to find a sense of group identity online in a way that would be more difficult in the offline world. This offers many benefits but also means it is far easier to target, access and influence groups online than it is offline.

The Bill should account for the collective impact of harmful content and make suitable provisions for mitigating such impact. This may be through revisiting the definition of harm, through amended risk assessment duties or other means. 

Finally, to bolster the protections for individuals online the Government needs to demonstrate its commitment to specific changes in the law particularly around intimate image abuse. The duty of care does not improve the criminal law’s capacity to provide effective redress in individual cases. This should be addressed in order for the new regulatory framework to provide thorough protections.

Content of democratic importance

Clause 13 of the draft Bill makes separate provisions for “content of democratic importance”. This includes content which is intended to “contribute to democratic political debate in the United Kingdom”. We support initiatives to protect democratic debate and subjective judgment, which are fundamental to our democracy. We would want to ensure, however, that the language in this clause doesn’t enable “democratic importance” to be a cover for vitriol and hate which so many policymakers and public figures witness online. The public profile of parliamentarians, particularly women MPs, leave them exposed as targets of personal abuse often disguised as political debate. The interpretation of this clause could leave figures in the public eye and those engaging in democratic processes even more vulnerable to abuse.

Age Verification and Assurance

The Government’s focus on protecting children online is a welcome and important step in tackling some of the most egregious breaches of trust online. The best routes to protect children are a comprehensive definition of harms, algorithmic inspection powers and wider inclusion of who is protected.

Age assurance also has a part to play but must be considered a tool and not an end. There are many calls for age assurance online – particularly in relation to adult content, data protection and age restricted goods and activities - and there is widespread dismay at the failure to implement part 3 of the Digital Economy Act (DEA). Whilst recognising the Government's view that the scope of the DEA was too narrow, the prospect of waiting another two or three years for protections for children from adult content and contact is simply untenable.

At present, in the absence of a regulatory or statutory code, each online provider is deciding for themselves what levels of privacy and efficacy are required; with the inevitable outcome that age assurance is poorly understood and little trusted. There are many technological approaches, and a growing market of commercial age assurance providers, but what is urgently needed is a governance system which sets expectations for those coming into the market. 

This urgently requires the introduction of a statutory code of practice for age assurance that will ensure any age assurance system protects the privacy of users; is proportionate; is appropriate to the capacity and age of a child; is secure; provides appropriate mechanisms for users to challenge or change decisions; is accessible and inclusive to users with protected characteristics; does not unduly restrict access to services to which children should reasonably have access (e.g. news, health and education services); provides sufficient information for a user to understand its operation; is effective in assuring the actual age or age range of a user;  anticipates that users may provide inaccurate information; and is compatible with data protection legislation.  It is essential to put trust back into the system for children, teachers, and parents – whilst allaying the privacy concerns of adult users.

Anonymity

The Bill is silent on anonymity despite the Full Government Response to the Online Harms consultation recognising that many users are subject to vicious abuse from anonymous accounts when engaging online. This is particularly true for girls, women and minority groups, and for Parliamentarians and people in the public eye. For those with intersecting identities, the abuse is even more acute. Meanwhile, there are many instances where anonymity is critical in protecting personal safety such as for whistleblowers, journalistic sources and survivors of abuse, and protections in these circumstances should be preserved.

There are approaches and solutions which account for different needs and scenarios. Government should be open to solutions which address anonymous abuse whilst protecting freedom of expression and the legitimate use of anonymity online by certain groups. There are a range of options, many of which are already applied by some companies to some users. The issue shouldn’t be viewed as a binary choice between total anonymity or no anonymity at all.

Media literacy

Media literacy is an integral part of delivering a meaningful programme of change, enabling users to engage critically online and to ask the right questions. We were pleased to see the Online Media Literacy Strategy published this summer. The media literacy programme must be delivered as a cross-departmental initiative. The Government must not expect a “one size fits all” solution to work in such a complex area. Delivering an independent programme which accounts for different ages, demographics and computer literacy standards will take time.

Media literacy should not be considered an alternative to systemic changes to the design and responsibilities of products and services.  Users should not be responsible for the harms caused by businesses. The solution to Covid disinformation will not sit in media literacy alone, but in a systems wide approach where technology also plays its part. Literacy must also cover a broad range of subjects including data  and algorithmic literacy.  Given the focus on children in the draft Bill, there is surprisingly little insight into the role of teachers, Ofsted, and the Department for Education in developing and delivering a media literacy programme in schools. We would expect to see a comprehensive approach to implementation that engages these groups along with the necessary funding and training to ensure its success.

Similarly, civil society has flourished, with NGOs, academics and teacher-led initiatives developing robust, tested and regularly evolving tools for educators. It is important that the Government’s approach to media literacy embraces the vibrancy of this work, seeking out the best resources that already exist and making them available to schools. We hope to see Departments come together with civil society and industry to develop a meaningful and nuanced programme which maximises the harm reduction impact of the Bill.

These are just some of the perspectives our APPGs share on the online harms agenda. We will individually and collectively engage with the Government on our areas of interest and concern, and hope that some of these issues can be addressed before the Bill reaches Parliament. While we want the Bill to move at pace, we also want to ensure that it is fit for purpose in reducing harm to all vulnerable groups and society as a whole.

Yours sincerely,

APPG on Digital Regulation and Responsibility

APPG on Electoral Campaigning Transparency

APPG for Media Literacy

APPG on UN Women

APPG on Women in Parliament

APPG for Children's Media and the Arts

APPG on Social Media

 

7 October 2021


[1] Online Safety Bill, Section 7

[2] Online Harms White Paper - GOV.UK