DCMS Logo

 

 

 

Damian Collins MP House of Commons London

SW1A 0AA

Chris Philp MP

Minister for Technology and the Digital Economy

4th Floor

100 Parliament Street London SW1A 2BQ

 

www.gov.uk/dcms

enquiries@dcms.gov.uk

 

 

 

 

MC2021/17951/DC

13 October 2021

Written evidence submitted by Under Secretary of State, Chris Philp MP, Minister for Technology and the Digital Economy (OSB0248)

 

 

 

Dear Damian,

 

Thank you for meeting with my officials on 16 September, and for your follow up correspondence on some of the key concepts within the draft Bill.

 

The government recognises that any legislation addressing user-generated content has the potential to affect users’ freedom of expression. As a result, the government has put in place safeguards to ensure that service providers are required to interpret their duties in a way that minimises any interference with their users’ right to freedom of expression.

 

I have responded to your questions in turn below. My officials will also share an ECHR memorandum with the Committee, setting out further detail.

 

Question 1: Clause 12

 

The first set of questions in your letter relate to clause 12, which sets out the duties on user- to-user services that pertain to protecting users’ rights. Under clause 12, all in-scope service providers are required to take into account the importance of protecting freedom of expression when deciding on and implementing their safety policies and procedures. This protection is strengthened by clause 31(5) and (6), which require Ofcom to ensure that all codes of practice it prepares are designed to reflect the importance of protecting the right of users and interested persons to freedom of expression. The steps in the codes of practice must incorporate safeguards to protect the importance of freedom of expression. An in- scope service provider will be treated as complying with its safety duties if it acts in accordance with these steps.

 

The Committee firstly asked why clause 12 places duties on user-to-user services to “have regard” to freedom of expression, rather than ensure actions are, for example, “consistent with” it. This is because “consistent with”, and other similar formulations, might suggest that service providers owe a duty to their users under the ECHR or Human Rights Act 1998 (HRA). The ECHR only imposes obligations in relation to freedom of expression on public bodies, and private actors are not required to uphold freedom of expression. Private actors, including in-scope service providers, have their own common law freedom of expression rights. This means that platforms are free to decide what content should and should not be on their website within the bounds of the law, just as a supermarket can decide to remove unsuitable content from its community noticeboard. As such, it is more appropriate to ask them to “have regard” to these concepts, rather than to be compliant or consistent with them. “Have regard” also reflects our expectation, reflected in clause 13(2), that companies are balancing freedom of expression rights with their implementation of the broader safety duties.

 

To the Committee’s next questions (1c and 1d), the fact that service providers in scope of regulation will be private entities or persons is also why the ECHR is not expressly referred to in clause 12, and why the draft Bill places a duty to have regard to “the importance of” freedom of expression rather than to freedom of expression itself. As noted above, private actors do not have an obligation to protect freedom of expression in the way the government and public bodies do. Therefore, the use of “the importance of” reflects the extent of the duty for service providers as private actors.

 

The final questions on Clause 12 asked for further detail about how this provision can be complied with and whether there are draft Codes of Practice that can be shared. The duties have been set out at a high level, as the harms, risks and the best ways of mitigating them can change rapidly. Ofcom will set out how service providers can fulfill their duties in codes of practice with the detail of how companies of different risk levels and capacities can comply with them in Ofcom’s codes of practice.

 

The codes will be for Ofcom to draft, and at the moment we are unable to share a working draft of the codes. However, the draft bill includes a number of steps that must be followed when developing the Codes, and Ofcom will have a duty to consult interested parties. It will also have a statutory requirement to consult with organisations who have specific knowledge or expertise related to online harms.

 

Question 2: Responsibilities to protect freedom of speech

 

Your letter asks whether clause 12 delegates responsibility to protect freedom of speech to service providers. To confirm, clause 12 does not make service providers responsible for protecting the right to freedom of expression itself, or the final arbiters on it. As set out earlier on in this response, the intention is to avoid a situation where the duty might be interpreted as obliging service providers to owe a duty directly to users to protect their freedom of expression. Instead, it is that the service providers, when implementing their substantive duties on making the service safer for users, must do so in a way that takes into account the principle of freedom of expression within the law.

 

Ofcom will be required to set out how service providers can comply with their duties in codes of practice. Ofcom will set out steps which incorporate safeguards for the freedom of expression, as appropriate, in these codes of practice, and in order to be treated as compliant, service providers can follow these steps, or they can take alternative steps which Ofcom assesses meet the same objectives.

 

In addition, as a public authority under section 6 of the Human Rights Act 1998, Ofcom already has an obligation to not act in a way which is incompatible with individuals’ rights to freedom of expression. As such, Ofcom will need to ensure that their Codes of Practice and their enforcement action is consistent with freedom of expression rights under the ECHR and HRA.

 

Question 3: Clauses 45 and 46

 

Finally, the Committee asked about clauses 45 and 46, which define content that is harmful to children and adults. We do not consider that the definitions of harmful content set out in clauses 45 and 46 are unclear or vague. The concepts in these tests can be followed using the ordinary English meaning of the words.

 

It is important to note that the tests in clauses 45 and 46 do not apply to “priority” content, and only apply to other types of harmful content that companies identify in their risk assessments. The “priority” harmful content will be set out in secondary legislation and will make clear the categories of harmful content to children and adults that platforms must address. We expect these priority categories of content to cover the majority of harmful content that companies will need to address under their children’s and adults’ safety duties. This approach balances the need for legal certainty, whilst ensuring the regime remains flexible and future-proofed. Furthermore, Ofcom will use these definitions first, when writing their risk assessment (clause 61) and developing their guidance to companies.

Equally, we do not believe these definitions will give rise to a risk of excessive take down of legal content or infringements of Article 10. The definitions of content that is harmful to children (in clause 45) and content that is harmful to adults in (clause 46) will be used within the framework of protections for Article 10 rights described above.

 

Finally the Joint Committee asked about the standard associated with “significant adverse psychological reaction” and the threshold for that. In terms of the test, the standard to be applied by service providers in assessing whether content is harmful is: the service provider must have “reasonable grounds to believe that there is a material risk of significant harm to individuals.” We do not consider that there is a lack of clarity over the threshold for a qualifying adverse psychological reaction. The “significant adverse … psychological impact on an individual” test, read having regard to the ordinary English meaning of these words, clearly excludes trivial reactions but does cover significant adverse impacts which fall short of causing clinically diagnosable psychological conditions. This standard is set at a high level in order to avoid encouraging excessive take-down of less harmful or innocuous content and to operate at scale.

 

As explained above, compliance with Article 10 is achieved by a comprehensive set of protections and specific definitions and aspects of definitions need to be taken in that context.

 

Thank you again for your time and please do not hesitate to contact my officials if you have any further questions.

 

 

With best wishes,

 

 

 

 

 

Chris Philp MP

Minister for Technology and the Digital Economy

 

3 December 2021