Written evidence submitted by Professor Jacob Rowbottom (OSB0126)

 

 

1. This written evidence considers the issues relating to freedom of expression and media freedom. The points made refer to user-to-user services, though similar points may arise with equivalent obligations on search services. My comments are focused on the drafting of the Bill, as opposed to the policy underpinning it. While the text below considers the possible effects of the Draft Bill, I also make the following suggestions:

-          Remove the power to direct Ofcom to modify a code of practice (or provide an alternative process) under clause 33 [5-6].

-          Consider a wider definition for content of democratic importance [13].

-          Clarify the goal and text of clause 13(3), [14-15].

-          Consider an expedited complaints process for election related content [16].

-          Consider whether the term ‘generated’ in clause 14(8)(b) is sufficiently flexible to cover journalistic activities deserving of protection, [20].

-          Consider the grounds for complaints under clauses 14 and 15 (and who should determine those grounds), [7] and [19].

 

 

Free speech and the broader framework of the Draft Bill.

 

2. The starting point for protecting free speech lies in the careful drafting of the various obligations. While a level of imprecision is inevitable in this area, the harms and duties should be drafted to ensure that the impact on expression rights is no more than necessary.

 

3. Defining the scope of the obligations will be a challenging task, even in relation to ‘illegal content’. For some offences, the legality of a publication will not be evident solely from the content of the material. For example, a regulated provider may not be well placed to assess the mens rea or the defences in relation to certain offences.[1] The definition of illegal content under s.41 seeks to address this problem by setting the standard in terms of the provider having ‘reasonable grounds to believe’ the use of the content amounts to a relevant offence. That standard means that the duties may apply in relation to content that would not be found to be illegal in a court.[2] The duty may also apply to content that is illegal under the letter of the law, but which is unlikely to be prosecuted in practice.

 

4. Much will be left to the regulator’s codes of the practice, for example to specify factors providing reasonable grounds to believe content is illegal. Such an approach is to be expected in a system of regulation and is not objectionable in itself. My point is that even with duties relating to illegal content, the proposed regulatory system will differ from a direct application of the criminal law to digital publications (and could thereby raise different speech issues). The point also underlines the central role played by the regulator and its codes, which will do much to determine the impact on freedom of expression.

 

5. The framework also gives considerable powers to Secretary of State. For example, clause 33(1)(a) gives the Secretary of State the power to direct the regulator to modify a submitted code of practice to ensure that it ‘reflects government policy’. A modified code must be laid before Parliament once the Secretary of State is ‘satisfied’ with the modification. On reading the Draft Bill, it appears that the modifications are not subject to the consultation process specified in clause 29.[3] If that reading is correct, a key safeguard will be absent in relation to the power to direct the regulator. The use of the power is limited by clause 33(2), which provides that a direction will not require Ofcom to include a provision about a ‘particular step recommended to be taken by providers of regulated services’. Aside from that limit, the clause gives the minister a broad power in relation to the content of the codes.

 

6. The potential for such a ministerial direction in relation to the codes raises significant free speech issues (given the central role of the codes in defining the workings of the regulatory system). Such a power could provide a recipe for ‘mission creep’, in which the minister feels under pressure to respond to headlines and controversies by directing a change in a code. The power should be removed, and (if necessary) an alternative process for modification should be devised that gives ministers less direct power over such sensitive matters.[4]

 

7. The provider must operate a complaints procedure and take appropriate action when complaints are made. Under clauses 15(4)(c) and (d) the provider should allow for complaints to be made by a user where content has been taken down or where action has been taken against a user in relation to illegal/harmful content. The ability to challenge such decisions is an important safeguard for free speech and secures a level of accountability. One question in the design of the regulations is on what basis a complaint can be made. Would the complaint be an appeal (asking the provider to reconsider the application of the published terms to the particular case) or will the complaint be limited to certain grounds? Will this be a matter for the regulator or the provider? Such questions of detail make a difference to the rights of the user. 

 

 

Clause 12 – freedom of expression

 

8. Clause 12 provides a safeguard for freedom of expression in the performance of certain duties required elsewhere in the Draft Bill. Services are required ‘to have regard to the importance of’ freedom of expression. The codes of practices will specify steps to fulfil this obligation.[5] A user will also have a right to complain where the service has failed to comply with the obligation.[6] The addition of clause 12 is welcome and aims to provide a counterweight to the other regulatory pressures, so that providers are less likely to err on the side of caution and take the most restrictive action to limit the risk of being found in breach of a duty.

 

9. The effect of the safeguard is likely to be limited, especially when set against the more specific safety obligations elsewhere in the Draft Bill. The measure is cast in procedural terms, so that it is fulfilled where freedom of expression is given due consideration. The duty applies only when deciding and implementing policies and processes to comply with other obligations under the Draft Bill.[7] A provider is free to be more restrictive of speech rights in relation to policies and processes that are not designed to comply with the duties under the Draft Bill (and some services may claim that such controls are an exercise of the company’s own speech rights). For example, even though the Draft Bill does not mandate systems to deal with (legal) offensive content, the service can still choose to remove such content from its platform under its own terms. As a result, the clause cannot guarantee a particular outcome or substantive protection for speech rights. While important, Clause 12(2) is a limited duty, which ensures free expression is not left out of a provider’s reasoning when meeting the regulatory obligations.

 

10. There are additional obligations for category 1 services under clause 12. Clause 12(5) provides the service must specify the ‘positive steps’ taken after an impact assessment. Does the phrasing of the clause presume that the service will take positive steps following an impact assessment? There could be situations where a category 1 provider has regard to freedom expression, carries out an impact assessment and chooses not to take any positive steps in response.

 

 

The affirmative duties on category 1 providers

 

11. Clauses 13 and 14 impose separate duties on service providers in relation to freedom of expression. The duties are not limited to decisions and actions taken to implement duties elsewhere in the Bill. The two clauses regulate the provider’s own systems and processes, for example where a post is taken down for a breach of the company’s internal (i.e., non-regulatory) content moderation standards. These clauses are therefore distinct from the parts of the Draft Bill that deal with harm prevention. Instead, the two clauses seek to qualify the power of the technology companies to regulate expression more generally. While limited, these obligations are potentially an important development in constraining private censorship.

 

 

Clause 13 – content of democratic importance

 

12. Clause 13 will impose a duty on category 1 providers ‘to operate a service using systems and processes designed to ensure that the importance of the free expression of content of democratic importance is taken into account’ in relation to certain decisions. The duty does not guarantee a particular outcome for a user and a provider remains free to take action in relation to the content as long as free expression is taken into account in the process.

 

13. ‘Content of democratic importance’ is defined as that which appears or is ‘specifically intended to contribute to democratic political debate’ in the UK.[8]  That definition would cover the scrutiny of government, debates on government policy and electoral content. However, the definition is narrower than that for the content receiving heightened protection under Article 10 of the ECHR. It is not clear whether the clause will cover a wider range of content, such as scrutiny of large private companies. A good argument can be made that such content is of democratic importance, though some clarification may be helpful. This could become important in cases where a social media company removes a post that is critical of a large company.

 

14. Clause 13(3) imposes a duty to ensure that ‘the systems and processes mentioned in subsection (2) apply in the same way to a diversity of political opinion’. The Department of Digital, Culture, Media and Sport has stated that the clause will ensure the protections apply equally to different political viewpoints’.[9] From this, I take it that the goal of clause 13(3) is to ensure that the obligations will not be applied in a way that promotes favoured viewpoints (or disadvantages the disfavoured). I am not sure the clause as drafted achieves this result. One question is what it means for the systems to ‘apply in the same way’?  Clause 13(5) will already require that the terms of service in relation to this safeguard are applied consistently. I wondered if 13(3) is seeking to address biases or discrimination against viewpoints that arise even when the terms are applied consistently? I think the terms used could be clarified.

 

15. A further question is what it means for terms to apply in the same way to ‘a diversity of political opinion’?  Would this permit the arbitrary treatment of an individual political viewpoint as long as the systems work fairly (or non-arbitrarily) in relation to a range of other viewpoints (thereby applying in the same way to a ‘diversity of political opinion’)? Is this the desired result of the clause? Again, I think the goal and phrasing of the provision require clarification.

 

16. An expedited complaints mechanism (similar to that in clause 14) for electoral content should be considered. For example, a decision to remove election messages could have effects that are time sensitive and which cannot be remedied easily after the polling day. An expedited process to correct any errors may be particularly important in such circumstances.

 

 

Clause 14 – protection of journalistic content

 

17. Clause 14 sets out an affirmative duty on category 1 services in relation to journalistic content. While various duties under the Draft Bill are not applicable in relation to news publisher content (as defined in clause 39 and 40), there are still various situations where journalism requires protection. There is journalistic content that is not produced by a recognised news publisher, which will be affected by the regulatory measures. The systems and processes required under the regulations to combat illegal and harmful content could also have knock-on effects for media content that falls within clauses 39 and 40.[10] The provider’s own content moderation standards (imposed independently of the regulations) could also be applied to journalism. Clause 14 could offer protection in these various contexts.

 

18. Clause 14(2) provides for a duty to operate a service using systems and processes ‘designed to ensure that the importance of the free expression of journalistic content’ is taken into account when making certain decisions. This part of the clause is analogous to 13(2). A key difference in clause 14 lies in the requirement for a dedicated and expedited complaints process in relation to certain decisions relating to journalistic content. [11] Clause 14(5) also provides for a swift remedy where a complaint is upheld.

 

19. What is not clear from the current draft is on what basis a complaint can be made under clause 14(3) or (4). There are various ways a complaint could be framed, such as (1) a failure to identify the content as journalistic, (2) a failure to take the importance of the free expression of journalistic content into account when making the decision, (3) a failure to give due weight to the free expression of journalistic content when making the decision, or (4) that the decision should be reconsidered as a whole (an appeal).[12] I wondered whether the grounds for complaint would be left for the regulated provider to specify (under the duty to provide terms under 14(6)(c)) or whether the regulator would specify the possible grounds for complaint. This point may need clarification in the legislation

 

20. Clause 14 applies to content that ‘is generated for the purposes of journalism’. One question is whether that formulation would exclude the republication of material generated for a purpose other than journalism. For example, a prison inmate may take footage on a mobile phone (in breach of prison rules) showing disruption inside a prison and post that content on social media. Would clause 14 apply to a news service that reposts the material online to make a point about standards inside prisons? The question is whether the post by the journalist would amount to the generation of fresh content or whether it would be a re-publication of content generated for a non-journalistic purpose? The issue is important as republishing statements and material generated by others is a key part of journalism.[13] The point is worth clarifying.

 

21. The term ‘purposes of journalism’ is not given further definition in the Draft Bill, but the term is used in other legislation.[14] In Sugar, Lord Walker defined the ‘purposes of journalism’ (in the context of the Freedom of Information Act 2000) as ‘output on news and current affairs’ (including sport).[15] One question is whether the term should follow this definition or cover a broader range of content (such as showbusiness news, reviews of films, etc)? A second question is whether journalism should be defined solely by the content, or whether there should also be a procedural element to the definition (requiring the publisher to act in accordance with the standards of journalism, taking steps to verify, etc)?[16] The matter could be left to the regulator to provide guidance, but the approach taken will have a significant impact on the scope of the provision.

 

22 September 2021

6


 


[1] Similar issues can arise with the E-Commerce Regulations 2002 in relation to knowledge of illegality. However, that standard does not impose liability on a service, but determines whether the company is shielded from liability.

[2] Or there may be content that would be found to be illegal once investigated and prosecuted, but which a provider has no reasonable grounds to believe is illegal and therefore is not covered by the regulatory duty.

[3] Clause 33(6) refers to requirements under clause 32, but there is no reference to the procedures under clause 29.

[4] There are provisions elsewhere in the Draft Bill for the revision of codes, which may already enable necessary modifications.

[5] 36(5).

[6] 15(4)(b)(ii). Clause 15(8) provides a reminder that clause 12(2) is relevant to the duties of reporting and redress, though it is not clear what that provision requires in relation to complaints processes.

[7] Clause 12(6).

[8] 13(6)(b)

[9] Letter from Caroline Dinenage MP to Lord Gilbert, 16 June 2021.

[10] See Graham Smith’s helpful explanation in ‘Carved out or carved up: the draft online safety bill and the press’ Inforrm, 30 June 2021: https://inforrm.org/2021/06/30/carved-out-or-carved-up-the-draft-online-safety-bill-and-the-press-graham-smith/

[11] The provision is distinct from clause 15(4)(b)(iv), which imposes a duty to operate a system for complaints where a provider is not complying with the clause 14 duties.

[12] In the written evidence submitted by the Department for Digital, Culture, Media and Sport and the Home Office [57], the process is described as an appeal.

[13] Along such lines, see the classic case of Jersild v Denmark (1995) 19 EHRR 1 at [35].

[14] PACE 1984, s.13, Freedom of Information Act 2000.

[15] In BBC v Sugar (No 2) [2012] UKSC 4, the term had to be read alongside additional words referring to ‘art or literature’, so the statutory context is different from that of the Draft Bill.

[16] Prior to 2013, defamation law relied on certain professional processes to define ‘responsible journalism’. Such factors may, however, be difficult for a regulated provider to assess on a case-by-case basis.