{"HashCode":-849872376,"Height":841.0,"Width":595.0,"Placement":"Header","Index":"Primary","Section":1,"Top":0.0,"Left":0.0}

 

English PENwritten evidence (FEO0043)

 

 

House of Lords Communications and Digital Committee inquiry into Freedom of Expression Online
 

Introduction

 

English PEN is a writers’ association. Through our campaigns and programmes, we promote literature across frontiers and defend the right to freedom of expression. We are the founding centre of the PEN International network of 140 centres in 110 countries. In 2021 we celebrate our centenary year.

 

We welcome the Committee’s timely inquiry into freedom of expression online and welcome the fact that representatives of freedom of expression NGOs were among the first to be invited to give evidence to the Committee. We broadly support the oral submissions of ARTICLE 19, Index on Censorship and Global Partners Digital. This submission contains some supplementary observations that we hope the Committee will take into account when making any recommendations to the Government and Parliament.

 

The International Context

 

Although we appreciate that the Committee is principally concerned with how freedom of expression operates within the United Kingdom and its sphere of influence, we believe it is useful to briefly consider the international context.

 

Internationally, as in the UK, the Internet is the front line for free speech. Many of the cases on which English PEN is actively campaigning concern someone who have been prosecuted or persecuted for words written online. To pick just two examples from PEN International’s extensive case list: the poet and human rights campaigner Ahmed Mansoor has been imprisoned in the United Arab Emirates since March 2017 on vague charges of “insulting the status and prestige of the UAE and its symbols” and “publishing false reports and information on social media” ; and the award-winning journalist Maria Ressa from the Philippines, who was among those honoured by TIME as Person of the Year in 2018, was convicted of ‘cyber libel’ in 2020 over a story  published in 2012 on news website Rappler, of which Ressa is a co-founder and director.

 

When making proposals on matters that affect freedom of expression in the United Kingdom, we urge the Committee to consider the international implications of any recommendations. As a permanent member of the UN Security Council and a progenitor of the European Convention on Human Rights, Britain’s laws serve as an example to other countries. Rights-abusing regimes often seek to excuse their activities by drawing attention to British human rights violations, or to illiberal British laws. It is therefore crucial that any laws or regulation affecting the exercise of free speech in the UK are precise and proportional. Vague and over-broad measures will not only harm the free speech rights of people living in the UK but will impact the rights of individuals all over the world.

 

Online Harms

 

The Committee’s questions regarding user generated content, the regulation of ‘lawful but harmful’ content, and whether platforms should have a legal duty to protect freedom of expression, mirror those posed by the Government’s Online Harms consultation of 2019, to which English PEN responded jointly with Scottish PEN.[1] In our response, we expressed concern that the ‘duty of care’ regulatory model proposed by the Government was inappropriate. In particular, we noted that the Government had not proposed a clear definition of ‘harm’ that a regulator could guard against. We wrote:

 

A new regulatory body cannot be expected to maintain effective oversight of online platforms unless it has a clear definition of the harms it seeks to prevent. Regulation cannot be ‘targeted and effective’ when the purported harms are hazily defined. A culture of ‘transparency and accountability’ cannot be promoted among the online platforms when there are no clear rules on what content is and is not acceptable.

 

In our view, the Government’s proposed model would lead to the social media companies over-moderating content out of an abundance of caution. It is also likely that they will deploy automated moderating technologies that have no awareness of context and cannot replicate human judgement. Such changes would have a disproportionate and detrimental effect on freedom of expression.

 

As an alternative, we recommended the adoption of a ‘rights-based’ regulatory model, where a regulator would ask social media companies to develop terms of service that protect not only the right to safety, but also freedom of expression and privacy rights.

 

We remind the Committee that a ’rights-based’ approach has already found favour with some policy-makers. In its report Challenging Hateful Extremism (October 2019), the Commission for Countering Extremism proposed that a ‘rights-based’ approach be at the centre of any counter-extremism strategy.[2] If online discourse is to be regulated, it would make sense for the regulatory principles to be aligned with related Government activities by an adoption of a similarly rights-based approach.

 

Harmful Online Communications

 

The Law Commission recently conducted a consultation on Harmful Online Communications.[3] The Commission proposed the abolition of the Communications Act 2003 c.21, section 127 offences, to be replaced with a ‘harm based’ offence.

 

In our response to the consultation, we expressed support for the removal of offences based on concepts of ‘offence’ or ‘annoyance.’[4] An offence linked to tangible, demonstrable harm to the recipient of a communication is to be preferred, and more likely to be justified under the provisions of ECHR Article 10(2).

 

However, we have reservations about the ‘serious emotional distress’ standard proposed by the Law Commission. With no reference to clinical standards, we fear that such a standard will be too subjective, vague and over-broad, and therefore cause a significant ‘chill’ on freedom of expression.

 

Furthermore, the proposed new offences do not appear to distinguish between social media messages that are ‘broadcast’ indiscriminately to followers/subscribers, and messages targeted at one or more particular individuals. These two kinds of messages are conceptually very different, with very different potential to cause psychological harm. Attempting to craft a law that covers both kinds of message is likely to create ambiguities, and therefore a ‘chill’ on free speech.

 

Anonymity

 

We strongly support the right of social media users and those operating their own websites to do so anonymously or pseudonymously.

 

Anonymity is sometimes crucial for freedom of expression, especially in ‘whistle-blower’ cases. The protection of journalistic sources has been acknowledged by the European Court of Human Rights as a crucial aspect of freedom of expression,[5] and the reasoning that underpins that right in relation to journalism extends, we believe, to whistle-blowers who publish directly using online tools.

 

We note that parliament has already granted the police and security services extensive powers under the Investigatory Powers Act 2016 c.25 and the Data Protection Act 2018 c.12 to acquire identifying information for the purposes of crime detection. Meanwhile, the Courts have a well-developed procedure in place for compelling Internet Service Providers to release identifying information of anonymous users. Norwich Pharmacal orders are routinely made in defamation, harassment, privacy and copyright cases.

 

In addition, section 5 of the Defamation Act 2013 c.26 establishes a detailed procedure for revealing the identity of the authors of libellous content posted anonymously.[6]

 

English PEN has reservations about the scope of all these measures, and would recommend reforms mandating that the Courts pay particular regard to freedom of expression and privacy rights, as well as the public interest, when considering whether to grant an order that compels operators to reveal personal information about their users.

 

Moreover, we strongly urge Parliament to resist the introduction of any measures which compel social media companies to enforce a ‘real name’ policy on their platforms. Some companies may choose to implement such a policy as being in keeping with their brand or the nature of their service, but pseudonymous services must be available for those who need them.

 

Online vs Offline

 

The Committee asks respondents whether online and offline content should be treated differently. In our response to the Law Commission consultation on Harmful Online Communications, we expressed support for the idea that online and offline content should be treated equally under the law.[7]

 

However, we ask whether ‘online vs offline’ is the most salient divide. The practical distinction between online and offline content is becoming increasingly blurred, with most written content appearing either exclusively in digital form, or in both printed and digital formats.

 

 

15 January 2021

4

 


[1]              ‘English PEN and Scottish PEN call on British government to rethink its approach to Online Harms’ englishpen.org 9 July 2019 https://www.englishpen.org/posts/news/english-pen-and-scottish-pen-call-on-british-government-to-rethink-its-approach-to-online-harms/

[2]              Commission for Countering Extremism Challenging Hateful Extremism, 7 October 2019 https://www.gov.uk/government/publications/challenging-hateful-extremism

[3]              Law Commission Project: Reform of the Communications Offences https://www.lawcom.gov.uk/project/reform-of-the-communications-offences/

[4]              English PEN responds to Law Commission consultations’ 12th January 2021 https://www.englishpen.org/posts/campaigns/english-pen-responds-to-law-commission-consultations/

[5]              Goodwin v United Kingdom (1996) 22 EHRR 123

[6]              The Defamation (Operators of Websites) Regulations 2013, UKSI 2013/3028

[7]              English PEN responds to Law Commission consultations’ 12th January 2021 https://www.englishpen.org/posts/campaigns/english-pen-responds-to-law-commission-consultations/