The News Media Associationwritten evidence (FEO0058)

 

House of Lords Communications and Digital Committee inquiry into Freedom of Expression Online

 

The News Media Association [NMA] is the voice of UK national, regional and local newspapers in all their print and digital forms – a £4 billion sector read by 49.2 million adults every month in print and online.  Our members publish over 900 news media titles, from The Times, The Guardian, the Daily Mail and the Daily Mirror to the Yorkshire Post, Kent Messenger and the Monmouthshire Beacon.  Our membership spans the industry from the largest groups to small, independent, family-owned companies publishing one or two local titles.

 

 

Q3.              Is online user-generated content covered adequately by existing law and, if so, is the law adequately enforced?  Should ‘lawful but harmful’ online content also be regulated?

 

1.1.              There are numerous criminal offences which curb publication of material by anyone in any medium, whether blogger, newspaper, broadcaster, book, magazine or private individual, in print, online, or via broadcast.   News publishers already have well established effective systems in place to govern user-generated content on their sites, which are also subject to well established, consistent, transparent, independent regulation by IPSO and by the criminal and civil law (including laws relating to terrorism, obscenity, public order and incitement to hatred).

 

1.2.              NMA members take pains to ensure that their sites provide a space for intelligent, free-ranging debate.  On press freedom grounds and in accordance with the industry’s principled opposition to state control, in line with its opposition to the Leveson Report and Royal Charter system of press regulation, the newspaper industry is opposed to any direct statutory oversight of its regulatory systems being applied to its websites and content, including user-generated content.  

 

1.3.              It is important that press publishers are not – whether deliberately or inadvertently brought within the scope of any new regime aimed at the tech platforms, either in respect of their own websites or when their material is carried by third parties.  In the UK, publishers already shoulder legal responsibility, augmented by efficient voluntary, self-regulatory and other frameworks.  Successive Secretaries of State and Ministers have accepted that there is no need for further regulation of an already well-regulated industry.

 

 

Q4.              Should online platforms be under a legal duty to protect freedom of expression?

 

2.1.              Platforms should be responsible for the content they host and curate in the same way that traditional publishers are.  As the Commons culture committee said in 2019: “Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites”

 

2.2.              Under the e-Commerce Directive, social media companies are exempt from liability for the content they host if they “play a neutral, merely technical and passive role” towards it. The recent moves by companies such as Twitter, Facebook, Snapchat, Twitch and Reddit to ban Donald Trump’s personal and campaign accounts show that they take proactive editorial decisions, with Twitter announcing that it was permanently suspending the US President’s account due to the risk of further incitement of violence”. The Times takes the view that rather than acting out of a sense of civic duty “tech giants have a wary eye on forthcoming regulation from a Democrat-controlled White House”[1], which clearly provides an impetus to flagging acceptable baseline standards of behaviour.

 

2.3.              Any inhibition on an individual’s right to freedom of expression should satisfy the tripartite test of legality, legitimacy, and necessity set out in Article 19(3) of the International Covenant on Civil and Political Rights (ICCPR).  The application of these principles to the corporate context of platform moderation is likely to pose a number of challenges; if platforms define prohibited categories of content too broadly they risk suppressing content in excess of what is necessary in pursuance of a legitimate aim.  Conversely, if they define the categories too narrowly, they risk facilitating harmful online environments.  Platforms must balance competing rights and interests in accordance with the test of necessity. Where an online platform has become a dominant channel for public communication, enabling users to disseminate their content to a wide audience, it has a responsibility to protect individuals and communities.  However, measures to protect online speakers must avoid oppressive content moderation in order to make the Internet’s ‘vast democratic forums’ usable.

 

Q9.              How could the transparency of algorithms used to censor or promote content, and the training and accountability of their creators, be improved? Should regulators play a role?

 

3.1.              In the Committee’s report Breaking News? The Future of UK Journalism (27 November 2020) it was recognised that “There is a fundamental imbalance of power between news publishers and platforms. Due to their dominant market position, Facebook and Google can stipulate the terms on which they use publishers’ content. This includes whether and how much they pay for news appearing on their platform, which news sources their algorithms rank most highly and how much notice they give publishers of changes to these algorithms. Algorithms are a product of the human value judgments of their designers, but there is a lack of transparency about them and designers’ possible biases” [§254].

 

3.2.              The NMA has welcomed (inter alia) the CMA’s recommendations in its Online platforms and digital advertising: Market study final report (1 July 2020) that: (1) platforms should be required to explain the operation of search and news feed ranking algorithms and advertising auctions and to allow audit and scrutiny of their operation by the DMU; and (2) platforms should give fair warning about changes to the operation of algorithms where these are likely to have a material effect on users, and to explain the basis of these changes.

 

Q11.              To what extent would strengthening competition regulation of dominant online platforms help to make them more responsive to their users’ views about content and its moderation?

 

4.1.              The relevance of a platform’s dominance is reflected in the jurisprudence of the European Court of Human Rights, which has circumscribed the obligations of States to respect and ensure the right to freedom of expression in a range of contexts based on whether those whose expression has been restricted have access to viable alternative platforms to exercise their expression [see for example Animal Defenders International v. the UK, 57 Eur. Ct. H.R. Ap. No.48876/08 (2013) 21, 41-43; Cengiz and Others v. Turkey, App. Nos. 48226/10 and 14027/11, Eur. Ct. H.R. (2015) 51-55].

 

4.2.              A fundamental issue in online content moderation is how the corporate imperatives of platform governance – namely growth and profit – can be reconciled with the broader public interest, including the public interest in freedom of expression.  Plainly, platforms’ speech policies are rooted in particular corporate values which influence how they moderate content on their sites and thus affect what content is surfaced online. Lack of oversight and transparency can lead to ‘censorship creep’, discrimination and inconsistency of application.  The parameters of free speech should not be a matter purely of corporate discretion.  Online platforms have often been criticised for failing to consult properly with users, civil society groups and the general public with regard to the development, transparency and oversight of their moderation rules. Any measures that can enhance stakeholder engagement are to be welcomed.

 

4.3.              On press freedom grounds the newspaper industry rejects indirect regulation and censorship by the platforms and other entities as a result of their own compliance practices for the purposes of the new online harms regime, by algorithmic or human intervention, whether by not surfacing content or preventing publication or taking down of material published on third parties’ websites.

 

4.4.              The Online Harms bill will introduce a statutory regulator for social media content. Successive Secretaries of State and Ministers have accepted that publishers – and their independent systems of press regulation – should not be subject to the new regime and its regulator.  This now needs to be translated into the legislation itself.  The Government intends that the compliance practices of entities subject to the regime should observe ‘freedom of expression’ requirements. This does not negate the need for the news industry exemption or meet the news industry’s objections of principle and practice. Direct and indirect censorship may not be detectable in practice, the procedures will be subject to the judgement and oversight of the statutory regulator (presumably itself subject to judicial review or further legal action) and any challenges may involve lengthy and costly processes which publishers would be unlikely to invoke. News is of course a perishable commodity.

4.5.              The Government plans to establish a Digital Markets Unit to strengthen digital competition regulation and rebalance the relationship between online platforms and news publishers via enforceable codes of conduct.   We welcome the conclusion of the Committee in ‘Breaking News?’ that “The Government should set up the proposed Digital Markets Unit as a matter of urgency and ensure that it has the powers and resources it needs. The possibility that the establishment of the Digital Markets Unit could be delayed until 2022 or later is unacceptable”.  Any delay in setting up the DMU could prove fatal for some publishers who are facing existential challenges that have only been accelerated by the current pandemic.  The news media industry is not in a condition to wait for wholesale competition reform regulating all digital markets from media to retail. Urgent pro-competitive reform to rebalance the platform-publisher relationship and restore competition to the digital advertising market is needed to secure a sustainable future for news.

 

4.6.              As increasing numbers of people in the UK use Facebook and Google to access news and information through platforms and aggregators, it is vital that these spaces are saturated with information that is accurate, trustworthy and engaging. The NMA proposes that the codes of conduct should include: (1) a mechanism by which publishers may secure payment for their content by platforms; and (2) obligations to carry and surface the industry’s trusted news (subject to publishers’ consent). If these requirements were to be included in the Online Harms bill the legislation would not merely eradicate harmful information from the internet but would actively support and sustain the dissemination of reliable, trusted information from responsible news publishers.

 

4.7.              Publishers’ inability to realise a fair return for their content is an important matter for consumers because, as the CMA has often acknowledged, it is “likely to reduce their incentives and ability to invest in news other online content, to the detriment of those who use and value such content and to broader society”.  If left unchecked, this problem could result in communities across the UK being left without dedicated quality news outlets. Now more than ever, securing a sustainable future for high quality journalism requires an end to the duopoly’s anticompetitive practices.

 

 

15 January 2021

4

 


[1]              https://www.thetimes.co.uk/article/the-times-view-of-extremism-online-antisocial-media f70zh2cdh?shareToken=e1f61eb8b477277966d2192c10d1e6c9