Skip to main content

Call for Evidence

Call for evidence

The House of Lords Communications and Digital Committee, chaired by Lord Gilbert of Panteg, is to hold an inquiry into freedom of expression online. The committee invites written contributions by Friday 15 January 2021.

The committee expects to hear from invited contributors in public sessions from December 2020 to March 2021 inclusive before publishing a report. The Government responds in writing to select committee reports.

Aim of the inquiry

The Communications and Digital Committee wishes to investigate how the right to freedom of expression should be protected online and how it should be balanced with other rights.


Debates and exchanges of information and content increasingly take place online. The internet has enabled ways of searching, publishing and sharing information with others that were not previously possible.

Freedom of expression is a fundamental right protected by Article 10 of the European Convention on Human Rights. It is also protected under common law and in the International Covenant on Civil and Political Rights. Historically, this right has been understood in terms of what is spoken by individuals or what is written or said in the media. Content posted online arguably occupies an ambiguous middle ground between these two. The right to freedom of expression includes people’s ability to freely search for, receive and communicate information, whether this is face-to-face or mediated across time and space. It comes with responsibilities related to avoiding harm to individuals and groups.

The founders of Facebook and Twitter have both described their platforms as a digital equivalent of the public square.[1] As the U.S. Supreme Court has noted, such websites “can provide perhaps the most powerful mechanisms available to a private citizen to make his or her voice heard.”[2] Yet these ‘public squares’ are controlled by private companies, which are free to ban or censor whomever or whatever they wish and whose platforms shape both the nature and visibility of communications transmitted across them.

Huge volumes of user-generated content are uploaded to platforms each day, making artificial intelligence increasingly important in moderation decisions.[3] This raises questions about algorithms’ effectiveness and possible biases, including the diversity of their designers and whether they should be subject to external audits. For example, two studies in the U.S. found evidence of widespread racial bias, with algorithms more likely to identify posts by African-Americans as hateful.[4] Google has been found to rank racist and sexist content highly in search results, which may have the effect of excluding and silencing individuals accessing online spaces.[5]

In recent years, there have been many high-profile controversies about action taken by platforms. These include Twitter banning Graham Linehan, creator of Father Ted and The IT Crowd; Twitter preventing users from tweeting a story by the New York Post; Facebook banning the famous ‘napalm girl’ photograph from the Vietnam War before reversing its decision; YouTube taking down videos which “go against World Health Organisation recommendations” on Covid-19; and Instagram limiting the visibility of posts by black and plus-size women.[6]

Websites have also been criticised for not doing enough to remove content which breaks the law or community standards. More than 1,200 companies and charities, including Adidas, Unilever and Ford, suspended their advertising on Facebook in July 2020 to pressure the company to “stop valuing profits over hate, bigotry, racism, antisemitism, and disinformation.”[7] Facebook has set up a 20-member oversight board. The board will have the final say in its review of ‘highly emblematic’ content moderation decisions on Facebook’s platforms.[8] Members include Alan Rusbridger, former Editor of The Guardian, Helle Thorning-Schmidt, former Prime Minister of Denmark, and Endy Bayuni, Senior Editor at The Jakarta Post.[9]

The Government aims, through its upcoming Online Harms Bill, to make the UK the safest place in the world to go online.[10] How this legislation should balance responding to harms with protecting freedom of expression is contentious. Other developments include the Government’s plans to establish a Digital Markets Unit to strengthen digital competition regulation, the Law Commission’s consultation on reform of online communications offences, and the growing global debate about whether platforms should be liable for the content they host in the same way as publishers such as over Section 230 of the Communications Decency Act of 1996 in the United States.[11]   


The committee seeks responses to the following questions to form the written evidence for its report. Contributors need not address every question and experts are encouraged to focus on their specialism. Other issues may be discussed provided that their relevance is explained. Submissions which have been previously published will not be accepted as evidence. However, published material may be referenced where relevant.

The committee encourages people from all backgrounds to contribute and believes that it is particularly important to hear from groups which are often under-represented. The committee’s work is most effective when it is informed by as diverse a range of perspectives and experiences as possible.

  1. Is freedom of expression under threat online? If so, how does this impact individuals differently, and why? Are there differences between exercising the freedom of expression online versus offline?
  2. How should good digital citizenship be promoted? How can education help?
  3. Is online user-generated content covered adequately by existing law and, if so, is the law adequately enforced? Should ‘lawful but harmful’ online content also be regulated?
  4. Should online platforms be under a legal duty to protect freedom of expression?
  5. What model of legal liability for content is most appropriate for online platforms?
  6. To what extent should users be allowed anonymity online?
  7. How can technology be used to help protect the freedom of expression?
  8. How do the design and norms of platforms influence the freedom of expression? How can platforms create environments that reduce the propensity for online harms?
  9. How could the transparency of algorithms used to censor or promote content, and the training and accountability of their creators, be improved? Should regulators play a role?
  10. How can content moderation systems be improved? Are users of online platforms sufficiently able to appeal moderation decisions with which they disagree? What role should regulators play?
  11. To what extent would strengthening competition regulation of dominant online platforms help to make them more responsive to their users’ views about content and its moderation?
  12. Are there examples of successful public policy on freedom of expression online in other countries from which the UK could learn? What scope is there for further international collaboration?

The committee encourages interested parties to follow the progress of the inquiry on Twitter @LordsCommsCom and at:

[1]             Mark Zuckerberg, ‘A Privacy-Focused Vision for Social Networking’, Facebook (6 March 2019); Jack Dorsey, Twitter (5 September 2018)  

[2]             Supreme Court of the United States, Packingham v. North Carolina (October 2016)

[3]             David Kaye, Speech Police: The Global Struggle to Govern the Internet (June 2019) p 64:

[4]             Shirin Ghaffary, ‘The algorithms that detect hate speech online are biased against black people’, Vox (15 August 2019)

[5]             Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (2018)

[6]             Molly Blackall, ‘Twitter closes Graham Linehan account after trans comment’, The Guardian (27 June 2020); Abram Brown, ‘Twitter Won’t Let The New York Post Tweet Until It Agrees To Behave Itself’, Forbes (19 October 2020); Sam Levin, Julia Carrie Wong and Luke Harding, ‘Facebook backs down from “napalm girl” censorship and reinstates photo’, The Guardian (9 September 2016); Jon Levine, ‘YouTube censors epidemiologist Knut Wittkowski for opposing lockdown’, New York Post (16 May 2020); Nosheen Iqbal, ‘Instagram row over plus-size model forces change to nudity policy’, The Guardian (25 October 2020)

[7]   ; Seb Joseph, ‘As the Facebook boycott ends, brand advertisers are split on what happens next with their marketing budgets’, Digiday (3 August 2020)


[9]             Adam Smith, ‘Facebook announces oversight board members that can overrule Zuckerberg’, The Independent (7 May 2020)

[10]            HM Government, Online Harms White Paper (April 2019) p 5:

[11]            Mark Jones, Zoe Bartlett and Matt Giles, ‘UK to create a Digital Markets Unit’, Mondaq (11 July 2019):; Law Commission, Harmful Online Communications: The Criminal Offences – Summary of the Consultation Paper (September 2020):; Derek E. Bambauer, ‘Trump’s Section 230 reform is repudiation in disguise’, Brookings (8 October 2020); Kommerskollegium, Platform liability in the EU: a need for reform? (2020)



Submissions should be made through the online form at:

Please bring this document to the attention of groups and individuals who may not have received a copy direct, including those who have not previously engaged with Parliament.

The deadline for making a written submission is 23.59 on Friday 15 January 2021.

Concise submissions are preferred. A submission longer than six pages should include a one-page summary. Paragraphs should be numbered. All submissions made through the written submission form will be acknowledged automatically by email.

Submissions which are accepted by the committee as written evidence may be published online at any stage. When it is published as written evidence a submission becomes subject to parliamentary copyright and is protected by parliamentary privilege. Submissions which have been previously published will not be accepted as evidence.

Once your submission has been accepted as evidence you will be notified by a further email, and at this point you may publicise or publish it yourself. In doing so you must indicate that it was prepared for the committee, and you should be aware that your publication or re-publication of your evidence may not be protected by parliamentary privilege.

Personal contact details will be removed from evidence before publication, but will be retained by the Committee Office and may used for specific purposes relating to the committee’s work—for instance to seek additional information.

The committee may invite individuals and groups who have submitted written evidence, as well as others, to answer questions in a public session. These oral evidence sessions are usually held in Westminster but currently take place virtually due to the Covid-19 pandemic. They are broadcast online and transcripts are also taken and published.

Substantive communications to the committee about the inquiry should be addressed to the clerk of the committee, whether or not they are intended to constitute a formal written submission.

This call for written evidence has now closed.

Go back to Freedom of expression online Inquiry