Written evidence submitted by the RSA



03 September 2021

The RSA submission to DCMS online safety and online harms inquiry

About the RSA, background, and reason for submission

  1. The RSA (royal society for the encouragement of arts, manufactures and commerce) believes in a world where everyone is able to participate in creating a better future. Through our ideas, research and a 30,000 strong Fellowship we are a global community of proactive problem solvers, sharing powerful ideas, carrying out cutting-edge research and building networks and opportunities for people to collaborate, influence and demonstrate practical solutions to realise change.


  1. Since 2018 the RSA has been investigating issues that arise at the intersection of technology and society. This line of work led us to our ongoing investigation into misinformation and disinformation, and what could be meaningfully done to remedy the individual and societal harms it causes. Our final report is due for publication in the coming weeks and our responses below are built and adapted from this research.

Response to questions:

What are the key omissions to the draft Bill, such as a general safety duty or powers to deal with urgent security threats, and (how) could they be practically included without compromising rights such as freedom of expression?

  1. The draft Bill does a good job of providing a structure which will take a systemic view of online harms to individuals. However, we believe the Bill speaks too little of the potential for harm caused by misinformation and disinformation and says nothing about dealing with collective or societal harms caused by misinformation and disinformation.


  1. To first give a sense of scale, consider:


  1. Information, and misinformation, is on a previously unknown scale and yet sits within a space checked only by the platforms themselves. On aggregate these individual cases of misinformation and harm can lead to collective or societal harms; we refer to this as erosion of the ‘public square’. By collective harms we refer to harms that are accumulated over time and lead to the erosion of important societal values, such as social cohesion, social trust, trust in expertise, or trust in accepted facts. For example:


  1. Yet such movements can have a slower effect on the trust in science, technology, or the electoral process. This can then lead to declining trust in authority for implementing the supposedly ‘dangerous’ technologies etc. This is where collective or societal harm has occurred. RSA pre-pandemic polling estimated that around 6 percent of people believe, or are inclined to believe, in three major anti-scientific conspiracies: 5G is harmful, vaccines are harmful, and global warming is fake.[13] A group we dubbed ‘The hoaxers.


  1. Providing specific scope for Ofcom, or any subsidiaries, to investigate and have remit over collective harms would allow them to put in place procedures which limit the spread of this misinformation and limit the damage to the public square.
  2. Second, RSA pre-pandemic polling showed that 71 percent of the public stated that they want a “stronger independent regulator on the quality of news”. This statistic should of course be taken with much caution, not least because a pandemic has occurred since, but also because the public will have highly varying views on what a “strong independent regulator” means. Nevertheless, this speaks to a general anxiety about the quality of our information eco-system and our insufficient mechanisms of redress for rightly concerned citizens.


  1. However, the Bill does little to address growing public unease about the quality of information online and instead retains the current system whereby platforms continue to be the primary and only monitors and purveyors of information on their sites, with the exception of considerations of national security where the government rightfully retains powers. Platforms, as we have shown, are increasingly influential and increasingly marred by misinformation. We, of course, must remain cautious. There is plenty to be concerned about the idea of government bureaucracies having sign-off on our social interactions and how we express ourselves. Yet the current system cannot be said to be the preferred approach either.


  1. We therefore feel that within the framework of the Bill a new system of shared and pluralist governance on both societal and individual harms caused by misinformation is required. This forum should represent a wide range of views, by combining the inputs of citizens, Ofcom, platforms, traditional media, civil society including fact checkers, researchers, and other experts. We propose that an Office for Public Harms is formed which supersedes the current advisory committee on misinformation and disinformation, as proposed in the Bill.

Our recommendations for protecting the public square and improving the veracity of information online

  1. We recommend that the Bill includes the explicit remit to investigate systemic issues online which lead to societal harms ie beyond its current scope of individual harms.


  1. We recommend this is done through an independent body, the Office for Public Harms, made up of a pluralist collective of stakeholders including: citizens, Ofcom, platforms and wider industry, traditional media, civil society, researchers, and other experts


  1. The Office would:


  1. Have responsibility to investigate and analyse societal harms caused by misinformation. It would do this through transparency reports, information requests, and through harms being submitted to it by the public or by organisations. The Office would then publish its findings publicly, inform platforms of issues it finds, and advise Ofcom on potential changes to the procedural and systemic factors within Ofcom’s remit.


  1. The Office should also act as a ‘misinformation ombudsman’ whereby it is able to investigate and suggest remedies for individual cases of harm caused by misinformation or disinformation, or where content has been felt to been unfairly removed or demoted. This would only occur if the remediation processes on platforms themselves are felt to be unsatisfactory. We feel that this measure is more appropriate than what is currently suggested in the Bill because it offers a multi-stakeholder backstop to an otherwise primarily platform-controlled online information ecosystem.


  1. Retain freedom of expression as a primary objective, as well as ensuring the veracity of information online. The Office should not seek to remove content but instead suggest remedies which affect content’s algorithmic amplification.


  1. Be funded in a similar arrangement to the Advertising Standards Authority, whereby platforms pay a levy to its running and work.


  1. Be informed by citizens’ deliberation and panels to generate wider legitimacy behind its work.

[1] Gottfried, J and Shearer, E. (2016) News Use Across Social Media Platforms 2016. [online] Washington DC. Pew Research Center. Available at: www.pewresearch.org/journalism/2016/05/26/news-use-across-social-media-platforms-2016/ [Accessed 3 September 2021].

[2] OFCOM (2021) News Consumption in the UK: 2021 OFCOM [PPT] Available at: www.Ofcom.org.uk/__data/assets/powerpoint_doc/0026/222479/news-consumption-in-the-uk-2021-report.pptx [Accessed 19 August 2021].

[3] MIT Sloan School (2018) Study: False news spreads faster than the truth [online] Available at: https://mitsloan.mit.edu/ideas-made-to-matter/study-false-news-spreads-faster-truth [Accessed 3 September 2021].

[4] Kornbluh, K., Goldstein, A. and Weiner, E., 2020. New Study By Digital New Deal Finds Engagement With Deceptive Outlets Higher On Facebook Today Than Run-Up To 2016 Election. [online] The German Marshall Fund of the United States. Available at: www.gmfus.org/blog/2020/10/12/new-study-digital-new-deal-finds-engagement-deceptive-outlets-higher-facebook-today> [Accessed 22 October 2020].

[5] Brennen, JS, Simon, FM, Howard, PN and Nielsen, RK. (2020) Types, sources, and claims of COVID-19 Misinformation [online] Available at: reutersinstitute.politics.ox.ac.uk/types-sources-and-claims-covid-19-misinformation.

[6] Bickert, M (2021) How We’re Taking Action Against Vaccine Misinformation Superspreaders. [online] Facebook. Available at: https://about.fb.com/news/2021/08/taking-action-against-vaccine-misinformation-superspreaders/ [Accessed 3 September 2021].

[7] See: Goodman, J and Carmichael, F (2020) Coronavirus: Bill Gates ‘microchip’ conspiracy theory and other vaccine claims fact checked [online] BBC News. Available at: www.bbc.co.uk/news/52847648 [Accessed 31 August 2021]. And see: Schraer, R (2021) Covid vaccine: Fertility and miscarriage claims fact-checked [online] BBC News. Available at: www.bbc.co.uk/news/health-57552527 [Accessed 31 August 2021].

[8] UK Government COVID Dashboard (2021Vaccinations in England. [online] Available at: coronavirus.data.gov.uk/details/vaccinations?areaType=nation&areaName=England [Accessed 3 September 2021].

[9] Office for National Statistics (2021) Coronavirus and vaccine hesitancy, Great Britain. [online] Available at: www.ons.gov.uk/peoplepopulationandcommunity/healthandsocialcare/healthandwellbeing/bulletins/coronavirusandvaccinehesitancygreatbritain/28aprilto23may2021 [Accessed 3 September 2021].

[10] BBC Reality Check (2021) Capitol riots: Who broke into the building? [online] BBC. Available at: www.bbc.co.uk/news/55572805 [Accessed 3 September 2021].

[11] Lawrence, D and Davis, G (2020) Qanon in the UK: The growth of a movement [Online] London: Hope not Hate. Available at: www.hopenothate.org.uk/wp-content/uploads/2020/10/qanon-report-2020-10-FINAL.pdf [Accessed 02 September 2021]

[12] Hern, A (2020) 5G conspiracy theories fuel attacks on telecoms workers. London: The Guardian. Available at: www.theguardian.com/business/2020/may/07/5g-conspiracy-theories-attacks-telecoms-covid [Accessed 02 September 2021]

[13] Full findings will be released in our forthcoming report.