Written evidence submitted by the RSA
03 September 2021
The RSA submission to DCMS online safety and online harms inquiry
About the RSA, background, and reason for submission
- The RSA (royal society for the encouragement of arts, manufactures and commerce) believes in a world where everyone is able to participate in creating a better future. Through our ideas, research and a 30,000 strong Fellowship we are a global community of proactive problem solvers, sharing powerful ideas, carrying out cutting-edge research and building networks and opportunities for people to collaborate, influence and demonstrate practical solutions to realise change.
- Since 2018 the RSA has been investigating issues that arise at the intersection of technology and society. This line of work led us to our ongoing investigation into misinformation and disinformation, and what could be meaningfully done to remedy the individual and societal harms it causes. Our final report is due for publication in the coming weeks and our responses below are built and adapted from this research.
Response to questions:
“What are the key omissions to the draft Bill, such as a general safety duty or powers to deal with urgent security threats, and (how) could they be practically included without compromising rights such as freedom of expression?”
- The draft Bill does a good job of providing a structure which will take a systemic view of online harms to individuals. However, we believe the Bill speaks too little of the potential for harm caused by misinformation and disinformation and says nothing about dealing with ‘collective’ or ‘societal’ harms caused by misinformation and disinformation.
- To first give a sense of scale, consider:
- Social media is increasingly cited as an influential source of news in polls – a 2016 study from Pew Research found that 62 percent of adults in the US name social media as a news source. In the UK, 49 percent of adults consume news via social media. Of those that do, 71 percent use Facebook and 46 percent use Twitter.
- On social media platforms like Facebook, false stories have a greater reach and likelihood of virality than true stories. Falsehoods are 70 percent more likely to get retweeted than truths. These falsehoods reach 1,500 people on average six times quicker than accurate news stories.
- Since the start of 2020 the ‘false content producers’ industry online has grown by 102 percent.
- Outputs from fact-checking organisations saw an increase of 900 percent between January and March 2020.
- Facebook state that since the start of the pandemic they have removed 20m pieces of content for breaking their rules on Coronavirus misinformation.
- Information, and misinformation, is on a previously unknown scale and yet sits within a space checked only by the platforms themselves. On aggregate these individual cases of misinformation and harm can lead to collective or societal harms; we refer to this as erosion of the ‘public square’. By collective harms we refer to harms that are accumulated over time and lead to the erosion of important societal values, such as social cohesion, social trust, trust in expertise, or trust in accepted facts. For example:
- Misinformation on public health measures. A clear example of this is that in recent years there has been a noticeable rise in anti-vaccination beliefs due to spurious or outright false information online, particularly around Coronavirus vaccine misinformation. At the time of writing 88 percent of English adults (16+) had received their first vaccine dose, internationally a high figure, but this falls to 63 percent of those aged 25-29. Of women that are vaccine hesitant, 31 percent state worries about fertility as a factor.
- Misinformation on the democratic system. Attacks at the US Capitol in January 2021 attest to what can occur when widespread disbelief in democracy emerges. Hope not Hate, in October 2020, estimated 5 percent of the UK public claimed to be QAnon supporters.
- Misinformation on science and technology. The UK in particular saw instances of anti-5G protests and sentiment within 2020, which resulted in telephone masts being destroyed and telecommunications engineers being attacked.
- Yet such movements can have a slower effect on the trust in science, technology, or the electoral process. This can then lead to declining trust in authority for implementing the supposedly ‘dangerous’ technologies etc. This is where collective or societal harm has occurred. RSA pre-pandemic polling estimated that around 6 percent of people believe, or are inclined to believe, in three major anti-scientific conspiracies: 5G is harmful, vaccines are harmful, and global warming is fake. A group we dubbed ‘The hoaxers’.
- Providing specific scope for Ofcom, or any subsidiaries, to investigate and have remit over collective harms would allow them to put in place procedures which limit the spread of this misinformation and limit the damage to the public square.
- Second, RSA pre-pandemic polling showed that 71 percent of the public stated that they want a “stronger independent regulator on the quality of news”. This statistic should of course be taken with much caution, not least because a pandemic has occurred since, but also because the public will have highly varying views on what a “strong independent regulator” means. Nevertheless, this speaks to a general anxiety about the quality of our information eco-system and our insufficient mechanisms of redress for rightly concerned citizens.
- However, the Bill does little to address growing public unease about the quality of information online and instead retains the current system whereby platforms continue to be the primary and only monitors and purveyors of information on their sites, with the exception of considerations of national security where the government rightfully retains powers. Platforms, as we have shown, are increasingly influential and increasingly marred by misinformation. We, of course, must remain cautious. There is plenty to be concerned about the idea of government bureaucracies having sign-off on our social interactions and how we express ourselves. Yet the current system cannot be said to be the preferred approach either.
- We therefore feel that within the framework of the Bill a new system of shared and pluralist governance on both societal and individual harms caused by misinformation is required. This forum should represent a wide range of views, by combining the inputs of citizens, Ofcom, platforms, traditional media, civil society including fact checkers, researchers, and other experts. We propose that an Office for Public Harms is formed which supersedes the current advisory committee on misinformation and disinformation, as proposed in the Bill.
Our recommendations for protecting the public square and improving the veracity of information online
- We recommend that the Bill includes the explicit remit to investigate systemic issues online which lead to societal harms ie beyond its current scope of individual harms.
- We recommend this is done through an independent body, the Office for Public Harms, made up of a pluralist collective of stakeholders including: citizens, Ofcom, platforms and wider industry, traditional media, civil society, researchers, and other experts
- The Office would:
- Have responsibility to investigate and analyse societal harms caused by misinformation. It would do this through transparency reports, information requests, and through harms being submitted to it by the public or by organisations. The Office would then publish its findings publicly, inform platforms of issues it finds, and advise Ofcom on potential changes to the procedural and systemic factors within Ofcom’s remit.
- The Office should also act as a ‘misinformation ombudsman’ whereby it is able to investigate and suggest remedies for individual cases of harm caused by misinformation or disinformation, or where content has been felt to been unfairly removed or demoted. This would only occur if the remediation processes on platforms themselves are felt to be unsatisfactory. We feel that this measure is more appropriate than what is currently suggested in the Bill because it offers a multi-stakeholder backstop to an otherwise primarily platform-controlled online information ecosystem.
- Retain freedom of expression as a primary objective, as well as ensuring the veracity of information online. The Office should not seek to remove content but instead suggest remedies which affect content’s algorithmic amplification.
- Be funded in a similar arrangement to the Advertising Standards Authority, whereby platforms pay a levy to its running and work.
- Be informed by citizens’ deliberation and panels to generate wider legitimacy behind its work.
 Gottfried, J and Shearer, E. (2016) News Use Across Social Media Platforms 2016. [online] Washington DC. Pew Research Center. Available at: www.pewresearch.org/journalism/2016/05/26/news-use-across-social-media-platforms-2016/ [Accessed 3 September 2021].
 OFCOM (2021) News Consumption in the UK: 2021 OFCOM [PPT] Available at: www.Ofcom.org.uk/__data/assets/powerpoint_doc/0026/222479/news-consumption-in-the-uk-2021-report.pptx [Accessed 19 August 2021].
 MIT Sloan School (2018) Study: False news spreads faster than the truth [online] Available at: https://mitsloan.mit.edu/ideas-made-to-matter/study-false-news-spreads-faster-truth [Accessed 3 September 2021].
 Kornbluh, K., Goldstein, A. and Weiner, E., 2020. New Study By Digital New Deal Finds Engagement With Deceptive Outlets Higher On Facebook Today Than Run-Up To 2016 Election. [online] The German Marshall Fund of the United States. Available at: www.gmfus.org/blog/2020/10/12/new-study-digital-new-deal-finds-engagement-deceptive-outlets-higher-facebook-today> [Accessed 22 October 2020].
 Brennen, JS, Simon, FM, Howard, PN and Nielsen, RK. (2020) Types, sources, and claims of COVID-19 Misinformation [online] Available at: reutersinstitute.politics.ox.ac.uk/types-sources-and-claims-covid-19-misinformation.
 Bickert, M (2021) How We’re Taking Action Against Vaccine Misinformation Superspreaders. [online] Facebook. Available at: https://about.fb.com/news/2021/08/taking-action-against-vaccine-misinformation-superspreaders/ [Accessed 3 September 2021].
 See: Goodman, J and Carmichael, F (2020) Coronavirus: Bill Gates ‘microchip’ conspiracy theory and other vaccine claims fact checked [online] BBC News. Available at: www.bbc.co.uk/news/52847648 [Accessed 31 August 2021]. And see: Schraer, R (2021) Covid vaccine: Fertility and miscarriage claims fact-checked [online] BBC News. Available at: www.bbc.co.uk/news/health-57552527 [Accessed 31 August 2021].
 UK Government COVID Dashboard (2021Vaccinations in England. [online] Available at: coronavirus.data.gov.uk/details/vaccinations?areaType=nation&areaName=England [Accessed 3 September 2021].
 Office for National Statistics (2021) Coronavirus and vaccine hesitancy, Great Britain. [online] Available at: www.ons.gov.uk/peoplepopulationandcommunity/healthandsocialcare/healthandwellbeing/bulletins/coronavirusandvaccinehesitancygreatbritain/28aprilto23may2021 [Accessed 3 September 2021].
 BBC Reality Check (2021) Capitol riots: Who broke into the building? [online] BBC. Available at: www.bbc.co.uk/news/55572805 [Accessed 3 September 2021].
 Lawrence, D and Davis, G (2020) Qanon in the UK: The growth of a movement [Online] London: Hope not Hate. Available at: www.hopenothate.org.uk/wp-content/uploads/2020/10/qanon-report-2020-10-FINAL.pdf [Accessed 02 September 2021]
 Hern, A (2020) 5G conspiracy theories fuel attacks on telecoms workers. London: The Guardian. Available at: www.theguardian.com/business/2020/may/07/5g-conspiracy-theories-attacks-telecoms-covid [Accessed 02 September 2021]
 Full findings will be released in our forthcoming report.