Written evidence submitted by Dr. Sander van der Linden (University of Cambridge),

Mr. Jon Roozenbeek (University of Cambridge),

Mr. Ruurd Oosterwoud (DROG),

Associate Professor Josh Compton (Dartmouth College),

Professor Stephan Lewandowsky (University of Bristol)

 

The Science of Prebunking: Inoculating the Public Against Fake News

 

1 About the authors

-          1.1: Dr. Sander van der Linden is the Director of the Cambridge Social Decision-Making Lab at the University of Cambridge. His work focuses on studying influence, persuasion, inoculation, and the spread of fake news and misinformation and he has won numerous public interest and research awards for his work in this area.

-          1.2: Jon Roozenbeek works as a PhD-researcher at Cambridge University. His work mainly covers the media landscape in the so-called Donetsk and Luhansk People’s Republics in eastern Ukraine after 2014. He also does research on online disinformation, inoculation theory, political polarisation and social media.

-          1.3: Ruurd Oosterwoud is the founder of DROG, a Netherlands-based organisation developing innovative campaigns, workshops and educational programmes aimed at diminishing the impact of disinformation. He has worked on the topic of disinformation since 2015, focusing at first on research as well as policy advice.

-          1.4: Dr. Josh Compton is Associate Professor in the Institute for Writing and Rhetoric at Dartmouth College, USA and an authority on inoculation theory and persuasive communication. He has published over 50 articles and chapters and has received a number of research and teaching awards.

-          1.5: Professor Stephan Lewandowsky has been researching misinformation and “fake news” for decades and has published many peer-reviewed articles on the issue, including a popular handbook on debunking techniques. He has previously provided expert testimony to the committee.

 

2 Definition: Fake News

-          2.1: The terms ‘fake news’, ‘disinformation’, ‘propaganda’, and ‘misinformation’ are often used interchangeably to describe news stories that are false or use deception.

-          2.2: Fake news’ is not an ideal term, as information need not be fake (or news) to be misleading. Our preferred term is therefore ‘disinformation’.

-          2.3: We define ‘disinformation’ from a psychological perspective as ‘false or misleading information that is intended to deceive its audience’. This may include a political agenda but is not exclusively beholden to state- or non-state actors.

-          2.4: We use this definition because in contrast to information that is simply incorrect (misinformation), ‘intent’ has important psychological connotations [1, 2].

3 The problem: Public impact

-          3.1: Disinformation can undermine the democratic process by confusing, misleading, or polarising people, and hamper evidence-based decision-making, posing risks to for example public health, trust in government, and international relations [1-6].

-          3.2: Public opinion reports show that a majority of the public are left confused about basic facts and find themselves struggling to know what news to believe and trust, particularly online [7].

-          3.3: Importantly, beliefs based on disinformation, once acquired, are difficult to correct, even when people acknowledge that they have been misinformed [3].

4 Solutions: When technology meets psychology

-          4.1: Given the scope of the problem, we call for technological solutions grounded in psychological principles, an interdisciplinary area known as “technocognition” [4].

-          4.2: Our knowledge of human cognition suggests that the development of better debunking techniques (i.e., correcting misconceptions after the fact) is unlikely to be sufficient and even when corrections are issued the damage is often already done. In fact, the “continued influence effect” suggests that corrections are often ineffective as people frequently continue to rely on debunked disinformation [3, 8].

-          4.3: Instead, the “inoculation” approach focuses specifically on the process of preemptive debunking (i.e. “pre-bunking”) of disinformation.

-          4.4: Inoculation follows a biological metaphor: just as injections containing a small weakened dose of a virus can trigger antibodies in the immune system to confer resistance against future infection, the same can reasonably be achieved with information, by cultivating “mental antibodies”.

-          4.5: Inoculation [9] originated in the psychological study of how propaganda influences public opinion and many studies have demonstrated that public attitudes can be inoculated against (unwanted) persuasion [10, 16, 17]. 

-          4.6: Studies by two of the authors show that inoculating audiences against disinformation has also proven effective in the context of highly politicized issues such as climate change [6,11].

-          4.7: A pilot study by two of the authors suggests the possibility of developing a general ‘vaccine’ against disinformation [12]. We theorised that instead of merely passively receiving information, prompting participants to actively think about how fake news is produced and how audiences may be misled could have beneficial effects on participants’ ability to recognise and resist fake news.

-          4.8: The results from this relatively small pilot study were tentative but positive, and prompted us to explore this angle further online.

5 DROG

-          5.1: DROG develops novel educational solutions to disinformation based on the philosophy that we should learn how to deal with disinformation rather than merely seeing it as a risk. Together with Dr. van der Linden and Mr. Roozenbeek, DROG created an online ‘fake news’-game called “Bad News”[1].

-          5.2: The game is intended as a general “inoculation” against disinformation.

-          5.3: In the game, players learn how to recognise six common techniques used in the creation of disinformation: impersonation, the use of emotion, polarisation, conspiracy theories, discrediting opponents, and trolling. These techniques are based on a review of the academic literature on disinformation techniques [12] and NATO StratCom COE’s ‘Digital Hydra’ report on online disinformation [13].

-          5.4: An important element of the game is that it is not ideologically charged. Playing enables citizens across the political spectrum to learn without feeling targeted.

-          5.5: The game launched on February 20th, 2018, and garnered considerable interest and press attention worldwide [14].

-          5.6: As of this writing (one week post-launch), approximately 80,000 people have played the game.

-          5.7: In order to evaluate its efficacy, we ran an online, in-game survey experiment that tested the ‘Bad News’ game as a generalized inoculation’ against disinformation. The survey tested participants’ ability to recognise fake news and deception.

-          5.8: The full results of the study will be published in the next few months. Initial results (with a sample of 751 participants) show that the game is effective in conveying resistance against common techniques used in the spread of disinformation. After playing, people were significantly better at recognising impersonation, conspiratorial content, and deflection embedded in previously unseen fake headlines.

-          5.9: Crucially, just as disinformation can spread rapidly, the ‘vaccine’ can be shared socially too [2,15] offering the possibility of herd immunity against disinformation, which could also protect those who did not directly receive the inoculation.

6 Recommendations

-          6.1: Familiarising audiences with common techniques used in the spread of disinformation could proactively empower the public, potentially lowering the ‘stickiness’ and effectiveness of disinformation before it is encountered.

-          6.2: Focusing on techniques rather than the background of the people and organisations responsible for spreading disinformation avoids a number of common pitfalls present in many anti-disinformation efforts. For example, there is no need for a ‘Ministry of Truth’ that determines what is and what is not ‘fake news’. Instead, people can be empowered at the individual level to discern real from fake news.

-          6.3: Thus, we recommend that Parliament not focus on ‘correcting’ disinformation after the “fact” but rather on preventing it from taking root in the first place.

-          6.4: Considering the effectiveness of inoculation as a tool to promote resistance to fake news and unwanted (mass) persuasion attempts, we recommend the implementation of prebunking strategies andtechnocognition”-inspired interventions similar to the ‘Bad News’ game in educational, civil, and professional settings.

 

March 2018

 

References

[1] van der Linden, S. (2017). Beating the hell out of fake news. Ethical Record: The Proceedings of the Conway Hall Ethical Society 122(6), 4-7.

 

[2] van der Linden, S., Maibach, E., Cook, J., Leiserowitz, A., & Lewandowsky, S. (2017). Inoculating against misinformation. Science 358(6367), 1141-1142.

 

[3] Lewandowsky, S.; Ecker, U. K. H.; Seifert, C.; Schwarz, N. & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest 13, 106131.

 

[4] Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition, 6(4), 353-369.

 

[5] Poland, G. A. & Spier, R. (2010). Fear misinformation, and innumerates: How the Wakefield paper, the press, and advocacy groups damaged the public health. Vaccine, 28, 23612362.

 

[6] van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. Global Challenges 1(2), 1600008.

 

[7] Barthel, M., Mitchell, A., & Holcomb, J. (2016). Many Americans believe fake news Is

sowing confusion. Pew Research Center. http://www.journalism.org/2016/12/15/many

americans-believe-fake-news-is-sowing-confusion/

 

[8] Nyhan, B. & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32, 303330.

 

[9] McGuire, W. J., & Papageorgis, D. (1961). The relative efficacy of various types of prior

belief-defense in producing immunity against persuasion. Journal of Abnormal and Social

Psychology, 62, 327–337.

 

[10] Banas, J. A., & Rains, S. A. (2010). A meta-analysis of research on inoculation

theory. Communication Monographs, 77(3), 281–311.

 

[11] Cook, J., Lewandowsky, S., & Ecker, U. K. H. (2017). Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLOS ONE, 12(5): e0175799.

 

[12] Roozenbeek, J., & van der Linden, S. (2018). The fake news game: Actively inoculating against the risk of misinformation. Journal of Risk Research. Doi: 10.1080/13669877.2018.1443491

 

[13] NATO StratCom. Digital Hydra: Security implications of false information online. https://www.stratcomcoe.org/digital-hydra-security-implications-false-information-online

 

[14] BBC News. Game helps players spot 'fake news'. http://www.bbc.co.uk/news/technology-43154667

 

[15] Compton, J., & Pfau, M. (2009). Spreading inoculation: Inoculation, resistance to influence, and wordofmouth communication. Communication Theory, 19(1), 9-28.

 

[16] Compton, J., & Ivanov, B. (2013). Vaccinating voters: Surveying political campaign inoculation scholarship. Annals of the International Communication Association, 37(1), 251-283.

 

[17] Compton, J. (2013). "Inoculation theory." The Sage handbook of persuasion: Developments in theory and practice (pp. 220-237).

 

 


[1] The game is freely available here: www.getbadnews.com