Written evidence submitted by Hadley Newman

Hadley Newman is a strategic communications advisor who leads national and international communications campaigns across a broad range of policy areas for public and multilateral organisations. He specialises in strategic communications planning, and the empirical analysis of target audiences. Hadley is a published commentator on communication governance, and his doctoral research at Heriot-Watt University explores the influence of targeted communication within information operations.



This paper is tailored to the knowledge needs of the Public Accounts Committee - Commons Select Committee, in response to their inquiry into Government preparedness for the COVID-19 Pandemic, specifically ‘lessons for government on risk’ and focuses on recognising the difference between disinformation, misinformation malinformation, and information (DMMI) – particularly with respect to differences in veracity and intent to harm – that will become increasingly important in managing the public’s fear, anxiety, and risk associated with misinformation. Hardly a matter of semantics, the meaning of these terms and their associated weight must be considered when shaping policies, especially as they bear on pre-scripted, signed-off messages, and communication protocols.



Disinformation, Misinformation, Malinformation, DMMI, Strategic Communication, Information Operations.



Introduction and Background

1.                   Since 2008, the UK Government’s National Risk Register has cited an influenza pandemic as the UK’s top non-malicious risk. Despite this, the Public Accounts Committee (PAC) has observed “ an astonishing failure to plan appropriately, especially in relation to the national economy”. In November of 2021, the National Audit Office (NAO) published a report entitled “The government’s preparedness for the COVID-19 pandemic: lessons for government on risk management”.


2.                 This paper focuses on key evidence in the NAO report; namely, that “departments’ pandemic plans and business continuity plans did not set out all the processes and responses required to maintain government operations during the pandemic”. In particular, 50% of all plans lacked “pre-scripted, signed-off messages or communication protocols, such as for dealing with fear, anxiety and misinformation” (NAO 2021 p. 43).


3.                 Misinformation is closely associated with disinformation and malinformation, but the terms are not synonymous, should not be used interchangeably, and the associated content and substance of the messages should be treated very differently – not to mention the messengers and the intent. Yet,it is questionable, at best, if the differences between disinformation, misinformation, malinformation, and information (DMMI) are generally understood, and arguably there is a need [to] provide the means to enable clarity in the fog of confusion” (Newman 2021, p. 2). Though often used interchangeably, disinformation, misinformation and malinformation are different, linguistically, and conceptually. This paper adopts the following definitions:


4.                 Information – in particular, its utility, effect, and management” (Tatham, 2008) – is central to strategic communication. Countering misinformation, or DMMI, requires agility and innovation, as they are fraught with interconnected threats and risks which have the potential to undermine social, national, and international stability (Tatham, 2008). As the Secretary of State for Health and Social Care (2020) noted, “a critical part of tackling disinformation is providing accurate, fair, and objective positive information”.


DMMI Matrix


5.                 The DMMI Matrix was created to demonstrate the differences between disinformation, misinformation, malinformation, and information, and appears below (Figure 1).


Description automatically generated

Figure 1: DMMI Matrix. Source: Newman (2021)


6.                 The design is user-friendly and easy to interpret, as it is similar to well-known risk maps (e.g., Risk Heat Map). Further, it adopts the PHIA Probability Yardstick – the standard mandated across the UK intelligence assessment community (Irwin and Mandel, 2020) – to provide clarity and reduce any need for subjective interpretation. The synonyms in the Probability Yardstick establish what the terms within the core of the matrix correspond to regarding the probability (of intent to harm, likelihood of the information being false, level of severity). The more detailed version of the DMMI Matrix also includes numerical values and is available upon request.


7.                 Within the DMMI Matrix, veracity and intent are plotted on a two-axis grid, and converge under the information rating to give a high-level overview of the information landscape. Each of the 25 cells represent different levels of severity. The original version of the DMMI Matrix (Figure 1) includes a colour scheme similar to a Risk Heat Map to indicate the level of ‘severity’. Cells in the top-right are red (indicating ‘severe’), whereas those in the bottom left are dark orange (high), light orange (significant), yellow (moderate), and green (negligible). The DMMI Matrix also presents ‘severity’ in the different dimensions of information, such as the cluster of risks by category of DMMI, or by platform (e.g., the risks presented by a single, concise message or campaign).


8.                 As Newman (2021) noted, the following questions can serve as a useful point of departure when implementing the DMMI Matrix:

  1. Does the information have the intent to cause harm?
  2. Is the information false? This is “to ensure that users can easily flag content that they suspect or know to be false, and which enable users to understand what actions have been taken and why” (UK Government, 2020).
  3. What, if any, is the level of severity?
  4. Does this constitute a material risk?
  5. What is the range of acceptable variance from (any) established operating metrics?


Importance of Nomenclature

9.                 Along with intent and veracity, the importance of nomenclature is apparent from the matrix.  The NAO (2021) in its report discusses the importance of communication protocols for addressing misinformation. However, from the definitions contained in paragraph 3, and when conducting a mapping exercise using the DMMI Matrix it is evident that disinformation is considerably more dangerous than misinformation, and malinformation is also insidious. Since these terms are not synonymous and instances of each carry significant political implications, they need to be countered in different ways. It is not enough to have a generic communications plan in place, as referred to in the NAO reportwith “pre-scripted, signed-off messages or communication protocols, such as for dealing with fear, anxiety and misinformation” (NAO, 2021 p. 43). Rather, it is necessary to have a robust strategy to counter sophisticated targeted information operations that endeavour to foster disorder and fuel chaos.


10.             To be clear, misinformation is the inadvertent sharing of false information, without the intent to cause harm. During the first wave of the COVID-19 pandemic, social media was awash with well-meaning, if not medically sound, advice that aimed to safeguard public health. For example, The President of Argentina was quoted stating that World Health Organisation recommended drinking lots of hot drinks, because the heat can kill the virus (Lucía, 2020). President Fernández did not intend to cause harm, however, WHO never made this statement and scientists noted drinking extremely hot drinks can damage your throat.  Another example of well-intentioned misinformation began with British man in China claimed that he had beaten the virus by combining honey and whiskey, as reported by the Daily Mirror (Jolly, 2020). Once the story began circulating in Kenya, the Governor of Nairobi distributed small bottles of Hennessy cognac in care packages, claiming that the alcohol was a throat sanitizer (Lange, 2020). Within two months of the story appearing on social media in Iran, where alcohol is banned, over 700 Iranians had died from methanol poisoning from drinking illegal alcohol to fight off COVID-19 (Jolly, 2020).


11.             Although the original story was true and had no intention to cause harm, it quickly transformed from a feel-good story to a pernicious piece of misinformation. Sharing the misinformation – and it’s tragic, unintended, consequences – is what resulted in it being rated as ‘High’ to ‘Severe’ on the DMMI Matrix.


12.             Similarly, while “malinformation often contains some truth, it is spread with the intent to exact reputational harm and to discredit an individual’s claims. For example, in the United States, Chief Medical Advisor, Dr. Anthony Fauci has been compared to the Nazi physician, Josef Mengele, and Italian dictator Benito Mussolini (Kaonga, 2021). While these malicious ad hominem attacks only have the smallest grain of truth to them (i.e., he is a doctor (and so was Mengele) and he is short in stature (and so was Mussolini), even these small measures are enough to give substance to the lies.


These messages are intended to discredit Dr. Fauci and undermine his scientific claims, if not negate them. Therefore, on the DMMI Matrix, this would be rated as a ‘Significant’ threat. Moreover, combating this type of malinformation, especially when it contains ad hominem attacks threating a critically important message, requires a different approach than responding to misinformation.


13.             There was a remarkable volume of disinformation, often emanating from anti-vaccine groups, about the COVID-19 vaccination. The claims that the vaccines were not properly tested or developed” and that they “do not work” are pervasive pieces of disinformation (as per the WHO). Both claims would be given a ‘Severe’ rating on the DMMI Matrix, as they are entirely false, misleading, and aim to cause harm by dissuading people from getting the vaccine.


14.             Recognising erroneous information and then categorizing it as disinformation, misinformation, or malinformation is the critical step that precedes the implementation of a robust strategy to counter sophisticated, targeted information operations (even with pre-existing communications protocols). This crucial step has been overlooked in the NAO (2020) report. The UK Government (2021) produced the RESIST 2 toolkit, which was designed to deal with types of manipulated, false, and misleading information”. Essentially, when developing and delivering a response to false information, it’s most important to focus on the harm it can do”. While this certainly has its place in guiding policy, DMMI differ in substance, intent, and impact, and need to be countered in different ways.


Conclusion & Recommendations

15.             Public education about the differences between DMMI is critical in the understanding of vaccine efficacy and more importantly how they work. This education might even include the need to demonstrate the microchips cannot be injected with the vaccination! As Professor Jim McManus, the president of the association of directors for Public Health UK notes: “The way in which we, as a country, talk about vaccine hesitancy is sloppy… People have genuine questions about the vaccines – and I wouldn’t call those people anti-vaxxers or vaccine hesitant at all. I would call them, simply, people with questions” (Barber, 2021).  Alternatively, paraphrasing Professor Chris Whitty, if there is a lone wolf anti-vaxxer trying to spread misinformation, you don’t engage with the content – you don’t “give them air” (Barber, 2021).


16. By way of a summary, in response to the call for 'Government preparedness for the COVID-19 Pandemic: Lessons for government on risk', all plans should be founded on inbuilt resistance. This should be included within a larger strategy that is underpinned by authority and institutional power. Fundamental measures, such as fact-checking and reporting, should be enhanced through actions that diminish the credibility of adversarial messaging. This includes sustained support of independent professional journalism, establishing narrative discipline, and investing in media buying. Furthermore, the strategy may demand countermeasures by Government to the production and distribution of information activity. The Government-led the use of legal and appropriate countermeasures should not constitute a form of state-imposed censorship, but could include deterrence by denial, legislation, as well as algorithmic limits on the dissemination of the information by deploying UK cyber capabilities.



Barber, H. (2021) Vaccine misinformation having ‘limited impact’ in UK, says public health chief. The Telegraph.

Irwin, D. & Mandel, D. (2020). Variants of Vague Verbiage: Intelligence Community Methods for Communicating Probability.

Jolly, B. (2020). "Coronavirus: British man who caught virus 'beat flu with glass of hot whisky'". The Mirror.

Kaonga, G. (2021) Fox Hosts Compare Anthony Fauci to Josef Mengele, Benito Mussolini. Newsweek.

Lange, J. (2020). "The governor of Nairobi is putting Hennessy in residents' coronavirus care packages" This Week.

Lucía, M. (2020). "Alberto Fernández: "La OMS recomienda que uno tome muchas bebidas calientes porque el calor mata al virus. Chequeado.

Newman, H. (2021) Understanding the Differences Between Disinformation, Misinformation, Malinformation and Information – Presenting The DMMI Matrix.

Secretary of State for Health and Social Care (2020) Anti-vaccination Disinformation Online. Hansard Volume 684: debated on Tuesday 17 November 2020.

Tatham, S. (2008). Strategic Communication: A Primer.

UK Government (2019) Online Harms White Paper.

UK Government (2020) Consultation outcome: Online Harms White Paper. Updated 15 December 2020.

UK Government (2021) RESIST 2 Counter Disinformation Toolkit. Updated 24 November 2021.


January 2022