Written evidence submitted by Dorje Brody
RE: Disinformation and ‘fake news’: Interim Report (HC363, 29 July 2018)
I am writing in connection with the above report, to which I would like to add some comments that might be of interest to members of the Committee. My name is Dorje Brody. I am a Professor of Mathematics at the University of Surrey, and I specialize in research into applications of communication theory to various areas of physical sciences and social sciences. Together with a colleague (Dr David Meier) I have recently carried out research on modelling 'fake news', and I would like to communicate to you some of the findings and implications of this work.
As you will no doubt be aware, over the past two years many research outputs have emerged, primarily by computer scientists, on the detection, prevention, and retrospective analysis of fake news. While this is important, what is equally important is the modelling of fake news, with a view to scenario analysis, impact studies, and strategic planning. My own research is concerned with this latter topic. In particular, in collaboration with Dr Meier, I have introduced a precise scientific definition of fake news, along with a model for studying the impact of fake news, which has allowed us to initiate for the first time a comprehensive scheme for scenario analysis involving such impact.
Our main preliminary finding, which we believe may be of relevance to policy making, is broadly an optimistic one: namely, that a degree of awareness and sophistication is sufficient to mitigate much of the impact of fake news, even if one cannot say precisely which pieces of news are fake. Our paper reporting these findings is publicly available at https://arxiv.org/pdf/1809.00964.pdf. While our paper is a technical one aimed at applied mathematicians, to get one of the main messages it suffices to consult Figure 3 in the paper, which shows the impact of fake news in an election scenario. This is compared with how voters would have behaved had there been a sophisticated filter in place to mitigate the risks. In fact, the overall efficiency with which one could in principle mitigate the impact of fake news came to us as a surprise. However, the degree of sophistication required for this probably goes beyond what the general public is capable of handling, and this has an implication in policy making.
Scientifically speaking, the phenomena surrounding the dissemination of deliberate disinformation (i.e. fake news) can best be understood in the context of communication theory. New developments in society generate new types of problems for analysis in the mathematical sciences. Indeed, our work has highlighted the emergence of some of the new categories of challenges that must be addressed by society in order to build effective strategies to combat the forces of fake news. This area of research, which currently is under-represented in the UK, is by nature highly mathematical in character. As such, an important role that policy makers / legislators can play here is to encourage the UK research funders in the mathematical sciences (e.g., EPSRC) to support research in this direction. Otherwise, the UK will not be well placed in tackling these problems --- in particular, in dealing with the major challenge of state-sponsored disinformation.
One of the Recommendations in your report concerns support of research into the identification of fake news via improved ‘fact-checking’. This is undoubtedly extremely important, but I would like to stress that fact-checking is only part of the story. One learns very little from the identification of a bogus story about how such a story might affect a democratic process; and one learns very little about the potential impact of disinformation. Whereas, for strategic planning, one requires detailed information concerning the impact of disinformation on the everyday operations of a democratic society. Putting the matter differently, as it stands, the Recommendation in this otherwise rather well-written and thought-out report strikes me as being somewhat defensive: to combat the issues we face, a more proactive approach to scientific research seems more appropriate.
To illustrate this point, it may be helpful to compare the situation with the efforts made in climate science to tackle global warming. For sure, if a catastrophic weather event occurs, then one has to build a defence to minimize the damage (analogue of fact-checking). However, it is equally important to conduct a comprehensive scenario analysis for the purpose of policy making (e.g., setting target for CO2 reduction). Indeed, with the model at hand, we are now able to ask quantitative questions such as, ‘What is the likelihood of flipping the election outcome given that false stories are released with such and such frequency?’
With these in mind, the main points of my letter can be summarized as follows:
1. There are researchers both within and outside UK higher educational institutions who have studied various aspects of fake news who may be able to assist the UK Government to build its strategy in addressing the problem of fake news.
2. Preliminary scenario analysis shows that the dangers of fake news can be mitigated to a significant extent, but only by use of methods requiring a rather high level of sophistication, demanding further research developments in mathematical sciences and data sciences alike.
3. Research in this area is currently under-represented in the UK. Given the importance of the problem, the urgency of the current environment, and the likelihood of future escalation, it seems desirable to initiate a drive to encourage targeted mathematical research in this area in the UK.
4. In particular, the type of research to be promoted should not be confined to the ‘defensive’ mode of fact checking: it should also contain scenario analysis, impact studies, and strategic planning.
Should any clarification or further elucidation be required on the points raised above, I would of course be happy to assist.
Sincerely yours,
Professor of Mathematics
University of Surrey