Written evidence submitted by Trilateral Research

Trilateral Research, a London-based SME, has participated in more than 100 EU-funded projects. It currently is coordinating the ATHENA project, a Horizon Europe project, on foreign information manipulation and interference (FIMI), a three-year project with 14 partners from 11 countries, which is conducting 30 cases of FIMI.

In view of Trilateral’s interests in curtailing disinformation generally and FIMI in particular, we are pleased to provide our comments and recommendations in response to the consultation to Joint Committee on National Security Strategy Inquiry into Defending Democracy.

What are the actual and perceived threats to the UK’s democracy, and from where do those threats originate?

2024 is marked as the global year of election with nearly 70 countries going to the polls and nearly 4 billion people eligible to vote. Starting with Bangladesh on 7 January, the polls include six of the world’s 10 most populous nations: India, the United States, Indonesia, Pakistan, Russia and Mexico[1]. The sheer volume of this level of electoral activity is a testament to the progress made in advancing political freedom and democracies worldwide[2], some of which are nevertheless fraught with threats such as the Ukraine war, authoritarian regimes and widespread information manipulation.

Although the date for the next UK general elections has not yet been announced, Prime Minister Rishi Sunak has said it will be in the second half of the year 2024. The UK faces a combination of old and new threats to its democracy and to the upcoming elections, among which are the following.

First, the UK domestic political parties and politicians have a responsibility to maintain the decorum and authenticity of political discourse and narrative. The information manipulation used by political parties to discredit any kind of opposition, amplify falsehoods to reach a wider audience or gain more public support poses a huge risk. An example of information manipulation is the recent AI-generated image of Donald Trump surrounded by several smiling black men, completely unreal. The image is intended to convey the impression that Trump is a friend of African Americans, and they should vote Republican[3]. This is an example of how political parties or their supporters use disinformation tactics targeting certain communities. Such information manipulation tactics to set political narratives and discourse are used both during and beyond election cycles, and hence must be addressed as an ongoing threat. We recommend that political parties should be held accountable by way of fines etc., for spreading manipulated information to deceive the voters. In addition, we recommend that politicians and political parties should be transparent about their use of generative AI content for advertising and campaigning and the same should be communicated to the citizens in simple and accessible language.

Second, as noted by the UK National Security Bill fact sheet dated 12 February 2024, foreign interference can pose a serious threat to the UK electoral process by posing undue influence or manipulating information. Foreign information manipulation can influence the outcome of an election, advance the interests of a government and shape public perception of the state. Information manipulation can be covert (e.g., through fake accounts) or overt (e.g., through state-backed media)[4]. UK has often raised concerns about countries such as Russia and its security services engaging in sustained cyber campaigns to meddle in the country’s next general election[5]. We recommend that the UK, through its global coalition with USA and Canada, should work with other established and emerging democracies of the world and set out guidelines and regulations for attribution for foreign interference. In addition, we suggest the development of robust technological countermeasures to track, trace and report foreign interference in the online space. The EU-funded project ATHENA[6] is contributing to such efforts. It is creating a foreign information manipulation and interference (FIMI) detection platform including a toolbox for analysis, a knowledge graph to store the data, and a dynamic dashboard to enable policymakers to monitor FIMI activities to devise effective counterstrategies.

Third, the abuse of social media poses a huge threat to electoral processes. In his 2020 book Lie Machines[7], Philip Howard discusses how social media sites are complicit mechanisms for producing, distributing and marketing “political lies”. Social media accelerates the spread of manipulated information. The 2016 US presidential election, which installed Donald Trump as President, and the UK’s referendum decision to leave the European Union (‘Brexit’) the same year were watershed moments for the public perception of the online platforms’ role in elections. In a recent report, CNN[8] found enterprises in Vietnam that help clients artificially boost online traffic and social media engagement in the hope of manipulating algorithms and user perceptions. Known as “click farms” or “troll farms”[9], with low labour and electricity costs and thousands of connected devices, these click farms amplify political messages and spread disinformation during elections.  Political agents use these click farms or troll farms to influence elections by flooding social media platforms with messages from fake accounts that emulate genuine information and use effective disinformation tactics to achieve voter manipulation. UK society with higher quality media is vulnerable to information manipulation by click farms or troll farms. The electorate urgently needs easy access to credible, accurate and easily understandable information. We recommend that there should be measures taken to educate and empower citizens and help them discern when they are consuming manipulated information or amplified information via troll farms or trick farms. We recommend developing accessible questionnaires like in the ATHENA project for public use to help individuals recognise encounters with FIMI in their daily lives.


How secure and resilient are elections across the UK, when it comes to foreign interference?

Ranging from fatal and overt actions of war like the Russian invasion of Ukraine to sophisticated information manipulation via fake news and disinformation campaigns, foreign interference can be hugely detrimental to a country’s democratic processes. The UK has been vocal about alleged foreign interference from Russia[10] and China[11] and understanding that no country is immune to foreign interference via information manipulation.

The UK’s legal framework for countering foreign interference lies with the Foreign Influence Registration Scheme (FIRS) by way of amendment to the National Security Bill (the Bill). The FIRS has a twofold purpose: first, to strengthen the resilience of the UK political system against covert foreign influence and, second, to provide greater assurance around the activities of specified foreign powers or entities[12]. Under the FIRS amended rules, organisations and individuals carrying out political influence activities on behalf of a foreign state are required to register under the scheme or face a criminal sanction with a maximum two-year sentence, a fine or both[13].

Foreign interference has also been referenced in the Online Safety Bill as a priority offence, requiring services to assess the risk of this activity taking place on their service and put in place proportionate measures to effectively mitigate and manage the risks[14]In the Online Safety Bill, foreign interference includes foreign information manipulation and interference (FIMI). The ATHENA[15] project is analysing the impact of FIMI on democracy in 30 case studies, the attackers’ tactics, techniques and procedures (TTPs), the behavioural and societal effects of FIMI. It is also developing novel countermeasures to combat FIMI. Based on initial findings of ATHENA and recognising that FIMI campaigns can negatively impact societal values and undermine political processes, we recommend that laws and policies be bolstered with advanced AI-informed countermeasures, media literacy programs and constant engagement with citizens.

The UK is part of a global coalition of democracies combating disinformation campaigns by foreign governments[16]. The US, UK and Canada signed the framework to counter foreign state manipulation with the aim of addressing disinformation as a national security threat and have coordinated government and civil society responses. The coalition encourages information sharing and joint data analysis. The success of the coalition will rely on practices of information and data sharing, which is a thorny area. Based on Trilateral‘s experience in coordinating CC-DRIVER[17], an EU-funded project on the human and technical drivers of cybercrime, we recommend that the UK should strengthen rules around data trusts and data sharing in compliance with the General Data Protection Regulation (GDPR). Further, in the long term, the UK and other governments could develop open-source repositories with a virtual data lab and collaborate with researchers, civil society organisations, advocacy organisations, fact checkers and citizen bodies to make the analysis of data accessible and understandable. Furthermore, there should be provisions to make this information available across various media – online and offline

In addition to forging the above alliances and promulgating new laws, we strongly recommend that, to defend democracy, it is necessary to build citizen resilience to manipulated information and FIMI. It is important that citizens have the skills to identify manipulated information and how it might be attempting to sway their voting intentions.

What role are emerging technologies, such as generative AI, expected to play in upcoming elections?

Emerging technologies including generative AI have resulted in unprecedented advancements that have revolutionised the way information is created, consumed and manipulated.  With fine sophistication, they pose challenges that have blurred the lines between fact and fabrication, truth and falsehoods that have a huge impact on democratic processes. A few examples serve to indicate that emerging technologies pose a systemic risk to our electoral processes, such as the Cambridge Analytica scandal[18], the manipulation of American voters by the Russian Internet Research Agency (IRA) in the 2016 elections[19], the deepfake in Slovakia’s recent pre-election campaign and the 2022 national election in Brazil where the malicious use of AI-generated deepfakes, with fabricated images and videos of leading candidates in scandalous situations, was used to undermine the elections[20].

Emerging technologies allow for digital personalisation, in which users are targeted with content tailored to their interests and sensitivities[21]. With large amounts of user data readily available, micro-targeted messaging is easy to generate profiles of personal likes, dislikes, ideology and psychology and use the same to manipulate large sections of the population to impact the elections. This was particularly highlighted in the case of 2016 US elections whereby social media content with racial undertones in advertisements, memes and tweets, targeted African Americans with an eye toward generating resentment against minorities, coopting participation in protest behaviour and even convincing individuals to sit out the elections[22].

In the Indian parliamentary elections of 2019, the capabilities of natural language processing (NLP) were deployed for manipulating political debate at scale[23]. Competing AI systems were specifically programmed to generate disinformation predominantly on Facebook, tailored to different regions and languages and was used to play to regional biases, particularly to the Hindu-Muslim communal divide. Generative AI content poses a major risk for the 2024 Indian elections as generative AI tools have made it cheap and easy to spread the disinformation that can mislead voters and potentially influence elections.

The threats posed by generative AI content include the following[24]:

Thus, generative AI poses the risk of degrading the overall information environment[25]; it can be “used to target existing vulnerabilities in election operations or voter engagement by scaling tried and tested interference playbooks”[26].

We recommend that combating the challenges posed by generative AI requires a multi-pronged approach.

16 March 2024

[1] https://www.aljazeera.com/news/2024/1/4/the-year-of-elections-is-2024-democracys-biggest-test-ever

[2] https://thehill.com/opinion/international/4399987-the-free-world-should-celebrate-2024-as-a-landmark-year-for-democracy/

[3] https://www.bbc.co.uk/news/world-us-canada-68440150

[4] https://www.iri.org/resources/combating-information-manipulation-a-playbook-for-elections-and-beyond/

[5] https://www.aljazeera.com/news/2023/12/7/uk-accuses-russia-of-attempted-election-interference-through-cyberattacks

[6] An exposition on THe forEign informatioN mAnipulation and interference | ATHENA | Project | Fact sheet | HORIZON | CORDIS | European Commission (europa.eu)

[7] Howard, Philip N. Lie machines: How to save democracy from troll armies, deceitful robots, junk news operations, and political operatives. Yale University Press, 2020.

[8] Photographer steps inside Vietnam’s shadowy ‘click farms’ | CNN

[9] https://mpra.ub.uni-muenchen.de/109634/

[10] https://www.theguardian.com/politics/2023/dec/07/russian-spies-targeting-uk-mps-and-media-with-cyber-interference

[11] https://www.bbc.co.uk/news/uk-politics-66780515

[12] Draft guidance on the Foreign Influence Registration Scheme (accessible) - GOV.UK (www.gov.uk)

[13] https://www.politico.eu/article/uk-narrow-scope-political-influence-criticism-national-security-bill-foreign-influence-registration-scheme/

[14] https://www.ofcom.org.uk/research-and-data/online-research/assessing-the-risk-of-foreign-influence-in-uk-search-results

[15] An exposition on THe forEign informatioN mAnipulation and interference | ATHENA | Project | Fact sheet | HORIZON | CORDIS | European Commission (europa.eu)

[16] US leading global alliance to counter foreign government disinformation | Cyberwar | The Guardian

[17] Home | CC-DRIVER (ccdriver-h2020.com)

[18] https://www.theguardian.com/news/series/cambridge-analytica-files

[19] https://time.com/5565991/russia-influence-2016-election/

[20] https://www.france24.com/en/live-news/20240308-brazil-seeks-to-curb-ai-deepfakes-as-key-elections-loom

[21] https://www.brookings.edu/wp-content/uploads/2020/06/The-role-of-technology-in-online-misinformation.pdf

[22] https://www.brookings.edu/wp-content/uploads/2020/06/The-role-of-technology-in-online-misinformation.pdf

[23] https://edam.org.tr/en/cyber-governance-digital-democracy/the-role-of-technology-new-methods-of-information-manipulation-and-disinformation

[24] https://www.brookings.edu/articles/the-impact-of-generative-ai-in-a-global-election-year/

[25] 25 https://www.gsb.stanford.edu/sites/default/files/publication/pdfs/white-paper-2023-ai-and-elections-best-practices_0.pdf

[26] .https://protectdemocracy.org/work/generative-ai-make-election-threats-worse/

[27] Preparing for Generative AI in the 2024 Election: Recommendations and Best Practices Based on Academic Research | Stanford Graduate School of Business