Electoral Commission – written evidence (DAD0058)

The Electoral Commission is the independent body which oversees elections and regulates political finance in the UK. We work to promote public confidence in the democratic process and ensure its integrity. This includes using our expertise to make and advocate for changes to our democracy, aiming to improve fairness and transparency.

We have answered the Committee’s questions that are directly related to our regulatory remit or where we have contextual information the Committee may find useful.

1. How has digital technology changed the way that democracy works in the UK and has this been a net positive or negative effect?

Digital technology is leading to fundamental shifts in the way democracy works both in the UK and abroad. Voters in the UK can now register to vote online, which has significantly improved access to this process, and we have recently published the findings from feasibility studies showing how technology could support strengthened registration through better use of public data. Electronic counting of ballots is now used for certain elections in the UK. In some countries voters can cast their ballots online, and adopting such technology has been discussed here in the UK.

But the biggest change to UK democracy in recent years is the rapid increase in the use of digital tools to campaign at elections and referendums. This is evident in the spending reported to us by political parties and campaigners after recent elections. At the 2015 UK Parliamentary General election, campaigners spent an estimate[1] of 23.9% of their total advertising spending on digital advertising. That figure rose to 42.8% at the 2017 election.

Campaigns that communicate effectively with voters are essential to well-run elections and referendums. Digital campaigning is a net positive for voters and campaigners. Its lower cost has helped smaller parties and campaigners reach voters. Crowdfunding technology has supported smaller parties and independent candidates to raise money for deposits and campaigning. It allows campaigners to talk to a wider section of voters about the issues that matter to them.  

But there is a negative side to digital campaigning. In 2018 we commissioned research into the public’s understanding of political finance regulation and digital campaigning. Participants said that targeted online messages were helpful if they got information that was interesting or relevant to them.  But they were also concerned about the use of their personal data, about targeted messages spreading false or misleading information, and about who was funding these messages.

Our annual research measuring public attitudes echoed these concerns. It showed that 46% of respondents thought inadequate regulation of political activity on social media was a problem, with 21% thinking it was a serious problem.

2. How have the design of algorithms used by social media platforms shaped democratic debate? To what extent should there be greater accountability for the design of these algorithms?

There is currently little understanding about how social media companies design their algorithms, apart from statements by the companies. This enhances concern that algorithms prioritise sensational content that may be polarising or inflammatory because users are more likely to click on it.

The companies’ efforts to improve transparency of political content have so far focussed on paid advertising. Campaigners pay for adverts to be targeted at specific audiences or promoted higher in ‘news feeds’ or search results. Voluntary action means these adverts are now labelled as promoted political content, but not all election campaigning is done through paid advertising. Election campaign material can be placed on ‘owned assets’ (such as websites or social media) and “liked” or shared by followers. The companies’ algorithms may promote the “liked” or shared content to others connected to the user. This is known as unpaid organic content.

Some commentators have suggested that an independent body should audit the algorithms to ensure that the companies are acting ethically and responsibly. This would provide more accountability about the way they work. However, the operation of the tech companies’ algorithms affects not just political content and activity, but the entire digital ecosystem. Any algorithmic regulation would need to be holistic.

3. What role should every stage of education play in helping to create a healthy, active, digitally literate democracy?  

Digital literacy skills are becoming increasingly important in our lives. The ability to consider critically any election material we see online is only one aspect of this.

The voters who took part in our public opinion research said they adopted a sceptical attitude to the campaign material they saw online. They anticipated that not all information online is true and that not all sources are credible. But it was also clear that they were unlikely to be aware of the extent to which they may be influenced by digital material. For example, some younger voters did not recognise certain issues-based digital content as being political, or as linked to a specific campaign.

Free speech means that regulating all political content online is neither feasible nor desirable, so enhancing digital literacy skills is vital. Some countries, including Canada and Australia, are already taking steps to raise public awareness and develop media literacy to help the public engage with political campaigns.

In its final report on fake news and disinformation, the DCMS Select Committee recommended that the UK Government coordinate a more united strategy to promote digital political literacy. While in its response the Government has not taken up this suggestion, we are currently considering how this work might be taken forward.

4. Would greater transparency in the online spending and campaigning of political groups improve the electoral process in the UK by ensuring accountability, and if so what should this transparency look like?

Electoral law should be updated to reflect modern campaigning techniques. These updates should increase transparency about digital campaigning for voters and for the Commission, and enhance the regulatory powers available to us to enforce those rules. This would give voters more information about how campaigners are trying to influence their votes and help to ensure compliance. These changes need to be implemented alongside the UK Government’s wider online harms agenda and further measures taken by social media companies.

5. What effect does online targeted advertising have on the political process, and what effects could it have in the future? Should there be additional regulation of political advertising?

It is important to recognise the distinction between political campaigning and election campaigning. Political campaigning can happen at any time with a wide variety of aims, including influencing public opinion or the decisions of politicians. Election campaigning is a sub-set of this, covering campaigns that are intended to influence how people vote. The spending rules only apply during a period of time ahead of a vote, known as the ‘regulated period’. We are responsible for ensuring compliance with these rules, but do not regulate wider political campaigning.

Some commentators have argued that political advertising should be regulated on a year-round basis. Such a move would require careful thought and scrutiny, in relation to free speech as well as the practicalities of enforcing the new rules.

Digital campaigning

Digital campaigning helps voters to receive campaign messages they’re interested in, but there’s the risk of anonymous UK campaigners and foreign entities using social media to try and influence voters. Additional regulation of election advertising is needed to make it more transparent. However, it needs to be proportionate so that it does not stifle open political debate. In our 2018 Digital Campaigning report, we made a number of recommendations to improve the regulation of digital campaigning. We include the main recommendations from that report below and any developments on them since its release.


Online campaign materials must be required to have an imprint stating who has created them, as is the case for printed material. An imprint is a short piece of text on election material that identifies who is behind it. We are pleased that the UK Government has committed to implementing a digital imprint requirement and to publish legislative proposals by the end of the year, and that a digital imprint requirement also features in the Referendums (Scotland) Bill.

Social media companies

The social media companies should continue to work with us to improve their political advertising policies, to label election and referendum adverts on their platforms, and make sure their political advert databases follow UK electoral law.

At the 2019 European Parliament election, Facebook, Twitter and Google voluntarily published advert libraries/reports for election advertising in accordance with the EU Code of Practice on disinformation. Facebook’s operates on a year-round basis. These advert libraries were a positive first step. We used them for our monitoring work and they made it easier to identify who was paying to advertise.

These companies included new advert labelling or ‘Paid for by’ disclaimers for some political advertising on their platforms and channels during the European Parliamentary election campaign. This made it easier to identify who was spending money to influence voters.

Social media companies that operate in the UK should continue to develop their advert policies and libraries, and ensure that they have them in place for the next set of national level elections or any future referendums in the UK, and thereafter. They should ensure that their election advertising policies fit the definitions of election campaigning in electoral law. This may need to become a legal requirement.

There should be a formal requirement for social media companies and other digital platforms to run these kinds of libraries and reports, which could be enforced as appropriate, possibly through the UK Government’s proposed Online Harms regulator. The proposed online harms regime should set common standards and obligations for what the social media companies publish, including what is defined as political campaigning, and what information is required about how adverts are targeted.

Other recommendations

The Commission’s investigatory and sanctioning powers should be strengthened to deter campaigners from breaking the rules:

We need further powers to obtain information from others, beyond those we regulate, where it is in the public interest to do so. That will allow us to deal with compliance issues in real time, and compel for example, social media companies to give us information about the source of an online campaign.

A higher maximum fine should be available for the most serious offences to act as an effective deterrent. As a benchmark a higher fine should be in line with the powers available to other UK regulators such as the ICO, which under the Privacy and Electronic Communications Regulations is able to impose a fine of £500,000.

6. To what extent does increasing use of encrypted messaging and private groups present a challenge to the democratic process?

If campaigners are spending money on encrypted messages and election or referendum advertising in private groups, that spending must be declared to the Commission. Those messages should also appear in the social media companies’ advert libraries. However, we do not believe it would be appropriate to regulate private conversations about politics in messaging apps or private social media groups, any more than it would be our role to regulate private face-to-face political conversations. Exempting individuals expressing their personal political opinions online from any new rules requiring imprints on digital material should be considered.

7. What are the positive or negative effects of anonymity on online democratic discourse?

Transparency is key to a healthy democracy. Anything that undermines transparency, such as anonymous campaigning, risks harm to the integrity of elections and voters confidence in the political system. We have included relevant recommendations, such as on digital imprints, elsewhere in this response.

Transparency around donations to political parties and campaigners is important, including the continued application of the permissibility rules. Some online services or cryptocurrencies – such as Facebook’s proposed Libra – may not enable campaigners to have a way of checking permissibility. If they can’t get enough information to check donations given this way, they shouldn’t use the service.

8. To what extent does social media negatively shape public debate, either through encouraging polarisation or through abuse deterring individuals from engaging in public life?

The Committee on Standards in Public Life has looked at the increasing intimidation of public figures and recommended that the Government consult on a new offence in electoral law of intimidating parliamentary candidates and party campaigners.

In our response to the resulting consultation, we agreed that allowing electoral sanctions to be applied as well as criminal sanctions could act as a strengthened deterrent against intimidation. We said that the Government should consider whether increasing the maximum sentence for serious offences relating to elections, as recommended by the UK’s Law Commissions, would also act as a strengthened deterrent against intimidation.

The UK Government has since announced its plans for a new electoral offence, including being banned from running for public office for 5 years. We will support work to ensure this offence is workable and proportionate.

9. To what extent do you think that there are those who are using social media to attempt to undermine trust in the democratic process and in democratic institutions; and what might be the best ways to combat this and strengthen faith in democracy?

There is the risk of bad actors using social media to try and disrupt elections and affect the outcome by depressing turnout. Voters in various countries have received messages telling them that polling day has changed. The big social media companies have created special teams before electoral events to combat such misinformation and do this in cooperation with election bodies like us.

10. What might be the best ways of reducing the effects of misinformation on social media platforms?

Improving digital literacy is an urgent priority for reducing the effects of misinformation, as noted in our answer to question 3. Imprints on digital campaign material will also help. They will allow voters to see who is trying to influence them so that they can make informed decisions about whether to trust particular sources.

11. How could the moderation processes of large technology companies be improved to better tackle abuse and misinformation, as well as helping public debate flourish?

Facebook, Google and Twitter introduced tools for users to report adverts that should have been labelled as political content but weren’t. In the future, the companies could make it clearer to users how to do this.

12. How could the Government better support the positive work of civil society organisations using technology to facilitate engagement with democratic processes?

Before elections and referendums, the Electoral Commission runs its own voter registration campaigns. We also partner with civil society organisations which want to promote voter registration to specific demographics. We would be happy to share our experiences and knowledge with the Committee and the UK’s governments.

13. How can elected representatives use technology to engage with the public in local and national decision making?  What can Parliament and Government do to better use technology to support democratic engagement and ensure the efficacy of the democratic process?

Being registered to vote is a key foundation of democratic engagement. We recently published the findings of feasibility studies about reforming the process for registering voters. They examined the potential for using technology to improve the system, and found that these reforms were feasible and could be implemented without radically changing the way electoral registration works in the UK.

What positive examples are there of technology being used to enhance democracy?

Please see our answer to the first question.



[1] It is worth noting that the percentages do not show the full picture of digital advertising at elections. They only consist of spending data for the most well-known digital platforms, which registered campaigners have reported to us.