Bethany Shiner – written evidence (DAD0013)

 

 

  1. This evidence is submitted in my capacity as Lecturer in Law at Middlesex University, London. I have previously published research on how overlapping areas of the law apply to micro-targeted political campaign material[1] which I hope will address some of the Committee’s questions and provide some background information about the issues associated with micro-targeting. I have responded to the numbered questions below.

 

Summary

  1. Legislative reform must be made with core principals such as fairness, reasonableness and electoral integrity in mind. Reforms must also be compatible with the European Convention on Human Rights (ECHR) which includes the right to freedom of expression (Article 10 ECHR) and the right to freedom of thought (Article 9 ECHR). Furthermore, ad hoc reforms may not address the central issue which is the widespread deceptive use of data and the obfuscation of electoral law and direct marketing regulations to steer people’s democratic decision-making in the context of high disenfranchisement and low political participation.
  2. There is a risk of introducing ‘quick fix’ reforms which actually suppress political expression online. Article 10 ECHR is a qualified right meaning the law can limit or interfere with the exercise of the right but the grounds on which the law may do so need to be balanced against the necessity and aim of the law (protecting democratic engagement; limiting the spread of misinformation and disinformation; preserving election integrity) and need to be proportionate to that aim (including how it may be balanced against other rights). As such, reforms that block advertisements based on their content or block certain human social media accounts might not be compatible with Article 10 ECHR. Legislative protection of the democratic process must ensure fairness on behalf of the electorate, not just between political parties and campaign groups.
  3. My primary recommendation is that any initiatives to limit the spread of mis/disinformation must target the method of communication, i.e. limiting the use of bots (totally automation) and cyborgs (partial automation), or ensuring that these automated accounts are clearly marked as bots or cyborgs; compiling databases online containing all political advertisements (something Facebook has already initiated to a limited extent and a version of which is set out in the Elections and Referendums (Advertising) Bill); monitoring the use of personal data for political purposes much more closely; and/or, banning dark advertisements.[2]
  4. In tackling the effects of mis/disinformation, it is helpful to think about the right to information which is contained in Article 10 ECHR alongside the right to freedom of expression. The right to information could extent to the lack of transparency surrounding the source of money for political groups and campaigns (such as the alleged dark money for DUP advertisements placed in the London Metro ahead of the UK-EU Referendum). It may also apply to the content of advertisements, for example, if contradictory policy pledges are sent to different people then this throws into question how the electorate is meant to vote in the free expression of their opinion based on the merits of the candidates or parties standing for election.
  5. In short, these forms of communication undermine and erode at the individual’s ability to exercise their own judgement when basing democratic decisions on publicly available information. There is a balancing act which must enable the senders of material the right to promote their political policies and ideology against the necessity for the electorate to be able to rely on the integrity of the method and content of material sent to them which is meant to inform their democratic decision-making.

 

Responding to the Committee’s questions

Question 4. Would greater transparency in the online spending and campaigning of political groups improve the electoral process in the UK by ensuring accountability, and if so what should this transparency look like?

  1. Yes. The regulation of the different caps in spending for local and national elections must be strengthened to avoid the funnelling of money into swing seats which should fall under local spending but might be paid for out of the national budget.[3] The Elections and Referendums (Advertising) Bill sought to address this and offers an example of how the law may be amended.
  2. Rules on foreign donations must be tightened, as the UK-EU referendum campaign revealed. Since then there have been concerns that foreign and automated donations have been made through PayPal[4] making it possible to escape the prohibition on accepting donations in excess of £500 from anyone other than individuals on the electoral register or from political parties, companies, trade unions or similar organisations registered in the UK.
  3. Spending requirements should also include general digital campaigns for example if the UK was to witness political parties or campaign groups using forums like WhatsApp then this should also fall within the category of spending that requires itemisation.
  4. Practically, a new process should aim to make it easier for political parties to track, calculate and declare their spending nationally and locally. Spending returns should remain accessible online.
  5. The Electoral Commission may also reconsider the rules on large private donations. Although this aspect of the UK’s electoral spending laws has not been challenged in the European Court of Human Rights it is worth considering whether the law is still compatible with Article 3 of Protocol 1 ECHR.[5] There are recent examples of unfairness surrounding the financing and operation of the UK-EU referendum campaign including large donations by private individuals and difficulties in identifying the true source of donations.[6]

 

Question 5. What effect does online targeted advertising have on the political process, and what effects could it have in the future? Should there be additional regulation of political advertising?[7]

  1. Online targeted advertising, known as micro-targeting, is regulated by overlapping frameworks including data protection law, direct marketing regulations, electoral law and, indirectly, campaign spending limits.
  2. There is an assumption that micro-targeted advertisements influence the way people vote i.e. the party they vote for. There is no evidence of this and it would be incredibly difficult to measure as voting is a necessarily private act and people have difficulty explaining their decisions in rational terms. In addition, research shows that the vast majority of people make up their minds about how to vote before the regulated period has begun meaning political advertisements and communication is often there to reinforce an intention already formed.
  3. However, there is some impact which, I think, is through voter mobilisation and voter suppression. The effect, therefore, is upon whether people vote, not how. This is greatly influential when efforts to mobilise or suppress voters is married with insights into which regions of the country can swing an election. It is possible for political parties to direct their communication strategy to a narrow and selected group of people who have been categorised as undecided or persuadable and as living within a swing constituency.
  4. The fact that micro-targeted advertisements are not made public mean they are harder to challenge and hold accountable. Research commissioned by the Electoral Commission with members of the public highlight the negative effect of these methods of communication: “If they are going to send one message to me about immigration say and a completely different one to him, it is a dirty game really.”[8]
  5. It is legal for political parties, candidates and campaign groups to use personal data, and micro-targeted advertisements are lawful. The Data Protection Act 2018 recognises the democratic value in using personal data for the purpose of enhancing democratic communication. Section 8(e) provides a lawful basis processing personal data “necessary for the performance of a task carried out in the public interest”, including “an activity that supports or promotes democratic engagement” such as communicating with electors, campaigning activities, and opinion gathering inside and outside election periods.
  6. However, exactly how political parties can rely on section 8(e) needs clarification. ‘Democratic engagement’ provides a very broad basis for the use of data which could be used to legitimise opaque micro-targeting.[9] Often the data processing associated with the democratic engagement basis reveals political opinions, which is a special category personal data attracting higher protections, through the process of combining freely-given information with other data sets like the electoral register, commercially available modelled consumer data and publicly available data to make a prediction about that individuals lifestyle, habits and political views (an automated process in itself). However, it is unclear whether this inferred data is subjected to the necessary additional safeguards which include the requirement for there to be a ‘public interest’ justification for the processing of special category data. As noted in some political parties’ privacy notices this justification is cited very generically; seemingly that by existing as a political party there is automatically a ‘public interest’ justification.
  7. There is a phenomenal amount of advertisements disseminated online that all share a similar message but vary in subtle but important ways. The response and interaction with these advertisements are monitored so that they can be altered in a constant iterative process to ensure maximum impact. This means that until the optimum advertisement is identified through this process of iteration, all prior viewings of this content is experimental. The scale of this experimentation must be limited.
  8. An online imprint requirement would provide some accountability as long as the imprint includes: who has paid for the material, who has made the material and on whose behalf the material is disseminated by. There are several limitations to the regulatory impact of imprints because it may still be easy to conceal the true source of material especially if the source is foreign but the funding or design of the material is funnelled through UK political campaign groups or candidates; and, it is possible to circumvent imprint requirements by relying on organic content or content that is disseminated by supporters (known as ‘influencers’), not official parties, candidates or campaign groups.
  9. Therefore, imprints must not be the only reform. There are other possible methods of regulating political advertising which focus on the method of communication and target bots and other forms of automated communication (see paragraph 4 above).
  10. The Government has been urged to produce a statutory code of practice but has not done so. I have previously written on the merits of this.[10]
  11. It should be made easier for legal actions to be brought which challenge these practices. This can be easily achieved by immediately incorporating Article 80(2) GDPR into the Data Protection Act 2018 which enables any body, organisation or association the right to lodge a complaint with the supervisory authority, independent of a data subject if data rights have been infringed as a result of data processing. Currently, such actions can only be brought when data subjects have subjects have authorised such bodies to do so yet individual data subjects may not always realise when their rights have been breached. Such an amendment would allow voter suppression tactics to be challenged, for example.

 

Question 6. To what extent does increasing use of encrypted messaging and private groups present a challenge to the democratic process?

  1. Because WhatsApp messages are encrypted and private they tend to be beyond the scrutiny of electoral authorities or independent fact-checkers. Content forwarded through the system have no context about where they originate, but benefit from the trust of coming from a known contact.
  2. WhatsApp has applied its own processes to tackle the spread of mis/disinformation by limiting the number of times the same message can be forwarded to other group chats by individuals working for campaigns who send messages out to a cultivated and large network of people, and through automation.[11] The most obvious problem with the use of WhatsApp as a campaigning platform is it is difficult to hold the creators of such content to account because the content is sealed off behind encrypted messages; and, as authors of content which could escape any imprint rules.
  3. It must be noted that the impact of using encrypted messaging and private groups to spread information ahead of an election is more damaging in countries where the majority of people do not have ready access to the internet to verify the credibility of information shared via messages.[12]
  4. That is not to say this is not a problem in the UK. Participants in a focus group commissioned by the Electoral Commission explained that they felt immune, “disillusioned”, “desensitised”, and “bamboozled” by contemporary online campaigning methods even though most people can engage in their own research to verify information shared online.[13]
  5. In addition, news spread via closed and encrypted messaging apps then independent verification becomes harder and more prolonged for journalists who will struggle to identify the source of the information. This may feed into the cycle of cynicism and detachment amongst a frustrated and overwhelmed electorate[14] alongside increased levels of hyper-engagementbecause of the emotionalised or provocative nature of the content.

 

Question 10. What might be the best ways of reducing the effects of misinformation on social media platforms?

  1. Build in systems that create breathing space for users online, for example creating pauses before unverified content can be shared.
  2. Ensure the rights of data subjects (e.g. the right to object and the right to explanation) are much more widely and clearly disseminated on social media platforms as well as guidance on how to exercise those rights.

 


[1] ‘Big Data, Small Law: How Gaps in Regulation are Affecting Political Campaigning Methods and the Need for Fundamental Reform’ [2019] 2 Public Law 361-378

[2] Dark advertisements are only seen by the recipient (unless the recipient chooses the share the advertisement).

[3] Tambini et al, “The new political campaigning” (London School of Economics and Political Science, 2017), LSE Media Policy Project Series, Media Policy Brief 19, p.12, http://eprints.lse.ac.uk/71945/ 

[4] The Electoral Commission ‘Recommendations for The Brexit Party –

financial procedures for incoming funds’ https://www.electoralcommission.org.uk/sites/default/files/2019-07/FOI-159-19.pdf published following a Freedom of Information Act request.

[5] Bowman v UK App No 24839/94 (1998) 26 EHRR 1 was related to another, now outdated, element of campaign spending limits.

[6] For example, Arron Banks is the largest political donor having given £1 million to UKIP and £8 million to Leave.EU and is reportedly continuing to provide donations to Nigel Farage.

[7] The answer provided here also addresses questions 8 and 9. See ‘The applicable legal framework’ section of the enclosed article (‘Big Data, Small Law’).

[8] Political finance regulation and digital campaigning: a public perspective. GfK UK report for qualitative research findings (The Electoral Commission, April 24, 2018) https://www.electoralcommission.org.uk/media/2565

[9] For example, the Conservative party’s privacy policy  for recent advertisements and questionnaires  relied on the democratic engagement basis for processing personal data. The privacy policy makes to clear that data submitted by the individual will be processed along other data sets like the electoral register and purchased consumer data and other publicly available data to make predictions. see https://www.conservatives.com/Privacy.

[10] See the reform section inBig data, Small Law’ article and evidence given to the All Party Parliamentary Group on electoral campaigning transparency.

[11] Caio Machado  and Marco Konopacki ‘Computational Power: Automated Use of WhatsApp in the Elections

Study on the use of automation tools to boost political campaigns digitally in the 2018 Brazilian elections’ (Medium, October 26, 2018) https://feed.itsrio.org/computational-power-automated-use-of-whatsapp-in-the-elections-59f62b857033

[12] Luca Belli ‘WhatsApp skewed Brazilian, election, providing social media’s danger to democracy’ (The Conversation, December 5, 2018) https://theconversation.com/whatsapp-skewed-brazilian-election-proving-social-medias-danger-to-democracy-106476

[13] Political finance regulation and digital campaigning: a public perspective. GfK UK report for qualitative research findings (The Electoral Commission, April 24, 2018) https://www.electoralcommission.org.uk/media/2565

[14] For example, 71% of people surveyed claim they are avoiding news about Brexit, see: Reuters Institute

Digital News Report 2019 https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2019-06/DNR_2019_FINAL_1.pdf (pages 25-27)