Submission from Emma Goodman, Professor Sonia Livingstone and Professor Robin Mansell on behalf of LSE’s Truth, Trust and Technology project (T3), 14 October 2019.
Summary
1. We have responded to the questions most relevant to our work in this area. Key points that we hope the committee will take into consideration are:
How has digital technology changed the way that democracy works in the UK and has this been a net positive or negative effect?
2. Although it is difficult to assess the net effects because they are so pervasive and subject to change, it is possible to signal areas in which there are serious concerns.
3. What is clear is that we should be wary of nostalgia for a perfect bygone era that never actually existed. As noted by Alice Thwaite (2019) “Information environments have never lived up to the democratic ideal, and voters have never been good at choosing governments that reflect their interests.”[1]
4. Digital technology has significant potential to change the way that the voting public understands political issues:
5. For those equipped with sufficient critical digital and media literacy knowledge and skills, these changes represent positive developments that offer enhanced opportunities to participate meaningfully in a democratic society. However, digital technology can also exacerbate existing inequalities in education and status and many people who lack adequate knowledge in this area risk relying on biased, opaque or misleading information when making democratic decisions.
6. Looking specifically at elections, the main goal of electoral law in the UK is to ensure a fair playing field by capping spending, enforcing tight control of political advertising on broadcast media, ensuring transparency in campaigning and avoiding outside interference. Digital technology has the potential to undermine these efforts in various ways:
7. Based on our work in this area, we concluded that the information crisis is systemic and requires a long-term, coordinated institutional response. Multiple actions are urgently needed to protect the interests of individual citizens and to safeguard democracy in the UK.
How have the design of algorithms used by social media platforms shaped democratic debate? To what extent should there be greater accountability for the design of these algorithms?
8. The design of algorithms, coupled with the structure and business model of social media companies, does shape and influence online debate: in the ‘attention economy,’ social media tends to favour the emotive and sensational over factual content, the negative over the positive, the outrageous over the mundane.[8] Recommendation engine algorithms are expert at sending users down paths that expose them to more similar views, often becoming increasingly extreme.[9] People with extreme fringe viewpoints - those opposed to vaccinations, for example - are enabled to have a disproportionately loud voice.
9. Social media companies are also in a position – should they wish – to offer different terms and services to different political (and other) actors, and even to deny access to certain individuals or organisations. They could make it easier for a political party with which their business or ideological interests align to reach their supporters, or vice versa.
10. Far greater transparency and accountability in the design and function of algorithms is essential, even though this is challenging to implement and guarantee in private companies with proprietary technology in a competitive environment. Social media and other tech companies were not designed to play a significant role in the public sphere. Their codes of practice are not designed to make their algorithms transparent, and their proprietary algorithms lack independent oversight.
11. The LSE Truth, Trust and Technology Commission (T3) report[10] calls for a regulator to help to safeguard citizens’ rights while ensuring an accountability framework is in place. Its mandate would include interventions such as, for example, ensuring credible content is ranked up and warning against misleading content. The role of the regulator would enhance transparency through requiring platform reporting responsibilities about how content curation judgements are made and by making this information public. This would enable citizens and other stakeholders to know who is making decisions about online moderation and curation (and on what basis), and whether those decisions are timely and effective.
12. Transparency in itself doesn’t equal accountability, particularly in an area where the technology is so complex that even those who work with it struggle to understand it. As our colleague Alison Powell has argued, increasing the ‘explainability’ of algorithms could pave the way for more accountability by increasing public understanding of how they make decisions particular decisions in particular contexts.[11]
13. Researchers must be given more access to data on algorithms used by major tech companies. There is a paucity of reliable data on what is happening on the platforms and a methodological challenge in analysing what data are available. The academic community currently lacks access to the data held by platforms that is essential for undertaking research that can help to hold them to account.[12] Evidence, underpinned by independent research, is crucial to examine biases in the platforms’ technical and human operations.
What role should every stage of education play in helping to create a healthy, active, digitally literate democracy?
14. There is clear evidence that greater education for media literacy has a direct impact on improving people’s ability to access and judge information.[13] With growing evidence of the damage in which misinformation and disinformation are involved, a new and increased commitment to media literacy is needed that addresses the context as well as the symptoms[14] – a media literacy that is fit for confronting the informational complexities that face citizens of the early 21st century.
15. Our T3 report calls for Government to mobilise an urgent, integrated, new programme in media literacy for both children in schools – for example, a compulsory media literacy module in citizenship classes – but also for adults in further and vocational education, as well as parents, teachers and the children’s workforce. This could be funded by a digital platform levy as part of the work of the regulator, and should include digital media literacy training for politicians. (Any contributions from platforms to media literacy work should be via an independent organisation, rather than direct communication from platforms to the public, and should be independently evaluated.)
16. Media literacy messages need to be based on a reliable set of principles that are understood by news providers, educators and platforms, and they need to be implemented effectively and consistently over time in an unstable world. Ofcom’s media literacy research enterprise should be tasked with developing an evaluation toolkit for all media literacy initiatives to use, and Ofcom should also publicise the results and draw out the lessons for future initiatives.
17. In schools, media literacy should be the fourth pillar of education alongside reading, writing and maths. The Department for Education should lead an inclusive educational framework to build digital literacy: at present, educational provision in UK schools is insufficient for digital and media literacy, especially as regards critical literacy. As recently argued by LSE’s Media Policy Project Brief #22, there are five pressing challenges faced by the current legislation, national curriculum and teaching resources:[15]
18. A new integrated programme in media literacy also needs to reach out to adults not in education or training. Both platforms and civil society organisations need to be incorporated into a programme that could include the provision and use of media literacy toolkits to integrate media literacy into wider social activism and services. An independent regulator or other organisation should coordinate work with the BBC and public service broadcasters, libraries, the National Literacy Trust and the tech platforms, ensuring that particular effort is made to target vulnerable and hard-to-reach groups. It seems clear that with the demands on people’s media literacy constantly outpacing what they can understand or keep up with, the media literacy gap will continue to grow unless substantial efforts are made.[16]
19. As Professor Sonia Livingstone has stressed, when embarking on any efforts to improve public understanding we should be mindful of the extent to which we ‘responsibilise’ the individual: the politics of media literacy risks not only burdening but also blaming the individual for the problems of our highly complex digital environment.[17]
Would greater transparency in the online spending and campaigning of political groups improve the electoral process in the UK by ensuring accountability, and if so what should this transparency look like?
20. As outlined above, the regulatory framework dealing with election campaigning in the UK aims to ensure a fair playing field primarily by regulating and making transparent the amount of money that political parties and other campaigners spend, and the amount of time they are given on broadcast media. It is therefore crucial that online spending and campaigning should be reported in detail to improve transparency.
21. While Facebook has increased its monitoring and identification of political advertising to enforce its existing guidelines,[18] these efforts lack transparency and are no substitute for publicly accountable regulation. Regulators themselves have recognised this and have argued for increased powers:
- In 2018, the Electoral Commission called for increased transparency on who is paying for online political advertisements, clearer reporting requirements on how political parties spend their money and the power to levy larger fines on those who break the rules.[19]
- In the same year, the Information Commissioner’s Office’s report, Democracy Disrupted, said there was a ‘significant shortfall in transparency’ and made policy recommendations for addressing ‘personal information and political influence’.[20]
22. The public has limited awareness of how individuals are targeted by political parties. Ofcom evidence suggests that while people may be aware that their data are being used to allow third parties to tailor messages to them, they know less about the data brokerage systems and involvement of third parties and how the data are used or monetised.[21] We support the Electoral Commission’s call to make it clear to voters who is paying to influence them online.[22]
23. The ability to target specific people within a particular geographic area gives parties the opportunity to focus their attention on marginal voters in marginal constituencies. This means, in practice, that parties can direct significant effort –and therefore spending –at a small number of crucial seats. Yet, though the social media spending may be targeted directly at those constituencies, and at particular voters within those constituencies, the spending can currently be defined as national, for which limits are set far higher than for constituency spending. This necessarily undermines the principle of a level playing field at a local level.
What effect does online targeted advertising have on the political process, and what effects could it have in the future? Should there be additional regulation of political advertising?
24. Message targeting encourages contact and engagement only with those who are deemed worthy of political campaigning, for example those in marginal seats or judged to be undecided voters. This begs the question as to what happens to those who are not regarded as strategically important. Groups less likely to vote risk being further disenfranchised if they do not see campaign messages, and there is also a risk of a compounding effect.
25. Data on past elections are often used as a guide to inform future campaigning, so groups which are seen as not worth the resources are likely to be bypassed in the future, while those already seen as ‘decided’ are likely to receive information only from their affiliated party, if at all (as it might be considered a waste of resources). If democratic societies flourish through the free flow of information which in turn allows citizens to consider issues on balance, then any move to restrict information flow might exacerbate polarization. As Karpf (2012) noted, advances in technology which allow message targeting remove a “beneficial inefficiency” that has in fact aided the public sphere.[23]
26. The ability to micro-target political messages also increases the likelihood that parties and candidates campaign on wedge issues, which are highly divisive in a public forum but also have the ability to mobilize voters such as matters on immigration and welfare.[24] Research in the U.S has shown that candidates are more likely to campaign on these wedge issues when the forum is not public.[25] This raises questions about the impact this type of precise hidden campaigning and asymmetric informational flows has on the polarization of citizens. Message targeting speaks to the individual concerns of citizens as part of a group. The legitimate concerns of opposing groups are discredited or dismissed. Because these messages are being played out largely in secret, they cannot be challenged or fact checked.
27. Ultimately, electoral regulation in the UK is diffuse and unfit for purpose, and spread across a number of institutions resulting in regulatory blind spots. The self-regulatory arrangement in place for political advertising is not sustainable when social media are making paid advertising content a much more important element in the UK’s political communication.
28. In the Truth, Trust and Technology report, we said the Government should act urgently to introduce legislation supporting a mandatory code for political advertising before the next election. In addition, the Government should introduce legislation to enable the Advertising Standards Authority (ASA) and the Electoral Commission to create a new standards code for political advertising online. A regulator should help to coordinate evaluation of the impacts of micro-targeting (as well as general advertising) to ensure that guidelines and limits are appropriate for use in political contexts.
29. A regulator should also help to encourage the introduction of a UK political advertising directory and monitor outcomes of the initiatives of relevant institutions to ensure that databases such as Google’s and Facebook’s Ad Libraries are independently overseen. As Privacy International stresses in its recent report, companies should provide users with meaningful, granular information about why they are being targeted by an advertiser or a campaign, and allow access to much more information than is currently available.[26] Mozilla has compiled a list of guidelines, endorsed by dozens of researchers, that platforms’ ad archive APIs “must meet in order to truly support election influence monitoring and independent research.”[27]
30. The Electoral Commission needs the powers to act quickly in response to emerging risks including requiring spending information and accountability on online advertisements during elections and referendums by foreign organisations and individuals. Legislation is needed to ensure greater transparency of the sources of information produced and circulated on the platforms during an election. Legislation should include provisions, subject to assessment of impact, for levying heavier fines on organisations or individuals who break the law.
31. We also recommend:
● Aligning the constraints on television advertising with the lack of constraints online
● Clarifying guidance for the use of targeted messaging online, particularly with regards to enabling transparency and public scrutiny
● Obliging platforms to offer equal access and equivalent services to campaigners at equal pricing
● Encouraging self-regulation by candidates and parties of campaign messaging online, in order to reassure voters that campaigns will not adopt intrusive or manipulative propaganda techniques
32. If the policy framework is not updated, the ability of ‘rules of the game’ to ensure that elections are free, fair and legitimate will increasingly be called into question. The UK should not find itself having to go to the polls again before the legislative framework is modernised.
What might be the best ways of reducing the effects of misinformation on social media platforms?
33. Key to reducing the effects of misinformation is limiting its reach. An important step for tech companies is to recognise their responsibility for amplifying misinformation through their recommendations. This is gradually starting to happen, but there is a long way to go in defining and establishing the boundaries between free speech and free reach.[28]
34. In an ideal world, companies would adjust their recommendation algorithms to ensure that reliable information is prioritised over the goal of achieving a high volume of engagement. As their business models are founded on attention, this is unlikely to happen quickly because the ‘attention economy’ prioritises any sort of engagement over supplying quality information.
35. Providing links to reliable, evidence-based information, as many tech companies are now doing when users seek or encounter misinformation on vaccines, for example, is a welcome step forward. However, often a lack of information isn’t the problem.[29] There is a huge amount of evidence for the benefits of immunization, but presenting this to dedicated anti-vaxxers isn’t necessarily going to change their minds. A social media redirect to an authoritative source is unlikely to be sufficient to have an impact on those with deeply entrenched viewpoints.
36. On a more positive note, evidence suggests that trust indicators used by news organisations can have a statistically significant effect on readers’ perceptions of the credibility of online news sources. This should be investigated and more widely implemented to differentiate between credible and unreliable sources of information.[30]
How could the moderation processes of large technology companies be improved to better tackle abuse and misinformation, as well as helping public debate flourish?
37. For a start, evidence on the incidence of abuse and misinformation and how it is moderated must be independently produced, with academic rigour, for the benefit of the public and other stakeholders, including the platforms. A regulator should play a key role in facilitating the relationship between the platforms and researchers in the UK, and provide a forum for debate with a wide range of stakeholders and the public. It should coordinate closely with research centres such as the UK Centre for Data Ethics and Innovation that has a key role in establishing codes of conduct for the future design of the technologies that support platform services.
38. A regulator could encourage content moderation based on agreed principles and a taxonomy of misinformation, disinformation and mal-information, including hate speech, taking into account the source, content and intent of the material involved. Platforms should have a duty to identify the source and identity of posts and accounts in a way that allows anonymous, but verified, user accounts, in order to protect vulnerable categories of users and addresses used in closed apps. A regulator could oversee and monitor practices for the filtering, removal of user accounts or content, flagging, warnings, up- or down-ranking of material, and changes to algorithms or their design and accessibility. It would provide annual assessments of the outcomes of moderation activities to Parliament, the Electoral Commission, Ofcom and other relevant bodies and to the public.
11
[1] Thwaite, A. (2019). Literature Review on Elections, Political Campaigning and Democracy. [ebook] Oxford: Oxford Internet Institute, Oxford University. Available at: https://oxtec.oii.ox.ac.uk/wp-content/uploads/sites/115/2019/09/OxTEC-Literature-Review-Alice-Thwaite-Report-25-09-19.pdf
[2] Ofcom (2019). Adults: Media use and attitudes report 2019. [ebook] London: Ofcom. Available at: https://www.ofcom.org.uk/__data/assets/pdf_file/0021/149124/adults-media-use-and-attitudes-report.pdf
[3] A study found that: “respondents who were less politically efficacious and less interested in politics received relatively greater cognitive and influence benefits from dual screening the debate.” Chadwick, A., O’Loughlin, B., & Vaccari, C. (2017) Why People Dual Screen Political Debates and Why It Matters for Democratic Engagement, Journal of Broadcasting & Electronic Media, 61:2, 220-239, Available at: https://www.tandfonline.com/doi/full/10.1080/08838151.2017.1309415
[4] Ranie, L., Smith, A., Anderson, M. and Toor, S. (2018). Activism in the Social Media Age. [online] Pew Research Center: Internet, Science & Tech. Available at: https://www.pewinternet.org/2018/07/11/public-attitudes-toward-political-engagement-on-social-media/
[5] Fox, R. (2019). Government squandered public education opportunity with dismissive response to anti-Trump State Visit e-petition. [online] Hansardsociety.org.uk. Available at: https://www.hansardsociety.org.uk/blog/government-squandered-public-education-opportunity-with-dismissive-response
[6] A recent survey in the US showed that 62% of Americans believed that social media companies had too much control over the mix of news that people see, and 55% say that the role social media companies play in delivering the news on their sites results in a worse mix of news. Shearer, E. and Grieco, E. (2019). Americans Are Wary of the Role Social Media Sites Play in Delivering the News. [online] Pew Research Center's Journalism Project. Available at: https://www.journalism.org/2019/10/02/americans-are-wary-of-the-role-social-media-sites-play-in-delivering-the-news/
[7] Zuboff, S. (2019).The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. London: Profile Books.
[8] Acerbi, A. (2019) Cognitive attraction and online misinformation, Palgrave Communications volume 5, Article number: 15. Available at: https://www.nature.com/articles/s41599-019-0224-y#Bib1
[10] LSE Commission on Truth, Trust and Technology (2018) Tackling the Information Crisis: A Policy Framework for Media System Resilience, LSE, London. Available at: http://www.lse.ac.uk/media-and-communications/truth-trust-and-technology-commission/The-report
[11] Powell, A., Joshi, A., Carfantan, P., Bourke, G., Hutchinson, I. and Eichholzer, A. (2019) Understanding and Explaining Automated Decisions. Available at SSRN: https://ssrn.com/abstract=3309779 or http://dx.doi.org/10.2139/ssrn.3309779
[12] Bastos, M. and Walker, S.T. (2018) ‘Facebook’s data lockdown is a disaster for academic researchers’, 18 April. Available at www.city.ac.uk/news/2018/april/facebook-data-academic-research; Hotham, T. (2018) ‘Facebook risk starting a war on knowledge’, The Conversation, 17 August. Available at https://theconversation.com/facebook-risks-starting-a-war-on-knowledge-101646
[13] Kim, E.-M. and Yang, S. (2016) Internet literacy and digital natives’ civic engagement: Internet skill literacy or internet information literacy?, Journal of Youth Studies, 19(4), 438–56. doi:10.1080/13676261.2015.1083961; Martens, H. and Hobbs, R. (2015) How media literacy supports civic engagement in a digital age’ Atlantic Journal of Communication, 23(2), 120–37. doi:10.1080/15456870.2014.961636; Hobbs, R. (2010) Digital and media literacy: A plan of action. Available at http://mediaeducationlab.com/sites/mediaeducationlab.com/files/Hobbs%20Digital%20and%20Media%20Literacy%20Plan%20of%20Action.pdf; Frau-Meigs, D. and Hibbard, L. (2016) Education 3.0 and internet governance: A new global alliance for children and young people’s sustainable digital development, Paper Series no 27, Centre for International Governance Innovation and Chatham House. Available at www.cigionline.org/sites/default/files/gcig_no27web_0.pdf
[14] Jeong, S.H., Cho, H. and Hwang, Y. (2012) Media literacy interventions: A meta-analytic review, Journal of Communication, 62(3), 454–72. doi:10.1111/j.1460-2466.2012.01643.x.
[15] Polizzi, G., Taylor, R., (2019) Misinformation, digital literacy and the school curriculum. Media Policy Brief 22. London: Media Policy Project, London School of Economics and Political Science. Media Policy Project, London School of Economics and Political Science, London, UK. Available at http://eprints.lse.ac.uk/101083/
[16] Livingstone, S. (2008) Engaging with media – A matter of literacy?, Communication, Culture & Critique, 1(1), 51–62. Available at http://eprints.lse.ac.uk/4264
[17] Livingstone, S. (2018). Media Literacy: what are the challenges and how can we move towards a solution?. [Blog] LSE Media Policy Project. Available at: https://blogs.lse.ac.uk/mediapolicyproject/2018/10/25/media-literacy-what-are-the-challenges-and-how-can-we-move-towards-a-solution/
[18] Leathern, R. (2018) Shining a light on ads with political content, 24 May. Available at https://newsroom.fb.com/news/2018/05/ads-with-political-content/
[19] The Electoral Commission (2018) Digital campaigning: Increasing transparency for voters. Available at www.electoralcommission.org.uk/__data/assets/pdf_file/0010/244594/Digital-campaigning-improving-transparency-for-voters.pdf
[20] Information Commissioner’s Office (2018) Democracy disrupted? Personal information and political influence. Available at https://ico.org.uk/media/action-weve-taken/2259369/democracy-disrupted-110718.pdf
[21] Ofcom (2018) ‘Eight in ten Internet users have concerns about going online’, 18 September. Available at www.ofcom.org.uk/about-ofcom/latest/features-and-news/eight-in-ten-have-online-concerns
[22] Electoralcommission.org.uk. (2019). Transparent digital campaigning. [online] Available at: https://www.electoralcommission.org.uk/who-we-are-and-what-we-do/changing-electoral-law/transparent-digital-campaigning
[23] Karpf, D. (2012) The MoveOn Effect: The Unexpected Transformation of American Political Advocacy, Oxford University Press.
[24] Barocas, S. (2012) The price of precision: Voter microtargeting and its potential harms to the democratic process. In Proceedings of the first edition workshop on Politics, elections and data (pp. 31-36). ACM.
[25] Sunshine Hillygus, D & Shields, T. (2009) The Persuadable Voter: Wedge Issues in Presidential Campaigns, Princeton University Press.
[26] Privacy International. (2019). Social media companies have failed to provide adequate advertising transparency to users globally. [online] Available at: https://privacyinternational.org/long-read/3244/social-media-companies-have-failed-provide-adequate-advertising-transparency-users
[27] The Mozilla Blog. (2019). Facebook and Google: This is What an Effective Ad Archive API Looks Like – The Mozilla Blog. [online] Available at: https://blog.mozilla.org/blog/2019/03/27/facebook-and-google-this-is-what-an-effective-ad-archive-api-looks-like/
[28] Diresta, R. (2018) Free Speech Is Not the Same as Free Reach, Wired Magazine. Available at: https://www.wired.com/story/free-speech-is-not-the-same-as-free-reach/
[29] Joubert, M. and van Schalkwyk, F. (2019). Why anti-vaccine beliefs and ideas spread so fast on the internet. [online] The Conversation. Available at: https://theconversation.com/why-anti-vaccine-beliefs-and-ideas-spread-so-fast-on-the-internet-111431
[30] The Trust Project. (2018). Trust Indicators boost readers’ perceptions of news credibility. [online] Available at: https://thetrustproject.org/trust-indicators-boost-readers-perceptions-of-news-credibility/