Doteveryone – written evidence (DAD0037)
Doteveryone1 is a think tank that champions responsible technology for the good of everyone in society. Our work explores how technology is changing society, shows what technology that considers its social impact can look like, and builds communities and networks to improve the way technology shapes our world.
1. How has digital technology changed the way that democracy works in the UK and has this been a net positive or negative effect?
- Digital technology is accelerating the dynamics of democracy. Research shows that the life-span of data is getter shorter online, with content receiving less collective attention and debate.2 Issues are becoming popular more rapidly, and interest fades away at a similarly increasing rate.
- 24-hour social media has shattered the traditional news cycle and lowered the barriers to entry for anyone who wants to create and participate in political debates. Fierce competition to set the news agenda has led to an attention arms-race, where increasingly sensationalised and disposable content wins out.3
- Technology has enabled campaigners to tailor different messages to different audiences based on sophisticated data analytics and behavioural profiling has fragmented what was once a single party line into thousands of micro-targeted manifestos. Scrutinising political arguments and claims is made exponentially harder by their sheer volume and the opacity of campaigns to those outside of targeted groups.5
- Important political Issues requiring deep and nuanced public debate are drowned out amidst this pace, noise and fragmentation. This flux has pushed many past their breaking point: 61% of Brits that the need to read and keep track of information from too many sources is a major concern in their daily lives,6 35% are actively avoiding the news.7 The online ecosystem is designed to treat people as consumers of fast-news, not active participants in civic debate.
- A review by the Committee for Standards in Public Life also finds that “social media has been the most significant factor accelerating and enabling intimidatory behaviour in recent years”.8 The structure of these platforms - where private arguments now take place in the digital public eye, with an audience of millions of onlookers - lends itself to a culture of hostile takedowns that fosters political instability.
- But rather than intervening against these damaging dynamics of online democracy, the UK’s political institutions have instead invested significantly in building up expertise in digital campaigning, predictive modelling and data science.
- During the 2017 election an estimated £3.16 million was spent on Facebook alone (this figure is an estimate in the absence of regulations requiring parties to report digital spend).9 The recruitment of Nick Clegg by the platform is emblematic of a wider recent trend of British politicians and political advisors moving to work in big tech firms,10 showing how valuable knowledge of the British political market is to the tech sector.
- In the race to control the digital media agenda some political institutions routinely disregard the law. The November 2018 Information Commissioner’s Office (ICO) investigation into the use of data analytics in political campaigns uncovered “a disturbing disregard for voters’ personal privacy...and significant issues, negligence and contraventions of the law”.11 The Committee for Digital, Culture, Media and Sport has raised doubts that behavioural manipulation through microtargeting of campaigns is lawful.4
- Parliament and democratic institutions must realise and reaffirm their responsibility to push back on the aspects of digital technology that inhibit reasoned democratic debate online, and to hold each other accountable to higher standards of conduct when engaging in these debates that go beyond mere compliance with the law.
- Much of this change can only realistically be achieved through regulatory reforms, which we outline in our response to consultation questions four and five.
- We recommend a review of the Standards for Public Life12 and parliamentary Code of Conduct13 to clarify how they translate online, focusing on areas including the gathering and processing of data, social media conduct and the creation and dissemination of content. This review must also look at strengthening accountability mechanisms, to ensure the Parliamentary Commissioner for Standards, Committee on Standards and Committee for Standards in Public Life have the resources and digital capabilities needed to uphold these standards.
2. How have the design of algorithms used by social media platforms shaped democratic debate? To what extent should there be greater accountability for the design of these Algorithms?
- The issues of algorithm-driven misinformation and information manipulation take up much of the oxygen in policy debates around the impact of technology on democracy.
- The European Commission has identified this issue as a priority through their Action Plan on Disinformation.14The Department for Digital, Culture, Media and Sport Select Committee has said that “[the internet] carries the insidious ability to distort, to mislead and to produce hatred and instability...when these factors are brought to bear directly in election campaigns then the very fabric of our democracy is threatened.”15
- Despite the urgency of the political rhetoric, research into the effect of misinformation on real-world behaviour and voting intentions suggests its impact is relatively marginal.
- Research into American Twitter users found that the spread of misinformation was highly concentrated and relatively uncommon – 0.1% of users accounted for more than 80% of shares of false content, and the proportion of articles on the average user’s feed identified as misinformation was just over 1%.16
- And a study into misinformation during the Brexit referendum found that “ just two of the many misleading claims made by politicians during the referendum were found to be cited in 4.6 times more tweets than the 7,103 tweets related to Russia Today and Sputnik and in 10.2 times more tweets than the 3,200 Brexit-related tweets by the Russian troll accounts”.17
- What is clear however is that the explosion of debate around the impact of digital misinformation on democracy has coincided with a dramatic decline in trust across all institutions implicated in the debate.
- Our People, Power and Technology research has shown the public demand for greater accountability from technology companies. Two thirds say government should be helping ensure companies treat their customers, staff and society fairly.18 Edelman’s Trust Barometer finds only 36% of the UK public trusts search engines and platforms and government, with this figure falling to 32% for journalists.19
- Technocratic fixes for bots sharing misinformation do little to address the broader issue of growing hyper-scepticism and mistrust. We recommend that accountability for technology companies should be seen holistically, focusing on more than technical design of algorithms. Where algorithms are a focus, action should focus on the bigger picture issues we outline in our response to consultation question 1.
- Instead, we should encourage extensive transparency from platforms, set out broad obligations to act responsibly through regulation, and create mechanisms for the public to assert their values upstream in the design of digital technologies and assert their rights when they have been breached. Our responses to consultation questions 3,4,5 and 11 outline our recommendations for bringing about these changes.
3. What role should every stage of education play in helping to create a healthy, active, digitally literate democracy?
- Understanding the relationship between digital platforms, information and political institutions is fundamental to participating in modern democracy. But our People, Power and Technology research20 shows common blindspots in digital understanding.
- 45% dont understand that adverts are targeted based on social media data, and 62% don’t realise the news they see online is influenced by this information. 41% say some news websites and apps can’t be fully trusted but read them anyway, while only 30% say they make an effort to view websites with differing political standpoints.
- These findings clearly show that lack of digital understanding is a universal concern; not only an issue for young people and schools.
- Our Engaging the Public with Responsible Technology programme explores ways to empower the public to navigate and scrutinise the online world. We will be publishing our initial findings in October 2019 and we would welcome the opportunity to discuss our findings further with the Committee upon its publication.
- This research - which encompassed a literature review, an expert roundtable, and an online diary and qualitative research with members of the public - has found building digital understanding through rote learning and one-size-fits-all media literacy campaigns is challenging.
- Many people are overwhelmed by the opacity and complexity of online services, and struggle to keep up with the pace of digital change. With the next generation of misinformation techniques including “deep-fakes”21 and speech synthesis evolving rapidly, the gap between technology and the public is only set to widen.
- More broadly, awareness campaigns suffer from a crisis of evaluation, with only 1% of public health campaigns evaluated by looking at behavioural, rather than attitudinal, change.22
- Other leading voices in the sector have also expressed concerns around digital literacy. Data and Society founder danah boyd has questioned whether media literacy is, in fact, undermining democratic debate by encouraging individuals to find their own truths at the expense of reason, saying “[Media literacy] is a form of critical thinking that asks people to doubt what they see. And that makes me very nervous”23
- Professor Sonia Livingstone has called media literacy the ‘policy of last resort’ stating “we cannot teach what is unlearnable, and people cannot learn to be literate in what is illegible...we cannot teach people data literacy without transparency, or what to trust without authoritative markers of authenticity and expertise. So people’s media literacy depends on how their digital environment has been designed and regulated.”24
- We therefore caution against placing too much responsibility on individuals to improve and maintain their digital understanding and literacy, however well-intentioned.
- Action within government and the tech sector must instead be focused on supply-side changes to make digital services more legible and transparent. These actions could include smarter engagement to give people information at point of need (explaining trade-offs between personal data sharing and benefits of using a service at the point of consent, for example), creating feedback loops that explain consequences of online choices to users and creating accessible avenues for people to raise questions about their online experience - which could be answered by platforms or fellow users.
- Government must also proactively foster the ecosystem of trusted news providers and civil society organisations that can support people to understand their digital rights and seek redress where they have been breached.
4. Would greater transparency in the online spending and campaigning of political groups improve the electoral process in the UK by ensuring accountability, and if so what should this transparency look like?
5. What effect does online targeted advertising have on the political process, and what effects could it have in the future? Should there be additional regulation of political Advertising?
- Our contribution to the Electoral Reform Society’s Reining in the Political ‘Wild West’: Campaign Rules for the 21st Century report sets out our recommendations for reforming the regulation of digital political campaigning.25 We have, alongside Full Fact, also called for reforms in this area ahead of the European elections in April 2019.26
- This regulation is in urgent need of reform. The Electoral Commission is hamstrung by an outdated remit and limited resources, whilst other aspects of digital campaigning fall in the gaps between the Information Commissioner’s Office, Advertising Standards Agency and Independent Press Standards Organisation.
- The Department for Culture, Media and Sport (DCMS) select committee, Cabinet Office,27 the Electoral Commission,28 internet companies and the Institute of Practitioners in Advertising (IPA) are amongst other notable organisations that have also called for reform in this area.
- The Political Parties and Elections Act 2000 must be amended to give the Electoral Commission the flexibility to apply its offline powers online by:
- Ensuring the government delivers on its promise29 to extend the ‘imprint law’ for offline campaign material to cover digital political campaign material. Imprint should explain clearly that an ad is political in nature, and identify both the author and funders.
- Establishing a public register of online political ads, provided in real time, in machine readable format and with full information on content, targeting, reach and spend to guarantee transparency.
- New data regulations are also needed to:
- Introduce minimum allowable group sizes for online targeted advertising, to be decided through public consultation. Microtargeting ads to an audience of one fatally undermines external scrutiny and fact-checking that are cornerstones of democratic debate.
- Enforce a “right to an explanation” for political adverts, giving individuals the right to request information on why they have been targeted and on the basis of what characteristics.
- For these reforms to be effective it is vital that the Electoral Commission builds the digital capabilities needed to anticipate and respond to future developments in digital campaigning. We recommend that the Electoral Commission work with organisations including the Centre for Data Ethics and Innovation and Better Regulation Executive to develop the horizon scanning capacities need identify emerging challenges such as “deep-fakes” and the next generation of behavioural profiling practices.
- Parliament must also urgently review the level of resources needed by the Electoral Commission to successfully deliver these reforms and resource it accordingly.
11. How could the moderation processes of large technology companies be improved to better tackle abuse and misinformation, as well as helping public debate flourish?
- The imminent Online Harms regulatory framework presents an unprecedented opportunity for improving technology companies’ approach to content moderation.
- Our Digital Duty of Care briefing30 and Online Harms White Paper response31 set out our recommendations for improving these processes.
- We recommend that the forthcoming online harms regulator set best practice for content moderation. This should be enforced through powers to scrutinise online service’s internal processes - auditing the accuracy of technology designed to automatically identify and take-down extremist speech or spot-checking complaints handling procedures, for example. The quality and efficacy of these processes should then be publicly reported to improve accountability and encourage improvement where performance lags behind industry standards.
- We also propose independent oversight of platform’s content moderation decisions. The scale of social media platforms pose a challenge to oversight of individual decisions, as it will likely be prohibitively expensive to establish a body that is able to hear all content-related complaints. It may however be feasible to develop an Ombudsman-style body to adjudicate on content takedown decisions if its scope is suitably narrow - for example only being able to mediate cases relating to abuse of a political figure, or pieces of content that have been broadcast to a minimum threshold of users.
- In instances where content has affected large volumes of people we recommend the body be given powers to issue collective redress, with powers to compel platforms to proactively inform everyone that has been exposed to the relevant content about a takedown decision, explaining why that decision has been made and signposting them to any relevant sources of support.
- Beyond improving the quality and oversight of moderation processes, there is also an urgent need to empower the public and civil society groups to shape the community standards of online services upstream.
- The distribution of responsibilities for defining content standards is currently the subject of intense debate in the UK and internationally.
- Our People, Power and Technology research shows the public feel they have little agency to influence community standards online or hold social media companies accountable for respecting them: 43% say they consent to online services despite having concerns, because “tech companies will do what they want anyway”.
- Large platforms - with an extensive track-record of unethical practice32 and deficit of public trust - must not be responsible for defining what constitutes acceptable content and arbitrating on freedom of speech.
- Facebook have recently proposed establishing an “Independent Governance and Oversight Board” to set their content policies.33 Article 19 and the UN Special Rapporteur for freedom of speech David Kaye, propose the establishment of “Social Media Councils”34 to act as a multi-stakeholder forum for setting content moderation standards. A network of civil society organisations have proposed the Santa Clara Principles on Transparency and Accountability in Content Moderation35 as a starting point for a global standard on moderation processes.
- However, we believe that content standards must ultimately be defined according to the public’s values, which must be asserted by parliament via legislation founded on rigorous democratic debate.
- We acknowledge that this approach is not without risks. Government must not be given powers to regulate free speech arbitrarily, and civil rights groups have raised legitimate concerns that creating such powers will embolden authoritarian regimes in other nations to use similar legislation as a means of cracking down on dissent.36
- To safeguard against these risks we recommend that content standards legislation is grounded in the United Nations human rights principles, and that formal structures are put in place for members of the public to raise complaints around unfair moderation decisions and defend their rights to freedom of speech. Existing oversight bodies and laws for rights in the UK including the Human Rights Act, Equality and Human Rights Commission, Joint Committee on Human Rights and international human rights courts will be able to provide checks and balances against problematic applications of content laws.
6. To what extent does increasing use of encrypted messaging and private groups present a challenge to the democratic process?
7. What are the positive or negative effects of anonymity on online democratic
8. To what extent does social media negatively shape public debate, either through
encouraging polarisation or through abuse deterring individuals from engaging in public
9. To what extent do you think that there are those who are using social media to
attempt to undermine trust in the democratic process and in democratic institutions;
and what might be the best ways to combat this and strengthen faith in democracy?
10. What might be the best ways of reducing the effects of misinformation on social media platforms?
12. How could the Government better support the positive work of civil society organisations using technology to facilitate engagement with democratic processes?
13. How can elected representatives use technology to engage with the public in local and national decision making? What can Parliament and Government do to better use
technology to support democratic engagement and ensure the efficacy of the democratic process?
14. What positive examples are there of technology being used to enhance democracy?