Dr Gillian Bolsover – written evidence (DAD0046)


I am a Lecturer in Politics and Media at the University of Leeds. I research the impact of digital technologies on political and social life in different political systems and the changing nature of citizenship in the Internet age. The majority of my research focuses on China and the US either individually or comparatively to understand how digital technologies are used in different ways and have different effects in different political systems. I am just finishing a book manuscript that investigates how increasing commercialisation and growing authoritarianism are changing the balance of political power in the Internet age. I enclose below brief responses to a number of the questions put forward in this call for evidence. These opinions are my own and not intended to represent the views of my institution or other affiliations.

How has digital technology changed the way that democracy works in the UK and has this been a net positive or negative effect?

  1. 1.1 The way that digital technologies have changed politics can be divided into four main areas related to the functionalities of these technologies: the spaces that technologies provide, the data technologies produce, the way that these technologies work (e.g. based on algorithms) and the functionalities these technologies provide.

1.2. As a space, digital technologies were hoped to provide a new public sphere and invigorate offline civic life. They have become a key place that politics now takes place but, particularly due to the dominance of digital technologies by powerful commercial entities, these spaces do not match up to the hopes that people had for these technologies to positively effect politics. The commercialisation of digital spaces has accelerated processes of an entertainmentisation of politics and vastly increased political polarisation, as individual political activity needs to be shaped to meet the commercial needs of these companies. This situation, plus the pervasiveness of digital technologies in modern life, leads to cognitive dissonance on the part of individuals and resignation to the status quo (e.g. it is too hard to get off Facebook, it is too expensive to pay for journalism).

1.3. The vast amounts of data produced by digital technologies has enabled new and powerful corporations, with monopolistic economies of scale, whose business models are based on a monetisation of human behaviour (this argument is laid out in the popular 2019 book Surveillance Capitalism by Harvard Business School Professor Shoshana Zuboff). It is through a combination of this surveillance capitalist approach, based on this abundant data and the computing power to analyse and predict based on this data, and channelled through the algorithms and affordances of digital giants such as Facebook and Google, that leads to targeted political marketing such as that of Cambridge Analytica. This leads to greater entertainmentisation of politics and political polarisation, as mentioned above. In addition, tendencies toward targeted political advertising based on analysis of abundant digital data helps to normalise extreme views through playing on and exacerbating these emotions to ensure maximum appeal of this system of advertisement.

1.4. These technologies work based on algorithms that tailor content to each individual user within a system in which functionalities are restricted by the (commercial) platform. In this system, each user of digital technologies receives different content. The vast majority of theories of how politics works rely on individuals having access to a relatively small number of relatively impartial news media that report similar issues and facts (as theorised by Benedict Anderson). Digital technologies have fundamentally these assumptions leading to examples such as the one detailed by Eli Pariser in The Filter Bubble in which two people with similar demographic profiles searching for the word Egypt in Google around the time of the protests in Tahir Square receive very different information: one largely news about the protests and one information about holidays. Citizen bodies cannot form when the readily available information consumed is that different. In addition, arbitrary design choices of digital technologies can have big and unintended effects. The choice to limit Twitter posts to 140 characters, which was based on the length of an SMS, fundamentally changes the kinds information that can be shared and discussions that can be had. The quality of debate is often better on Chinese social media (where the 140-character limit was maintained) partly because approximately five times as much information can be conveyed in 140 Chinese characters. These design choices are compounded by the scale of these platforms, for instance, an A-B test on Facebook in the US 2012 mid-term elections generated an estimated 340,000 additional votes in the election by showing some users faces of six of their friends who had clicked an i-voted button on the site (Bond et al., 2012). The reliance on black-box commercial sites for political functions means that control over how politics works is lost to states and citizens, with the companies that create and control these digital technologies not equipped to evaluate or deal with the potential power of tiny choices (such as the Facebook i-voted button). This plays into the changes already mentioned: entertainmentisation of politics, political participation as gratification rather than civic responsibility, political polarisation, cognitive dissonance and resignation to the status quo in terms of the ownership, use and functionality of digital technologies.

1.5. Lastly, the functionalities of these technologies have afforded new opportunities for citizen oversight and organisation. There are notable benefits here, such as the use of cell phone video to document police brutality against African Americans in the US propelling the issue into the public consciousness. However, political participation on digital technologies tends to produce leaderless movements that although massive, suffer from problems such as lack of coherent platforms that means massive political participations may not achieve any change (e.g. the Women's marches in the US). This has led to increased volatility and turbulence in the political system, as massive movements can arise unpredictably, but this also means that participants can be disillusioned when their activity does not lead to results. Digital technology fuelled political participation tends to be through channels outside formal politics. Many do not see their activity as political and this increases disillusionment with the political system. The greater strength of Brexit positions compared to party affiliation among the British public is a testament to the increased strength of issue publics and how that breaks down existing ways of doing politics. These movements empower and create fertile ground political candidates who run against the existing system and existing processes, which is extremely dangerous for the long-term health of democracy. Politicians have moved online as a response to citizen movements there, but communication and campaigning over digital technologies undermines professionalism, changes the tone and substance of politics and increases volatility and entertainmentisation of politics. The functionalities of technology have also been exploited to manipulate public opinion for political or commercial gain through misinformation, disinformation and just plain distraction, which is dealt with in a later question.

How have the design of algorithms used by social media platforms shaped democratic debate? To what extent should there be greater accountability for the design of these algorithms?


  1. The main effects have been:
    2.1. An entertainmentisation of politics: Political information must compete online in an attention-grabbing environment, rewarding politicians, providers and information types (e.g. opinion rather than reporting) that can and are willing to compete for attention. The democratic functionalities of social media mean that a year-long piece of investigative journalism has the same space and presentation as a blog post on a partisan 'junk news' site and an offhand comment from a celebrity. In this system, hyper-partisan quasi-news providers have flourished. This presents a more widespread and complex problem than simple mis- or disinformation. With the online click-based system, it does not matter if a user reads or believes the information, if a user clicks the creator is rewarded. The previous 24-hour news cycle of cable TV has become a more intrusive notification-based cycle.

2.2. Political polarisation: Algorithms tailor content to users. This has resulted in less diversity in opinions and voices in consumed information, leading to political polarisation, breaking broad communities of citizens and normalising extreme opinions.

2.3. Loss of control: Even though who create these algorithms cannot predict what they will do. This is a major problem (and potential) of artificial intelligence. These algorithms often to tend toward a lowest common denominator (e.g. music recommendation systems would tend toward mainstream popular bands with the broadest appeal) with high emotional impact (as the system functions simply on clicks rather than effect it pays to incite emption and reaction whatever the form). These algorithms also exacerbate existing structural problems, e.g. Microsoft's chatbot that replicated racist messages found in the Twitter tweets it was programmed to learn from or the fears about the use of AI in policing exacerbating existing bias against minority populations which results in datasets that present an impression of higher levels of illegality in these populations, for instance because they are more likely to be the subject of a stop and search. These algorithms (created by commercial companies) mean that the state and citizens lose control over how the system works. Citizens are disempowered because they do not know why they see the information they see online and have little control over how they are profiles. The AI algorithms used by these companies for profiling are designed to fill in the blanks on a user's profile, e.g. guessing race or gender from videos watched, removing the ability to opt out of the system.

2.4. Unfortunately, to try to make these algorithms more accountable undermines their potential as well as their pitfalls. I believe the fundamental problem is not the algorithms themselves but largely their ownership (i.e. they are designed and implemented by commercial companies for commercial purposes). Yes, there are problems with the use of algorithms by states (as in the policing example above) but a more pressing issue is who uses the vast majority of algorithms and for what reasons. Technologies themselves are largely neutral and the question is who is using them, why and what for.

What role should every stage of education play in helping to create a healthy, active, digitally literate democracy?

  1. I believe that we need to be thinking more widely than digital literacy. Thinking about digital literacy takes the current system as a given and thinks about how we need to educate people to function in this system; we need to be thinking more about how to change the system. I think we need to be thinking wider than the Internet or digital technology to think about the wider system that both precipitated and has been created by this system. As discussed about, core aspects of this system are the commercialisation and entertainmentisation of both political and social life. The movement to this system has been based on ideas of trading privacy and freedom for efficiency, predictability and safety. While digital literacy is important, I would argue that of wider and more profound importance is to counteract in education these messages of for efficiency, predictability and safety coming from a capitalist system that profits from them, less so from the sale of products to the consumer and more from the sale of data produced by the consumer during the use of these products. Civic education needs to emphasise that sacrifice of time, money, ease, comfort is necessary for a healthy democracy. Our biggest problem is not that people don't know that they are being exploited by social media companies or that reducing the consumption of meat and dairy is a necessary part of preventing further climate change; the biggest problem is that people are unwilling to make the changes in their lives necessary to bring their private lives in-line with their personal ethics.

To what extent does increasing use of encrypted messaging and private groups present a challenge to the democratic process?

  1. I strongly support movement towards the encryption of personal information and communication. While realising that encryption means that data, for instance about illegal activities that could be of use to law enforcement to prosecute or prevent crimes, is hidden, I believe that in balance the personal liberty and right of an individual to have ultimate control over their personal data is more important. I accept the risk that encryption might be used to plan and execute crimes that could affect my personal safety and security in order that the way I conduct my own life might be free from surveillance and profiling. Democracy is predicated on the fundamental liberty of individuals; perhaps a benevolent dictator would in the short-term produce faster and better results but we do not endorse this system because (a) it goes against fundamental principles of liberty and freedom that are worth risk in order to preserve and (b) it is too easily abused or slips from benevolence to malevolence. Arguments against encryption are similar. Arguments have been made that technology companies should provide a back-door so that encrypted messages can be accessed by, for instance, law enforcement officials. However, this system is too easily abused. A back-door provided for a legitimate and well-meaning democratic government can be used just as easily by an authoritarian one and provides a route of attack for exploitation. It also invites function creep. For instance, once established as a means to investigate, for instance, terrorist attacks it is then easy to make the argument that it should be used to prevent lesser and lesser crimes. Digital data never disappears and is readily processable in ways that far outpace a pre-digital era. As the situation regarding homosexuality in Brazil shows, we are always only one government away from a previously safe behaviour being unsafe. Using encrypted communications for data that will never go away preserves the freedom and liberty of people against future changes. Encryption, in balance, will help individuals reclaim freedom and bolster democracy rather than undermining it.

What might be the best ways of reducing the effects of misinformation on social media platforms?

  1. Misinformation is not a new problem in society and the proliferation of hyper-partisan, emotional information (propaganda) is more of a problem than strictly factually incorrect information.

5.1. One important way to reduce the effects of misinformation is to support the provision and proliferation of traditional journalism. This is not about traditional journalists adapting to digital technologies, as that entails a movement to attention-grabbing, easy to process information. It is rather, re-recognising a civic responsibility in being well informed. The BBC is important and we should aim to (re-)build trust in and consumption of the platform. Investigative journalism and journalistic training should be supported. Library closures need to be reversed. Newspaper subscriptions should be available at schools, community centres, doctor's offices, integrated, where possible, in service provision (such as through school assignments) and subsidised for low-income individuals and families. 

5.2. The second important way is to emphasise that the problem of misinformation comes from the commercial nature of the system. It is telling that a great deal of the misinformation produced around the 2016 US Presidential Election was not produced by people who had an interest in one side or another winning but rather by Macedonian teenagers who realised how much money they could make writing salacious headlines that people would click on. To the extent that we continue to access political information through commercial social media sites, we will always have a problem with misinformation as misinformation performs well in grabbing attention, encouraging click-through and gives audiences entertainment satisfaction. The consumption of political information cannot be filtered through structures, algorithms and processes that are primarily commercial. One way to fight this is to create and encourage non-commercial and civic-minded alternatives. Right now, this is next to impossible because of the economies of scale of digital technologies and their resultant network effect (e.g. it is hard to get off Facebook when all your contacts are still using Facebook). In order to facilitate the breaking of monopolies and the cultivation of non-commercial and civic-focused alternatives, there needs to be portability of user data between platforms and the ability of users on different platforms to communicate. Digital communication technologies have become a vital infrastructure. Early commercial builders of the London Underground did not want to allow passengers to interchange between lines build by different companies. Early phone companies did not want to allow users on one provider to call a user on another provider. The ability to interchange seamlessly between infrastructure build and owned by different companies is vital to the utility of these services. The same is now true for our digital technologies. If users were able to move their contacts from one platform to another and to communicate with users of different platforms within the same messaging services, we could move away from economies of scale and support the development of alternative choices in contrast to the current near-global monopolies.


6. In sum, my core argument here is not about the digital technologies themselves but about their ownership. The majority of the current issues arise because of the commercial ownership of these technologies, rather than a particular underlying principle of the technology. We need to facilitate competition as well as non-commercial and civic-minded alternatives to current technologies and work to reinvigorate and rehabilitate traditional media and civic spaces and life.  Currently, commercial media sell us the idea that they are sufficient and efficient for democratic engagement. This is not true; it is just their advertising message. Politics cannot function largely through on commercial spaces and processes. The idea of giving up freedom and privacy for efficiently, predictability and safety needs to be fought and we will need to accept that it will be less efficient and less predictable to reclaim democratic political life for civic bodies from commercial digital technology providers, who have made large amounts of money by taking private ownership of political processes and functions that need to be public.