Ms Christina Eager – written evidence (DAD0002)
1. How has digital technology changed the way that democracy works in the UK and has this been a net positive or negative effect?
1.1 Much of the political debate has moved online and is therefore inaccessible to some people. This has to be a bad thing for democracy.
2. How have the design of algorithms used by social media platforms shaped democratic debate? To what extent should there be greater accountability for the design of these algorithms?
2.1 It’s not the design of the algorithm but the uses to which they are put. A kitchen knife is a useful tool, until you use it to kill somebody. Unfortunately the actual algorithms are treated as commercially valuable and are secret to an extent that nobody, possibly not even the developers (if they use modularisation which I expect is the case) know what the full algorithm does. More scrutiny of those might be helpful. Maybe they should be open source and not proprietary.
3. What role should every stage of education play in helping to create a healthy, active, digitally literate democracy?
3.1 It is failure of education that has led to massive issues with democracy. Although there is a requirement to teach digital literacy and online safety there’s no guarantee that it is taught well or absorbed by the pupils.
3.2 The population generally is not equipped with the skills necessary to check the veracity of any thing they read. That isn’t a failure of only IT/Computer Science lessons (for those people who have had them) but all subjects. Analysing sources is fine in History or Geography or any academic subject but it is not taught as a life skill which it now most definitely is. This is the single most important issue: that voters of all ages must equipped with the skills to fact check and to analyse what they are being told. Actually this is also true for lots of other things.
3.3 People generally don’t understand data and numbers enough to make sense of them. Lots of people don’t seem to understand what a majority is or what 50% is or that if 75% of the population has something then 25% don’t. You can see this inability to analyse in lots of places in public debate. People generally don’t understand the meaning of the key terms being used and of course they can be used incorrectly or used to obfuscate rather than enlighten.
3.4 In schools this must be included in at least the Key Stage 4 curriculum and not be a subject parents can opt out of. A free course/session should be offered via Adult Education Centres, community centres, health centres, sports clubs or anywhere people gather so that the communities can learn this new life skill. It should be seen as un-trendy or stupid not to check facts. Maybe something along the lines of previous health campaigns should be established to make people more aware of the issues. If it worked for AIDS and smoking and seat belts it could work for this.
4. Would greater transparency in the online spending and campaigning of political groups improve the electoral process in the UK by ensuring accountability, and if so what should this transparency look like?
4.1 Yes as long as they didn’t cheat by hiding their spending. The Electoral Commission or another body must have oversight of all expenditure with rigorous checks and the ability to impose truly punitive sanctions on organisations and individuals who break the rules. The current rules as I understand them are not fit for purpose and the sanctions are relatively insignificant. All donations to political parties should be a matter of record (though not necessarily a public record - that would lead to more problems).
5. What effect does online targeted advertising have on the political process, and what effects could it have in the future? Should there be additional regulation of political advertising?
5.1 It’s not the advertising alone although that seems to be a huge issue and recognised to be massively influential. It’s also “editorial” pieces and “factual” pieces. There is no way of measuring the impact of that. We can check spending on formal advertising but what about product placement on apparently a-political websites?
5.2 You can’t control who creates a website. Anybody can do it. My students do it as part of their coursework every year. Punitive sanctions against website hosting companies who allow some kinds of content and OK but who is to decide what is acceptable and what not?
Privacy and anonymity
6. To what extent does increasing use of encrypted messaging and private groups present a challenge to the democratic process?
6.1 Quite large and not in a good way BUT this has to be balanced against the right to privacy. There’s no easy answer. How private is private? Encrypted messages are not automatically bad but they can encourage people to discuss “bad things” because they reduce the likelihood of them being found out. There is a need for private groups for example for families to share photos or groups of friends to keep in touch. Nobody can un-invent encryption.
7. What are the positive or negative effects of anonymity on online democratic
7.1 Negative: Abuse because people hide behind anonymity and will say things online they don’t dare say in public to somebody’s face.
7.2 Positive: Participation for people who don’t want their employer being able to see what they are saying. I work in a school. I keep my online presence private using an alternative name so that students or their parents don’t try to follow me or find out where I live. It’s a way of keeping my public in-school life and my private life separate. It’s important that I can do that. I don’t need some enraged parent turning up on my doorstep or going to my employer to complain because of something I have said online. I am not going deny my democratic right to engage in political discussion.
8. To what extent does social media negatively shape public debate, either through
encouraging polarisation or through abuse deterring individuals from engaging in public life?
8.1 Polarisation is obviously increased and we have a generation of people, possibly more than one generation, who don’t get their news from newspapers or TV (see https://www.ofcom.org.uk/research-and-data/tv-radio-and-on-demand/news-media/news-consumption) but only online so they never see alternative opinions. Never ever. So when somebody says something that’s different from what they think it’s shocking to them that different opinions exist. It’s too easy to get into a nice little bubble with like-minded people and never be challenged. Which reduces the quality of political debate enormously.
8.2 The ability of social media to expose people to abuse must put a lot of people off participating and that’s a common theme that social media is full of trolls and bots. It is. That might be something to consider in a regulatory framework.
8.2 Ofcom data suggests 37% of users trust social media. That’s a significant proportion of social media users and enough to skew the debate if they are given inaccurate data/facts/interpretations or even downright lies.
9. To what extent do you think that there are those who are using social media to
attempt to undermine trust in the democratic process and in democratic institutions; and what might be the best ways to combat this and strengthen faith in democracy?
9.1 There will will always be such people. Whatever one person uses for good another will pervert. More people need to feel engaged with democracy and they need to know or have faith that what they are being told is truth and that when they cast a vote it will have been legally obtained and they haven’t been conned by a special interest group which thinks they are stupid enough to belive what they are told.
10. What might be the best ways of reducing the effects of misinformation on social media platforms?
10.1 More information. Lots of verified and verifiable facts freely available. With people who have the skills to understand them and interpret data and think critically about what they are being told. That last bit is the most difficult. I am old and cynical and disbelieve most of what people (especially politicians) tell me, Younger or less cynical people won’t check or challenge.
11. How could the moderation processes of large technology companies be improved to better tackle abuse and misinformation, as well as helping public debate flourish
11.1 Better oversight. There has to be regulation because you can’t rely on the owners of the companies to act in the public interest. They act in their own interest and their shareholders’ not ours.
11.2 One persons’ abuse is another persons’ fair comment. That’s true on the streets as well in conversation or political debate. It’s one of our freedoms as a democratic society.
11.3 Misinformation could be checked easily enough but it’s often a case of interpretation.
11.4 Maybe more moderation might be an option, make it mandatory for all groups to have a moderator but then who would moderate the moderators and how would they be selected?
12.1 The biggest issue, as I see it and the one that this questionnaire completely ignores, is the internationalism. There is nothing the UK can do about social media companies that are based outside the UK. Not. One. Thing. We can regulate and legislate and all the rest of it about what happens to people and organisations here, but Parliament can’t even get Mark Zuckerberg to come and talk to it. There must be some international agreement about standards of privacy and posting. It should cover bots and trolls and have the teeth to do something about the countries who allow such activities to persist (there will always be countries who do that).
12.2 Since we can’t control what goes on outside our borders then we have to control and improve what goes on inside. That’s where the education comes in. See paras 3.1 – 3.4 That is the single most important thing we can do to improve the quality of political debate in this country – make UK citizens skilled to deal with the online world and the threats and promises it holds.