final logo red (RGB)

 

Select Committee on Democracy and Digital Technologies

Corrected oral evidence: Democracy and Digital Technologies

Tuesday 29 October 2019

10.35 am

 

Watch the meeting

Members present: Lord Puttnam (The Chair); Lord Black of Brentwood; Lord German; Lord Holmes of Richmond; Baroness Kidron; Lord Lipsey; Lord Lucas; Lord Mitchell; Baroness Morris of Yardley; Lord Scriven.

Evidence Session No. 3              Heard in Public              Questions 31 – 44

 

Witnesses

I: Alex Krasodomski-Jones, Director, Centre for the Analysis of Social Media, Demos; Caroline Elsom, Senior Researcher, Centre for Policy Studies; Rachel Coldicutt, CEO, Doteveryone.

 


17

 

Examination of witnesses

Alex KrasodomskiJones, Caroline Elsom and Rachel Coldicutt.

Q31            The Chair: Good morning and welcome. As you will know, this session is open to the public. A webcast of the session goes out live and is subsequently accessible via the parliamentary website. A verbatim transcript of your evidence will be taken and put on the parliamentary website. You will have the opportunity to make minor corrections for the purpose of clarification or accuracy. Perhaps you would like to introduce yourselves for the record, and then we can begin with the questions.

Alex Krasodomski-Jones: I am the director of the Centre for the Analysis of Social Media at Demos.

Caroline Elsom: I am a senior researcher at the Centre for Policy Studies think tank.

Rachel Coldicutt: I am CEO of Doteveryone.

The Chair: Thank you very much indeed for being here.

Q32            Lord Black of Brentwood: I have a nice, simple and broad question to start. How is technology shaping democracy in the UK?

Alex Krasodomski-Jones: I suppose the question is which technology. Our focus is primarily on digital technology and the way the internet, and perhaps more specifically social media, is changing. If I were to summarise it in two ways, one would be about the pace and speed of what has been called “the great acceleration”, this idea that everything is moving much faster. The news cycle is accelerating, with impacts on people’s ability to keep up. The amount of information out there is far greater than any single human can take in at any one moment.

That has a secondary effect, which talks to some of the questions about political advertising, of a breakdown in what I might call a common reality, in the sense of a common set of facts, on which democracy sits. If we cannot agree on the basics of what is going on, it is very difficult to compromise on those facts. That is a key symptom of digital change.

Caroline Elsom: I am glad you mentioned the great acceleration. That is the title of a book written by the director of the Centre for Policy Studies, Robert Colville, which is exactly about how the internet is speeding up all things, particularly when it comes to democracy.

The main thing to say is that technology is shaping democracy in a really positive way on the whole. People are now walking around with a printing press, a broadcast station and a place of assembly in their pockets, if they have a smartphone. People can now participate in democracy in a way that was not possible five or 10 years ago. That is inherently positive. People can express themselves and challenge or criticise ideas. They can access democratic debate much faster. You do not have to wait for the 6 o’clock news or the 10 o’clock news to know what is going on in Parliament on any day. It is instant. It is on Twitter. You can immediately interact with that and have your say on what is happening.

It is also important in the way politicians can interact with their constituents or laws can interact with the outside world. You can instantly get to a particular group of people who are interested in a democratic issue, no matter where they might be. You can get that message out there in a way that has not been possible previously. On the whole, tech is shaping democracy in a very positive way.

Rachel Coldicutt: I have a slightly different view. Business has been allowed to shape a lot of the infrastructure of news and information communication. It has done so at a speed that governance is not able to catch up with. The space that the state held is shrinking and the influence of business is moving into areas where it has perhaps not been as intently felt.

In terms of how that is changing the public, you have two extremes. There are people who have constant, nonstop performative engagement with news, who are using up all their democratic energy in simply trying to understand the things that are happening, which is then not really turning into traditional political activity. At the other end, Reuters has shown that 35 per cent of people are no longer engaging with news at all.

We are in a world where it is very hard to understand what is happening at any moment. It is very possible for all of us here to have a glimpse of a completely different reality at any moment of the day. That displacement has meant that it has become very easy to manipulate. We have all probably experienced days in the last four years where there has just been too much news, where the world at 8 am has looked completely different from the world at 8 pm, and it feels like a year has passed in a day. The space that public service and governance hold has shrunk a lot.

Lord Black of Brentwood: That is a very fair point. The world can change in an hour, as we have seen in recent times. I am very glad that you mentioned the positive side of digital technology. That can often be overlooked, because we are looking at the harms that happen all too frequently.

One aspect that I would be grateful for your views on is online abuse. Is the level of abuse that is directed against politicians debasing politics so much that it puts people off coming into it in the first place?

Caroline Elsom: It is putting off quite a lot of people, not even putting themselves up for Parliament but engaging in politics at all, because all they see is nastiness. If you look at the replies on an MP’s Facebook page or Twitter, all you can see is pretty nasty content, rather than people actively engaging in positive or constructive debate.

That is not necessarily a reason to sweep out the good with the bad. There are plenty of bad things online, but there is plenty of very positive criticism as well. It is about encouraging people to take responsibility for what they are putting out there. As part of the moderation process, there could be systems for up-voting positive and constructive criticism, rather than leaving it all together in a big thread on Twitter.

Q33            Baroness Morris of Yardley: You talk about the pace and scale of it, and the amount of things out there. That is undeniable. If you compare this with campaigning and politics in the predigital age, are we doing the same things in a different way or different things in a different way? Does that make sense? If you take any one thing, I can make an argument that it used to happen when I first stood for Parliament, in just about the predigital age. Are you saying that, yes, it used to happen, but it is now pacier and quicker, in which case there is an issue about whether it is qualitatively better or worse?

The other argument is that digital technology allows campaigners, politicians and politics in general to do different things to have impact. I just wondered where you thought the balance lay in those two approaches.

Rachel Coldicutt: I have two thoughts. First, there are lots of perfectly good rules that are not being applied to the internet. In particular, the responsibilities of individuals in public life ought to be the same online as off.

Microtargeting is a very different issue that comes up, and there has not been enough diligence on that. Looking at the timings involved in the Cambridge Analytica debacle, it took nearly three and a half years for the Information Commissioner to go in and seize the data. The amount of activity that could have been happening in half an hour is extraordinary. Specifically, there need to be new rules on data and targeting. Offline rules about the transparency of ads need to be transferred. It would be possible to have the infrastructure there for external scrutiny, but it just has not happened.

The Chair: That is a very natural segue into the question from Lord Lipsey.

Q34            Lord Lipsey: The regulatory scene in Britain is at best patchy and at worst nonexistent, although that is not a totally bad thing because it gives the media the freedom of expression that a regulator might cut back on. What do you think of the present regulatory system? Where would you strike that balance between freedom and regulation?

Alex Krasodomski-Jones: This was the focus of the Online Harms White Paper. One proposal was to have a new regulator. Whether that would sit within Ofcom, we do not know. Going on what was presented earlier this year, we have a significant way to go in regulating these companies. It is a challenge, given how international they are. We still have not answered the question about what these technology companies actually are. The oldhat debate is whether they are a platform or a publisher, and the answer appears to be neither. Answering that question will be key to empowering a regulator to go further.

My assessment of the current state of play is that the ICO remains woefully underresourced. It seems to be the consensus that our electoral laws are fundamentally out of date for the digital age. In the same way, the ICO, under the existing regulation, is very underresourced and needs more resource.

Caroline Elsom: That is certainly the case. There are at least five regulators working in this space already. If we are talking about either extending the powers of Ofcom or bringing in a whole new regulator, there is a real risk that we end up overregulating and damaging the really brilliant tech community in the UK. We have some of the best and brightest businesses coming to this country to set up tech firms. If we set a playing field where any new players also have to have factchecking, moderating and verification processes, that is a much bigger burden on smaller companies than on the big tech giants which the proposals in the online harms White Paper go after.

It is really important to bear in mind the unintended consequences of perhaps going too far with regulation. New platforms might emerge in the next five years that are even better at enhancing and informing democratic debate. We may never see those companies if we set the bar too high for entering and being a platform for helping democratic debate.

Rachel Coldicutt: First, in our research we have seen a real lack of digital expertise and confidence in the regulators, which quite often have to look backwards for longitudinal evidence. There is no feel for it. “Something dodgy is happening. I will ring up”—we need to get to that kind of pace so that things happen in real time.

There is a real issue that very often, when there is a problem, government looks to the platforms to solve it. In the last week, Jack Dorsey has said that the internet is a nation state and Mark Zuckerberg has had opinions on free speech. They are moving into government and governance, and government is not moving into the thing it knows how to do, which is to look after its citizens. That is a failing.

Lastly, there is a real issue about listening to people. At doteveryone, we have done lots of work on redress. If you have experienced a harm online, to whom do you raise it and how? There needs to be more evidence gathering on the fly that is based on things that have happened to people as opposed to studies that come out years afterwards. Fundamentally, there is a lot there, but there is not enough communicating or enough resilience.

Q35            Lord Mitchell: Alex, you mentioned the speed of acceleration of digital in general. It seems exponential to me; it is growing faster and faster and faster. Here are we, as a legislature, trying to keep pace with what is happening, but I have the feeling that we are always in the wake. It is very hard, and they are getting further and further away from us. What can we do to make sure that we are up with them and can protect our population?

Alex Krasodomski-Jones: It is hugely challenging. You are right. I appreciate you describing it as protecting our population. Too often we hear the argument that, because technology moves so fast, we must look to alternatives to regulation and law, such as resilience, education and boosting civil society. Those are fundamentally important. Nevertheless, the power of the state is hugely important here. You keep up with the pace of change by resourcing a regulator appropriately and doing what can be done to futureproof technology.

If, as we have heard from Baroness Morris, at least some of these activities are historical activities that have been part of politics for an awfully long time, surely it is possible to think about that going forward. In the meantime, you just have to move faster.

Rachel Coldicutt: There is a case for just having people who understand how it works. It is not a mystery. It is not terribly hard. We would not be here if this were about pharma; there would be lots of people to draw on. The issue is that many of the people who have the expertise are in industry, and it needs to be attractive for them to move into regulatory roles.

Q36            The Chair: Caroline, are you concerned that the large digital companies are possibly hiding behind free-speech arguments rather than using them legitimately?

Caroline Elsom: Yes, that is somewhat the case. That is not necessarily a reason to curtail the good work that they are doing in creating platforms for other people to have a voice. There is a case for us pushing for platforms to be more transparent about how, say, their targeting of advertising works, so that the people who are legislating can see exactly how their systems are working.

At the moment, if you click in a political advert on Facebook, you can click on the “Why am I seeing this?” button and see who is seeing the advert. Are the majority women, where do they live, and how old are they? We do not see the other side, which is how their algorithm works and what characteristics people advertising online have put in to reach you. It is a very difficult balance. If there was an easy answer, we would already have arrived at it.

I am slightly cautious about the tendency to bash the big tech companies in forums such as this, when they are providing a great platform. As I said at the beginning, in many ways they are providing a positive platform for people to share their views, engage, criticise and be part of the debate.

Alex Krasodomski-Jones: We have heard the words “good” and “positive” many times this morning. I urge you think about who decides what “good” looks like. I believe that the UK, and to some extent Europe, has ceded control over what “good” looks like. What is our vision for a positive internet?

We have spent the last 10 years talking about online harms and all the bad things. We have been very reactive to what has been going on, and as a result we have lost power. There are visions of a positive internet coming out of Beijing, Moscow and Silicon Valley. They are deciding where the boundaries of free speech lie and what democratic participation should look like online.

I feel that we are falling behind here. We do not have a vision for a good internet that reflects the principles of liberal democracy as we understand them. That is a real weakness, because we cede control and suddenly we are not the people making decisions about limitations or otherwise on free speech.

As a side note, I would again talk to the question of how international these platforms are. Facebook, Google and Twitter are under great pressure on the US side to be more liberal in the policing of speech, while in dealing with the EU or Germany there is a lot more pressure on them to police things. They are juggling all these different things at once. We have to find our path through those different commitments with each of these big platforms.

Q37            Lord Mitchell: I would like to talk about transparency. Is greater transparency needed for online political campaigning? If so, what would it look like? Should there be further regulation of online political campaigns beyond transparency? What form should this take? Again we are comparing what has been happening in the press or television advertising previously with what we have today. I would be interested in your comments.

Alex Krasodomski-Jones: Yes, absolutely. Again, we need to set the standards for what we expect. We have already heard about ad archives and a collection of all the political adverts that are going out. This is vital in a world of targeted advertising.

To talk to Baroness Morris’s point about whether this is the same thing but newer or something fundamentally different, it is fundamentally different when a million different groups receive a million different messages coming from one party. We used to have a partypolitical broadcast that went out on BBC2. You had one shot; you had your pitch. Now you have thousands of micropitches, each going to different people. How on earth are we going to keep that accountable without a sense of what is actually going out? That is fundamental.

The other question about transparency is transparency to whom. As we have heard, the public need access, but researchers, civil society, regulators and the Government also require some kind of transparency. Over the past five or six years, we as a research unit have found our ability to keep track of what is going on in these online spaces, with a few exceptions, getting significantly worse as more and more tools and technologies that once allowed us to monitor these spaces and to have some sense of what is going on in these public fora are diminishing. The cynic in me says that it is because what is going on in these platforms is often quite problematic.

Caroline Elsom: I am perhaps slightly more optimistic than Alex in this field. The platforms have made a lot of changes over the last year. Because we have not had an election in that time, we have not fully seen how well these new tools will stand up to a huge, huge flurry of political activity online. It has been a very busy year for politics, but we have not yet seen these new tools stresstested in an election space.

Twitter, Google and Facebook all now have ad libraries so that you can go back through after the fact and see who has paid for adverts. People have to be verified now in order to put up adverts, so you can see where they are from and who is paying for them. Blue ticks are much more widespread on all different types of accounts. I believe that in the last week Facebook has expanded its ad library to include people who are putting out political posts about social issues. You do not have to be a formal party to be included in the ad library. Extinction Rebellion, for example, will be included; it is not just formal party structures.

Having said that, no, it is not yet perfect. As I mentioned briefly earlier, transparency is really important in showing how these platforms are working from the input side rather than just the outcome of how ads are being targeted. That is very important. In the US, there is evidence that ads are being pushed to particular racial groups. Because you cannot see what input has gone in to reach that particular racial group, it is very hard to know exactly how these messages are being targeted, so transparency is key in this debate.

Q38            The Chair: Caroline, you are right. A lot of this information will be available after the fact. Do you think that your view may change if, at Christmas, we found ourselves staring at a contested election?

Caroline Elsom: Quite possibly, and I am ready to have my mind changed on the basis of what information comes out as and when the next election happens.

The Chair: If that occurs, where should politicians put their energies to make sure that could not happen again?

Caroline Elsom: We would need to look very seriously at how we compel social media platforms to be much more transparent about how their algorithms are working and, as I said, how people’s inputs create the outcomes that we can already see in ad libraries or by using the “Why am I seeing this?” function.

It is very difficult, because so much has changed over the last year. Most of these ad library functions came in only in January. They have been tweaking them all the way through this time, too. They did not come out as a perfect tool and they are by no means perfect at the moment. Many gaps have been identified. The Brexit Party, for example, has been very slow to show up on the ad library. They are not perfect; they are still tweaking them.

Yes, I may change my mind as and when an election happens, if it becomes very clear that the functions are not working and it is not transparent enough for you to be able to see who is funding ads and how people are being targeted.

The Chair: To reinterpret Alex, we are operating in a dangerous time.

Caroline Elsom: Yes, absolutely, it is a dangerous time. How dangerous it is can also be overplayed. With the flurry of people being very worried about Russian interference in elections, particularly in the US, the evidence so far Alex has directly worked on this, so perhaps I will let him explain more about it does not seem to suggest foreign interference in the UK. Take the flurry of activity after the Brexit referendum. The majority of the tweeting and retweeting activity from Russian accounts happened on 24 June. So on the information available, it does not appear that there was big foreign interference in that.

Rachel Coldicutt: First, any regulator needs to be independent. It will never be a politically good time to do this. Waiting for political agreement is likely to make it never happen.

A lot of time is captured between First Amendment types of technology. I speak to lots of people at technology companies in Silicon Valley who do not know that the First Amendment is not a global thing. We need a framework of rights in the UK whereby we have a way of navigating whether and how the right to freedom from abuse is to be held more highly than the freedom of speech. We have talked about ad libraries and those things, but this is not all about the platforms. It is about the people who are posting the adverts and politicians who are behaving badly. There needs to be accountability for those individuals and parties, and it has to be meaningful and real.

Lastly, we ought to remember that, to go to Caroline’s point, none of YouTube’s or Facebook’s activity is done out of altruism. YouTube made $116 billion of revenue last year. In the next 12 months, Facebook will make $2 billion from political ads in the States. As a part of the mix, there is not only transparency to think about when thinking about how to incentivise better behaviour but doing something interesting on revenue.

Baroness Kidron: I quickly wanted to ask you about your thoughts on the perception of transparency and trustworthiness. As you are speaking, I understand you to be saying that it is quite complicated to say exactly when interference did or did not come, but there is a perception that there was some interference. I would be interested if you could unpick that a little and say how people trust the democratic process. You say that you cannot get at the data. Does that worry you? Do you see what I mean? There is a sense in which we do not know. Is perception a big player in this? Maybe it is not.

Alex Krasodomski-Jones: Yes, we do not know what we do not know. There has been hysteria over some aspects of what took place. The Russia example is a very good one. Russia certainly tried to influence Brexit, or I think it did, but not online. It was presumably through much more traditional influence work, whatever the investigations are into money and all the rest of it. It is not something I know a huge amount about, but our and other people’s research has suggested that the online aspect of this disinformation or misinformation push, targeting that in particular, was not hugely significant.

But we do not know what we do not know, and that is not acceptable. I cannot accept the idea that Facebook, Google and Twitter can tell us what an ad library or transparency looks like. That is for us to decide. I am no pessimist, I am an optimist, but I want us to set the standards for what “good” looks like. These ad libraries have been stresstested in elections across Europe and the globe. Either they do not exist or they have been found woefully inadequate. Twitter has announced that it will probably not run ad libraries for the next big democratic event, which is a shame, because Twitter is normally very good on transparency.

We need to understand quite clearly, as Rachel says, who can tell us what we need to know. I would point the Committee at organisations such as Doteveryone. Mozilla has a fantastic document that says, “This is what a good ad library looks like”. Take it to the companies and say, “Do this”. We cannot just sit here and say, “Well take what we are given”. That is not how this can work.

Q39            Lord Scriven: This question goes to what you have just said, Alex, and what Rachel said earlier about the Bill of Rights. Are you saying that we are spending too long on who, when and how and not concentrating on what needs to be regulated? If that is the case, how can legislators and government, from what you have seen internationally, help with what “good” looks like, and the balance between freedom of speech and what is not right? You are all urging that we focus on the structure rather than the base, which is what we are actually regulating, as well as how it is regulated.

Rachel Coldicutt: Politically, quite often the problem is that technology is seen as linked to prosperity and innovation. Nobody has a brief within government to look at technology holistically. It is carved up into bits that are intimately related but then treated totally differently.

We have done some thinking about what a social contract looks like now. Unfortunately at the heart of this there needs to be political choice. Are we going to be libertarian? Are we going to leave everything up to individuals so that it does not matter if you do not understand what is happening, or if you have to organise your own healthcare because it is all there and it is up to you? Are we going the other way and thinking about technology as a means of the state? Fundamentally, there is a spectrum there and somebody has to choose, whereas now both those things are happening and neither can work.

Lord Scriven: I hear that message from all three of you in a way. Are you saying that without that decision on what “good” looks like, as you described, Alex, we will not get to the fundamentals of being able to regulate and having a playing field for the platforms that balances the rights of individuals as well as what it is like to work within a democratic society? Is that what you are saying is fundamental and is being missed in this whole approach?

Rachel Coldicutt: There is a bunch of relatively easy, tactical, obvious things that ought to happen anyway, which could carry on. A lot of those are in the new paper that DCMS brought out earlier this year. There is a lot of trying to solve the whole thing, which means that nothing is happening. There are tactical things that could happen, and there has to be a vision overall.

Lord Scriven: What key tactical issues do you see as a priority that must be tackled from the big picture in this field?

Rachel Coldicutt: We have all spoken about ad libraries, data stores, transparency and the possibility for scrutiny. To Alex’s point, if standards are created which technology companies then have to be compliant with, it takes away a lot of the talking and the getting out of it. I would also like to see more direct scrutiny of the political parties and politicians, particularly in relation to political targeting.

Q40            Baroness Morris of Yardley: This is a question on digital literacy. I assume that you all think we need it, so I will take that as given. We all worry about the lack of political participation at the moment. That is a national worry. How important is digital literacy in improving political participation in particular? Where is it standing in those things?

Secondly, what would meaningful improvements actually look like? We talk about it a lot, but have you broken it down into the elements that would need to happen if we were to have a population that was more digitally literate?

Caroline Elsom: The work on digital literacy, particularly in schools, has rightfully had a lot of focus on things like teaching kids how to code and the nuts and bolts of how the internet and technology work. What has been missing in this is looking at the psychological side of things, so that people understand how they are being targeted with ideas and how to scrutinise and think critically about the information they are being presented with. The Nordic countries are doing quite well on this at the moment, particularly because their education on digital literacy really focuses on critical thinking and teaching people to be more inquiring about what they are seeing.

To the second part of your question, about meaningful improvements, we now need to see the educating of the educators. Very little is known about how digitally literate teachers are. This is not at all to bash teachers, but they need training on digital literacy, and not just ICT teachers or PHSE teachers but teachers of every subject. Teachers need to have specific training on how they can promote digital literacy. If they are sending children home to do homework online, it is not just about telling them not to answer their essay by using Wikipedia but about asking them to be inquiring and to look at who set up or who is funding the websites they are getting information from. It is about teaching an inquiring mind, as much as it is about teaching children very technical skills for using the internet.

Rachel Coldicutt: I do not know. The problem we have now is that things are very easy to use and hard to understand. If I had to assert the right to explainability for every algorithm I have encountered to date, it would probably take me a year to understand them.

Although it is important that people are equipped with literacy, that is only meaningful in a system where we can seek redress. Currently, I could spend a lot of time understanding things, but I have no ability to act on them. There is a real problem here. We are in a moment where national governments do not know what to do with Facebook. That cannot be transferred to individuals. Digital literacy is important in a mix of things, but it can help us only so much, particularly as we are getting into a world of deep fakes.

We are currently living in a world of entirely mixed messages. Screen time is bad, but people need to do more things on the internet. Going back to the point of vision, nobody really knows what we are educating people to do or why. Not only does there need to be better redress, but there needs to be better public service infrastructure, particularly for news. I would like to be able to know where to look at any moment to understand the things that have happened. There is a danger in putting it all back on to individuals.

Alex Krasodomski-Jones: I am with Rachel. I do not really know. I would encourage the Committee to look at work by the Institute for Strategic Dialogue, Doteveryone and Glitch. We at Demos did some work a while ago looking at this. Three or four years ago, our message was always to be more cynical. That rather backfired, because now the culture appears to be that you just question everything and nothing is true. It is the Gerasimov doctrine or something like that. That is not good.

My focus would be elsewhere. I do not know whether it is contained in digital literacy, but you alluded to it: what is the culture of participation online? How does one be online? This is not about skills or critical thinking. Well, it is about skills and critical thinking, but it is also about laying out a sense of how you participate as a citizen in an online space and online community. How do you take part in that? We are not there yet. Maybe that takes time, because these technologies are reasonably new.

The best work I have seen I feel has been trying to tackle that question: how should I be an active participatory citizen in this online space? Is that space set up in such a way as to make me feel a sense of civic pride and a pride in being part of this community?

Baroness Morris of Yardley: That sounds attractive, and it might happen over time. When you were answering that question, I do not know whether you were thinking about young people or old people. If we are worrying about digital literacy, the group that is not digitally literate is old people, who will never engage online but who have to understand democracy in an online world.

I do not know what to do about that. We cannot start persuading the section of the population that is not digitally literate to get so far along the road that they would be able to both master and understand it. I accept what you say about young people; your analysis is right on that. I do not have the answer, but have you given that any thought?

Alex Krasodomski-Jones: The kids are all right. I mean, they are not, but—

Baroness Morris of Yardley: At least they have the skills to use the things.

Alex Krasodomski-Jones: The most obvious example is fraud and scamming. The most vulnerable people on the internet at the moment are those like my parents, who came into the internet a little later and have not grown up in this world. They are in fundamental need of support, redress—that is another good one—and, where possible, some kind of upskilling.

Rachel Coldicutt: There is a duty for service providers, particularly in financial services, to protect their customers.

Baroness Morris of Yardley: Yes, there is.

Rachel Coldicutt: I speak as a person who changed the remit of our organisation from being about digital literacy to understanding the impact that technology has on the world more widely, because, as an individual, knowledge is not going to give you the power that you might need to have control here.

The Chair: I am a bit worried about time. Caroline, your answer was very helpful, but the submission from the CPS is quite thin in this area. Would you like to have a look at it, maybe bolster it and come back to us?

Caroline Elsom: Yes, absolutely.

The Chair: Thank you very much indeed.

Q41            Baroness Kidron: I would like to know what role civil society should play in ensuring democratic flourishing. I wonder whether, in your answers, you could also address the issue of asymmetry of resources.

Rachel Coldicutt: One problem is that a number of public services and functions of government could be more effective. At the moment, I run a charity of 15 people. We go toe to toe with the policy people from the big technology companies. Actually, there is an extent to which it should not be up to civil society.

As well as redress, which I keep going on about, I would like us to think about the role of public service broadcasting or public service news. What is the role of the BBC in this? I do not know whether that counts as civil society. It feels to me as though it is rather underused and underpowered. We are leaning on lots of tiny organisations when, with a movement of resource and expertise, there could be more happening both centrally, within government and regulators, and elsewhere.

Caroline Elsom: What you said there about news is really important. I guess it depends what you mean by “civil society” in the question, but there are already organisations moving into this space, such as NewsGuard, which has a browser extension that offers nutritional labelling next to news sites. In the same way that you would see a traffic light system on food that you buy, they put a little shield up that can be green or red. If you hover over it, you can see news organisations rated on whether they repeatedly publish false content, whether their headlines are deceptive, whether they are trying to generate clickbait deliberately, whether they are gathering information in a responsible way and whether they are revealing if there are any conflicts of interest in who they are hosting on their site.

There is a really important point here about identifying news sites that are putting out content with a pattern of behaviour that is repeatedly poor rather than looking at factchecking individual articles. Perhaps NewsGuard may not fall exactly under civil society, but it is about encouraging responsible journalism, which is part of the civil society debate.

Alex Krasodomski-Jones: I do not have a huge amount to add. You are right to identify the huge asymmetry in resources. The Government could perhaps play a role in readjusting that. I have heard talk about a digital services tax. More needs to be invested in civil society organisations that are trying to understand this space or provide policy research—obviously I would say that—but also in those that are trying to provide services of redress or educational materials. They move faster. Especially in a space that moves as fast as digital technology does, you need people who are agile and who really get this stuff. Civil society, and to some extent academia, would be a good place to start there.

Baroness Kidron: On top of that, one thing that happens to those civil society groups is that they get bundled up with tech money and resource. Is there a role for government in supporting civil society more thoroughly so it can be a counterpoint to both tech and government? You are making the point about speed and expertise.

Alex Krasodomski-Jones: Yes, for sure. We have taken money from technology companies as part of our work, but I would much prefer that we were able to lean on a neutral source of funding. If the Government were able to provide that, it would be fantastic.

Rachel Coldicutt: Yes, absolutely. I probably spend as much time fundraising as I do researching.

Q42            Lord Lucas: What should best practice in content moderation that satisfies the duty of care look like? What role should the Government play in improving moderation processes?

Rachel Coldicutt: It is important to think about moderation as the last resort rather than a first resort. If we employ safety by design, bring more scrutiny to business models and think about how to create positive communities rather than policing after the fact, half the battle is won. One problem with moderating is that, if nothing else, a whole industry has been created of people who spend hours and hours every day exposed to completely horrific and terrible things. These people tend to be in Asia. They are not visible to us. They are an expendable entity.

It is much more important to put effort into codes of conduct and incentivising the platforms to have less extractive business models, rather than putting government funding into going in and tidying up after them. There is another issue here of the potential for algorithmic bias in automating moderation. I am in favour overall of a more regulated system upstream, in order that moderating is tidying up as opposed to solving all the issues.

Caroline Elsom: On the moderation point, we can already see how these policies have been used in places like Germany, with the NetzDG law. One in six of all Facebook’s moderators are now based in Germany since they brought in the new laws there. If you are a platform with more than 2 million subscribers, you have to take down within 24 hours things that are considered hate speech or face a fine of up to €50 million. You can see how platforms will move their moderators to where the law is, because they do not want to get caught out.

It is important for content that has been taken down to be kept somewhere and be publicly available so that people can analyse how well the platforms are doing at moderating content. If content is removed by a platform and archived in a back office so that only that company can see it, there is no space for academics and researchers to see exactly what has been taken down and how well people are doing with that.

There is also a risk with overzealous moderation that you can create opinion martyrs. Take, for example, Tommy Robinson. People made much more of a fuss over him because his content was routinely taken down. When I was thinking about this question, I was taken back to thinking about Nick Griffin on Question Time many, many years ago and the massive furore over the fact that he was given a platform on national television to talk about his views. The reaction to that was that it shone a light on exactly the sort of hateful views he was putting out into the world, and people reacted to that in a very negative way.

Exactly the same happens on social media. Some of the things that are said that are not very pleasant but perfectly legal need to be left online so that people can make up their own minds about whether a party or a political movement is the right thing to follow. If we simply take everything down all at once, we do not let people say, “Hang on. That organisation is really hateful and I don’t want to be a part of that”. There needs to be some moderation in what we moderate. I think that is what I am trying to say here.

Alex Krasodomski-Jones: On that last point, we come back to the question of the fundamental difference. If you are on Question Time, you have one shot; it is your partypolitical broadcast. In the online world, very often you inhabit spaces where you will be surrounded only by people who agree with your views and you will never be challenged.

This is a hugely interesting question. There are a number of ways to come at it. At the highest level it is about legitimacy. Who decides whether content stays on a platform? At the moment, it is the platforms, and that does not seem to be working, or at least people do not seem to be happy with the current situation. When a piece of content is removed, whether that was a legitimate thing to happen is not clear at the moment.

In these conversations we have spoken a lot about content. I would suggest that this is problematic. Obviously, there is illegal content. Whether it is CSAI, a terrorism offence or whatever, if it is illegal content it has to be removed, and there are processes and systems to do that. We should be careful about treating other kinds of online harm in the same way we treat content. We will play a never-ending game of whackamole if we try to remove stuff that is not illegal but is potentially offensive or harmful. We will never get there.

Lastly, we just do not know what the best system for moderating an online space is. Think about how we approach policing. Lots of different models for policing have been tried in history, and the one we have ended up with was a sort of attempt at community policing, where you have a policeman being part of a community. Compare that model to the Facebook model, the Google model, the Twitter model or whatever it is, and it is a very topdown and almost authoritarian approach to content.

I am an Aston Villa supporter, so I have an Aston Villa fan forum. On that forum, we decided on a couple of people who would be responsible for that community. We elected them, and they were the sort of bobby on the beat for that thing. Everybody knew them and they were part of the community. People respected their decisions. The whole system is fundamentally different.

I would encourage tech companies, civil society organisations and research groups to explore I am going to say it again what “good” looks like when it comes to content moderation systems and which ones are better than others.

Q43            Lord Scriven: I was listening to you until I heard you were an Aston Villa fan, Alex.

Alex Krasodomski-Jones: I am sorry.

Lord Scriven: We have talked until now about democracy particularly at the national level. We have talked a lot about national issues and the framework. I come from a local government background originally. I want to know how digital technology can provide opportunities to improve people’s engagement, not just the information they are getting, et cetera. How can it improve people’s engagement with democratic politics at all levels? Democratic politics is not just about voting. You can be an active citizen at all levels of government. Should the same approach be taken at local, regional and national levels to help to achieve this?

Rachel Coldicutt: You have a very different opportunity at local level to engage people with the fabric of life and think about what infrastructure means. If we think about the things everybody cares about, such as bins, rather than technology being connected to discourse that people may have little time for or interest in, what if the services people use are demonstrably improved because of the technology? What if people on the same road can use technology to coordinate the day on which glass is collected as opposed to paper?

Rather than making this about debate, can we think about how technology might be used in communities to bring people together? In a world where people speak to their neighbours on WhatsApp and Facebook, how can we start to create new norms? We have seen that 35 per cent of people are not engaging with news. They are not going to start engaging with local news, but they may start engaging differently with people who live near them.

Caroline Elsom: There are some really good examples of how this is working at a local level. Take Reykjavik in Iceland. After the banking collapse in Iceland and the political storm that followed, the council realised that trust in politicians at a local level in Reykjavik, as well as a national level, was really poor, so they came up with Better Reykjavik and Better Neighbourhoods in Reykjavik.

The idea is that a citizen can go online and submit an idea to the Reykjavik Construction Board, and then the Construction Board has to do a feasibility study. The projects that are feasible, which pass all the usual tests, are put to a public vote. Not only can people have the ideas organically, but they are then voted upon. Let me give you some examples that have come out of this scheme. They put in a youth centre in a disused power station, which had been consigned to brownfield land that would wait for a developer. The community was able to build on that land. They have also built shelters for homeless people under this scheme. There are tentative signs that it is starting to rebuild trust in politicians.

There are all sorts of examples of how this working around the globe. Yes, Reykjavik is a much smaller city than London, but it could translate into towns across the country where the population is smaller and coherent community groups can put in these ideas, which can be put through feasibility studies. There are examples from all around the world where this is happening already, and we can do similar things here.

Alex Krasodomski-Jones: The web changed everything; it changed the way we shop, the way we communicate with one another and the way we fall in love. The one thing it has not changed is the way we make democratic decisions. The contrast between our digital politics and our lack of digital democracy is becoming ever more jarring. That is a real problem and one that will become ever starker, as people feel they are participating politically but are unable to participate democratically.

That can change at a local level. Caroline’s example is a really good one. My colleague Carl recently stuck something on the BBC about vTaiwan, which is worth a look. It looks at how Taiwan implemented a system called Pol.is to do more democratic, participatory decisionmaking.

What the contours of a change to our democracy look like does not really depend on the technology. The technology needs to be built to solve whatever problem we have. We just build the kit. The will has to be about how we update democracy in the first place in such a way as to encourage people to participate. We need to start at a local level with a clear sense of what our ambitions for improved democratic participation would look like and then get a load of techies to build the kit for that.

Q44            Lord German: This is the magic wand question. If the Government could choose to do one thing to ensure democracy is supported and not undermined in the digital world, what would it be? I do not care in which order you want to wave the wand.

The Chair: Can I make a suggestion, as we are tight on time? If you would like to write to us on that particular question, it would probably help us and may well help you, rather than trap you into it. What is the one thing you would like to see us do as a Committee?

Alex Krasodomski-Jones: We need urgent electoral reform around political advertising.

Caroline Elsom: To the digital literacy point I made earlier, make sure that teachers are equipped to teach digital literacy in schools.

Rachel Coldicutt: We need a joined-up ministerial brief across the piece.

The Chair: It would be very helpful if you could enlarge on those three points in writing. Thank you all very much indeed. I am sorry we have kept you. I did my best to keep to time, but I failed. Thank you.