Corrected oral evidence: Consideration of government’s draft Online Safety Bill
Thursday 21 October 2021
11.50 am
Watch the meeting: https://parliamentlive.tv/event/index/4df31e2e-50c7-4b92-a745-a5fddc227498
Members present: Damian Collins MP (The Chair); Debbie Abrahams MP; Lord Black of Brentwood; Lord Clement-Jones; Lord Gilbert of Panteg; Baroness Kidron; Darren Jones MP; Lord Knight of Weymouth; John Nicolson MP; Lord Stevenson of Balmacara; Suzanne Webb MP.
Evidence Session No. 9 Heard in Public Questions 143 - 147
Witnesses
II: Gavin Millar QC, Specialist in Media Law, Matrix; Alison Gow, President, Society of Editors (appearing virtually via Zoom); Matt Rogerson, Director of Public Policy, Guardian Media Group, and News Media Association; Peter Wright, Editor Emeritus, DMG Media.
USE OF THE TRANSCRIPT
21
16
Gavin Millar QC, Alison Gow, Matt Rogerson and Peter Wright.
Q143 The Chair: Good morning. My apologies to the witnesses for our late start. We overran and were also delayed by a technical break of about 15 minutes during the first panel; hopefully, that will not repeat itself.
Inherent in the drafting of the Bill, it would appear, is the idea that journalism is of inherent democratic importance and, therefore, journalism and journalists’ organisations have protections from the application of the duty of care principles in the Bill. Do the members of the panel feel that the protection that exists for journalistic content and journalist organisations is sufficient and the definitions are clear enough, and do you have concerns about suppression of journalistic content as a consequence of this Bill becoming law? Perhaps Peter Wright could start us off.
Peter Wright: Yes, indeed, I would be very happy to. My reading of the Bill is that we are protected in the first case by the fact that the duty of care does not apply to our content, both on our own websites and when it is distributed on search and social media. There is no obligation on the platforms to censor our content. However, the problem comes because there is also no compulsion on them not to. Clearly, the authors of the Bill envisage that they will block and take down items of content because the journalistic protections are there, and the journalistic protections specifically apply to news publisher content.
We have to look at the journalistic protections and ask how effective they are. In my view, they are not effective. The problem with any sort of moderation of content by social media companies is that they will do it by algorithm. The Bill puts them under threat of very, very heavy penalties, possibly even criminal penalties, as we heard yesterday, and their inevitable response to that will be to set the parameters of any moderation they do as widely as possible. It is human nature; they will want to protect themselves. They will use a very blunt instrument. I saw in Google’s submission that it says its algorithms are very poor at understanding context and it is going to find moderating journalism particularly difficult. We also know from articles in the Wall Street Journal over the weekend that Facebook’s artificial intelligence is very poor at moderating this type of content.
What does the Bill demand of them? That they take freedom of expression into account. That can mean almost anything. It is left to the platforms to determine how they do it, what rules they set. From what I have seen of how Facebook has been trying to moderate journalism in the USA, it is doing it for completely different reasons. It had an advertising boycott last year that has prompted it to do this. It is arbitrary; it often fails to understand the nature of the content; it is imposed without any sort of process; it is not in line with English legal thinking on journalism, which is that the editor must take responsibility for what he or she publishes and pay the consequences afterwards; and it is blocking before people have had an opportunity to read it. They also outsource their decisions to fact checkers, of which there are large numbers, and some of which appear to be single-issue lobby groups in another guise.
I am hugely sceptical about it. In my view, the exemption needs to be made a positive exemption, so that the duty of care obliges platforms not to moderate journalistic content when it is produced by recognised news publishers, for which there is a very good definition in the Bill.
The Chair: On that basis, it is because there are other avenues available to challenge a publication over its content.
Peter Wright: Indeed. We are fully subject to the law. We are also subject to regulation. In the case of our company, our titles are regulated by IPSO and our journalists are obliged to follow a code of conduct. Other companies have slightly different arrangements. The Guardian follows the same code of conduct as us but is not a member of IPSO. Redress is available. We only employ trained journalists. We have teams of lawyers. We are not the problem this Bill is trying to address.
Enormous energy has been spent over the last decade debating regulation of the press, and enormous energy continues to go into refining and improving defamation law. All the necessary procedures for dealing with journalistic content when produced by proper responsible news publishers are in place. This raises the danger of setting up an entirely separate and different system of regulation. We end up in a position where an article may be contravening the law, not contravening the editors’ code of practice, but then contravening an internet company’s terms of service written in California, which is entirely different.
The Chair: Alison Gow, do you agree with Peter Wright that there should be a positive exemption for journalism from the provisions of the Bill?
Alison Gow: Thank you for inviting me. I agree with Peter. I think he spoke very fully on the matter.
I come from a regional background. I have always worked in local and regional news; the implications there are really wide-ranging. The time and effort it takes, even now, to resolve issues that arise from takedowns should not be underestimated. No matter how good a relationship you might have with a platform, and we have good relationships at my employer, Reach plc, not all regional brands have the same level of access. It is so time-consuming to go back and submit and reargue points around it. Quite often, the game will not be worth the candle, especially in smaller news teams—let alone the idea of there being more layers.
As Peter said, algorithms are an incredibly blunt instrument. We know that by the time human moderation gets involved the issue has very often moved on. They are incredibly overworked and it might be days before you get a response, by which point the story is lost. With the time it takes to resolve an issue currently, and our knowledge of how differently the platforms can interpret what is news, what is of public interest, what is of interest to the public, more layers will stifle even further how we serve content to readers.
The Chair: Matt Rogerson, would you agree with what the other witnesses are saying?
Matt Rogerson: I would. I think the Government has tried in their intent to exclude premium publisher/news publisher content from the scope of the Bill. As to whether they have achieved that in the drafting, the NMA has views on that which it has submitted to the Committee.
I think the positive duty is essential because the reality is that the platforms’ track record shows they are not the biggest fans of news. On Facebook we are going to see news transitioning out of News Feed and into a completely separate news tab, which could be good for news publishers, or it could be the equivalent of the BBC stopping its 6 pm news bulletin and saying it is doing a news channel. We do not know what role news will play on those platforms in the future, so having a positive duty for them to carry news and to not block news is really essential. There is a further question about whether there should be a positive duty about them carrying high-quality news, but I think that is separate in the Ofcom view of what the future of media plurality looks like.
We have to deal with fact checkers on Facebook. I am sure they are well meaning, but often they comprise academics rather than journalists. We have had a few instances over the past few years in relation to climate articles, where a piece of journalism based on an academic report was perfectly valid and accurate, but the fact checker found that we did not unpack a particular concept in a way that they thought a reader could understand. Even though we had discussions with the fact checker to try to resolve the issue, once it was labelled, we had no conversation with the fact checker before that judgement was made. They just label on Facebook that the article is misleading or false. That creates a false perception about journalism on that platform, which we are unable to correct. Conversations about that particular article started in October last year. If you share that article on Facebook today, it still says that the article is misleading or false. You talked in the last session about media literacy and digital literacy. If you have that sort of blunt labelling by people who do not understand journalism, it is extremely problematic.
The Chair: Thank you. Gavin Millar, if a social media platform decided that, in its own opinion, a news article was harmful but not illegal and decided to remove it, despite the general presumption against journalistic content being included in the duty of care provisions, do you feel that would be likely to attract legal challenge in the future?
Gavin Millar: If the Bill were to become an Act in this form, there is a lot of scope for legal challenge. You can have macro or micro challenges to pieces of legislation, as you know. With the micro challenges to some particular aspect of the Bill, people come from different angles. Some people may want to challenge the way it interferes with their privacy; and others perhaps, as I think is much more likely, will want to challenge the way it interferes with their freedom of expression.
Those challenges to not the whole operation of the Bill but to a big chunk of the Bill would look at how the Act had worked in practice, as indeed would a macro challenge. You would have evidence about the impact of the Bill. It would cover more or fewer of those problems, depending on the way the challenge was framed. I am not saying they are straightforward to bring, but I think they will happen, and lawyers will identify strong cases for challenges to interferences with privacy and interference with freedom of expression. The premise of the challenge is that this is a state measure, so it has an impact in interfering with those rights, for people, groups of people, or publishers, whatever it may be. It invites those sorts of challenges.
Would you mind if I answered the original question in a slightly different way from my colleagues? I do not have their industry interest in this. Journalism is a type of freedom of expression. We always defend journalism under the banner of freedom of expression. It is not just any old type but a very important type of freedom of expression, both in our common law and in convention law, and indeed in America under the First Amendment. It is strongly protected because it is high-value speech.
The question normally in legislation and law regulation is, starting with the presumption that you are allowed to say it, and that it is strongly protected, how you justify particular interference with it. You give a degree of primacy, in fact a significant degree of primacy, to journalistic speech. We then have to stand back and look at this Bill to see how it approaches that problem. How is this legislation approaching the problem?
I have a fundamental difficulty, once you understand that premise about how freedom of speech and journalistic speech works, with the way this Bill is doing it. You could answer the original question on a very narrow basis, which is to look at Clause 14 and the bit in Clause 12 that deals with freedom of expression, and say they are a bit mealy-mouthed, a bit watery and it is only a duty to have regard to the importance of blah, blah, blah. There is no proper definition of journalistic speech.
A much more fundamental question is whether the approach in the Bill is right, given how the premise about the importance of freedom of speech is articulated. I find the introduction of the concept of the duty of care into this area extremely difficult as a lawyer. We have a duty of care in our common law. It is the basis of the law of negligence. It is very, very simple. The idea is that the defendant does not cause foreseeable harm, on the facts, to the claimant. It is not a qualified or conditional obligation on that person.
Here, the duty of care has been transposed into a completely different context, which is regulation by service providers. Unlike in the law of negligence, you have other people over there, the users, whose fundamental rights are being affected. Immediately, once the Bill adopts that approach, it has to create a conditional form of duty of care where you have to start balancing the interests of the people over there—the users, the journalists, the readers of the material. It seems to me that the Bill is constantly playing catch-up to try to do that in a slightly inadequate way, in Sections 12, 13 and 14. Basically, it is the wrong way round, and that is very, very concerning for anybody who cares about freedom of speech and journalism. I know the Carnegie Trust has suggested it, but it does not seem to me that it is the proper approach.
The Chair: On that point, you will have heard the discussion Baroness Kidron and I had earlier with Professor Wilson about the central role of the civil law in this case. There is illegal content, which we have discussed, where the law is really clear that certain types of content could be illegal—the sort of speech that would be illegal. Then there is the broader area of harmful but legal, and how that is defined.
Should we look at the existing negligence law and say that the regulator may give a view as to whether a company has been negligent because it has failed to act against harmful content that was likely to cause harm and did cause harm? Therefore, the way of resolving that is not through a regulatory intervention for legal but harmful, but through the civil courts, with the regulator giving a view based on the risk register, based on its own opinion, as to whether it thought the company was negligent or not, but the final decision would be taken by the courts if someone chose to go down that road.
Gavin Millar: It would get complicated. Going to court is always complicated; it is always expensive and time-consuming. I thought the idea behind the Bill when I originally saw it mooted was that it would identify some distinct types of harm and clearly define them. We all know what they are: the self-harm stuff, the racial abuse, the stuff that is harmful to children. You could define them clearly, and then prescribe in the Bill how you are supposed to deal with that and how you are supposed to prevent that happening. In that sense, it is quite simple.
Instead, we have this huge edifice involving duty of care being imposed on providers. We have an entire regulatory structure. We have the Secretary of State having all these reserved powers so nobody knows how it is going to work in practice. Now we have the Carnegie Trust saying that perhaps we should have a general overriding duty of care on the providers, which will lead us goodness knows where.
I would have thought the better thing would be to have primary legislation that identifies the harm and identifies how you deal with it, and if they are not dealing with it in that way, there is a mechanism, possibly to go to a regulator or possibly to go to court, for the Government, a regulator or individuals or a group of individuals to challenge the failure of the provider to tackle that designated harm in that designated way. That is what I thought we would be doing, but the Bill goes much further than that.
The Chair: I have one more follow-up. In your view, would you be satisfied that a schedule of harmful content could be defined in legislation or through statutory instruments of Parliament, and could include areas of content that are not otherwise specifically illegal but would be designated as harmful through the process of passing this Bill?
Gavin Millar: I do not agree with the use of a Bill like this to deal with a long shopping list of criminal offences. I think it is a potential disaster, because, if you look at them, the offences are incredibly complicated. The police and the CPS would struggle with deciding whether those offences had been committed. Here we are giving a “reasonable grounds” to suspect power to a social media provider to say whether a piece of content involves one of those offences being committed. They are not going to be able to do that. It just will not be possible. You are left in the situation where they decide: do we interpret and apply it restrictively, expansively, narrowly? It gives them far too much discretion and power. I do not think illegal content should be for this Bill. It should be a matter for the police and the courts.
On legal but harmful, yes, of course, as a matter of social policy, you as politicians and parliamentarians can say, “There is a societal problem here: we need to address it in a Bill in this way”. You identify it and prescribe a solution to it. That has been going on for decades and centuries. I do not think it is impossible. I think it is more desirable from the point of view of people understanding what the law is doing, and I think it would be a better way of doing it.
Q144 John Nicolson: I am a journalist by profession. That is what I did before I became an elected politician. People sometimes say to me, “How do you qualify as a journalist? How do you become a journalist?” The truth is I am utterly untrained. I just got a lucky break on BBC youth programmes. I never did shorthand or any of the other traditional things. Peter, could you set out for us what the difficulties are in defining exactly what a journalist is?
Peter Wright: You have put your finger on it, and I am afraid it is all caught up with freedom of expression. There is no absolutely clear definition of a journalist. It is not like being a doctor or a lawyer. You have to look at the work of a journalist. Virtually all of our journalists are trained and do shorthand, but not all. I think for our purposes, as professional journalists working for established news publishers, the question of whether journalism qualifies as journalism for the purposes of exemption from this Bill is actually quite well addressed by the recognised news publisher definition. It is about whether you follow the code of conduct, whether there are mechanisms in place for people to—
John Nicolson: Which code of conduct?
Peter Wright: It does not have to be a particular code of conduct.
John Nicolson: Or you can make up your own code of conduct and say you are a journalist. That is what happened to Ambassador Craig Murray recently in Scotland. He decided that he was a journalist. Unfortunately, he had not trained in media law and, sadly, now finds himself detained at Her Majesty’s pleasure because he did not understand the rules of court reporting. He described himself as a journalist and lots of his supporters call him a journalist.
Peter Wright: People who have had training as a journalist and follow recognised codes of conduct also get contempt of court wrong sometimes. In my long career it happened to me once, many years ago, but I did not go to jail.
John Nicolson: You did not go to jail. I should say that for your reputation.
Peter Wright: The problem with journalism is that when you look at it people know what it is, but you cannot define it in a closed and tight way without introducing a system of licensing.
John Nicolson: So if it walks like a journalist and talks like a journalist, it is a journalist.
Peter Wright: And it holds itself responsible for what it does.
John Nicolson: Matt, suppose somebody like Tommy Robinson decides to describe himself as a journalist: do we have a problem with this Bill? Could he be protected from the exemptions that apply to journalists in this Bill?
Matt Rogerson: I think Gavin will probably be best to answer the Tommy Robinson question. I was going to bring up the Tommy Robinson case because I think it is important in the context of legal but harmful, and the lack of clarity in the Bill. What that case was used to do was to suggest that the mainstream media was not reporting on issues it should have been reporting on. In fact, what the mainstream media was doing, because its journalists were trained and were following the rules, was following court rules, and it was not reporting in the way that he did and was found in contempt of court for. The potential for journalism to be captured and blocked from platforms on the basis of very broad definitions will potentially lead to more conspiracy theories about why things have not been covered, because there is lack of clarity about why stuff has been taken down. On the broader Tommy Robinson question, I turn to Gavin.
Gavin Millar: You sometimes have to define journalism in legislation and in case law to give it the extra protection, that it merits and that I spoke about, in the law. You have to decide whether it is that thing to give it the extra protection.
There are working definitions. If you want me to venture one that is a kind of hybrid from what we have in the common law and the convention law, it is somebody who gathers together information and ideas and disseminates them to the public. The last bit of it is always the contentious bit. Do you then say “in the public interest”, or do you give it a broader ambit and say something like—this is in the legal definition—“in the reasonable belief of that person that it is in the public interest”? You would cover citizen journalists who believe what they are doing is important in the public interest, but you, or the judge, might not agree. You can get a working definition of what it is. It is very important in the context of this Bill that it covers citizen journalists as well as—
John Nicolson: In his particular case, could he be protected?
Gavin Millar: You have got the point. What you then have to say is, “This is about protection”. It is not about his status as a journalist. That is not the issue. The issue is that it is a type of speech. Whether it is or is not journalism, or whether he is claiming that it is journalism, is it the sort of speech that should be strongly protected in the way you would with legitimate journalism? That is where it falls apart. That is where a society or a court or a judge would have to say no, it does not, because it does not serve any public interest. His belief that it is in the public interest is completely unreasonable looked at objectively. It does not matter a fig whether you categorise it as journalism or not; it is just not speech that is deserving of protection. It does not get home on that score just by styling itself as journalism. It is not a problem if you look at it in that way.
John Nicolson: Alison, are you content with that reassurance?
Alison Gow: Yes, I think—
John Nicolson: Yes or no?
Alison Gow: Yes.
Peter Wright: Perhaps I could add one thing. We discussed this with DCMS officials. We would not want Tommy Robinson to be regarded as a journalist. One of the elements of the definition of a recognised news publisher is that it is not an organisation that is proscribed under the Terrorism Act. I am not sure that Tommy Robinson has been, but certainly the definition of a journalist or a news publisher should not include people who are using journalism as a vehicle to other ends that are against the law.
John Nicolson: Gavin, what happens if he stands for election during that period? Does he get additional protections as a candidate?
Gavin Millar: No, on the contrary. If you stand as a candidate in an election, there are election laws regulating freedom of speech in elections that aim to ensure fairness in elections that would restrict some of the things he would want to say in an election.
John Nicolson: Really? He would presumably also be given certain freedoms during the course of an election campaign to say things as well.
Gavin Millar: In general terms, it is a heightened period of public debate. If there was an issue about defamation, public interest speech or something like that in the context of a campaign, the courts would always recognise it as a form of particularly high-value political speech because of its context as a debate during election time. As I said, there are prohibitions on what candidates can say, particularly about other candidates, during an election. There is the whole panoply of other criminal laws—incitement to racial hatred and so on—that may be a problem for somebody in that position. You do not escape those just because you are a candidate in an election—on the contrary.
Q145 Dean Russell: For the register of interests, and for transparency, I have been involved with Express Newspapers, which I know is owned by Reach, with regard to a Covid memorial.
Mr Wright, would you ever foresee a time when a newspaper in the UK would publish a challenge on the front page to encourage children to swallow detergents? I am hoping it is a simple answer. A yes or no would be good.
Peter Wright: I am trying to fathom where you are coming from.
Dean Russell: I will explain my question. There are platforms like TikTok that have had challenges, one of which, I believe, was to encourage children to swallow detergents. They have had all sorts of those challenges. What I am trying to get to—sorry, it was probably unfair of me to ask you the question without the context—is the difference between a publisher, a journalist, an editorial platform and a social network. It seems to me that a platform like TikTok, through the fact that it is spreading a challenge such as that to a large number of people, is equivalent in many ways to putting it on a front page of a newspaper in terms of reach and visibility.
It was an unfair question—my apologies. I would think that a British publisher or newspaper would not put that on the front of a paper, but a social platform could do that and reach many. Perhaps, Matt, you would like to come in because I notice you are nodding.
Matt Rogerson: Don’t expect a special front page next week.
Peter Wright: If I had done that during my career as an editor, it would have ended the next day.
Dean Russell: Exactly.
Peter Wright: Because of the profile and the reach of established news publications you are held to the court of public opinion anyway, to which you give a lot of thought.
Matt Rogerson: It is a good question. It is even worse than just the front page of a newspaper: it is a newspaper personalised to the user, being delivered to their post-box without, potentially, their parents knowing or the rest of the public knowing. That is what is happening with those platforms in terms of the algorithms.
There is a serious point about how that stuff is being funded, and that tracks through to something that is a slight hobby horse of mine, which is the online advertising market and the fact that platforms, at the moment, are not accountable to advertisers for the activity on those platforms. There is no transparent link between the people who are funding that content and the platforms that are making money from it. A lot of the questions about interoperability and competition would be resolved if we had the establishment of the digital markets unit running in parallel to this Bill.
I have said before that I think they are two parts of trying to solve the same problem, which is bringing order, competition and accountability to the platforms. There are two elements of the public outrage narrative: the users who might, if there was greater transparency about the sort of activity that the platforms were facilitating, choose to go to a different platform; and, at the other end, the advertisers who, if they knew what their advertising was funding, might choose not to fund it. As publishers whose content is out in the open, we always run that risk. In our history, when we have got things wrong, advertisers have left. You need to make some reconnections between the way in which the media has worked in the past and the way it could work in the future.
Dean Russell: Thank you. If I may, Alison, I will come to you with a similarly unfair question. Could you ever see anyone in Reach plc putting that sort of challenge or campaign in their newspaper?
Alison Gow: No, we would never encourage people to eat a Tide pod. I know Facebook has terms of service about ingesting harmful substances, and what have you, and promoting it. I am not a social media expert, but I am assuming that they have their own ToS in place to stop that sort of thing happening. What you always find on social networks is that people work around them and come up with new ways of doing it. That is just one of the cautions I have whenever I think about this sort of thing—the evolution of how things move forward.
It is hard to quantify what is a harm because it sometimes looks really innocent. A completely innocent-looking hashtag, which you might see on TikTok or Twitter, is actually, to those who know, promoting incredibly harmful content. Who is finding out what these things are? For example, where are young 15 year-old girls who talk about self-harm using “rainbow” and “sparkle” hashtags going to talk about it? We talk about the big platforms, but there is a dark rabbit hole. They surface there, move down and quite often go into members-only groups and WhatsApps and things like that. I have wandered far from your question—apologies for that—but it is something that concerns me.
Dean Russell: Absolutely. To be honest, that is where I was hoping you would explore the question further. It was definitely not intended as purely binary, because I was hoping the answer would be no from everyone. It seems to me that one of the themes that has come through in my interpretation of the evidence we have had over the previous weeks is that newspaper publishers are treated very differently from online platforms, yet serve a similar purpose in some ways. There is the definition of journalism and all of those things.
One of the get out of jail free clauses in this—I would like to come to all of you again on this, if I may, including you, Mr Millar—seems to be that they are relegating the responsibility of the content to algorithms. They say, “We don’t have an editor, a person, who is watching all of this content, therefore we cannot be held responsible”, and this means it is fine because it is an algorithm that is doing it, not a human, so they cannot see all the content, but they are still publishing it. They are still using hashtags and they are still promoting it. Matt, are they using that as a way to say, “Well, actually, it’s not our fault, guv”, but it is still doing harm?
Matt Rogerson: There are a couple of things. I do not think algorithms will solve everything. From the Wall Street Journal reporting, we can see that the public are told one thing about the role of AI in taking content down from Facebook, but the reality, I fear, internally in their research mechanisms, is somewhat different. We cannot rely on algorithms to do everything and be as clever as we hope they will be.
We are content businesses. You can tell we are a content business by the fact that in the company I work for 60% of our staff work in editorial or production. If you look at the Wall Street Journal numbers, 4% of Facebook’s revenue is spent on moderating content. I do not know what 96% of their revenue is spent on, but they do not spend a great deal of money on moderation .
Dean Russell: But in a newspaper or a magazine, there would be an editorial decision by a person to say, “This is a relevant news story and therefore it is going to go in”, whereas a platform would say, “It is a free-for-all and anyone can do it, but actually we are going to promote this bit, effectively, to the front page by promoting a hashtag”. Should we be looking at the algorithm and the human aspect and whether they have people looking at content to help make decisions on what should be shown?
Matt Rogerson: There is definitely an evolving space in the curation of news on those platforms. I suppose there are two separate things. The platforms now have dedicated areas for news. A news tab is now curated by people at Facebook or outsourced by Facebook. What we do not know is the incentives that are put into the algorithm itself in terms of what people see. We know that they have made changes in the past that have taken autonomy away from the user, and the algorithm takes autonomy in serving the user content that they think will keep the user on the platform for longer.
There was a big algorithm change in 2018, which is the subject of a Wall Street Journal article. It was a friends and family change, and a lot of publishers, including us, warned about the potential for that to drive viral content, rather than the users’ own choices being the primary source of content they receive. We definitely need to understand how those algorithms work. There is an editorial bit and a competition and commercial bit in the discipline of understanding how algorithms work in deciding where the money goes in the online advertising market. This is an issue where you need a massive amount of specialism to look at both the safety and protection piece and the commercial piece to make sure that the companies are acting fairly.
Dean Russell: Mr Wright, what is your take on that?
Peter Wright: One of the things that concerns me with this Bill is that the only way the platforms will be able to fulfil their obligations under the duty of care is by massive use of algorithmic decision-making. In our business, we deal with the consequences of algorithms every day of the week. They are massively overrated. They are very simple in their essence. To a large extent, they work off key words, but they are very prone to error. We had an appalling example about a month ago, when a video news story we had run in America about a black man being harassed in the street by a white man and then the black man being arrested was flagged by Facebook with a line saying, “Watch more primate videos on Facebook”. That is an appalling racist trope.
Dean Russell: Yes.
Peter Wright: It was actually uncovered by the New York Times. We did not know about it. We did not know it had happened. Facebook apologised, but blithely said, “We’re very sorry, this was done by artificial intelligence. Our artificial intelligence is not perfect. We are working on it”. That was awful. They are going to be making errors of that nature on a massive scale. As far as we are concerned, as news publishers and journalists, the algorithms will have massive difficulty discerning a news story about anorexia from content that is very harmful—a news story that is trying to condemn anorexia websites and the anorexia websites themselves. That is why we think there must be a full exemption. We suggest in our submission a method by which the platforms can algorithmically identify news copy in order to leave it alone.
Dean Russell: Thank you. Alison, is there a risk of relegating responsibility to algorithms and that therefore real people in those companies do not take on the responsibility of solving them?
Alison Gow: Yes. Algorithms are written by people, who come with all the unconscious and conscious bias that goes into creating code that is then quite often rewritten by other people. With code, what you start out with may have been iterated, revised, broken and repaired within six months. It is a constantly changing thing that is looking for key words, sweeping sites to find potentially inappropriate content and then, as has been said, down it comes.
We all know Mark Zuckerberg hates a nipple. If you are writing stories about breast cancer and breast cancer survivors, with important information around checking yourself, that content may struggle for visibility. We certainly know that important court cases that might involve violence, for example, do not get visibility on platforms that are looking to attract advertising because that is not what advertisers want. Even now, visibility is an issue, given the takedown consequences of having more algorithms sent out in that hunter-killer way to find content that may lead to the platform being in difficulties. It is obviously going to be in their interests to be over-vigilant because the consequences could be severe.
Dean Russell: Thank you. Gavin, from a legal perspective, what is your take?
Gavin Millar: I agree with everything that has been said. The worrying thing about this Bill is that it encourages and facilitates the service providers to hide behind algorithms in that way. They have those systems in place. A most pessimistic view of the Bill is that they could just carry on doing what they are doing at the moment, and nobody will be able to challenge it.
There is nothing in the Bill that says that the state, the Government who have launched this whole ship, will later on be overseeing how you do the algorithms and how you apply them in practice. The service providers say that they just screen out the problematic stuff and then get the humans in to look at that, but there is nothing in the Bill to say there is any oversight of that. It is just leaving it to them to a large measure to continue what they are doing at the moment. It is just giving state credibility—government credibility—to that, and I do not understand it.
Dean Russell: Thank you very much.
Q146 Baroness Kidron: You just launched into my question, Gavin. I get the thing about error, and I get the point that Matt made about a positive duty, but surely we need to see the regulator with the duty and the ability to have oversight of the algorithms, because it seems to me that the level of error is minuscule compared with the power to just change the algorithm and bury the news altogether, and you do not know. Do you know what I mean?
Matt Rogerson: Yes.
Baroness Kidron: Should there be a duty on Ofcom to investigate where there may be economic unfairness or persistent harm of some description, as you described earlier, and a duty for companies to keep information in such a way that the regulator can meaningfully fulfil that oversight?
Matt Rogerson: There is a nice analogy my colleague gave me. You can be on Facebook but you cannot be seen and you cannot be completely blocked. The way he described it to me was that you can be let into a bar, but if you are never served are you really in a bar? You need a regulator that is able to understand that. I do not know whether that sits in this Bill or whether that is part of the wider work that Ofcom is looking at on whether the platforms are showing a diverse enough view of voices in the news. What is their influence on news consumption? I do not know. I do not know whether it should be this Bill or that Bill.
Certainly, Ofcom needs access to and understanding of the algorithms. I was quite surprised when I read the Bill and some of the evidence from Ofcom about what it sees its role as being in relation to this potential Act. It seems very collaborative: “We want to invite them in”. It sounds like a jazz fusion band practice. It does not sound like a relationship between a regulator and a regulated entity. I cannot quite understand what its role is and what it will be trying to achieve. In thinking about how that has worked in other areas of law, in the data protection world, we can look at the relationship between Facebook and the Irish data protection authority. Facebook’s view of the Irish data protection authority is, “We don’t need consent in order to serve users with targeted advertising because it is all bound up in the contract”. The regulator says, “Okay, you are doing it completely differently to everyone else, but because you have lawyered up and told us that is the way you are doing it, you carry on in that way”. There is a question about what the relationship between the regulator and the regulated will be like. Will it genuinely have power to make them change course in how they operate?
Baroness Kidron: Gavin, I recognise you made a bigger point about the Bill, but if it was constructed as is, does it need this second piece?
Gavin Millar: We have lots of black letter law in this country that does not work. It is not enforced, and it is not practical. You could write into a piece of legislation a tight oversight regime for tech solutions involving an independent regulator, whether it is Ofcom or somebody else, but you have to face the reality that the big digital platforms are way ahead of us on this. They have lots of very clever, skilful people who can produce reports and give accounts of outcomes that make them look fine, successful and so forth.
You say, “We are going to put into the law provision for the regulator to get underneath all of that and actually look into it and see how those things work and what the outcomes are”. I do not think, even if you put that in, it will be practical to do it. I do not think the regulator is going to manage to do that. To go to Ofcom now and say, “You have to take that on and do that”, is a big ask for it, given the competences and resources available to the big tech platforms. I am not saying that it is impossible to regulate, but it is really difficult to regulate in that way. I am not saying do not try it. I am just warning that I have my doubts about it.
Baroness Kidron: For the record, it would not be my whole position. Thank you for that.
Peter Wright: I would like to take a different view, if I may. At the moment, platform algorithms are what tech people call a black box. You have no idea what goes on. They do not explain them. They do not even announce changes in advance. They do not give you a warning. If you complain, you get absolutely no redress at all. That has happened to us over the last decade with Google. It has made a series of changes to its algorithm that has meant that the graph for our search visibility, which is the industry-standard measure of how often and how prominently you get surfaced in Google search results, has gone right down. At the same time, you have to bear in mind that Google is the means by which nearly all of our revenue is delivered. It is a commercial monopoly with which we have a business relationship in which it holds every single card.
We do not know why it is degrading our search visibility, but for some very broad and big topics such as Covid or Brexit, if you put in the word “Covid”, the visibility of Mail Online—the share of search that we get—is close to zero. We have complained, but we get no answer. It is probably doing it for commercial reasons, which we think we can guess at. The point is that it needs regulation and it needs to be made to explain. You can measure what it is doing. We have teams of people doing metrics; you have to, all the time. You can see that, for whatever reason it is doing it, it is choosing to promote some providers of news and choosing to suppress others. That is very damaging for plurality and democracy.
Baroness Kidron: If you were willing to give a case study of one those metrics, it would be very helpful to the committee.
Peter Wright: Some of it is in our submission.
Baroness Kidron: Okay. I read it—I did not see that.
The Chair: A final question to this panel from Jim Knight, who is joining us remotely.
Q147 Lord Knight of Weymouth: Thanks very much, Chair. I have a question for Matt and Peter. I cannot see anyone, so I am just guessing by tone of voice who you are.
The Bill gives exemptions around news content and comments on that content. We have heard from the NUJ some concern about the abuse of journalists online. Currently, on your sites, how do you police comments on your own content to ensure that you are taking out harmful content and that we can feel confident about that exemption?
Matt Rogerson: We have been pretty open about how we moderate comments on our site. We have built in the open and talked about the problems we have had in the past of particular attempts to infiltrate our comments sections. We had an attack by pro-Kremlin trolls about five or six years ago, which our readers editor wrote a column about . We have also looked at the nature of comments through our “The web we want” project to look at how negative comments particularly target women and journalists from black and ethnic minorities.
We have undertaken a project to both employ AI machine learning to triage comments that come on to our site and slim down the number of comment sections that we open up, so that we can dedicate the resource we have in order to better moderate and create better conversations around the content we have on our site. That has kept the interest and that is why we have comment threads—in order to have those good conversations on our site.
Ultimately, the regulation of that, in terms of whether there are complaints about our comments section, is by our internal ombudsman, the readers’ editor. We have not had a massive number of complaints because we have very clear community guidelines with no ambiguity, which means that if comments are taken off people should have a clear understanding why. In the rest of the industry, because I am also here with an NMA hat on, comments sections are also moderated in a proactive way. Ultimately, they are regulated by IPSO. When there are complaints about a comments section, they would be ultimately dealt with by IPSO.
Peter Wright: We operate in a similar way to the Guardian. We do not allow comments on all stories. When a story is expected to be problematic, we do not run comments. When a story is regarded as low risk, we pre-moderate comments. Teams of moderators look at them and check them against the rules, which, like the Guardian’s, are very clear. When a story is extremely low risk, we do not pre-moderate, but we operate a flagging system, and any comment that is flagged is immediately taken down and only reinstated if the moderators are happy with it. Once a comment has been moderated, it becomes subject to IPSO regulation.
Lord Knight of Weymouth: Peter, is your risk assessment done using AI, or is it, as I assume, done by the journalist as it is posted up?
Peter Wright: Yes, it is done by senior members of the journalistic team. We very rarely run comments on ongoing court cases. We do not usually allow them on stories with subject matter that is predominantly to do with issues around race, sexuality or gender, or where we have a concern that the story has potential to generate comments that would be against our guidelines.
Lord Knight of Weymouth: I am concerned about the lack of protection for the integrity of elections in the Bill. During an election period, for example, would that change your risk appetite on stories?
Peter Wright: I have not studied comments during election campaigns. You would generally hope to allow debate during election campaigns. Obviously, we all have the tragic death of Sir David Amess in mind. There might be circumstances when you would want to be very careful about comments about particular candidates, particularly if you know that candidates are receiving abuse or have been threatened.
Lord Knight of Weymouth: Thank you. It is very helpful to see the difference between an edited publisher site and a user-generated content site.
Matt Rogerson: Gavin can correct me if I am wrong. Newspapers have been found liable for comments left on their site. We are liable under the law. We have been found liable under the law for comments that have been left but have not been dealt with.
The Chair: Thank you very much, witnesses. We have to draw an end to this panel.