final logo red (RGB)

 

Communications and Digital Committee

Corrected oral evidence: Freedom of expression online

Tuesday 11 May 2021

4 pm

 

Watch the meeting

Members present: Lord Stevenson of Balmacara (Acting Chair); Baroness Bull; Baroness Buscombe; Viscount Colville of Culross; Baroness Featherstone; Lord Gilbert of Panteg (The Chair); Baroness Grender; Lord Griffiths of Burry Port; Lord Lipsey; Baroness Rebuck; Lord Vaizey of Didcot; The Lord Bishop of Worcester.

Evidence Session No. 29              Virtual Proceeding              Questions 234 - 242

 

Witnesses

I: Katie ODonovan, UK Director of Government Affairs and Public Policy, Google; Becky Foreman, UK Corporate Affairs Director, Microsoft.

 

USE OF THE TRANSCRIPT

This is a corrected transcript of evidence taken in public and webcast on www.parliamentlive.tv.

 


16

 

Examination of Witnesses

Katie O’Donovan and Becky Foreman.

Q234         The Chair: Welcome to the second part of our meeting today. We are taking evidence for the Communications and Digital Committee in relation to our work on the internet and freedom of expression. We have evidence to be taken from Katie O’Donovan, UK director of government affairs and public policy for Google, and Becky Foreman, UK corporate affairs director for Microsoft.

Thank you both very much indeed for joining us. As you have heard, the transcript of this event will be taken, and there will also be a live broadcast going on as we speak. We will go through a series of questions that we have. If, when answering the first question, you have a few things that you want to say as context, that would be helpful. Other than that, we would expect to spend the hour we have with you going through a range of questions. We hope that we will come across some common themes and issues that we can debate and discuss, which will help us to influence what we put in our final report.

Q235         Baroness Grender: Welcome and thank you for joining us for this session. My first question is about your business model, the shape of design and what search algorithms are used. One of the witnesses who came to us, Safiya Noble, an associate professor at UCLA, found that Google ranks racist and sexist content highly in search results. What are you doing to ameliorate that kind of issue? Are you open to independent external audits of how the algorithms work?

Let us face it: Google has significant levels of dominance, with Bing being the next possible rival on 5%. But witnesses have said the profit motive trumps all else and, therefore, the searches are dictated by that. What are you doing to ensure that issues such as racism and sexism are not prominent as a result of searches? Obviously, please use this opportunity to introduce context for you.

Katie O’Donovan: Thank you to the committee for inviting us here today, and for the wider work of the committee, which has been really interesting for us to follow and to help inform our thinking and considerations internally.

I will take an opportunity to take a step back, explain a little bit about Google search and our business model, and then go on to the substance of your question. That will help to frame my comments here today but also answer that question. I am sure that nearly all of us have used Google search, if not used it in the last hour or the last day. It is a search engine that began 20 years ago when other search engines existed. It changed the way content was ranked, and aimed to make the world’s information universally and usefully available to users. It is different from a social media site, which is often discussed or thought about from an online harms or online safety perspective. It takes information that is not hosted by Google in response to a search term, and it returns that information to make it available in a useful format.

Our business model is to run adverts alongside searches. Many searches are not monetised at all. For example, if you are thinking about a news search for the Queen’s Speech, there will be lots of content that we could return in response to that, but very little of it will be monetised because it is not a competitive monetary environment. If you are looking at buying a new car, for example, that advert might be monetised. If you choose to purchase that car, it is a good return on investment for a car dealership to advertise with us. It is worth remembering as well that we charge only where people click on the link. You could serve an advert for me looking for a car. If I did not click on the link, you would not pay there. It has been proven to be quite an effective way to advertise.

Your question is whether our business model trumps what we are returning in search, and whether that could in some way perpetuate content that is undesirable or unwanted. There is a very clear distinction on how our search results work. The ranking of search in the body of your search results is determined purely by our algorithm; it is not determined by advertisements. If you were searching for a new car, you might see adverts at the top that were clearly paid for and labelled, but the content would be returned by our algorithm. The algorithm makes those decisions on a number of factors. If the search term is not monetised, for example “the Queen’s Speech”, you will not see adverts at the top, but you will see that content.

You will want to understand how that content is sorted by the algorithm. The algorithm will use lots of different signals to determine that. For instance, is your search term topical? Do I want information on the history of the Queen’s Speech and what that means, or do I want information on what has happened in Parliament today? It might be that my location setting is on and I want information about the UK Queen’s Speech and not things that have happened in other countries. Others who have used that search might want authoritative content. They might want it direct from Parliament itself because that is the source of the information, or they might want high-quality news. All those criteria are set and filtered through our algorithm. I might want a quick-to-read or quick-to-load page, or I might be on my mobile phone, so that signal is important.

Those signals are fundamental. The algorithm is fundamental to how we serve search results, and those search results cannot be bought. Yes, you can bid for advertising space on commercial search terms, but the body of our search results cannot be bought. That is really important and fundamental to our business model. That also means that it is not in our interest to show necessarily the most antagonistic or exciting content. We want to show the most relevant content.

You asked questions about algorithmic transparency, so I will come on to that now, if that is helpful. Many different people want to understand how search works. As a user, I want to know: “Am I getting this because it is relevant to me? Am I getting it because I am in the southeast of England or because I am a woman?” Academics want to know how our systems work so they can process them, understand them and test some allegations that we see around our algorithms. Regulators and government want to understand how they work too. We think very carefully about how we can reassure different groups on the substance.

First, we think about users. We education our users quite a lot on how search works. We have an interactive website that tells you that and explains the criteria that we might look to use. It helps you to understand, if you have a Google account, what settings might be used. We see lots of people conducting research and experiments on Google Search and testing different hypotheses. We see different allegations, whether they are commercial, about undesirable content or about political bias. Because we are not a walled garden, those academics have access to our systems to test and see what those results are.

We also make available, as we have talked about before, a document called our Search Quality Raters Guidelines. We chose to make this document available; we do not have to. It is a 200-page document which people we employ use to test whether our algorithm is acting as it should be. We will set the rules for our algorithm to determine: “Is it good-quality content? Is it relevant content? Is it recent content?” Our search raters use the guidelines that we publish to our algorithm. We make updates to those Search Quality Raters Guidelines to check that they are in process.

Finally, we have conversations with regulators. That is a really important consideration of this committee. We might come on and talk about it later, but we are working very closely at the moment with the CMA on our privacy sandbox, which is a change to some of the advertising settings. That is a really good example of where we can get under the bonnet with a regulator and respond in real time to its questions or feedback on how a product might work or evolve.

Becky Foreman: Thank you very much for inviting me here today to give evidence. The focus of your inquiry is definitely one that Microsoft cares passionately about. We really want to help people use technology, connect with others around the world, and find and share information, knowledge and ideas.

I would like to quickly outline Microsoft’s position in the tech ecosystem. We have quite a unique position within the global digital landscape. We offer a very diverse range of tools, which are primarily aimed at allowing organisations and individuals to achieve more. We have cloud services. We have the productivity suites that you will all be familiar with, including Excel, Word, et cetera. We have developer platforms. We have personalised gaming through Xbox. We build products and services that help people work, live and play right across the board.

We also have a unique view of the world of the tech ecosystem, because our business model is to sell through partners. We have an inverse supply chain. In the UK, for example, we have tens of thousands of those partners. Some of them are very small businesses; some of them are much bigger. We do not rely on monetisation of content via advertising as our primary source of revenue, as I am sure you are aware. At the same time, we believe that search and user-generated content, when handled responsibly as part of a much larger tech ecosystem, can have a net positive impact on society. Social features and functions have really clear benefits. But there are also risks and potential for abuse, which the Minister talked about in the last session, and so we have to be very conscious of that.

Our business model[1] is to sell advertising against the search results. We can do that only if we provide users with the highest-quality, most relevant content available in response to their search queries. We design our algorithms and those features with that goal in mind. We do not remove URLs from search except in limited cases.

In order to sort through the trillions and trillions of pages of web content to provide those high-quality search results, we rely on algorithms. We want to be able to identify the most relevant content for a given query near-instantaneously. As Katie has outlined, algorithms use signals. We look at factors such as relevance, quality, credibility, user engagement, the freshness of the page, the location of the user and the page load time. We constantly work to revise our algorithms to give a much better search experience.

One key factor in ranking is authority. We use authority to indicate the quality of the discourse on a website, how often it is cited by other sites, the transparency of the author and whether the site clearly distinguishes fact from opinion. We generally assume that our users are looking for high-quality content, and so we work to return high-quality content in our results and to deprioritise low-authority content.

Turning to the audit of our algorithms, modern artificial intelligence algorithms involve machine learning. They are learning from huge amounts of data to try to discern patterns in order to make predictions and recommendations when we encounter new data. As such, we do not really have a predetermined or pre-programmed rule that can easily be audited. We try to better explain how our algorithms work by setting out how search works, as I have already outlined, the types of data that are used in machine learning, and the outcome, ie the predictions and recommendations of the algorithm. We also have lots of mechanisms for users to provide feedback via the feedback tool or to report via the concern tool on our services. We have a variety of ways that users can understand how the Bing algorithm works in addition to the explanations that we give on our webmaster webpages.

Baroness Grender: Thank you very much for that very detailed response. Can I just double-check something? When Ofcom is up and running with regard to the forthcoming online harms legislation, part of its function will be to have black box, confidential access to algorithms. Will you be happy to co-operate with that?

Becky Foreman: At the current time, we are still waiting to see exactly what powers Ofcom will be given for regulating the online space. I would want to understand its rules better and how it was going to regulate. As an organisation, we always strive to comply with local law and regulators in the regions that we operate in.

Katie O’Donovan: I would very much agree with that. It is very important for us to engage with Ofcom. We have had fair sight that it will be the regulator for the Online Safety Bill. We want to help it understand our products, as we work with other regulators in the UK to do so too.

Q236         Baroness Grender: I wonder if I can move on to something about user search, and in particular what is the sharp end of the user search engines. I have read the investigation by Nicholas Kristof for the New York Times. It was pretty horrifying to read. If I were to put into Google right now “rape unconscious girl”, the probability is that I would be taken through to Xvideos, Pornhub, xHamster or one of these sites, where I could see some illegal content. One in eight videos on these sites are non-consensual or violent, according to studies that have also been included in this.

The number of visits made to these kinds of sites via Google are astonishing. Worldwide, something like 2 billion visits per day are made to just one of the sites. That is more than Amazon, Netflix and Yahoo. It is astonishing that roughly half the traffic reaching these sites appears to come from Google searches. Indeed, these sites look with great interest at Google searches and how they are reached by their audience.

I am wondering in particular, Katie, how you feel about that, given your background, where you come from and that you have been a board member of the Internet Watch Foundation. You have looked under the paving stones and seen some of these issues before. Given Google’s prominence, how concerned are you about this, and what on earth can be done to stop it?

Katie O’Donovan: It is a really important and challenging question. It is good to be able to address it. In my time at Google, I have worked alongside Becky on the board of the Internet Watch Foundation. I know you will know its work well. It is a UK-based organisation that works across the tech sector to challenge illegal content.

The issue of pornography is challenging from a personal, a legal and a corporate perspective. Google is very clear. As I mentioned earlier, we are not hosting any of this content ourselves. Our role as the search engine is to index what is on the internet. We very clearly will not return results that are illegal, and we have a very significant and dedicated workstream around what we call CSAM, which is child abuse material. That is where we will work with others such as the Internet Watch Foundation, and indeed across industry, to make sure that we are doing all we can, developing the right technology, instituting it, enforcing it on our own platforms and sharing it across industry.

We will not return search results that lead to illegal content. We have developed algorithms to understand where you are conducting search results that could lead to illegal content. That content will not be allowed. We will not allow CSAM content. We work with NGOs in the UK to put a warning on top of those search results and make sure that that is not delivered. Legal content has a different categorisation. By making sure that we have the algorithms in place to discern content that is looking for child abuse material, we can prevent those results from showing and, instead, show news that highlights the dangers of that particular type of content and links through to external support. We can then develop associated technology to use across our systems. That is the best way forward.

Baroness Grender: One in eight of the videos on the sites I have described to you are in some way non-consensual or violent. In fact, cases of minors are found again and again. Your searches allow people to reach those sites. Whether or not you say you are ruling out anything illegal, the sites that they get through to are definitely found to be using illegal content. People are reaching those sites via a search on your search engine, and they are doing that on a huge scale.

Katie O’Donovan: One of the responsibilities there is for the sites that are hosting that content. Some of those sites want to use some of the technology developed by us and Microsoft to make sure that their content is industry standard. That is really important.

Baroness Grender: Why not say, “XNXX, we will not allow any searches to go through to your site any more until you remove all illegal content?

Katie O’Donovan: Certainly, where a site is hosting illegal content, we will not lead through to that content.

Baroness Grender: But you will go through to the site that has been found guilty again and again of hosting this kind of content.

Katie O’Donovan: If a site is not legal in the UK, we have a legal removals process. We will not link to sites that are illegal in the UK. We will not link through to illegal content.

Baroness Grender: I hope that you take a look at this investigation by Nicholas Kristof in the New York Times. Are you familiar with the investigation?

Katie O’Donovan: I am familiar with some of the detail of the case, yes.

Baroness Grender: One example that he uses is a girl who was 14, who was persuaded to have sex. This was shown again and again. She asked for it to be taken down; it was still up. She asked for it to be taken down. She pretended to be a lawyer. The video of her was still not taken down. Are you genuinely comfortable that Google searches go through to a site that continues to do that sort of thing?

Katie O’Donovan: It is really important that the site takes responsibility for that content and action. By any of our standards, that is not acceptable behaviour. We have processes in place at Google that, for example, enable revenge porn images not to be delivered through our search engines. They are hosted by third-party sites. If that site is illegal by UK law, we will not link through to it. If the content is illegal, we will not link through to that.

This is an incredibly serious topic. There is a range of actors in a very contentious space. We have made a very clear commitment not to link through to content that is not legal. We do not run adverts for anything in this industry. We work with the Internet Watch Foundation, the Government and others. All those things can be true, while work still needs to be done consistently across the piece to make sure that we are all doing what we can in this space.

Baroness Grender: When he says that roughly half the traffic reaching Xvideos and XNXX appears to come from Google searches, is he wrong?

Katie O’Donovan: I do not know about that specific case, and I do not want to speculate. It is the case that you can find legal content through Google search. If I was a regulator or a legislator, I would make the distinction: “Does the country think that that content should be legal or illegal? Are the processes in place, if government and others have made that judgment, to ensure that content that the UK does not feel should be legal is not being reached here?” We certainly have a part to play by not linking through to known illegal sites.

Baroness Grender: If you do not have the information, can you write to us about that specific question?

Katie O’Donovan: Yes, certainly. That is no problem at all.

Becky Foreman: I would echo what Katie said. I have read the news article and I am aware of the case. The videos that the New York Times referred to are abhorrent and we do not want them linked to from our search engine. As soon as we were made aware of those URLs being available on our search index, we removed them. Where those URLs linked to child sexual abuse imagery, we reported that to the National Center for Missing & Exploited Children, as we are required to do. We also remove all URLs to non-consensual internet images from our search index. That could be rape porn or revenge porn.

The Chair: Thank you very much. That has been a useful exchange, although a bit uncomfortable.

Q237         Lord Griffiths of Burry Port: I feel as if the territory I had carefully drawn out for myself has been well trampled on by the previous exchanges. It is clear that the search engines will be included in the scope of the regulatory framework with the new Bill that we are all anticipating. As we look at the way that will happen, laypeople such as me are constantly bemused by the way platforms remind us that they are not publishers and search engines keep telling us that they do not host user-generated content. It is all smoke and mirrors for the ordinary person.

The two things that we have been confronted by again and again are the questions of how you deal with what is legal but harmful—we have heard the Law Society come up with a proposal that might address some of that, but here you are, representing search engines—and how to protect children from harm, and how that would apply to search engines. It is well-trodden ground, but it would be useful for you to share with us, in respect of those two questions, how you would anticipate being regulated under the Online Safety Bill.

It is all very frustrating, because there is very little detail around and the Bill has not been published yet. We are a little bit speculative and hypothetical in our questions, but they are well honed. It would be useful to hear your thinking as you anticipate the next phase of the work that you do with these search engines. I hope I have been coherent, by the way. Baroness Grender and I will have to sort each other out later, because she was spot on with so much that I would have wanted to mention, too.

Becky Foreman: As we have already talked about a little bit, search engines are very different from the other services that are impacted by the forthcoming Bill. They do not host content, share content or facilitate online communication between users. They index URLs on the web, and then serve up the URLs and results that they think people want to see in order to answer their queries. Of course, where URLs are illegal under UK law and we are made aware of them, we remove them from our search index. That includes URLs that lead to copyrighted content or child sexual abuse material. We also remove non-consensual, indecent imagery URLs and URLs that we have received a court order or warrant asking us to take down.

There are also a number of valid reasons why users might want to find content via search that might be considered harmful in social or communications contexts. They might be academics or journalists. Content removal obligations that we might consider to be very appropriate and proportionate when applied to certain other online services really heighten censorship concerns when they are applied to search.

Search engines simply help people find information hosted elsewhere, as Katie has already said, so regulators in the first instance should target content removal on the entity that is hosting the content. We are hopeful that the codes of practice that are put in place for search engines by the regulator will recognise that and understand that proactive content monitoring on the web is not technically feasible or scalable. We can do other things, such as public service announcement boxes warning about the content of URLs and trying to direct people towards other sources of information. We can do that kind of thing, but search engines are fundamentally different from user-generated content on the internet.

We have a very different type of social network attached to our Xbox gaming platform called Xbox Live. We are much more restrictive about the type of content that individuals can share there, because it is a closed social network. It has a focus on gaming, and so we are a lot more restrictive about what people can say, how they behave and the type of content that they can share. In that type of environment where there are children and young people using the service, we will pre-moderate images and videos that are shared before they are shared on that sort of network. As I say, search engines are quite different.

Lord Griffiths of Burry Port: We understand that they are quite different, but they allow activities to occur and interaction to take place that definitely is harmful. That is what Baroness Grender, it seems to me, was trying to put across. We understand the identity and integrity of a search engine. But it cannot just content itself with knowing what it supposed to be doing when it is clear to quite a lot of people that it is enabling other people to do other things that lie beyond its remit, in which it carries some degree of responsibility because it allows those activities to happen.

Can you understand the bewilderment of people who, understanding what you said about what a search engine is, are still a bit mystified? As far as we are concerned, you are all in it together in terms of the product that comes out of all these organisations, structures and so on. We need to be a bit more reassured, rather than feeling that you are hiding behind your self-definition.

Becky Foreman: I absolutely understand the concern that regulators and parliamentarians have about this. We make every effort to ensure that people are served with the search results that they want, and are not surprised by receiving URLs that they do not want to see. We offer a safe search facility on Bing. You can choose to have safe search set to high, in which case you will not be served any URLs to any type of adult material at all. That is one option if you want to make sure that you limit your exposure to those sorts of URLs.

There is a real challenge for all of us and for regulators as to where you put that line between freedom of expression and not wanting to unnecessarily censor content, and protecting people when they want to be protected. It is a real debate that legislators and parliamentarians should have about where that line is placed.

Lord Griffiths of Burry Port: Thank you very much, Becky. Sorry, I did not mean to harass you, by the way, with my question. Katie, do you have anything to add?

Katie O’Donovan: Becky made lots of important points. I share your frustration with the terminology. The terminology “publishers” is not helpful, because we are a different type of entity. Our platform YouTube, which hosts video content, will clearly be in scope of the regulation. It is not a publisher, but we have not waited for that regulation to act and set our own community guidelines on what content is needed there. The terminology need not distract from progress in this area.

To echo the points Becky made, we also have safe search functionality here. Thinking back to the premise of this inquiry and the tensions we are trying to unpick, a search engine works in response to the clear intent of users who put in the search terminology. We have a safe search function, which is in place for younger users. We also have tools such as Family Link that help parents manage what their children do and have access to. We also have rules on illegal content. If you are an adult with a clear intent to find content that is not illegal, some really careful calibration needs to be thought through when it comes to the legislation.

Lord Griffiths of Burry Port: It is a question of balance again, is it not? Thank you both very much.

Q238         Baroness Rebuck: My question is on a different subject. It is about news bargaining codes. It is really directed at Google, because I believe Lord Vaizey will ask a similar question of Microsoft. Google has argued that a mandatory bargaining code between newspaper publishers and platforms undermines the foundations of a democratic internet. I have a little question mark there. But from the newspapers’ perspective, having lost the majority of their advertising revenue to you and Facebook, they are now forced to seek payment for their very carefully researched and curated content elsewhere.

This committee, in its report on the future of UK journalism, recommended that the UK Government follow the Australian example and introduce a mandatory bargaining code. Quite honestly, all our witnesses from the press claimed that they found it difficult to reach a financial settlement that they believed reflected the value of their journalism. Katie, given that you have a 93% share of the UK search engine market, is this not really an imbalance of power?

Katie O’Donovan: There is lots in there, so I hope you do not mind if I give a multifaceted answer to that. The first answer is in general support of the news ecosystem and just to explain a little bit about our role within that. You reflected on some of the challenges and changes that traditional media have been through over the last 10 and 20 years. A lot of that is precipitated by the shift online and users doing things differently online. A lot of that is fantastically positive. The access that I, as someone sat in southeast London, can have to an Indian newspaper or an east African newspaper is completely different from when I was a child growing up. I was reliant on whether my parents had happened to buy a newspaper that day, that week or even that month.

The opportunities that have come with that have come with challenges. We have seen particular functions of newspapers move to different specialised sites. For example, local newspapers often thrive because of classifieds, which are now on Gumtree, or, if you are buying a new car, you might go to Auto Trader. We have seen a shift in advertising in national newspapers too. One thing to note is that national newspapers now have access to bigger, perhaps global, audiences. They have responded really well to that and are taking time to get business models right.

We very much see that we are part of that ecosystem and want that ecosystem to thrive. We do that in several ways. The first is driving traffic to news organisations. It is worth stating at the outset that nobody needs to be on Google search if they do not want to be. If you host your own website, you can put in a little bit of code that means you will not be on Google search. If a news publisher would rather they were not, they can do that.

Secondly, we drive a huge amount of traffic, if you think of those people searching for the Queen’s Speech or other terms, to news publishers in the UK. That is roughly £600 million each year. We also work in the news advertising market. We help them monetise content, particularly big frontpage news stories, but also that longer tail of advertising that would not necessarily have had adverts in print. Newspapers keep from 70% up to 95%, in some circumstances, of that advertising. We are a really important part of that news ecosystem and want to be a positive player in it. We work to help news organisations develop the skills that they need, at a local level, to understand how they market and find users digitally, but also how they get a website that loads quickly and can use really innovative journalistic techniques.

The final thing that is worth considering is a recent news announcement that we made, which was probably after your most recent inquiry, of our £1 billion investment into Google News Showcase. It is a way for newspapers to choose exactly what is shown to our users and audience. They can do that in a way that helps them recruit new subscribers or supporters. It perhaps shares a bit of premium content, but it encourages people to click through for more. That £1 billion is a global investment figure, but we have been able to partner with about 125 publishers in the UK, ranging from the FT and the Telegraph at a national level to many local publications.

The internet has enabled, often through Google search, enormous benefits for users who can uncover a wealth of new information. Sometimes, that is a timely news story that everyone has about the Queen’s Speech. Sometimes, you as a news publisher have spent two years investing in your staff to go out and do really difficult journalism. In that latter case, we updated our algorithms to make sure that that is reflected. We are hugely supportive of that wider ecosystem and news publishers directly with the £1 billion investment in Google News Showcase.

We also understand that the Government see themselves as having a role in this place. We will engage carefully in the work that the Government, since the Cairncross inquiry, have asked the CMA and the DMU to do on this issue. This is a good opportunity for us to work with government, the regulator and publishers to make progress in this area.

Q239         Baroness Rebuck: If it is as harmonious and full of great initiatives in the way that you have explained them, why did Google tell an Australian Senate committee that you might have to stop making Google Search available if the code became law in Australia? That no doubt would have been a nice opportunity for Microsoft, being the much smaller player there. Thankfully, you did not carry out that threat. I am interested in why you made the threat in the first place and what changed that made you change your mind. Was it because you were successful in lobbying for certain amendments? There were quite a few amendments to the Bill.

Katie O’Donovan: You might disagree here, and it is probably not worth us dwelling on it, but I would not characterise it as a threat. It was really important for us to explain what we thought would and would not work in Australia, and to seek good resolution for Australian publishers and all our users in Australia.

The original proposal from the Australian Government included a process of paying for a link to an Australian publisher, in a way that we felt, along with Tim Berners-Lee and others, changed the fundamental operation of the open and free internet. If you start with that premise, that can be applied to every link that we include in Google search. We felt that was an unworkable solution. It is not about whether we see partnership between us and the publishers as unworkable or undesirable; we do not at all. We have long invested in it and will continue to do so through the new CMA process. But that particular proposal was very challenging.

Baroness Rebuck: I will hand over now to Lord Vaizey, because he has a similar question for you, Becky.

Q240         Lord Vaizey of Didcot: Thanks, Baroness Rebuck. Your powers of telepathy are extraordinary, because I do have a similar question and it is directed at Becky at Microsoft. I would be very interested to hear why Microsoft thought the Australian law was a good idea. Some people might have accused Microsoft of complete opportunism in seeing the opportunity, having such a tiny market share in search, to see off its rival through the power of legislation. Maybe Microsoft genuinely believed it was a very good idea.

Becky Foreman: Microsoft really values democracy and the institution of the media as being very important to the health of democracy. As a technology company, we are able to thrive and run because we exist in many democracies around the world. When we see technology undermining the health of the free press, as it has, we want to pursue solutions to restore healthy journalism. We believe that democracy depends on that. We see in the Economist global democracy index, which it releases every year, that the global average is at an all-time low.

We wanted to look for opportunities to support the press. When Google threatened to pull out its services from Australia, we said that we were comfortable paying news publishers for their work and, as a result, running our search service at a lower economic margin, but with more economic returns for the press. We were prepared to do that. We are, however, really pleased that the platforms and Google came to an agreement.

The Governments in other democracies around the world should definitely look at what Australia did in passing its new regulatory codes, and whether regulations or similar codes could be established. We recognise that a one-size-fits-all approach does not work and that countries need to develop their own solutions, but the Australian approach has some elements that could work well in other places. We like the fact that it relies on market-based solutions and it only applies when a dominant digital platform fails to reach a compensation agreement with a news organisation for use of its content. There is a lot to learn from the Australian approach.

Lord Vaizey of Didcot: Could Microsoft now lead the way in other democracies by introducing its own bargaining code without waiting for legislators who are not as quick off the mark as our Australian colleagues?

Becky Foreman: We are having conversations with news organisations about how we can support them, but they should lead and decide on this. We want to see how they approach this, but we are very happy to continue having conversations.

Lord Vaizey of Didcot: Are you lobbying the British government to introduce something similar to the Australians? I am not saying, “lobbying” in a pejorative way. I am saying it in a save democracy/good deed kind of way.

Becky Foreman: I have not had any conversations with the UK Government about this, but we are having conversations with lots of other people about it. It is an interesting debate.

Q241         Baroness Featherstone: What I want to look at is really only for Katie, because it concerns YouTube—apologies, Becky. In April, YouTube removed a video of a round-table discussion with Ron DeSantis, who is the Governor of Florida, and lots of public experts. You stated when you removed it that you had clear policies on Covid and medical misinformation to support the health and safety of your users. You say that you removed it, because it included content that contradicted the consensus of local and global health authorities regarding the efficacy of masks to prevent the spread of Covid.

The issue is that we understand, and correct me if I am wrong, that YouTube objected to the opinion of some of the panellists that children should not, or should not be required to, wear facemasks. That flew in the face of the actuality, because the CDC said that only children under two should not wear masks. Over here in Britain, it was children under three, and no requirement to up to 11.

We want to understand why YouTube feels that it was justified in taking down the speech of experts. Are you saying that your market power of 95% makes you more expert, so you could make that judgment yourself? It gives you particular responsibilities to protect freedom of expression. Why did the company feel that removing the information was more proportionate than fact checking that case? You put yourselves first, or that is how it appears.

Katie O’Donovan: This touches on the policies that we put in place ourselves where legislation perhaps does not exist, and we decide how we want YouTube to be governed, the community guidelines that do that and how we implement that. We heard Minister Dinenage talk about the Government’s work and efforts on Covid misinformation and disinformation. Without wanting to state the obvious, at the beginning of last year, as Covid cases increased around the world, we did not have a Covid-specific policy on YouTube. We saw lots of varying and different cases of misinformation spread, some of those on YouTube.

In a public health pandemic, we have to consider whether the information we provide or enable to be shared would challenge people’s own health or the health of the community. For those reasons, we put in place very clear community guidelines around Covid misinformation. We based them on the World Health Organization guidelines and the guidelines of the NHS. We were saying not that we as YouTube had all the health knowledge and information needed, but that these were universally considered the authoritative sources of that content. We would very clearly look at all the content hosted on YouTube to see whether it met those standards.

Within that, you get people making different assertions. For example, for a while we had quite a lot of content about 5G being a cause of Covid. It was very important for us that we had the policies in place to remove that. In this particular instance, we saw a broad and general discussion, as you reflect, but also some commentary that looked at whether children could be at risk of Covid and whether they could pass that on.

It is important for us in those circumstances, regardless of who the speaker may be, that we can assess that independently and internally through our trust and safety aspects. This is not done by somebody like me or somebody who understands the politics of that person, but by somebody who understands our Covid-19 guidelines. They assess it against that and make a judgment. It might be that there is a longer-term discussion, corrections are made and there is very clearly no risk to humans. If there is a particular piece of content that goes against the Covid-19 guidelines informed by the NHS and WHO, we will remove it because of the very real concern to public health.

Baroness Featherstone: Did you reinstate it when the WHO changed its view?

Katie O’Donovan: I have the transcript of the conversation and I am happy to write to you with the detail there. As well as the conversation about masks and children, there was some commentary about whether children could catch and pass on Covid. We also had to look at that. It was not about just whether children needed or were required to wear masks.

Baroness Featherstone: It is a bit like saying, “We’ll choose these experts over those experts”. The CDC and Public Health England were on the other side of the argument. I am just wondering.

The Chair: It would be helpful if you could write to us on that, because we are running a bit short of time.

Katie O’Donovan: Yes, certainly. We would look for consistent medical advice on that. It was about the broader issue, not just the wearing of masks.

Q242         Baroness Bull: I will bring in Becky first on this one to save Katie’s voice. I want to ask you both about digital citizenship. We have heard from all our witnesses about the important of digital etiquette or citizenship—call it what you will. We have heard about projects with young people and silver surfers. We have heard about projects from all manner of organisations, including your own and the Church of England. Some witnesses felt that it was an issue for education. Others thought that it was more about platform design discouraging or encouraging civil behaviours. Thinking specifically about adults, whom we cannot capture in the classroom, what is the best way to promote digital citizenship online?

Becky Foreman: We agree that promoting digital citizenship is a very important task. Both tech companies and Governments have a role to play. Tech companies have a role to play in setting appropriate terms and conditions or community standards on their platforms, and then educating users about them. I agree that reaching adults is a really important aspect of this, because they are not covered by the education system in quite the same way.

This was underlined by our digital civility research, which we do very year globally. Our 2020 research, which was released earlier this year, yielded a really important data point in this regard. Increasingly, teens across the globe are turning to their parents and other adults for help with online issues. Data showed that 49% of teenagers who had experienced an online risk in our latest research said that they turned to a parent for help, while 31% saw guidance from another trusted adult such as a teacher. Back in 2017, those percentages stood at just 10% and 9% respectively. It is very clear that helping adults to improve their digital literacy impacts and helps teenagers, young people and children. We reach both groups.

People need to use and practise their digital and media literacy skill sets in order to keep them sharp. It is like any other skill; you need to do it regularly in order to maintain it at a higher level. In Microsoft, we partner with a number of entities to help people try to build those sorts of skill sets. One example of a partnership that we have is with NewsGuard. NewsGuard is an organisation that rates news sites based on journalistic criteria and gives them a nutrition rating, green, amber or red, based on the score they receive. We do that through a browser extension to Edge. Users can see the rating that is displayed and the details of the rating. That can help them evaluate the content that they are consuming.

In the US, we have also partnered with the Journalism Trust Initiative, which has supported a public service campaign focused on educating older adults on the importance of reviewing and checking sources on information that they were sharing of others. The practices that were really effective in this campaign included using educational and encouraging messages rather than emotionally charged messages, focusing the user’s attention on a tool that they could use to tackle the issue rather than on the problem of untrustworthy news, and using metrics and analytics efficiently to find out whether the method that had been used to reach people had had the desired impact on the audience. Tech companies can do a number of things, but this needs to be a partnership between the tech sector and the Government in order to reach adults and help them develop the digital literacy skills they need.

Baroness Bull: I wish we had more time, because I would be very interested to explore the link between digital literacy and digital citizenship. As these are not universally understood concepts, we conflate the two. They may indeed overlap.

Katie O’Donovan: This is so important. We often frame this solely through children. There is technology that we can and should put in place. We have tools such as Family Link that help parents enable what their children do and do not see online. We obviously need the right community guidelines and policies to enforce those, but there will always be people who look from a different perspective or try to circumvent them. That is where this digital citizenship is so important.

We do a lot of investment in secondary and primary schools. That is important for the instant, but it pays off in the future too. We have reached about 70% of primary schools through our Be Internet Legends programme, which is now five years in. It is surprising how few programmes have that longevity and scale. We have partnered with Parent Zone and made sure that that was approved by the PSHE Association, so we knew it worked and was relevant.

As adults, we are all becoming cannier online, which is a good thing. We also become slightly more mature online by experiencing different things and thinking about things from different perspectives. We make sure that our users have the tools, in terms of their account settings and, on a site such as YouTube, what content and comments they can restrict on their account. Those initiatives are super helpful too.

The Chair: Thank you very much indeed. I will have to draw things to a conclusion. I want to thank our witnesses very much indeed for very good answers to the questions we have asked. We have covered a lot of ground. I am very grateful to you. It has been recorded and there is a transcript if you would like to see that as well.


[1]              Amended by witness: On our search engine Bing