final logo red (RGB)

 

Communications and Digital Committee 

Corrected oral evidence: Media literacy

Tuesday 6 May 2025

2.30 pm

 

Watch the meeting 

Members present: Baroness Keeley (The Chair); Viscount Colville of Culross; Lord Dunlop; Baroness Fleet; Baroness Healy of Primrose Hill; Lord Holmes of Richmond; Lord Knight of Weymouth; The Bishop of Leeds; Lord McNally; Lord Mitchell; Baroness Owen of Alderley Edge; Lord Storey; Baroness Wheatcroft.

Evidence Session No. 8              Heard in Public              Questions 127 - 162

 

Witnesses

I: Zoe Darmé, Director of Trust Strategy, Google Search; Ben Bradley, Senior Government Relations and Public Policy Manager, TikTok; Laura Higgins, Senior Director of Community Safety and Civility, Roblox.

 

USE OF THE TRANSCRIPT

This is a corrected transcript of evidence taken in public and webcast on www.parliamentlive.tv.

 



31

 

 

Examination of witnesses

Zoe Darmé, Ben Bradley and Laura Higgins.

Q127         The Chair: Good afternoon and welcome to this meeting of the Communications and Digital Committee. My name is Baroness Barbara Keeley, and I am the chair of the committee. Today we are continuing our inquiry into media literacy by hearing from representatives of three online platforms about their work in this area. The session is being broadcast live, and a transcript will be taken. Our witnesses will have the opportunity to make corrections to that transcript when necessary.

Welcome to our witnesses and thank you for joining us. I start by asking you to introduce yourselves, starting on my left.

Zoe Darmé: Thank you so much. I really appreciate the opportunity to be here. I am a director of trust strategy at Google. Specifically, I work on products such as Google Search and Google Mapswhat we call our knowledge and information products. I am particularly pleased to be talking to you today about media literacy, which is something that we have really invested in, and is near and dear to my heart. It is something that my team focuses on, along with other online safety issues.

Ben Bradley: Thank you, Chair, and thank you to the committee for having me here today. I lead on online safety for TikToks public policy team in the UK. I have sat on DSITs Media Literacy Taskforce, as well as Ofcoms Making Sense of Media advisory board and Establish Working Group.

If you are not familiar with TikTok, we are an entertainment platform. Our mission is to inspire creativity and to bring joy. Key to understanding TikTok is that your recommendations are based on a content graph rather than a social graph. To explain that briefly, with social media that you might be familiar with, traditionally what you would see in your feed is based on what your friends have posted, what your friends have liked and what your friends have shared. What you would see on TikTok is based on your interactions, so whether you have liked a piece of content, said you are not interested, or watched it in its entirety.

Laura Higgins: Thank you very much for having me today. I am the senior director of community safety and civility at Roblox. I have been at Roblox for about six and a half years, but prior to that my background was all in online safety and digital safeguarding. I was the operations manager at the UK Safer Internet Centre, where I ran all of our helpline services supporting victims of online harm. That is still an area of passion for me.

My work at Roblox is very much around partnership working, creating helpful resources for all of our different audiences, and very much embedded in media literacy. I am also on the board of the Ofcom Making Sense of Media panel and sit on numerous other working groups in that field, both in the UK and overseas.

I will give you a little background on Roblox, if you are not familiar with the platform. Likewise, we are not a social media platform; we are a game-creation platform. Every day, millions of people come on to the platform to create content where they can build experiences on a range of things; some are just fun play experiences, and some of them might be roleplayfor example, pretending to be at high school or running a pizza cafe.

Some of them are really creative around fashion. We have a very popular experience at the moment called “Dress to Impress, where you are able to pick items out of your virtual wardrobe and dress up in different themes and then people vote on the winner.

There is a really wide variety of different experiences and they are suitable for different ages. That is the content in there, and then other people come and they can create educational experiences and learn and play together in those spaces too.

Q128         The Chair: We have a number of detailed questions. My question is perhaps one that I will ask you to answer at the higher level, so that we do not get into the detail that my colleagues are going to ask about later.

Ofcom describes media literacy as everyones business. How would you describe the role and responsibilities of your platform in relation to media literacy?

Zoe Darmé: I am happy to start. I would generally agree that media literacy is everyones business. We are entering a time now where all of us are both creators and consumers of the information ecosystem, and so it is quite important that we all have a baseline understanding of media literacy.

My role at Google is to work on not only information literacy but information quality and information integrity. I will take a quick step back to say that the way we think about this issue at Google is that there are lots of interrelated concepts. There is media literacy, there is information literacy and there is digital citizenship, which I am sure Roblox will talk about as well. A lot of these concepts can be interrelated.

At Googleespecially for products such as Search and Maps, which are not social media productswe think about this in three ways. One is the off-platform programming that you can do through partnerships. I think you heard from ParentZone earlier last month. The second is through on-platform products, features and tools. We have things such as about this result, about this image, double-check and Gemini that can help people find more information and context about a source. Then the third way is through awareness-raising campaigns, which we have done here in the UK through programmes such as Hit Pause. Roughly speaking, that is how we think about it. I work on all of those issues specifically for Search and Maps.

Ben Bradley: For us, it is really tied back to that mission that I mentioned around creativity and joy. We have eight community principles that guide all of the work that we do around trust and safety, as well as media literacy. I will not name them all but there are probably three that are really relevant for this discussion: those around fostering civility, enabling free expression and protecting users from harm.

The objective for us in this space is really making sure that our community has the tools and knowledge that they need to discover, learn, engage, and create information and content.

As for Ofcoms statement, I would agree that it is everyones business. I think we need a whole-of-society response if we are to have the biggest impact that we can have here. Yes, that includes online, but it includes at home and in schools as well.

We focus our efforts on where we think we can have the biggest impact and where we think we have the most authority to speak, but we also try to support those conversations that might be happening off-platform. For example, we have a family pairing tool, which I know many other platforms have. This allows a parent to connect their device to their teen’s account and tailor around 15 or 16 different settings. The teen can actually override those, but it notifies the parent. The aim there is to start a conversation in that household around what the right digital guard-rails are. We provide conversation guides on the best ways to go about that as well.

Laura Higgins: Similar to what Ben and Zoe said, we very much believe that, yes, it is everyones business. First, the safety element is our responsibility, but helping people to understand how to use those safety tools definitely sits with us. We need to make sure that they are easy to use and easy to understand, and that we are regularly keeping people updated, whether that is the parents, the people using the platform, the creators who are building on the platform, or wider stakeholders. Discussions like this are really helpful, along with working with educators and other people, and caregivers who are supporting families, to really make sure that people feel confident in using the tools that we provide on Roblox.

There is also a second layer. At Roblox, we quite often have a younger demographic starting on the platform. We see that as an opportunity to help really give them a strong confidence in how to navigate the online spaces and then take those positive behaviours into the other platforms that they are using as well. It does not just stop with Roblox. It is about building the soft skills around things such as relationships and managing conflictthings that do not necessarily seem like digital issues but are actually very present in online spaces. We want young people and their families to thrive in all the online spaces that they are on. That involves some safety education, as well as the media literacy piece. As I say it is about engaging with all the stakeholders who have those touch points.

It is also thinking about how we reach those audiences. We have a family centre, and we have all of our tools and guides in one place, but not everyone may be digitally literate. The parents might not feel comfortable going on to sites. We also appreciate that parents and caregivers are very busy and sometimes may not have that time, so it is about thinking about how we engage in the spaces where the platform users and the parents are.

We do workshops, physically working with schools to get physical resources into the hands of parents. For example, with South West Grid for Learning, which is part of the UK Safer Internet Centre, we have a little pocket checklist. It is really easy to use and parents love it. It is a physical thing that they can have with them and read when they are on the Tube. We find that those sorts of things are really appreciated.

It is about having that multilayered approach to how we how we think about media literacy and how we get it into the hands of the right people.

Q129         The Chair: You mentioned running workshops and working with schools. Just so we have an idea of the scale, how many would you run in the course of a year?

Laura Higgins: We work with a range of partners, here in the UK and overseas. For example, Internet Matters in the UK is one of our key partners, and we have done online sessions for parents on particular topics. One of the areas of focus we are working with it on is for neurodivergent kids and their families, because we appreciate that there might be some nuance around the support that they need and what their experience is when they are online.

We ran some really big webinars that were very well attended, with hundreds of parents attending those. For Safer Internet Daywhich I know many of us support each Februarywe always make sure that we do some sessions. I have done some with local schools. We also support some of the organisations that we work with, which then go out and deliver sessions in the schools that they support, working with, for example, the UK Safer Internet Centre and others.

Q130         The Chair: Do you track your reach with schools? The specific question was about how many schools you are reaching.

Laura Higgins: I cannot give you that number, I am afraid.

The Chair: You mentioned hundreds of parents. It is hundreds, is it?

Laura Higgins: Yes.

The Chair: You do not know how many.

Laura Higgins: There is just myself in the team, so I am not able to scale it as an ongoing programme. It is more that we provide resources that then become a curriculum that is used by online safety organisations, for example.

Q131         Lord Knight of Weymouth: I want to explore a little further the tools, resources and support that you offer users in relation to media literacy. In particular, how do you ensure that that support is easily accessible to users, including children and those with limited digital skills, including parents?

I have a particular interest in this. You all have the capability to profile users—you profile users as part of what you do. Are you able to use that to see higher-risk behaviours and then push support towards those who might particularly need it, or are you fairly passive and just publish a model that is there if people want to find it, and then off you go?

Zoe Darmé: I will tackle this in two ways. I will talk about our off-platform programming and partnerships and then I will talk about what we do on-product, because I think your second question has to do with on-product user behaviour.

Off-platform, we are tailoring our specific programming to specific markets and audiences. For example, we have our Be Internet Legends programme; it is called Be Internet Awesome in the US. We have worked with local UK partnersincluding ParentZone and PSHEto make that locally relevant to UK audience members. We have been able to scale this to 80% of UK primary schools through multiyear investment in that kind of programming.

Likewise, my team developed the Super Searchers training. That is information literacy training based on the SIFT method developed by Mike Caulfield at the University of Washington. We take that and locally tailor it to use examples that might be relevant in a specific market.

To your question about profiling, I would say that one important thing to note is that we have turned off personalised ads for under-18 users. That is an important fact to remember in how we are understanding user behaviour. We do not personalise ads to minors.

As it relates to organic content, I think a lot of times people think that we have tons of information that we are using to tailor to individuals to give them certain search results. Often what we have is in the search box itself, including what you put in, the language you are searching in and your location. It often seems so relevant to you because we have just got better and better over time at serving relevant results, but it is not always based on personalisation.

At the bottom of the search results page it will say that the results are personalised for you but that you can try without, so you can see what that looks like without personalisation.

Q132         Lord Knight of Weymouth: You have announced that the Gemini chatbot will be available to under-13s. Through conversation, that is going to understand more about a users behaviour, and it could potentially understand some risky attitudes or behaviours, and therefore would trigger you to want to suggest some of the media literacy tools that you have. Is that fair?

Zoe Darmé: One thing I would say—though I do not work on the Gemini chatbot specificallyI think you would have similar questions about Search and follow-up questions, and it is useful sometimes to understand past searches in order to customise results for you. It makes a certain level of sense that, if I am searching for football and I am here in the UK, I am probably not searching for American football, for example.

The information literacy tools are available to you no matter what you are putting in the Gemini prompt or in the search results box. One of the tools that I mentioned on Gemini is our double-check feature. That is available no matter what you put in; it is available for you to quickly check a result or an output, and get sent to Google Search to learn more about it.

Lord Knight of Weymouth: It sounds like the stance is to make things available, not necessarily to push them as a response to behaviour.

Zoe Darmé: I would say that is right, especially for our information literacy tools. We want to make that freely available to everyone, no matter what you are searching for. These tools can be important for all variety of searches or for all variety of information needs.

Lord Knight of Weymouth: What is the TikTok story?

Ben Bradley: Obviously, often social media platforms reflect society in size and scope. There is a very broad range of issues where we have media literacy campaigns or interventions. I will not go through and list them all, but I will explain the approach that we take and give some examples that might be useful.

We try to work with our internal experts but also external experts, to take a holistic approach. When we are thinking about what we want to do in media literacy, it is a process of identifying a risk where there might be a particular challenge, assessing the degree to which that risk is present on the platform and then seeking the best way to mitigate it. That can be through many different ways.

Looking at some of the interventions that we have, we have many different prompts that arrive at different moments in an experience. Some of that is personalised based on the user. If you are over 35 on TikTok, you get a prompt around the family pairing tool on the assumption that you are a parent. Obviously we do not know, because we do not request that data, but on the assumption that you are, you are prompted to enable the family pairing tools in your household.

Sometimes, it is dependent on the activity that you are undertaking as well. We rolled out a feature a few years ago called the kindness prompt. If you are writing a comment and we assess that that comment may violate our community guidelinesour rules of the road for what is allowed on TikTokin that moment we prompt you to reconsider it because we think there is a strong likelihood it breaks those rules. In four out of 10 cases, people do not post those comments.

There are the prompts that we do and then there are labels, depending on what people are engaging with. We worked with MoneyHelper and Citizens Advice a little while ago on our financial literacy public service announcement. Depending on the type of content you are engaging with, based on a variety of hashtags and keywords, if that video relates to financial information, we have a label at the bottom of it that says, “Get the facts about financial literacy. You go into that hub, and it has tips from Citizens Advice and from MoneyHelper around spotting scams, action fraud and all of those things.

It is dependent on who the user is and their age range. If you are under 18 and on TikTok, one of the first things you see when you create an account are privacy highlights for teens, which is privacy and safety tips. But it is also dependent on what they are doing on the platform and what type of information they are seeking to engage with.

Q133         Lord Knight of Weymouth: As the online safety legislation is implemented, you will have to more closely ensure that your terms and conditions are followedfor example, the minimum age at which people sign up and take part. Does user behaviour help you understand the age of a user? Does that then mean that that might trigger you to surface tools for children in the way that you have described for adults?

Ben Bradley: Some of those interventions I mentioned—the privacy highlightsare for teens rather than for adults. That is based on your declared age on the platform. There are other things that we do as well. We are constantly searching for potential underage users. If we remove you because we believe that you are underage, you are then presented with a different option to verify your identity, such as a selfie with an adult, for example. You are right that there are those moments where there is an additional opportunity for intervention based on age.

On the wider piece with Ofcom, we are reviewing the guidance from Ofcom in relation to the child safety codes that were published, I think just over a week ago, and working through those 1,600 pages. We have been regulated by Ofcom since 2020 as a video-sharing platform. We have a longstanding relationship with Ofcom as a regulated entity. I think that will continue to be positive in the future.

Laura Higgins: We are very different in the way that the platform works. We do not have a feed of content that comes up as on more traditional social media platforms. We also take very limited data about the users at the point of sign-up, because many of them are younger children under 13. For older users coming on to the platform who wish to have a little more freedom in the types of experiences that they access, and access to things such as voice chat, they are ID age-verified to be able to access those additional things.

We do not advertise to minors on the platform and so, while we can pick up certain signals, ours mainly come from direct chat. There is a text chat function across the platform, which, when kids are co-ordinating in games, is how they communicate with one another. We have extremely strict filtering. Any sort of poor language or bullying behavioursanything like that—or any sharing of personal identifiable information is hashed out so that the other user will not see it. But we may get a signal from that that the child has tried to say something or has tried to share personal information, for example.

One of the things that we have in place is called helpful letters. We have just updated those. These go out to young people where we pick up enough of a signal that they are in danger. We would then report that proactively to NCMEC, which is our reporting platform for grooming and so on. We send a letter to that child explaining that we have seen some worrying behaviour on the platform and pointing them directly to support services. We do the same if we believe that a child has been a victim of serious bullying and so on, and also for self-injury and suicide threats or concerns.

We partner with an organisation called Find A Helpline, which is a global network of helplines that we can then surface. We have them both on the website but also included in these letters. Depending on where the child is in the world, it would direct them straight to helplines in their region, so that they do not have to go searching.

Q134         Lord Knight of Weymouth: Forgive me, in 2023, you reported a significant increase in child exploitation cases on the platformup to 13,000 in the year. Has that come down again? Have the things that you have been talking about made a difference in respect of the levels of activity that you do not like on your platform?

Laura Higgins: I am sorry, but I do not have the numbers to share with you. I believe the numbers will likely have gone up, which, when you think about it, is a positive thing, because it is being recognised and it is being reported. We have also introduced additional safeguarding teams internally. I cannot share too much, because how we operate to prevent harm on the platform is confidential, but we have certain things in place where we are proactivelyboth in our community and in some of the adjacent communities that our young people are usingdetecting and preventing harm before it happens. As I say, we are one of the few platforms that proactively make the reports, rather than waiting for a request from law enforcement.

There are many initiatives. As well as the helpful letters, I want to flag a wonderful experience that we did in partnership with Google, as part of the Be Internet Awesome programme, where we co-created a game all about online safety. It has had over a million plays in just over six months, which is absolutely incredible when you think that it is about online safety and most kids do not want to get involved in those discussions. We managed to make it engaging. It is a great way to get those messages out in a subtle way.

Q135         Lord Knight of Weymouth: Finally, I want to come back to Google and TikTok around issues of childrens data. Google is subject to a case in the US on childrens data harvesting, which is being contested. I know TikTok has an ICO investigation currently in respect of data collection. Do you see it as your responsibility to educate your users around data being collected, why it is being collected and the privacy of their data?

Ben Bradley: We take data security and privacy extremely seriously. I mentioned the tool at the start. When you are an under-18 user, one of the first things that you see on TikTok are the privacy highlights talking about the features that we have and features that you can enable.

In relation to the ICO investigation, I believe the commissioner has said that it is looking at the whole sector but using TikTok as an example, rather than looking at something specific about our practices, which I think are broadly consistent with industry practices. We invest very seriously in this. We just ran a cybersecurity campaignI think it was last monthas part of a national day of action, looking at things such as two-factor authentication and what more we can do there. We saw some really positive results as part of that campaign.

Zoe Darmé: I have already mentioned what we have done on the advertising side, which is to turn off personalised ads for minors. In addition to that, we obviously have very strong privacy protections and a full range of options for all users to set their own privacy settings, including auto deletion at three months as one option. It is important to have those tools very easily available in your account settings.

In addition to that, in our off platform training we have included modules on data privacy and personal information sharing online. We have tried to increase awareness of what is available to all users, including under-18 users, through that programming.

Q136         Lord Knight of Weymouth: Do you collect the data around how many users go and look at those settings to change privacy and how much curiosity there is among users?

Zoe Darmé: I am sure that we have that. I do not personally have that at my fingertips right now, primarily because our account settings are centralised.

Lord Knight of Weymouth: If you were able to supply us with a sense of the proportion of your users generally, but especially child users, who explore those settings, that would be helpful.

Zoe Darmé: At account creation, everyone goes through a set-up flow that includes their privacy settings. All users will go through that for child users. If you are an adult setting up an account for a minor—as I did for my stepson, together with my husbandyou go through a privacy set-up and privacy check-up to ensure that your settings are where you want them to be.

In addition, as I mentioned, we have all the off-platform training, as well as certain policies that are bespoke to younger users. For example, a lot of people do not necessarily think about this, but many people have shared photosI have been guilty of this myself, with shared photos of my family and my stepson, for example. We also have a policy called the minor images removal policy. Through that policy you can request removal of photos of people under 18 that are available in Google Search.

We have other privacy protective services, including our Privacy Plus or Results about You product, which is a monitoring programme that we have available to anyone. You can check and see whether your address is appearing in search results, your email address, or your confidential banking information. We make it really easy for you to get alerts and also to have those results reviewed and removed.

Q137         The Chair: Just to clarify on Roblox, the note I had before the meeting is that the Be Internet Awesome programme was work you did in the United States.

Laura Higgins: It is the US programme, but it is obviously available globally to all users.

The Chair: Is that the case too with the digital civility curriculum, which is in the United States?

Laura Higgins: The team that I am in is the civility programme. Within the whole range of our media literacy resources some are UK-specific and some we do in Latin America, South Korea and the US.

The Chair: We are obviously interested in reaching the UK with this.

Laura Higgins: All of our resources are in English and available to UK users, regardless of the partners who they have been created with.

The Chair: You are not able to say how many you are reaching.

Laura Higgins: No, sorry, because the curriculum that we deliver is not a schools programme specifically; it is a suite of resources as part of that. In South Korea, we have a whole education programme in schools, but that is in Korea specifically and not in the UK.

The Chair: If you have the updated figures that Lord Knight was asking about, it would be helpful to have those later.

We have a couple of follow-on questions.

Q138         Lord McNally: Just one point. In an answer Mr Bradley gave about a request for help or clarification, he said he received a 1,400-page response from Google. We all know that that is one way of answering a question without answering the question. Do you think that is a style that is adopted by the

Lord Knight of Weymouth: Do you mean the 1,200-page codes from Ofcom? That is what was mentioned.

Ben Bradley: I am sorry if that was unclear. That is Ofcom’s code of practice. That is the length of Ofcom’s documents published in the child safety codes a few weeks ago.

Lord McNally: Do you think that is helpful? When I ask a question, even if it is a very good question or a correct question, I am always suspicious at 1,200-page responses.

Ben Bradley: I think it was in relation to what steps we might take further under the Online Safety Act. There are some things I can point to, based on our relationship with Ofcom, but there are some specific measures that are inside Ofcoms final child safety codes. I cannot recall them now, because I have not had a chance to read through the full breadth of Ofcoms work. Once I have, I am happy to write to the committee with more information on the types of changes that we have made, once I am able to digest that and speak to our teams.

The Chair: We are trying to digest it, I think. It is 1,200 pages.

Q139         The Lord Bishop of Leeds: You seem very confident about age-verification systems, and yet it is clearand not just from anecdotal evidence—that adults can pose as children and children can pose as adults. How robust are the age-verification systems?

I have a 14 year-old grandson. He just laughs when I say that there are all these systems in place. I am over the age of 10, so I do not understand how you do it, but he is very adept. I would like some reassurance that the systems are robust, and that we are not just convincing ourselves that everything is hunky-dory when actually there are great gaping holes in the system.

Ben Bradley: I am happy to pick that up to start with. In order to download TikTok, you need to be 13. At the very starting point, the device that a child may use to download TikTok is registered at the right age. We have age ratings in the App Store and the Google Play Store, and you need to be the right age to download the app in the first place. You are then presented with what we call a neutral age gate. We do not prompt you to confirm you are over 13 or preload a box that says you are over 13. We ask you your date of birth and, if you publish a date that shows you are under 13, you are blocked from re-entering it. You cannot keep trying again and again. That is the very start of the process.

The Lord Bishop of Leeds: But what if you lie?

Ben Bradley: That is the very start of the process. Once you are on the platform, we have a range of measures to surface and remove under-13 users. We publish every quarter how many suspected under-13s we have removed from the platform. It is around 20 million a quarter, to give you a sense of the amount of investment that we make in this space.

Transparency is really important here to assess different platforms and the investments they are making in order to live up to the policies that they create. For us, that is the data that we have publishedand we have published that every quarter, I think for three years nowto hold ourselves to account and to engage in these conversations.

Under the Online Safety Act, Ofcom has a requirement for highly effective age assurance. I cannot give a specific answer in relation to that right now, because it is detailed work and we are reviewing it. We will be engaging with Ofcom as part of that and have been already.

Q140         The Lord Bishop of Leeds: Sorry to push you, but what if you lie on that very first thing? If you are 12 and you put in a date of birth that implies you are 18, does the system pick that up, and, if so, how?

Ben Bradley: That is the second half. Once you are on the platform, we have a range of tools to detect and remove those users. That is the data that I mentioned20 million a quarter is how many people we are removing who we believe to be under the age that they said they are. We then opt them into a different experience. They have a chance to appeal in those situations, and there is a range of different methods you can use to do that. I mentioned the selfie with a parent, for example. That is the position.

Q141         Baroness Owen of Alderley: Can I jump in on that question? Is this age verification for accessing TikTok for every country in the world? What would happen if a child rerouted their VPN?

Ben Bradley: That is the process that we have. That is the sign-up flow that every user goes through, everywhere in the world. There are some jurisdictions where there are different age limits or different TikTok products available. In the UK, if you are registered with the UK App Store, that is the age.

Baroness Owen of Alderley: So rerouting the VPN is not a workaround?

Ben Bradley: I will have to come back to you on the technicals. I think it is based on your App Store location, which you have registered the device with, rather than maybe where you might be at that time.

The Chair: Laura and Zoe, do you have points to add?

Zoe Darmé: I think Ben answered very well. I add that we do not just rely on declared ageto your point, Lord Bishop. We also have age-assurance or age-inference technology to help confirm whether a users declared age is accurate. If we find that there is a discrepancy between declared age and we are inferring someone to be a minor, we likewise will send them through a verification flow that helps double check their age.

Laura Higgins: Ours is a very slightly different set-up. As I say, we are a platform that does allow under-13s because we are not collecting that data on them. Our first point is that we make sure the platform is safe for all people. We assume that they may well be a child, and therefore those safeguards are automatically applied.

The age verification that we do have is to access our 17-plus experiences and additional things such as voice chat, for example. We use systems similar to those used in the banking system. They are very, very secure. We do not store any data, but official ID is required to access those.

Q142         Baroness Wheatcroft: A very quick question, if I may, to TikTok in particular. If you are having to remove 20 million people a quarter, does it not imply that maybe you should be doing a bit more in the first instance to stop them getting on?

Ben Bradley: That is the worldwide figure. The process that we have is industry standard, I think. Ofcom has done some work testing different age gates as part of the VSP regime. Ofcom found ours was one of the strictest in not pre-populating particular age and not asking people to tick a box that they are over 13. Highly effective age assurance is part of the Online Safety Act, and we are reviewing those measures and seeing if there are particular changes that might need to be made.

Baroness Wheatcroft: So you are not necessarily confident that you are doing as much as you could.

Ben Bradley: I think we are doing an enormous amount on our age-assurance system. We have age-inference models, as Google mentioned, and where there is discrepancy we take action. The reason I share that figure is that transparency is an important part of this conversation, to make sure that we are living up to not just what we are saying but what we are doing. It prompts these types of conversations, where we can look at the investments that companies have made, whether you think that is too high or too low, and compare that across different platforms. For us, transparency is the first step. That is why we publish that figure every quarter.

Baroness Wheatcroft: Lord McNally is concerned by 1,200 pages. I find 20 million a quarter quite a disturbing number.

Q143         Baroness Owen of Alderley: I declare my interest again as a guest of Google at its Future forum and AI policy conference. I want to start by asking how you collaborate with and support UK media literacy organisations, including by providing funding.

Zoe Darmé: Our Be Internet Legends programme has been funded since 2018. I think we have shown our commitment there to multiyear funding. As a former government grant-maker and someone in civil society, I know that it is really important to have sustained support that you can plan for. We have taken that into consideration when we have seen what our partners can do. We try to invest in partnerships that we can carry over. I think you heard from Vicki Shotbolt that our work with ParentZone has been very strong over the past years.

In addition to that, I am happy to relay that my team right now is exploring a partnership with CILIP, which is the major library association here in the UK. We have already piloted our Super Searchers programme through Public Libraries 2030 here in the UK. I remember, in particular, there was training at LancashireI do not know if anyone here is from Lancashire. We are looking to scale that more broadly.

Globally, we have reached about 2,500 trainers, who have turned around and trained a million end users. But I think we could do more in the UK. It is one of the reasons I flew over here to be with my colleague, Rosie, to talk about media literacy. We were very pleased to be in conversation with CILIP to bring that here to the UK as well.

Ben Bradley: The work that we do is tailored to the needs of the users and to our platform, and that includes region as well. It follows the process I outlined before of risk identification, assessment and mitigation.

To give a few examples of how we have applied that in the UK, we carried out focus groups with teachers on their understanding of online safety provision in the UK but also of TikTok. That found that two-thirds of teachers had a low awareness of TikTok specifically, because we were a newer platform and they were not familiar with using us. But a similar number also wanted to have resources provided to them by platforms such as TikTok.

In that case, we partnered with Internet Matters on an interactive teacher playbook. That was everything that you might want to know about TikTok: the features that we have, and the safety tools that we have available to assist teachers in conversations that they would have with students or with parents. Then we partnered with a different organisation called Ditch the Label, which you might be familiar with, on specific lesson plans around online safety issues. That is an example of how we have applied it to teachers.

We did the same for parents. Last November, we carried out polling of 1,000 parents. Encouragingly, around 60% felt more confident to engage in online safety conversations than five years ago, which I think is something we should all be proud of. But 50% still did not feel like they had the technical knowledge to engage in conversations. We created the Digital Safety Partnership for Families, which was a specific conversation guide to allow parents to engage in these types of discussions.

That is the process that we follow and some of the partners that we have worked with.

Laura Higgins: As I mentioned, we also partner with Internet Matters. We have been working with it since 2019. Some of that has been research and focus groups with young people and parents, to gain an understanding of young peoples experiences online and to hear more from them about where they need additional support—really digging into where the need is, so that we can then try to support them with those issues.

More recently, as I mentioned, we have been focused on supporting neurodivergent families here in the UK. We are not experts in that field, and neither are Internet Matters; it is the expert in media literacy and digital education. We partnered with Autistica and Ambitious about Autismspecialist charities here in the UKwhich helped to support us with, again, focus groups with tweens, teens and parents, as well as with qualitative research that we have done with them on identifying where the support needs were and what resources they wanted.

Last year, we launched a suite of resources based on that. We have now reviewed that with the groups, to look at the effectiveness and the impact, and what changes families recommended for us to make. We have now updated those based on that direct feedback, which was a first for us. It was really interesting to be able to have that full cycle and learn.

Last week, we launched some new resources based on that work, around screen time, which we know is a concern for parents and can be something particularly challenging for those kids with additional support needs. That work is going to be continuing. It is a long-term partnership.

We also support the UK Safer Internet Centre and all the partners within that. We are members of the Internet Watch Foundation. We work with South West Grid for Learning on resource creation and supporting its broader educational work in schools. We work with Childnet. We support Safer Internet Day each year. This year, we co-created a quiz with Childnets youth board, which we launched for Safer Internet Day. That was all around spotting misinformation and scams. Anyone who was at the event at BT Tower probably got to play with it this year—it was fantastic. There is a lot of ongoing work around those and planning for next year alreadyso more to come.

Q144         Baroness Owen of Alderley Edge: How are you making sure that this support has impact and is genuinely meeting users needs? Are you measuring it?

Laura Higgins: That is the tricky thing, and I am sure that all of us would say that it is really hard to measure the impact of media literacy.

We have a data science team internally at Roblox. It is based in California, but it works with UK families and young people. We do pulse surveys. We ask communities, “Did it help? Do you feel more confident in your skills?” We have started looking at the hard numbers around downloads and people accessing, but very soon—these are things that are coming—we will look at identifying specific times for when we launch a new resource, making sure that it is targeted. We will then monitor whether there has been a reduction in the number of reports of certain things that that would have impacted.

We have a suite of parental controls that we are regularly updating and bringing out new features. Again, we will look longer term at uptake and doing surveys with parents to ask how helpful this was for them. It is an ongoing piece of work.

Ben Bradley: There are several principles that we follow to make sure that what we are doing is meeting user needs. There are three that I will mention now. The first is meeting users where they are and making sure that the interventions or the campaigns that you have are timely and relevant to what they are expecting to receive. We have a default one-hour screen-time limit for under-18s on TikTok. When the one hour is up, that is when we prompt users to review their screen-time settings. In that case, we saw a 230% increase in use of that screen-time tool, because it was relevant and timely to what they were trying to engage with.

I would also stress the importance of being engaging. It has to be consistent with the type of user experience that they are expecting where they are. TikTok is a video-first platform. You are most likely to engage with video content on TikTok. When you are receiving online safety information, you do not want to be steered off to a different website, where you have to log into an e-learning hub to get additional information; you want video-first content, ideally within your experience. When we first designed our youth portal, it was text based, but we got feedback from teens and from our safety councils that they were on TikTok and they wanted to see this information in a video-first format. We redesigned it around that principle.

The last way that we make it relevant to user needs is through collaboration. I mentioned some of the partnerships before, where we work in collaboration with experts and academics on issues that are of relevance to our users and the things that they want to see and hear about. We then make sure that we implement it. The one-hour screen-time limit that I talked about was designed in conjunction with Boston Childrens Hospital’s Digital Wellness Lab, based on what exactly the right limit would be for under-18s on the platform.

Q145         Baroness Owen of Alderley Edge: Does TikTok make an attempt to measure the success of its strategy?

Ben Bradley: Definitely. Measurement is one of the biggest challenges. There are different methodologies that you can employ: you can look at the recall rate of a specific campaign and how many people remember the tips; you can compare people who have been exposed to a campaign and people not exposed to a campaign and measure whether there is a difference in perspectives and attitudes. Ultimately, there is not a one-size-fits-all solution when it comes to measuring the impact.

It is really important for us, before we launch an intervention or a campaign, to think about what it is that we are trying to achieve, and then what a good metric for achieving that is, and then assessing it. With the kindness prompt, we wanted to reduce the rate of comments that violated our guidelines. We designed the kindness prompt, and now four out of 10 people do not post those comments.

We did something similar with community guideline violations. We saw that 75% of people who violated our rules a second time did so under the same policy area, which might indicate that they had a low awareness of what the rules were. We redesigned the notifications that you get when you breach our rules. When you breach our rules, we send you the fact that you have broken them, what the specific policy was and further information on that. We see now that 60% of people who violate our rules the first time do not go on to violate them a second time.

Zoe Darmé: I am very glad that you asked this question, because measurement is really important. I agree with my other co-presenters here that it can be difficult.

One of the ways that we have tried to get at this is to work with third parties to evaluate our programme. Ipsos ran a study on Be Internet Legends and found that children who went through the programme were twice as likely to have an improved understanding of online safety concepts and three times as likely to be able to spot something suspicious, like a scam. There has also been a third-party independent randomised control trial. Any of you who have worked in social sciences will know that the gold standard in research is an RCT. That found an improved understanding of online safety, as well as improved online interactions relative to their peers.

I am a bit timid to say that we have a 106-page report, because I know that Lord McNally said that that might be too long. But we do have a 106-page report that we worked on with a research organisation called Ecorys. This outlines not only our Be Internet Legends programme but Super Searchers, which I mentioned, as well as some of our other media literacy investments.

Separate from the partnerships where we have someone come in to evaluate our own programmes, we also know that third-party evaluation is important. My team is responsible for researcher access to data. For Search, we just opened our researcher access API globally.

The Chair: We will come to Baroness Fleet in a minute, but before we leave your point about measurement, we have had a number of contributors to our inquiry suggesting that tech platforms should fund media literacy initiatives more, including one who suggested that online platforms and social media companies should be compelled to fund media literacy programmes. I understand your point about it being difficult to measure, but can you let us know—it does not have to be now—whether you can measure the current amount of funding that you are putting in? That would help answer that question.

We will go back to Baroness Owen and then I will come to Baroness Fleet in a moment.

Q146         Baroness Owen of Alderley Edge: You have spoken about your partnerships, and we have taken some written evidence. I want to put to you a quote from the LSE’s written evidence, which said that practitioners who draw on funding from platforms, “feel conflicted about relying on them when nothing is being done to address the root cause of harm, which lies in platform business models. Their funding can feel like a PR exercise rather than a genuine commitment”. It would be helpful to get a response to that and an outline of how you preserve the independence of funded partners.

Zoe Darmé: I start by saying that, first and foremost, we have to build safe products and services. I am sure that you have heard from many other Googlers that our mission is to organise the worlds information and make it universally accessible and helpful. The reason that you have heard that so many times is that we really do believe that. Users would not come back time after time to find helpful and trusted information if they did not find it to be safe and trustworthy. First and foremost, we have to invest in our own platforms. If you take away only one thing, it is that we must invest in our own products and services, and make them as safe as they can be.

Separate from that, I can understand the sentiment that was expressed in that written evidence. It is important to note that we do not just give a slide deck and some talking points to our partners and say, “Here you go”, almost like an advertisement for our products and services. Often what we do is a pilot and testing process with our implementing partners. They give us critical feedback and sometimes I say, “Oh my gosh, another round of review to make this better”. But in the end, it is the right thing to do. They are experts in things that we are not necessarily experts in, and it only makes our programming stronger and more effective if we take on their feedback, iterate, make it locally relevant and make it age appropriate.

We have one programme that I run to teach parents and caregivers about parental tools. We have a general set of that for parents and educators that we run through our parent and teacher associations. We have another set that is geared towards families who are lower income. Some of the feedback that we got from our implementing partners was, “You are talking about this device and that device. You are assuming that they have all these devices in the home, and many people cannot afford that”. It would be a shame if all we did was give money and a slide deck and say, “Here you go”. That would be a wasted opportunity in my view.

Ben Bradley: I agree with that. Similar to what Zoe said, our mission is around creativity and joy. If users are having a bad experience, I do not think that that would be consistent to the long-term business growth that we want. When we design the interventions that we have, we design them to be timely, engaging and relevant. I hope that the examples that I have provided are a good demonstration of that. The sum of the measures that we have are often greater than the parts.

Screen time was mentioned before. If you look at the screen-time experience of under-16s, you find that you have the default one-hour screen-time limit on TikTok, and that some of the first videos you see are privacy highlights and safety tips for teams. We send “take a break” reminders to you. We send weekly dashboard notifications about how much time you have spent, comparing that against different daysyou have access to that data. We do not send you notifications after 9 pm. Last month, we launched a full-screen meditation video that, at 10 pm, you have to sit through, encouraging you to take a break, breathe and reflect on your online usage. We then have additional tools for parents that can block use of the app in particular moments. We run media literacy campaigns to raise awareness of all these tools.

When you look at all of these things in combination, they are really powerful. I can understand some of the scepticism that you described in the evidence if you were to look at one or two of those features in isolation. But when you stack them on top of each other and look at the holistic experience that we are designing, the sum is greater than the parts. You can think about it like a Swiss cheese. One individual piece might have a couple of gaps but when you stack them they are really powerful.

Q147         Baroness Owen of Alderley Edge: How are you preserving the independence of the partners that you are funding?

Ben Bradley: Fundamentally, we work with them because they are experts and they are independent. They have insights that we need and want in the design of our products or in the design of the work that we are doing. It is not in our interests to do anything that would undermine their independence, and it is just as important to them that they are independent, because they will want to engage with a large range of companies. We would not put them in that type of position, and I do not think that they would be happy with it.

Laura Higgins: Similar to what Ben said, we really value the arms-length relationship that we have. They are very much critical friends. We also have a safety and civility advisory board. That is made up of independent experts in the US, the UK and Europe. We meet regularly on product policy issues and things that are coming out soon. We value that independent voice and critical feedback, and we do not pay our board members to be on that board. We appreciate that the expertise that our partners bring could be challenged. We really value that.

Parents can rightly be critical, doubtful or a bit suspicious sometimes, particularly about Silicon Valley tech companies. Asking what our intention is was a good point to raise. The organisations that we work with are trusted partners—they are trusted organisations with families and parents and educators. We would not want to jeopardise that in any way.

Q148         Baroness Fleet: I am so pleased that we have had a chance to talk about age verification, which, Mr Bradley, you have highlighted is not working very well20 million is a lot, and those who have slipped through the net are probably even more. Clearly, you are taking your responsibilities seriously.

I would like to come on to the question of responsibilities. You have talked a bit about screen time and so on. I am interested in addiction and how you recognise addiction at different ages, what you consider to be a level of addiction at different ages and how you take the responsibility to help to prevent it. You do not just say tell parents that their five year-old is spending six hours on TikTok at 10 pm or whatever it is. Could you talk a bit more about how you are taking those responsibilities to the next step?

Ben Bradley: Some of the features and products that I mentioned are important. When we have carried out research in this space and looked at third-party research, one of the things that comes through is the strength of feeling on agency. People want to know about and have data on their usage, and have greater tools that can help them manage that experience. Some of the features that I walked through before on screen time include, if you are under 18 on TikTok, a default one-hour screen-time limit.

Baroness Fleet: Is that per day?

Ben Bradley: Per day, yes. That was built in conjunction with Boston Childrens Hospital. There was a debate about whether you should set hard limits or if that would be counterintuitive. In conjunction with Boston Childrens Hospital we designed that one-hour limit. We are the only platform, I think, to have that default setting for under-18s.

We also looked at the question of agency. Lots of that research showed the need for agency, so we have our dashboard, which provides greater information, and in addition we have the tools for parents. There is all the work that we do as a default setting, and then we have additional media literacy tools for parents to set specific guard-rails that might be appropriate in their household, so that they can say “Absolutely no usage during these hours or after these times. The things that we do include no notifications after 9 pm for under-16s and no notifications after 10 pm.

Q149         Baroness Fleet: You mentioned earlier that the parents can intervene and that you expect them to have a discussion, but the teenagers can overrule their parents. What responsibility are you taking for that? Should the responsibility not be with the parents and not with the teenager, who may say, “I disagree, and I am going to go on using TikTok for six hours a day”?

Ben Bradley: Let me clarify that. As we build the product, there are three layers to it. First, we build in safety by design. If you are 13, you have a very different experience from a 16 year-old or an 18 year-old. If you are 13, you do not have access to many features, regardless of what might be allowed in your family pairing tools. You do not have access to direct messaging, for example. If you are 17, you might have access to some features, but they are disabled by default. That is where the family pairing tool can come in. When you are 18, you can access them. In relation to what we call the time-away feature, which allows parents to set hours, that is a hard block and something that the teens cannot override. That is something that is set by the parent. It is a hard limit, and the teen cannot override it.

Laura Higgins: We also have a suite of tools for parents. We have screen controls, so that they can limit the amount of time that their children are using on the platform, as well as the types of experiences that they are using. Parents can synch their own account, so that they can manage it remotely.

With the slightly younger demographic who we have on the platform, a lot of our education work with parents is about getting them involved in their kids lives. We do the safety bit—the basis of what we do is to provide a safe platformbut for parents to understand what their kids are doing when they are online is important, particularly walking with them as they first start out on this journey.

Not all screen time is the same, as we know. The advice that we give is very much around if your child is spending a certain amount of time on one experience and playing for hours and hours. We do not want them to be doing that. They must remember to take breaks. We want to see a healthy balance of online and offline activities.

With some of the older people on the platform, and those getting into creation, having conversations with their parents has been interesting. Sometimes the parents assumed that it was just a game and the kids were just playing and that it was a bit of a waste of their time, when what they were actually doing was creating and building experiences and learning coding skills and all of the different things that come with that. Once the parents understand that, they perhaps would then work with their kids, saying, “Okay, that is fine. I do not want you to be on there all day but now I understand what you are doing and the fact that it is educational and you are creating something”. They might be able to put different rules around that than if the child was just playing.

Q150         Baroness Fleet: Can you identify addiction among your users?

Laura Higgins: The term is a difficult one. We also work with the Digital Wellness Lab, which talks about problematic media use generally, as opposed to addiction. We realise that that term has been used, particularly around social media. We do not monitor a childs use specifically at our end. That is one of the tools that we have offered to parents.

The Chair: We are running out of time, so can we leave that question now, if there is nothing else to add from Google? We need to move on if we are going to finish our questions today.

Q151         Viscount Colville of Culross: You have talked a lot about the separation between your safety partners, which are doing the literacy work, and your own companies. However, there still seems to be an extraordinary level of ignorance among so many users about the way that the platforms work. A fifth of adults do not understand the way that the algorithms customise the way that they are fed information. Apparently half of adults are not confident that they can identify adverts when it comes to search engines particularly.

You have talked now for an hour about all the work that you are doing to try to make sure that people have effective media literacy and understand how your platforms work. Why do you think there is still so much ignorance among users about the way that your platforms work, the way that they try to engage our eyeballs and the way that they try to sell us advertising?

Zoe Darmé: When it comes to media literacy, one way to think about it is that people are engaging in media literacy best practices all the time, when they may not know that that is what they are doing. In some cases, that is okay. We would not want everyone to feel that they needed to have a masters degree or a PhD in information science to do something like lateral reading, for example. That is a best practice where you take a piece of information and try to find out more information about the source or about what other people are saying about the topic. That is why we have built that into the “about this” results feature. That tells you more about the page and about the topic that you searched for, to make it really easy. People do not need to know all of the technical details about information retrieval to be good consumers of information in the information ecosystem.

At the same time, it is important for us to do our utmost best to help provide transparency and to provide tools so that users can understand why they are seeing a certain result or a certain ad. We had user research that confirmed that people have their own mental models for why they are seeing a certain search result or why they are seeing a certain ad. It was helpful for us to build directly into the product about this result and about this ad so that they could understand why they are seeing what they are seeing. We invested in those types of in-product interventions where you do not need to know so much about all of the algorithmic ranking systems that we could be using to create a product like Google Search but you can give a layman’s explanation that you are seeing this result because it matches your keywords and you are searching in English, and factors like that, so that people have a strong mental model of how Search works, without needing advanced technical understanding of the product.

Q152         Viscount Colville of Culross: Is the truth of the matter that some people do not want to know how the algorithms work and that you can give them all this information until the cows come home, without effect?

Zoe Darmé: I think people are naturally curious, and 15% of all searches are new each day. To me, that is a stat that blows my mind every time I think about it and the limitless nature of human curiosity.

Lord Mitchell: Are you saying that 15% of all searches have not been searched before?

Zoe Darmé: That is correctI will say that many of them are misspellings. This is all to say that some people do want to know but do not feel like they have the technical capacities. Some people just want it to work.

Separately, I do a “How Search Works” presentation where I have everybody draw their mental model of how Search works. Oftentimes, from people I would expect to have a good command of the basis of information retrieval, I will get little pictures—this may be because people are not great at drawing—showing Search, a thought bubble and “results”, or something like that. It shows me that, in order to know that Search works, you have to have the experience of putting something in the search bar and getting the results that you think are trustworthy, authoritative, helpful and relevant.

Many people do not want to and do not need to know. They just want to know that it is there and that it works for there. For those who do want to go deeper, we have built some of tools so that they can understand their results and how advertisement works a bit better. It is a mix, depending on peoples interest level.

Q153         Viscount Colville of Culross: Ben, what is your response to the level of ignorance there is about the way that people have access to TikTok and its ads?

Ben Bradley: I go back to the objective we have to make sure that users have the tools and knowledge to engage and to create content. That is important to us, and that is a business priority as much as it is anything else. The posts on TikTok are by individual citizens in the UK, and obviously we want to make sure that they have the tools to express themselves in that way.

When we look at the types of tools and interventions, we invest very heavily in making sure that they are engaging and that they are meeting people where they arethe principles that I mentioned before. Of course, their usefulness is related to how often they are used. We have a “why this video?” feature, similar to Google but for search, about why you might be seeing a particular video—it is popular in your area or by a creator that you follow, for example. We also allow you to refresh your For You page on TikTok, as if you were a new user, if you want to try something new. We also invest very heavily in raising awareness of tools like refresh.

We ran a campaign called #SaferTogether, which was looking at this and how we can raise awareness of the ability to refresh your algorithm, for example. We worked with leading UK creatorsthe type of people you would expect to see every day in your typical TikTok experienceon many different tips. Some of those were how to enable comment filters or how to report content, but others were on the underlying features—how videos get recommended or how to refresh your algorithm. We saw in that one campaign a 10% increase in familiarity and awareness of those.

Laura Higgins: Ours is slightly different, because we do not surface content. While we do have algorithms helping to support the platform, they are mainly used for our safety features. They are used to help us to scale taking down spam, detecting harmful content and those sorts of things. That said, we can all do a lot more to help people, particularly parents and educators, understand what the platform is and how it works.

Many people still believe that Roblox is one game and that kids are spending hours playing one game, whereas it is millions of different games and experiences across a range of different topics. We know that we need to do more in that space, and we are always striving to have those conversations to help get that out there.

Where I think it is quite interesting is for the creator community within Roblox. If you were to use the platform—please do go and have a look—you would find that we have charts that show the most popular games, and things like that that can help people. It is quite obvious why those games are there: they are the most popular being played at that time. But the creator community might have some more questions about how they can boost their game or get on the leaderboard, or around why their game is not being picked up. We have something called the creator hub. We have a developer relations team that works with the developer community. They have FAQs and they are listening to the community to find out what those questions are. They regularly update content so that it is available for that particular community.

We know that people need different types of information to hear best, so we have onboarded our first ever global parent advocacy lead. They will be doing a lot of the work on how we amplify that message and help parents to deeply understand. We have just put out new parental controlswe had a launch in November last year. We did over 30 launches of different safety features last year. We have a lot more coming, but we know that we are still not getting that information in the right hands in a timely way. We will be focusing on that and on having a two-way parent engagement.

We also have a Teen Council, which we started the first phase of this year. We will be onboarding a new one in the summer. It is meeting with cross-functional teams within Roblox, with pretty much an “ask me anything” approach. It will talk about topics such as the new controls that we will be bringing out directed at teens later this year, and community standards, and really getting the bones of that.

Q154         Viscount Colville of Culross: We are running out of time, but I want to ask you one question about the role of Ofcom. Many people say that Ofcom should be focusing on minimum standards for media literacy by design. At the moment, what feedback do you have to give, as a platform, to Ofcom about the quality of your media literacy programmes?

The Chair: Could I ask that you keep your answers concise? We are rapidly running out of time, and we have a couple more questions.

Ben Bradley: We work with Ofcom as part of our regulatory relationship. We are a regulated VSP, and we will be until the Online Safety Act comes in. Media literacy has been part of our discussions, including the efficacy of interventions. The framework that we have at the minute with Ofcom and the Government, where they can look at best practice or at case studies on hard to reach audienceswhether that is neurodivergent communities or others—is a good one.

Zoe Darmé: We have adapted Ofcom’s best practices in media literacy. We were quite pleased to see that those included things such as being proactive, implementing user-centric design, and implementing monitoring and evaluation, which all seem broadly sensible things to do. I am pleased that Ofcom has a mandate to promote media literacy. As you can see, it is something that I am personally quite passionate about, and something that Google invests a lot in.

Laura Higgins: We were part of the working group that helped come up with the best practice guidelines, so we are very supportive and we are integrating those. We also signed the pledge that Ofcom put out on media literacy last year, which is helpful as a way of keeping us laser-focused on how we can keep improving and building in those milestones to make sure that we are not with the principles but perhaps going above and beyond.

Q155         Lord Mitchell: I have been knocked sideways by your statement about the 15% of searches every day being new, even allowing for misspelling and everything else. That probably means that every 10 days there is a complete refresh. I struggle with that number but I would be interested if you could come back to us on it, because it is a very interesting number, if it is correct. We can leave it at that.

Could we talk about other metrics, please? You have said that it is difficult, but, difficult or not, let us try it again. How do we measure the reach, impact and effectiveness of your interventions on user understanding and behaviour?

Zoe Darmé: I will try to be brief, even though there is a lot to unpack in that question. You have heard thematically from some of our answers that you can think of it through outputs and you can think of it through outcomes. I have given you a couple of stats on reach outputs. For example, 80% of UK primary schools and 8.5 million UK children. That is how we would think of output-based measurement.

Outcome-based measurement is more like the Ecorys study, the RCTs or the Ipsos study, which I mentioned, where we are trying to measure whether these programmes are effective and whether they are increasing the understanding in young people’s minds of key online safety concepts and media literacy concepts. For that, it is important to have third-party partnerships with research organisations.

Ben Bradley: I would add that it is about thinking about what it is that you are designing the intervention to do from the outset. What is it that you want to achieve? It is about thinking about those metrics and then assessing yourself against it.

We have an unverified sharing prompt. We have a fact-checking partner in the UK and, if something cannot be verified true or false by that fact-checker, we do not recommend it. If you have searched that information and found it and try to share it, we add an additional warning saying, “Are you sure you want to post that? It’s unverified content and the fact-checker can’t conclude”. About 25% of people do not share it. There is obviously room for improvement, but it is a good example.

We work with different experts and partners. That was designed in conjunction with a company called Irrational Labs in the US, which is a behavioural science organisation. It is about thinking in advance about what you are trying to achieve.

Laura Higgins: Internal and external research feedback loops are very important. Later this year we will be working with our creator community and an organisation called Prosocial Design Network to A/B test features that will include media literacy and behaviour change, I hope, within the platform, using those to inform change directly in the platform and externally.

Q156         Lord Mitchell: Is the information about your interventions or initiatives made publicly available or shared with third parties?

Ben Bradley: I can answer briefly. Where possible, we publish information where we work with different partners. I have mentioned a few interventions and partnerships with Internet Matters. That research is public, but we also have a research API that allows researchers to conduct their own tests on TikTok and see the results that they are seeking. Between the US and Europe, we have around 750 active research projects using that API at the minute.

Zoe Darmé: I hate to point to our big report again, but we are committed to publishing findings, and we are happy to send that along as well.

We make public other types of evaluations. There is a wealth of resources that we could provide to this committee if you are interested in follow-up. It is very important to support the research community. That is why, in part, my team is the team that is working on researcher access to data. As part of that researcher access to data API, we say that as long as someone is an academic or a researcher with a need, you can have access to that, subject to certain qualifications. What we expect in return is that you would make your findings publicly available and part of the growing evidence body of what works and what does not.

Laura Higgins: Wherever possible we work with researchers. We have published white papers about some of the initiatives that we have led, and we work with Ofcom, of course, with the Inspired Internet Pledge that we have there. That gives regular updates on this work, and there is more to come.

Q157         Lord Dunlop: We have received written evidence from DSIT setting out what it regards as the Government’s responsibilities on media literacy, which it describes as providing a clear vision, promoting cross-government collaboration and collaboration with stakeholders. Given your experience of working with Government, how are they discharging these responsibilities? Are there any gaps in the policy and regulatory framework that you feel need addressing?

Laura Higgins: I would defer to my colleague Tim, who is our head of policy and probably would be the best person to answer that. I think that we have a good relationship with them. We meet regularly on different topics. This one is a relationship more with Ofcom, and it is interesting to see how the dynamic works between the Government and Ofcom. I have not seen any gaps that I can speak of.

Ben Bradley: The framework that we have is a strong one, with the Government and the regulator acting as that convening body and looking into specific issues that can help inform the work that everyone else is doing. Looking at whether gamification is an effective media literacy intervention can help—we can use that as a case study for the applicability of our own services as well.

Co-ordination is important here. With the new media unit in government, you can see the importance of co-ordination. That should be reflected across media literacy as well, while making sure that it is everyone’s businessto steal Ofcom’s quote. The DCMS Committee published a report last year, just before the election, looking at the importance of early adoption in government communications, making sure that you are in spaces that people are engaging with in order to have effective work.

Q158         Lord Dunlop: Just before we move to Zoe, Ben, you are on the Media Literacy Taskforce steering board, which was disbanded. Why was that? Has it been replaced by other forums for the Government to engage with stakeholders?

Ben Bradley: I believe DSIT has said that that work is being led now by the Digital Inclusion Task Force, and media literacy is included within that. There was some good work produced as part of that task force. Looking at case studies on gamification and hard-to-reach audiences can help inform the work that we are doing.

Take a step back here. It is important that the very largest platforms are talking about what they are doing, but what the task force has done, and what the Ofcom best practice principles allow, is this. If you are starting on your media literacy journey today—TikTok started on an online safety journey in 2018, when we launched, and learned a lot from the platforms that came before us—and launching a new intervention in the UK, in 2025, for a new platform, there is a wealth of evidence that you can now use to inform your work. You are not starting from a blank slate like many people would have been before.

Q159         Lord Dunlop: Zoe, perhaps you could add something and address the issue of who within Government is driving and leading work on media literacy, to make sure that it has the priority that it requires?

Zoe Darmé: I will humbly admit that I am not the most expert on the different departments in the UK Government that have a mandate here. I know that my colleague Rosie and my other UK-based colleagues work closely with DSIT on its online media literacy strategy, and with the DfE on guidance that it sets out to promote media literacy in schools.

Your original question, Lord Dunlop, was about gaps that we might see. If I can excuse myself, I will give one American example. One thing that I have noticed in a neighbouring state to mine—I live in new York—is that New Jersey has looked at possible legislation to mandate media literacy in the school curriculum. I think that that is one way to maintain sustainable over-the-year policy interest in providing this as a baseline to all citizens, just as you would have a civics or geography class. That would be on area that Government could look at. I do not want to prescribe something for our UK colleagues.

Q160         Lord Dunlop: On the issue of things that are mandated, you all have obligations under the Online Safety Act to mitigate online harms. Where does media literacy sit in relation to compliance with these obligations?

Ben Bradley: Media literacy is important, regardless of the regulatory framework that we are operating in. It was important before, when we were regulated under the VSP regime, and it will be under the Online Safety Act.

Where it would fit most closely in the Online Safety Act is probably the user empowerment duties. We have not seen the draft code from Ofcom yet, but we have seen, in the illegal harms statement and the child safety statement, measures included that you might describe as media literacy—making sure that when you block a user they are presented with particular information. While it might not have a dedicated chapter currently, as it did in the initial Green Paper, it has clearly informed part of Ofcom’s thinking.

Laura Higgins: We have already been working towards this, having set up the civility initiative and all of our education programmes in early 2019. Having come from the opposite side, where I was very much involved in helping to create the Online Safety Act, in the early days we knew the suggestions of what would be coming down the line. When we were thinking about our programme, that was in mind. As we are seeing the Online Safety Act coming into force much more, I think we are already slightly ahead of the game.

Zoe Darmé: I find the Online Safety Act’s focus on our safety duties and on Ofcom’s mandate for media literacy to be very complementary. If you can think about the safety duties as protecting users, you can think about the media literacy elements as empowering users, and these things work together.

For us, media literacy is not just about regulatory compliance. You have heard me reference a bunch of programmes and initiatives that we have run for many, many years. One of the unique things about Google is that you can have an idea and it will give you a lot of autonomy to run with it. The origin story of Super Searches was not regulatory need. A few of us at Google were regularly being asked by our local library—we are big fans of libraries, I hope you can tell—to come and give information literacy training, and we would do this because that is the Googly thing to do. We thought to ourselves, “Why don’t we scale it, and make it evidence-based, and create something that can be not for the localities that might have a willing Googler but be a programme?” When I saw that Ofcom had a mandate to promote media literacy, to me it felt like validation in a lot of what we were investing in already.

Q161         Lord Dunlop: A final quick question from me. Is media literacy something that would ever feature on a company board agenda?

Zoe Darmé: I think they are quite interested lately in CSR programmes. I have to admit that I am in a trust strategy part of a product team, so I am a few levels down from talking to our board regularlylet me just put it that way.

Lord Dunlop: You are never asked to present to the board.

Zoe Darmé: We have had more questions lately from the board and from our internal processes that support boards of directors about online safety, generally speaking. I think that they are seeing this as part and parcel of how our brand is perceived and how trusted we are, because that is all part of users wanting to come back and use our products. There is an interest there. I will not say that I have gotten a specific question about media literacy, but we have definitely helped provide responses to questions that the board might have about trust and safety practices.

Ben Bradley: Trust and safety is important to our leadership team. It is part of our CEO’s OKRs—objectives and key results. Trust and safety sits with those, so I think there is an interest, in a holistic sense, in how media literacy can complement the trust and safety work that we do.

Laura Higgins: I would like to see it happening a lot more, because that is my whole job.

We regularly give updates every quarter to our senior management team. I feel that our founder would be quite well-equipped to speak on this if that did come up, so that is something. Certainly, a lot of the people who are on the board at Roblox are really keen on this because of the nature of the platform and the educational piece around it as well. I would not be surprised, but more of it, please.

The Chair: We are running into our last few minutesin fact, we are running slightly over time. Are you all okay to stay for a few minutes?

Q162         Lord Storey: Your companies are worldwide, international companies, making billions of pounds. They are good, successful companies. Those pesky Governments around the world require different things from you. Some are more hands-off and some are more hands-on. You have to respond, or try to respond, to what those Governments want.

Perhaps you could speak not as representatives of the particular standpoint of your company but from your personal observations. As an aside. I differentiate online safety from media literacy, although I get the point that online safety has to be part of media literacy. From your own perspective, is there a particular country you can point to where they have cracked the whole business of online safety and media literacy? Maybe you can point to two separate countries doing it in a different way, so that we could look at what they do.

Talking about safety, I went into a driverless Google car in San Francisco, and you have safety absolutely spot-on there.

The Chair: Could I add to that? There is the point being made about a state mandating media literacy, but a number of contributors to the inquiry have said that it should be for platformsthat our Government should mandate you, the platforms, to fund and work with these initiatives. Although you have given us lots of examples, you do not have to do it. You could stop doing it tomorrow because there is no legal requirement. If you could answer both those points, that would be helpful.

Laura Higgins: One country where I am always amazed how advanced they are compared to the rest of the world is South Korea. I mentioned that we are doing a huge amount of work there. Some of it is with corporate companies that are sponsoring efforts. We have been working with teacher training colleges; a whole programme specifically about Roblox and online safety has now been adopted and is being used as part of teacher training. The Government fully support educators and parents. That has been really interesting, and I would be more than happy to share some more information on that outside of this meeting.

Lord Storey: Yes, please.

Laura Higgins: I agree with you. Obviously, we aim to be compliant in all the countries that we serve, and we work very closely with all of the Governments, but it is always more about the regulatory piece. We take our responsibility very seriously when it comes to some of these things and do not wait to be told. We want to have these things in place already, whether it be the trust and safety elements, or the education piece.

The Chair: Should it be mandated, then, by Governments?

Laura Higgins: I feel that it should. From our standpoint, it would not necessarily force us to do more things, because that is the already the way that we are. I would like to see everyone taking more responsibility for these things.

Ben Bradley: To the worldwide question, reflecting on the strength of the UK’s position is an important one. If you look at the work that Ofcom has produced—the three-year strategy and the best practice principles—there are very few countries worldwide that have done that level of work, so I would champion the UK’s approach.

On mandating, our focus as TikTok is looking at what the particular risks or challenges are to us and how we can best address those and have the biggest impact possible. It is not something that we would pause doing because it is not mandatory. I outlined the importance from a business perspective as well, in making sure that users feel able and safe to express themselves and have the tools to do so. This is not just about the trust and safety side, there is a business imperative as well.

On the point around impact, if you were purely looking at this from a third-party provision lens, some of the steps that we have taken might be hard to produce otherwise. Looking at the screen-time limit tools that we have, and prompting users to use those, we saw a 234% increase. That would be very hard to replicate through a third-party measure. For us, looking at the unique challenges that we might have and how we can address those specific is what we are thinking about.

Zoe Darmé: I agree with Ben. It has been clear from some of my other answers that Ofcom’s best practices and DSIT’s media literacy strategy have shown a lot of attention paid by the UK to media literacy and information literacy. I think it is a good model. We honestly get quite a bit of interest from the UK on this topicperhaps more than from other countries. To answer this question, I would point to the UK and the interest that we have received in developing things like a three-year strategy and best practices.

On whether I think that it should be mandated, I echo the comments that we would be doing this whether it is mandated or not. If you think about the nature of Google Search, we are a product that people use for their information literacy journeys. It is often the thing that you use to get more information about something that you just heard about. Quantum mechanics is something that I am not particularly literate in, so I will turn to Search to try to learn more. For us, it is at the core of our product to have strong consumers of the information ecosystem. I am very supportive of information literacy and media literacy.

The Chair: Thank you. We have run over a little, so I would like to thank you very much for staying with us and answering all our questions. Thank you for your contributions today. They have been really helpful.