Digital, Culture, Media and Sport Sub-committee on Online Harms and Disinformation
Oral evidence: Online harms and the ethics of data, HC 646
Tuesday 22 September 2020
Ordered by the House of Commons to be published on 22 September 2020.
Members present: Julian Knight (Chair); Kevin Brennan; Steve Brine; Philip Davies; Alex Davies-Jones; Clive Efford; Julie Elliott; Damian Green; Damian Hinds; John Nicolson; Giles Watling.
Questions 1 - 175
Witnesses
I: Theo Bertram, Director, Government Relations and Public Policy EMEA, TikTok
II: Yuan Yang, Beijing Deputy Bureau Chief and Technology Correspondent, Financial Times, and Rui Ma, Creator and Co-Host, Tech Buzz China.
Witness: Theo Bertram.
Q1 Chair: Good morning and welcome to the Digital, Culture, Media and Sport Committee. This is our first hearing into online harms and the ethics of data. Today we are going to be joined by Theo Bertram, the director of public policy at TikTok. For our second panel, we will be joined by Rui Ma, creator and co-host of Tech Buzz China, and Yuan Yang, the Beijing deputy bureau chief and former China tech correspondent for the Financial Times.
Before I begin, I will ask any Members if they have any interests to declare. No? Okay. Thank you.
I will start. Our first witness is Theo Bertram, director of public policy at TikTok. Good morning, Theo.
Theo Bertram: Good morning. Thank you very much for inviting me.
Q2 Chair: Thank you for joining us today. You will probably have noticed that prior to the summer recess this Committee issued a report on Covid-19 and disinformation. How were you able to deal so quickly with users posting disinformation?
Theo Bertram: I can give you a lot of information. I will give you something short and then if there is any area that you want me to dig in on, I will be happy to do so.
Let me start with some numbers. We saw that 700,000 videos created in the UK on TikTok during this period so far were related to Covid. Of those 700,000, we removed around 13,500, and of those 13,500, we removed around 1,500 for medical misinformation—Covid misinformation, if you like. Of that 1,500, 40 were videos related to 5G, which I know is something you picked up on in your report.
If you step back and look at those numbers, by and large we saw a community that was acting very responsibly. We saw people coming to our platform broadly not looking for news about the pandemic but probably to escape it. People came to celebrate and dance, as I am sure you saw, and escape. I am happy to describe in more detail how our processes work, how we detect and so on.
Q3 Chair: Theo, I am interested to know how you came out with this. Did you have staff going through TikTok, or was it with algorithms? How did you identify what was potentially harmful disinformation?
Theo Bertram: There are two key pillars in anything to do with harmful content. One is policy and the other is enforcement, and that is true with this as well. We started to change our policies two weeks before the UK went into lockdown. We anticipated that there would likely be issues around Covid misinformation and we significantly increased the bar for anyone posting medical claims about Covid. Rather than it being something that we needed to disprove, if you were going to post a medical claim, you needed to have verifiable proof that we could objectively see that the claim was true, so we made that change.
On the algorithm and enforcement side of things, we started identifying the key words associated with Covid, and whenever anyone uploads a video we immediately scan it to see whether it contains any of those signals that we can take to think that this is a Covid-related video. Then, as we started to remove videos that we identified as being harmful, we could supplement that original list with, “These are the signals that we think are particularly harmful,” so they could be cued up for moderators to review. That is the process by which we identified the problem and removed it.
Q4 Chair: The algorithms identified the potential problem and then the moderators oversaw the videos, so it was a pair of eyes, effectively. I imagine that takes a lot of staff time.
Theo Bertram: It does. One of the things that I will probably talk about a bit is how I think we are a second generation tech company. What I mean by that is that most companies, I think, had 10 or 15 years to build the trust and safety apparatus that we now expect from a modern social media company. We are two years old but we already have more than 10,000 moderators globally, including thousands in Europe. If you think of those numbers, it took the competitors you have been speaking to in previous hearings 10 or 15 years and a lot of pressure from your Committee before they reached those numbers.
Q5 Chair: How much freedom do local TikTok teams have—for example, TikTok US or TikTok UK? How much influence do you have over policies and community standards about content and behaviour, and also on what data is collected?
Theo Bertram: There are a few points there. Let me start with the question of autonomy at a regional level. One of the reasons I joined this company, and one of the reasons why I think many of my colleagues joined, is that we do have a senior European management team and we have a lot more autonomy than I think is perhaps the norm for companies like ours. There was previously an idea that if you were a tech company, everything was global. For us, it is a much more regional approach. Our Trust and Safety team is led from Dublin, and they are the ones who work on the policies and enforcement, making sure that you get the nuance right. They bring in the NGOs to help advise them and we do that at a regional level.
Q6 Chair: What about data?
Theo Bertram: We do have a data privacy policy but clearly Europe has its own privacy regulations. We apply GDPR. That will mean that is the approach for UK citizens, but essentially our protections globally are the same and they are very high.
Q7 Chair: Okay. Effectively, your data protection is mirroring GDPR; you do not go beyond that?
Theo Bertram: We go significantly beyond GDPR in terms of data security and data transparency. I think we go beyond any other company that you have spoken to in terms of accountability. Data access is obviously controlled and strictly regulated within our company. The others will also have the same kinds of processes, but what we do that I do not think anyone else does is we say that third parties can come in and verify that. We would have a partner that can come in and verify that our data security, our data practices, are what we say they are. We have established a transparency centre in the US. I would invite you to it, but I don’t think you can travel. We are trying to set up a virtual transparency centre and perhaps you would be able to come and see that. That would enable you to not only see how our content moderation works but also see the code itself. In terms of the accountability and transparency of our data practices—
Q8 Chair: Yes, but what about true independent academic research into your organisation? For example, you talk about partners having oversight, partner companies, and you talked about the transparency centre; that sounds very in-house.
Theo Bertram: Yes, and we have said that we would be happy for people to come and do that. There is already quite a long queue, but what we have in mind is Committees like yours, regulators, and academics—we are very happy to be judged on the facts rather than the politics, and wherever we can have a discussion on the facts we are on much safer ground.
Q9 Chair: To be absolutely clear here, unlike the likes of Facebook, for instance, who like to pick and choose, effectively, who their academic partners are, you are open to academic institutions having free rein to look at your operations?
Theo Bertram: As much as our capacity can allow, the transparency centre is open to everyone. We have made it open. I cannot say right now that every university academic in the country can come into TikTok tomorrow, but in principle, yes.
Q10 Chair: The Information Commissioner told our predecessor Committee that it is looking into your company in terms of content and the type of videos being shared by children. Have you made any changes to your app in light of this investigation and what do you think is the context of this investigation?
Theo Bertram: Some of those questions of the ICO in terms of what is their context—we are very happy to talk about all the changes that we have made. We are having a good conversation with the ICO and I think that is a good thing, and a normal thing. We are a regulated company, the ICO is the regulator and, of course, we are in conversation.
In terms of child safety, even in the time that I have been at TikTok—and I joined last year—the changes that have been made have been quite transformational. There is family pairing, which you will see in similar types of products with the other companies, but one of the things that I would call out that we have done is around direct messaging. We know that there is a risk, that that is a vector of attack, and that if you have private messaging it is a risk, so we have ensured that there is no direct messaging for anyone under 16. More than that, we have also said there will be no direct messaging for anyone in an unsolicited way and, one step beyond that, we have also said, “You will not be able to share images or documents within direct messaging.” Those are all steps to help protect our users and especially our younger ones.
Q11 Chair: Was that effectively brought about because of the investigation by the Information Commissioner?
Theo Bertram: I know direct messaging was something that the Information Commissioner raised at the end of 2019.
Q12 Chair: How do you see the UK online harms legislation impacting on your business? What changes do you think will happen?
Theo Bertram: Obviously, we will comply with the Bill. If I take a step back, with my European hat on, we are already seeing, in Germany, NetzDG, which I am sure you are familiar with, which is the German version of this law. We have the Digital Services Act in Europe, which will also do this at a European level. We are happy to implement across all those areas. Regulation is inevitable and it strengthens this area.
The one place I would point out to the Committee is France, where the Avia law, which was going to be the French version of this, was struck down by the courts. The reason it was struck down was concern over how to get a balance between protection of users and freedom of speech. That is the big question that we still need to work through, and that will come through the detail.
Q13 Damian Hinds: Following on from the discussion with the Chair about the international devolution of TikTok, can you tell us a bit about TikTok Global? What is its geographical split by users and what are the implications for your sites in future?
Theo Bertram: The deal in the US is one of the most interesting political, commercial tech stories at the moment, but unfortunately I am not going to be able to give you more insight on that than there is in our statement. I cannot comment on that because of the ongoing commercial situation. I can talk to you about user data and where it is stored, if that is where you want me to go.
Q14 Damian Hinds: I am not asking you to go beyond what is in your statement; maybe just help us to understand what is in your statement. TikTok Global, what does it mean? Does that mean the world outside China? Does it mean the Americas? What does it mean?
Theo Bertram: To explain where we are now, TikTok does not operate in China. TikTok only operates outside China. TikTok data, globally, is stored in the US and Singapore. For TikTok users in the UK, the app is provided by TikTok Information Technologies UK. The data is also taken care of by the UK. The management team for the UK is based here in London, and it is the European management team. TikTok Information Technologies UK, the company I work for, the company that provides the UK services, is ultimately owned by ByteDance Limited, which is a company outside China. ByteDance Limited has a number of investors. They are the big American investors. They are all listed on our website, bytedance.com, on the front page—they are the same investors that are part of the ongoing discussion—and the board of that company is five directors: Yiming Zhang, our CEO and founder; and the four heads of the big investors.
Q15 Damian Hinds: To be clear, you are saying that despite the name, TikTok Global will not be global; it will be the Americas, or North America, or the continental United States—or what are you saying?
Theo Bertram: TikTok Global is the global business. It is TikTok. But as I said, I am not going to provide a running commentary on the commercial deal in the US. I hope you can understand why that is difficult.
Q16 Damian Hinds: With respect, if it is global, then it is not just a commercial deal in the US; it is a deal that affects the United Kingdom, Europe and elsewhere in the world as well, presumably.
Theo Bertram: I am not going to comment because it is an ongoing, commercial deal, it is not because it is specific to the US, but I think you can understand why I cannot discuss an ongoing commercial deal.
Q17 Damian Hinds: I can understand why you cannot discuss the details of your negotiations and price setting, but I think there is legitimate public interest, not only in the United States but around the world, about what is being proposed here and the implications for all sorts of things such as the ownership of intellectual property, the location of data, the payment of taxes, and what happens with our children and their activities online. These are all legitimate items of public interest, which should not be subject to the confidentiality of price setting.
Theo Bertram: Absolutely, and sorry, I do not want to be misunderstood. I am happy to answer each of those and I can go through them in detail. It is just on the specific deal that I cannot answer. Each of your points, I can go through. Data security—
Q18 Damian Hinds: Can we start with who will own the intellectual property?
Theo Bertram: The intellectual property—TikTok is owned by ByteDance Limited.
Q19 Damian Hinds: ByteDance Limited is a wholly owned subsidiary of who?
Theo Bertram: ByteDance Limited is the parent company.
Damian Hinds: Okay.
Theo Bertram: Just to explain the structure—
Q20 Damian Hinds: What might be helpful would be if you could write to us with a diagrammatic explanation of the structure, because otherwise I think it might take up a lot of our time today.
Theo Bertram: It is on the front page of bytedance.com; you can see it there.
Q21 Damian Hinds: Okay, but just to be clear—I know I have used that phrase a number of times already—what most people would call the Chinese-owned parent company is TikTok Limited, they will own the intellectual property—
Theo Bertram: That is not quite right. Can I just explain? Let me just very briefly explain. If you give me a moment, I think I can help you to understand.
Damian Hinds: Okay.
Theo Bertram: There is ByteDance China—this hand. Then there is TikTok and all the TikTok entities. Those report to ByteDance Limited, which is outside China. ByteDance Limited is made up of the board of directors of the four investors and Yiming Zhang, the CEO of the company. So it is not the case that TikTok reports into China. It is the case that the international business and the Chinese business both report into ByteDance Limited, which is outside of China. I know it is the economists’ perception that TikTok somehow reports into China, but that is not quite the case.
Q22 Damian Hinds: The intellectual property is owned by a common parent company, which provides the technology, the algorithms and the systems and processes both to a Chinese subsidiary and to an ex-China subsidiary. Is that correct?
Theo Bertram: It is not quite accurate. If the question you are getting at is around whether China is getting access to data, I can help answer that, if that is your concern.
Q23 Damian Hinds: That was not what my question was about. We are talking about the creation of a new entity called TikTok Global. What I am trying to understand is how separate it is from the existing entity. You may say that people’s concerns are not legitimate, or they are nothing to worry about, but these concerns exist about the involvement with the Chinese Government and who ultimately sees and controls data. Although there are many complicated convolutions we could go through around corporate structures, the basic point is that the IP is not being separated. Is that correct?
Theo Bertram: I completely understand your concern and I think it is entirely legitimate to ask about this. The reason we have set up the company in this way, this international business separated from China, owned by a parent company outside China, is in order to address these kinds of concerns.
Beyond that, we have also worked with national security assessments in Australia and Germany, both of which have cleared us. We have also made the commitments around transparency, in that we have allowed people to come in and inspect the code—the algorithm—so that they can see there is nothing untoward happening. We also have strict data controls in place. What we have pledged globally in our business outside China is that instead of it just being me saying, “Trust us, there is secure access, there is no access to individual user data from China,” what we have also said is that we will allow a third party company, or indeed the regulator or national security investigators, to come in and inspect that code, or inspect our processes. The reason why we have done these things is because we do understand these concerns and we want to address them.
Q24 Damian Hinds: Who owns this new entity?
Theo Bertram: With the one in the US, you will have seen the statement of what is proposed, but I do not want to go into that at the moment.
Q25 Damian Hinds: The claim is that it is 80% owned by ByteDance, 10% by Oracle, and 10% by Walmart.
Theo Bertram: It is 12.5% and 7.5%, but broadly, yes.
Q26 Damian Hinds: It sounds like there have been some discussions at least—maybe not a final agreement but some discussions—with the US authorities about the payment of tax and this figure of £5 billion has been floating around, which may just be an estimate of how much tax the company would be liable for in the US anyway, over a number of years. There has also been talk about this patriotic education foundation. What are the implications of that for tax that you pay in other jurisdictions?
Theo Bertram: Let me talk about the tax structure in the UK, because I think it is again something we do slightly differently to the other companies you have spoken to.
UK tax will be paid fully in the UK. We are permanently established in the UK and that is our starting place. That is where we will remain, regardless of what happens. Whatever changes there are, the UK business will pay full corporation tax on the basis of being permanently established in this country.
Q27 Damian Hinds: That is based on the allocation of IP rights and royalties in proportion to users, presumably—as opposed to saying the IP is located in some other jurisdiction and the UK subsidiary has to pay them to use the brand, use the algorithms, use the technology, and so therefore the economic activity is in that other jurisdiction.
Theo Bertram: I think permanent establishment means that we have to pay corporation tax at the full rate.
Q28 Damian Hinds: It certainly will mean that, but the question is corporation tax on what amount. That is what these policy questions turn on. Is it the way the system works that the UK subsidiary is paying a fee to some other part of the corporate structure, and that is, therefore, where you count that the economic activity is happening and that reduces your tax liability in the UK?
Theo Bertram: Currently not, as I am aware, but I cannot speculate on what the future model is. What I can promise is we will pay full corporation tax, that we are permanently established here, and I think that is a step ahead of the other companies you have had in front of you.
Q29 Damian Hinds: Can I ask you briefly about something else? This month, the age appropriate design code comes into force. I say it comes into force; there is a 12-month implementation period, but it comes in this month and from September 2021 will be mandatory. How will that change your business operations?
Theo Bertram: It is already changing our business operations. I think it is one of the most forward looking, interesting pieces of legislation. It is very interesting to see child safety being driven by a data protection authority in those terms.
You do have to put thinking about younger users at the centre of things. One of the things that we have started to do with partners is we have been running what we call family focus groups. You would be welcome to attend one of these. We have run nine of them over the last year. What we try to do is better understand what those parents and teenagers want, and what they expect. One of the things that we have learned is that people are not comfortable with the idea of us taking lots of data off them to prove their age; they do want privacy and safety settings in video form rather than—
Q30 Damian Hinds: Did you say they are not comfortable with you taking data to prove their age? Of course, they are not. What young person has ever been comfortable with the idea of being asked to prove they are 16, 18, or what have you?
Theo Bertram: Yes, but it was the parents as well.
Damian Hinds: Did you issue a press release about that? Sorry, go on.
Theo Bertram: It was the parents who were saying they do not want every app on their devices getting some passport, whatever proof of age you might have for a 13-year-old. They were much more happy with the idea of it being Government or even schools that acted as the verifier of a child’s age rather than it being data given to this plethora of apps. The parents were much happier thinking, “Is there a way that we can ensure that if we have to verify a child’s age, it is only one or two times that we do that, which can either be a Government thing or the other way would be for it to be at the app store level?” There are a lot of interesting things coming through from that. We are listening, we are learning, we are adjusting, and we already have an internal team that is working on that, headed up by our head of child safety.
Q31 Damian Hinds: Finally from me, principle 1 in the new code says, “The best interests of the child should be a primary consideration when you design and develop online services likely to be accessed by a child.” What do you think is in the best interests of the child in terms of the amount of time that it is helpful for a child or young person to spend online each day?
Theo Bertram: I am going to say, both as a parent and in this role, that the question is what they are doing online rather than just the time spent. One of the things that differentiates TikTok from the other apps is that TikTok is somewhere people come to create. A much higher ratio of our users are creating things. When I tweeted that I was coming to this hearing, someone, one of the journalists, tweeted, “Are you going to teach everyone to dance?”, which was a joke, obviously, but I think there is a sense that TikTok makes you do things rather than just passively sit in front of a screen.
That said, I do think it is important to have time limits, so with family pairing the parent can set a time limit. What we also do beyond that is that for every user, if you have been using a device for more than 60 minutes in the evening, or 80 minutes in the daytime, we have created videos that will pop up that say, “Maybe you should think about doing something else.” We are thinking about screen time. It is a concern, but I would not put screen time above quality time, and what you are doing on the app.
Q32 Steve Brine: Can you tell us how TikTok creates ads, please, and how you target ads?
Theo Bertram: We are still at the early stage of our ads business. I know this Committee will be much more familiar with the older companies, both as users and in the work you are doing. There you are used to the idea of an ad attached to a video, just before, coming during, or around it. At TikTok, you have a stream before your feed and ads will appear in there, so in that sense they are not necessarily attached to a particular video.
We also have different ways that brands can use our platforms. They can have a takeover, where there is an ad that appears at the top and fully covers the screen. We have done that three times with the NHS and Ministry of Health to promote Covid.
On targeting ads, briefly, targeted ads are by default off. You have to consent to switch them on and at any time you can toggle them on and off. I can dive into any of that in more detail, but let me know if that is enough.
Q33 Steve Brine: If they are not attached to particular types of content, assuming that they are toggled on, which users can do, how are they targeted? What do they attach to? A YouTube ad would play at the start of a video, and you can skip the ad in some cases, in others not. How are TikTok ads targeted? What does the algorithm do? What I am trying to ask you is, do you target ads at your under-18 users?
Theo Bertram: We would only target ads at anyone over the age of 13, where they have given us consent to do so, and the default setting is off so you do have to switch it on.
Q34 Steve Brine: Do you differentiate between the adverts that you push to under-18—over-13 users—and those who are over 18?
Theo Bertram: For all our users, it is default off and they have to consent. Then, obviously, we ban ads from products or services that are specifically intended for, or would appeal to, children—toys, games and that kind of thing.
Q35 Steve Brine: Tell me about the user experience with regard to age recognition. How do you verify their age? What process do I have to go through to say that I am over 13?
Theo Bertram: When a new user comes to the app, they are prompted to give us a date of birth and that date of birth has to be over 13. If it is not, they cannot get access and the device is blocked for 24 hours. That is higher than the industry standard. You will see other apps that say, “Are you over 13, yes or no?”, which is a much more leading question, and then they will let you try again. So, we are higher on that side but that is not enough. We know that having someone on the entry gate is not enough. We continually monitor to see whether we have users who are under 13.
Q36 Steve Brine: How do you know? If I say yes to that, then I am. Who would say no?
Theo Bertram: I just said that that is not the process; you have to put in your date of birth.
Q37 Steve Brine: Yes, but I am familiar with the concept—as, I fear, will my children be one day—of altering one’s date of birth to give a different age that suits what I want to do.
Theo Bertram: Let me just finish the second bit of what I was saying. It is not just about the entry gate. Once you are on the app, we continue to monitor. We do that in two ways. First, any user can flag another user as being under age. The second thing is that every time we moderate a video—and for the UK I think we have removed just under 3 million videos—with a human reviewer, that reviewer, whatever else they are reviewing that video for, is also looking to see whether that account belongs to someone under the age of 13. If it does, we remove them. We continue to remove people from our platform, even after the age check.
Q38 Steve Brine: What would you like us to do? If you were writing the online harms legislation, what would you like us to do to verify the age of children using this platform?
Theo Bertram: This is a hard question.
Steve Brine: That is why I have asked it.
Theo Bertram: I have two solutions. One would be, can you do it through the Government? The reason I am saying that is not because I am saying we are not going to do it, but I do not think parents want to give every single app a proof of age. I also think it is bad for competition if there is a high threshold to entry. I think you will just end up with parents and kids sticking with the apps they already know. I don’t think that is a good, long-term trend for parents and teenagers.
How do we get age verification and get it right? One way is a Government ID of a 13-year-old. There are risks inherent to that in terms of civil liberties and because the Government’s record on IT tech is not perfect. An alternative would be that there are existing bottlenecks in the system and there are only two app stores, and if you said to the two app stores, “This is the one place where we are going to ask parents to give proof of age,” then you are not increasing the risk of data loss by asking parents to give out data to every app, but you can ensure that every app needs to verify the age with the app store. It seems to me that that would be a solution to a tougher age verification process.
Q39 Steve Brine: Okay. Just a couple of things more generally. Does the TikTok For Business webpage urge marketers, people who want to spend money with you, to, “Turn culture into a cult-like following”?
Theo Bertram: I am guessing it does, if that is what you are reading to me.
Q40 Steve Brine: Yes. Do you think that is appropriate for a 13-year-old, let alone someone of my age? Cults are not good, are they?
Theo Bertram: No, I can see your concern with the wording there.
Q41 Steve Brine: It is your wording. Do you think it is appropriate wording?
Theo Bertram: I would not write it like that.
Q42 Steve Brine: Well, you did, because you are the public relations guy for TikTok. Although you might not have physically written it, you own it all. I own every policy of my Government; I didn’t write them all.
Theo Bertram: Yes, I know that feeling. What I am saying is that I am listening, and I will take that back.
Q43 Steve Brine: Please do. Finally, I appreciate that it is a big business and the owner of TikTok is a very wealthy person, but if TikTok disappeared tomorrow, do you think we would have a more or less healthy society and healthy minds among young children?
Theo Bertram: I think less, and the reason why I say that is TikTok users exist on the planet. They are your constituents, there are very many of them, and they come to the app and not to the others because this is their place to express themselves. They love it and it is something that makes them happy. I think if you want to see what contribution we make, just look at the way we helped people during the pandemic. People came to us looking for fun and relief. I know you are serious people—you care about news, you care about politics, and music and humour are maybe not top of your agendas—but I think everyone needs a bit of fun every now and then.
Q44 Steve Brine: I feel insulted on behalf of my profession. I am just saying, from just flicking through a TikTok feed this morning before the session, it is what my mother would have called trashy. There is lots of content where there is a mother and daughter pushing their tushies, if you know what I mean. In a clever way, there is lots of sexualised content. There are people jumping off piers into the water. Are we not better than that, as a society? There are lots of ways to have fun but is that really it? Is that really what it has come to? Is that a healthy society, Theo?
Theo Bertram: I think there are two things there. One thing I would say is that the more you use the app, the more that “For You” feed will anticipate what it is you like, so you will see less of that and more, say, Andrew Lloyd Webber on the platform.
Q45 Steve Brine: Does he push his tushy?
Theo Bertram: He doesn’t, but he is quite entertaining.
Steve Brine: Right. Excellent.
Theo Bertram: Andrew Lloyd Webber pushing his tushy was not what I was anticipating discussing.
Steve Brine: No.
Theo Bertram: So you will find something that you want. Then people jumping off piers, people dancing—we have strict community guidelines to make sure that they do not go over those rules, so they are not harmful. We would remove dangerous challenges, sexualised videos. We do have strict policies in place to protect from those things, but I do think you can let people have fun in many different ways and our platform is where they can do that.
Q46 Chair: One point about algorithms and presenting something that users like: isn’t there a danger of a rabbit hole—that people will see the same thing time after time after time? How do you combat that? Or do you welcome it? Is that the addictiveness of the app?
Theo Bertram: No, no. Again, there are some disadvantages to coming in 10 years after the other big tech companies, but there are some advantages and one is in how we think about the filter bubble, because we know that it exists and we, therefore, know that it is something we need to be careful about. Our algorithm does not recommend to you the same thing again and again. What it is trying to do is identify the full diversity of your interests. The algorithm is always trying to say, “You liked this video, but what about this one over here? Would you like that? What about this one over here?” My experience of the “For You” feed, having spent a long time on the app, is I find I am interested in things that I didn’t know I was interested in. It stretches the diversity of your interests.
The other reason we do this is that it is in our commercial interest. If you think about it, in a half hour session on YouTube, you might watch five videos and the idea is I will watch this one, then a second one, then another one. In half an hour on TikTok, you are going to watch hundreds of videos. You would be bored out of your mind if they were repetitive, so it is in our interest to make it a diversity of different types of video. I think that guards quite well against the filter bubble, but it is certainly something that we continue to monitor and watch out for.
Q47 Chair: How do you do that? How do you have this diversity that you talk about? What is it about the algorithm that is different from, for example, YouTube, which you just mentioned?
Theo Bertram: In some ways it is quite simple. When you watch a video, what we are looking for are signals about, “Did you flick past it? How long did you spend on it? Did you watch it again?”, and gradually we can build up from that. Each time you like a video, who you are or any data about you doesn’t really matter; it is just your interaction with that video. A content graph, rather than a social graph, is the technical term. On other social media, maybe you get stuff retweeted into your timeline or shares and likes into your feed. For us, it is just about what content you liked and that is what shapes what is next. It is a bit more like the Netflix or Spotify recommendation engines. We are giving you the content that we think you like.
Q48 Chair: That sounds exactly like what I described at the start of this question. It sounds like you do end up with very similar content. You talked about diversity, but that just sounds like you do end up with the same content.
Theo Bertram: The diversity comes from the idea of clusters. We are constantly trying to grab different clusters to see what you like. The way I would describe it is, let’s say you have this thing in front you—you like carrots. Okay, we will always give you carrots. What we are trying to do is say, “You can’t have a whole meal based on carrots; we are going to try to give you what complements that.” When other people liked this flavour, what flavour went with it? Another way of looking at it is a kind of colour chart. Okay, red; what matches red? What doesn’t match red? It is choosing a difference. That is how the algorithm works. It is not just, “You like this one thing, so we will always give you this one thing.” It is, “You like this one thing. What are all the other things that complement that and can we find what the other clusters are where your interest lies?” Then we will push those videos to you as well.
Q49 Chair: Finally, does the user data that you collect dictate that algorithm, or at least part of that algorithm? Or is it basically that you span people who are quite similar, maybe, and say, “They like this. You may like this.”?
Theo Bertram: It is a combination. We are collecting data on what your behavioural patterns are: “Did you skip past this? Did you like this? Did you engage with this? Did you click on the profile? Did you click on that piece of music?” Music is a big driver of the content on the platform. We can take all that and that helps us understand what we think you like, and then we match that in quite simple ways, saying, “Other people who liked this also liked this video, so do you like this? No? Okay. Do you like this one? Okay, you do,” and then we can start to refine so that the algorithm is based on what you like.
Q50 Alex Davies-Jones: We have mentioned the dances and the fun that TikTok can bring, but earlier this month it was widely reported by news organisations and the trade press that TikTok struggled to remove a video that graphically depicted suicide. Why did TikTok have such difficulties in removing this particular video?
Theo Bertram: This is a hard one. I am going to give you some information that we have not shared publicly before, so forgive me if I refer to my notes as I talk about this.
First, I should say how saddened we are by this and obviously our condolences go to the family and friends of the victim involved.
I am going to talk through the timeline of what happened, the patterns that we saw in the uploading and viewing of the content, and what we are doing about it. Forgive me if I take a little bit of time to go through this. I think it is important. This is news that we have not shared anywhere else.
The timeline was that on 31 August, a man livestreamed his own death by suicide on Facebook Live. A small number of clips were uploaded to our platform in the immediate days after. Then on the evening of 6 September, we saw a huge spike in the volume of clips being uploaded. Let me talk about how that content was uploaded. There was evidence of a co-ordinated attack. Through our investigations, we learned that groups operating on the dark web made plans to raid social media platforms, including TikTok, in order to spread the video across the internet. What we saw was a group of users who were repeatedly attempting to upload the video to our platform, splicing it, editing it, cutting it in different ways, joining the platform in order to drive that. I don’t want to say too much publicly in this forum about how we detect and manage that, but our emergency machine learning services kicked in and they detected the videos and we quickly removed them.
Let me briefly say that normally what you would see on our platform is people view content in the “For You” feed or through hashtags. What we saw in this instance was people searching for content in a very specific way, frequently clicking on the profile of people as if they were anticipating that those people had uploaded the video. It was quite an unusual pattern of how it was viewed as well. Obviously, even one view of this type of content on our platform is one too many.
So how are we going to improve and stop this in the future? The first thing is a bunch of internal things we need to do around machine learning and emergency systems. I don’t want to say too much about that, but it is essentially about how we can capture them more quickly and recognise them for our algorithms to detect and remove those videos, and also how to support the content moderator teams that have to do that review.
Also, last night we wrote to the CEOs of Facebook, Insta, Google, YouTube, Twitter, Twitch, Snapchat, Pinterest and Reddit. What we are proposing is that in the same way these companies already work together around CSAM—child sexual abuse imagery—with organisations like the Internet Watch Foundation and NCMEC in the US, and the way we already work together around terrorist-related content, we should now establish a partnership around dealing with this type of content.
We know we have to do better, and our hearts go out to the victim in this case, but we do believe that we can do even better in future.
Q51 Alex Davies-Jones: Thank you for that. I appreciate your answer. You mentioned that you are now collaborating with the other social media networks to combat this for the future, but did you actually have any discussion with Facebook at the time when it started to become an issue?
Theo Bertram: Not as much as we will on the new footing. I think that is what would be better for everyone. We need to all work together much more. We have systems in place for taking hashed videos from NCMEC and IWF. We are all very good in the industry at doing that and that is what we need to put on the same footing for this type of content.
Q52 Alex Davies-Jones: You mentioned accounts that were splicing and sharing this content. You were banning them as they repeatedly tried to upload the clips. Surely you would agree that any account that tries to do it even once should merit a ban.
Theo Bertram: Yes, and every account that did it, we banned. I fully agree with you, yes.
Q53 Alex Davies-Jones: I would like to move on to body positivity. I also sit on the Women and Equalities Committee. We are currently inquiring into body image. The Guardian reported last year that TikTok was using an algorithm—and we all know about algorithms this year and how great they can be—to demote LGBTQ, disabled, body positive, plus-sized content based on local policies on the grounds of an anti-bullying initiative. What steps have you taken to correct this and to ensure that people of all shapes and sizes can access TikTok fairly?
Theo Bertram: You are right, and I am really sorry. We really got that wrong. This was in the early days of TikTok. It was well intentioned, but it was completely wrong. The idea was that to avoid bullying, we would not help those types of videos to go viral, but I am pleased to say that that is certainly a thing of the past. I do think TikTok is somewhere where body positivity and the LGBTQ community don’t just feel protected now but also celebrated and lifted up. I think we are a terrific platform for body positivity; #bodypositive is a 2 billion hashtag. I think our work around Pride and LGBT is also very positive.
Q54 Alex Davies-Jones: If I, as a UK user, now were to visit a country where these local rules were still in place and that content is demoted, would I still be able to view such content? How would the policy work in that instance?
Theo Bertram: To be clear, in TikTok globally, our policies on LGBT and on this type of content—we have taken all those policies away. If you travel to countries where unfortunately there are laws against LGBT, then in those places we would only remove content in the event that we were asked to do so by law enforcement. You could post and you could view. It would only be when we were legally required to do so by a request from law enforcement that we would take that down.
Q55 Alex Davies-Jones: Finally, I would like to talk about the “365 Days” film. I know this is a Netflix film, but it has inspired a disturbing TikTok trend where young users share memes about violent sex. I would like to raise concerns about #365days, which has amassed more than 2 billion views on TikTok to date, while #365dayschallenge has over 20 million views. Some of these videos show the users sharing footage of their own injuries, and some male users have been filmed putting their partners into fake chokeholds in posts mirroring one of the film’s most infamous scenes. According to your own guidelines, content that “depicts, commits or incites non-consensual sexual acts” or that “commits, promotes or glorifies sexual solicitation or sexual objectification” is prohibited. The film “365 Days” is deemed explicit enough to have warranted an 18 certificate. Have you seen the film?
Theo Bertram: I haven’t seen the film, but I am familiar with the issue on our platform and I will very briefly talk about what we did, because I think it is a good example of how we work with NGOs to make sure that we are getting things right.
When we first saw this content—already a big meme on TikTok is cosplay or make-up—what we saw in this instance was users creating videos where they were kind of jokey, where there clearly wasn’t evidence of actual violence but where they were riffing off this “365 Days” film. Initially, our view on the content moderation was this does not breach our guidelines; it is clearly not actual harm or actual violence. But we were contacted by a charity called EmilyTest, which is an organisation set up to protect students from gender-based violence, and they said to us, “This may be not technically breaching the rules but there is a risk here that it is normalising a kind of sexual violence, even if it is not clear.” We took that on board, and we changed the way that we responded. We removed those hashtags and we started to remove those videos.
Q56 Alex Davies-Jones: Thank you. I would not say it normalises sexual violence or that it is even jokey. I would say it glorifies domestic and sexual violence in some cases. Do you think—yes or no—that it is appropriate for a 13 year-old to be able to generate and view content of this nature?
Theo Bertram: No.
Q57 Alex Davies-Jones: You have mentioned now that you are taking down this hashtag, but do you think you, as TikTok, have a responsibility to protect users from sexual violence?
Theo Bertram: Yes, absolutely, I 100% agree with you.
Q58 Alex Davies-Jones: What recent conversations have you had with the Department for Digital, Culture, Media and Sport and the Home Office about the role of social media companies in regulating their content, especially in light of the Domestic Abuse Bill that is currently making its way through Parliament?
Theo Bertram: I don’t know about specific conversations that we have had with those organisations—I can check with my team—but I can tell you that we have been in touch with many organisations and charities about this particular issue. Our door is open. We are listening and we will keep updating our policies to make sure that we get them right.
Q59 Alex Davies-Jones: Thank you. I think the Committee would welcome any evidence you have or any news of the conversations you have had with other Departments.
Theo Bertram: We will provide that.
Alex Davies-Jones: Thank you. No further questions from me, Chair.
Q60 Chair: Thank you. Theo, would you also be able to share the letter that you wrote to the other social media platforms and referenced earlier?
Theo Bertram: Yes, I will do that straight after this meeting.
Q61 Julie Elliott: TikTok has been criticised for its policies that reportedly have censored issues politically sensitive to the CCP, including the repression of Uighurs, Taiwan, and Tibet. The Guardian and The Intercept have reported on leaked documents that show that people may be banned for criticising the military or China’s repression of the Uighurs and discussing incidents such as the Tiananmen Square incident, among other things. Do you accept that such policies are corrosive and unacceptable?
Theo Bertram: Yes, I totally agree with you. Those are not our policies.
Q62 Julie Elliott: Why do you think it has been reported that they are your policies if they are not your policies, as you have said?
Theo Bertram: I think there are deep concerns about Xinjiang, about what is happening there. I think that there are broader concerns around China and China’s role in the world, and I think that those concerns are projected on to TikTok, but I don’t think they are always fairly projected on to TikTok. TikTok is not the same thing as China. Those are not our policies. You can go on to the app and search for any of those terms and you will find content.
Q63 Julie Elliott: Going on to something completely different, during the pandemic you made changes to your platform policies to tackle health misinformation. What trends regarding health misinformation did you see on your platform and how did you ensure these policies were enforced robustly?
Theo Bertram: The pattern that we saw in March and April was a huge spike in the number of videos that were being created and, corresponding with that, an increase in the number of videos that were spreading misinformation, but it was very low. Our users generally acted responsibly. There were 700,000 videos created over the period and we only removed 1,500 of those 700,000 videos for medical misinformation.
After April, there was a drop and since then we have seen it really low. I can give you the figures for August. I will have to look at my notes. I think we are down to a very low number of videos now being created. Yes, 305 videos were removed this month that were Covid related—this was for August—and only 47 of those were removed for misinformation. So, we are down to quite low numbers now.
Q64 Julie Elliott: What does that contrast to at the peak of this, earlier in the year?
Theo Bertram: I think the peak was more like three-figure numbers. We saw it peak and it dropped away. Broadly, we have seen people coming to our platform not to seek news about Covid but to escape it, I think.
Q65 Julie Elliott: Do you work with other tech companies and health organisations during the pandemic to identify harmful misinformation being shared across platforms?
Theo Bertram: Yes, absolutely. There is a European code of conduct on tackling disinformation. We signed up to that. That requires us to report monthly. We work with other companies to do that. We also work with the WHO. We work with the Department of Health. On three occasions, we basically took over the whole screen of the app. Every time you opened the app, for the first five times you opened it, you either saw the NHS or Matt Hancock’s face, inescapably, and we did that on three occasions with a total of, I think, 46 million views. That was about making sure that we got the right information to our users.
Q66 Julie Elliott: Do you currently use any third party fact checkers to help you to identify misinformation? If so, which organisations do you use?
Theo Bertram: Yes, great question. Yes, we do. We use different fact checkers, depending on what the issue is, and try to cue it up to make sure. Lead Stories handles general misinformation. Science Feedback is a fact-checking organisation that we use to handle medical misinformation. I think they are based in Paris.
Q67 Julie Elliott: Finally, what engagement have you had with the Government regarding misinformation and online harms?
Theo Bertram: A lot of engagement. We were a platform that not many people knew at the start of this year. One of the things that happened in the last seven months or so is that we had to engage a lot more with Government and play our part in helping Government reach those users that perhaps they cannot reach via other platforms.
Q68 Julie Elliott: Do you think that that is working—that you are reaching people and making sure that what they are reading and what they are seeing on your platform is real and not made up?
Theo Bertram: I think those figures are pretty good, but I don’t want to blow our own trumpet because I think there is a constant challenge. Whenever you upload a video, we are looking to see whether it is Covid related, and we put a little sticker on it that says, “Find out more about Covid here.” That will take you to a page that we work on with the Department of Health and the WHO to provide information. We are trying to steer people towards trusted information. We also work with the WHO to put a lot of effort into getting trusted content on the platform. We made a donation of £5 million to the Royal College of Nursing to support the frontline workers themselves during the crisis.
Q69 John Nicolson: Can I refer to a couple of points that have been raised by some of my colleagues? The Guardian has been cited a number of times by various members of the Committee. The Guardian reported that TikTok content showing same sex couples, gay couples, holding hands and kissing had been removed. Can you explain why you did that?
Theo Bertram: I don’t know the specific story, but I can say that there was—
Q70 John Nicolson: I am sure you read The Guardian investigation into TikTok. It has been referenced several times in the course of today’s hearing. I am sure you read it carefully at the time and probably before you appeared before the Committee. It specifically spoke about gay couples holding hands and kissing and that content being removed. Why did TikTok do that?
Theo Bertram: Unfortunately, there was a policy—I am pleased to say it is not a policy anymore—that TikTok took in its early days. It viewed the idea that promoting content that might encourage bullying—they thought about people with disabilities, they talked about LGBTQ users, and they did not promote that. That was a terrible idea. It might have been well intentioned, but it was the wrong thing to do.
Q71 John Nicolson: Obviously, if you yourself put up content of you kissing or dancing with your boyfriend and you are a man, you are not worried about bullying, are you? You are proud of who you are. They are not furtively filmed and put up without the permission of the people concerned; they are a celebration of same-sex love.
Theo Bertram: Yes, I absolutely agree with you and that is thankfully no longer our policy.
Q72 John Nicolson: Do you still restrict the prevalence of LGBT hashtags in various countries; for example, Russia?
Theo Bertram: Not as far as I am aware. You can create any content on our platform. There are no restrictions. The only place where we would restrict—as I said to your colleague earlier—is where we received a legal request to do so.
Q73 John Nicolson: I notice you say as far as you are aware. You might not be aware of it, so it is possible that your company restricts the prevalence of LGBT hashtags in Russia?
Theo Bertram: No, I do not think we do, but I do know of some instances where we did and I can explain briefly why some of those happened recently, which was a technical problem. We were trying to ban some terms like “sex”, and we ended up banning terms like “sex change”. We had a rather rudimentary filter system and that over-blocked, but we have corrected that. Our intention is not just that we protect the LGBT community but that we celebrate them and lift them up. That is true every day on the platform. The only time we would remove that content is where there is a legal requirement for us to do so.
Q74 John Nicolson: How often does that happen?
Theo Bertram: It is published in our transparency report, which is just out today. I do not have the figures off the top of my head, but it will all be there, how many in each country.
Q75 John Nicolson: In Russia, for example, are we talking about scores of times, hundreds of times, thousands of times that you do what the homophobic regime tells you to do?
Theo Bertram: I do not have the numbers off the top of my head, but they will be on our website. I think they will be quite low. I do not want to guess at it, but it will be there. We have to apply local law when it is applied to us by any country.
Q76 John Nicolson: Regardless of how oppressive that law is?
Theo Bertram: When we get a legal request and it is a valid legal request, we will comply with that legal request, but our platform is—
Q77 John Nicolson: When you say request, do you mean request or requirement?
Theo Bertram: Request. It would need to be if we were contacted by law enforcement to say, “Under this law you must remove this.” It is not that we will voluntarily police it ourselves; we would wait for the police to instruct us.
Q78 John Nicolson: That is a requirement, though, not a request.
Theo Bertram: We are not removing content on the basis that—we are not voluntarily policing it ourselves. It would only be where law enforcement came to us and said, “This specific video you must remove.”
Q79 John Nicolson: Obviously, a requirement is, “The law says this,” and you must comply with a parliamentary ruling. A request is, “We are homophobic, we don’t like this stuff, take it down, otherwise we’ll make life difficult for you.” That is obviously the difference between the two words. Perhaps you could write to the Committee and tell us how often you are taking down videos in Russia, for example. I will not ask you to do it for every country, but Russia, for instance, because it rather goes against your ethos, does it not, of celebrating youth and openness if you comply with regressive, homophobic regimes? That is very far from the joyous image you seek to portray.
Theo Bertram: I can give you that straight after the Committee because we publish legal requests from all countries. Yes, I agree with you. I think the Russian law is terrible and our community does, too, and they strongly voice that on the platform. However, we unfortunately have to comply with legal requests in the countries that we operate in, whether that is the UK or whether that is Russia.
Q80 John Nicolson: Moving on, can I ask you whether or not employees and directors of TikTok are free to express their own views on issues?
Theo Bertram: Yes.
Q81 John Nicolson: Okay, good. Running through a few contentious issues, what is your view about the treatment of Uighur Muslims by the Chinese authorities?
Theo Bertram: I am sure I am reading the same news reports that are dribbling out of increasing numbers in China. Personally, I am deeply concerned about that, and I think this House, this Government and western countries are right to be demanding answers from China on the issue of human rights.
Q82 John Nicolson: For instance, you will have seen the very powerful news reports from well-regarded news agencies. Do you yourself believe, when you read it, that women are being forcibly sterilised, that there are concentration camps and that people are being tortured? Do you believe that?
Theo Bertram: I see all the same stuff that you are seeing and I also see the somewhat less convincing answers that the Government are giving on the other side, so, yes, I absolutely share the concern that you have on this issue.
Q83 John Nicolson: We are all concerned, but do you believe the reports about the torture of the Uighur Muslims?
Theo Bertram: I think the evidence is increasingly strong, if you are asking me personally.
Q84 John Nicolson: Do the people of Tibet have the right to self-determination and, if they choose it, the right to independence?
Theo Bertram: I do not have—I guess so. If you are going to keep asking me questions about my view on world politics, I am happy to answer them, but I am not sure—
John Nicolson: I want an answer to the question I asked you.
Theo Bertram: Yes, I agree with you on that personally as well. If you are asking me my personal views on these issues, yes.
John Nicolson: The people of Tibet do have the right to self-determination.
Theo Bertram: Yes.
Q85 John Nicolson: What about the security law in Hong Kong? Do you regard that as oppressive?
Theo Bertram: If you are asking me personally—
John Nicolson: All these questions I am asking for your opinion.
Theo Bertram: Yes, absolutely, I do think so, too, yes.
Q86 John Nicolson: Can you tell us what happened in Tiananmen Square?
Theo Bertram: Yes. I was younger then as well and we have all seen the video of that brave man protesting and then the massacre that followed. I am not a historian.
Q87 John Nicolson: No, but you are an educated man. Do you believe and can you confirm that Chinese students were massacred by the Chinese authorities?
Theo Bertram: I believe so, yes.
Q88 John Nicolson: I am interested in your answers because your company apologised to Feroza Aziz last year. She was an American teenager and her account was banned by TikTok after she posted videos highlighting Chinese oppression of the Uighur Muslims. Does that say something about the corporate culture of TikTok?
Theo Bertram: No. Let me explain what happened there. In this instance Feroza Aziz posted a video that contained an image of Osama bin Laden. We then blocked her account in terms of our mechanisms. It took some time for that account to be blocked. She created another account where she posted this video. Because of our systems attempting to take people down when they have posted content that is terrorist-related, we then took down that video that she had posted about Uighur Muslims. We should not have taken down that video about Uighur Muslims and we allowed her to keep that back up and we apologised to her.
But I do not think it is fair to take that single incident and to say that that is reflective of TikTok as a whole. I encourage you to go on TikTok and search for Uighur Muslims. I can understand your line of questioning but what I am saying to you is that TikTok is a business outside of China. TikTok in the UK is led by a European management team that have the same concerns and the same world view that you do. We care about our users and our community cares about TikTok.
Q89 John Nicolson: Once again to return to The Guardian report from last year, it leaked a document from TikTok. The document showed that moderators were being instructed to take down videos that were critical of China. I will quote exactly; this is a TikTok document. It referred to, “The distortion of other countries’ histories”—distortion. That is a Chinese propaganda term. It referred to the Tiananmen Square massacre, which you have agreed was a massacre, as an “incident”. That is Orwellian to call it an incident. That is a TikTok document giving instructions to your moderators.
Theo Bertram: That is not our moderation policy. You can go on TikTok and check it for yourself.
John Nicolson: It is a TikTok document.
Theo Bertram: I also invite you to come and meet our content moderators. You can see our content moderation for yourself. You can meet that team. It is headed up by Cormac Keenan, who is my head of trust and safety in Dublin. You can meet the moderators that we have in London and I can promise you that our policies allow all these things. There is no political censorship of this kind. The management team in Europe would not allow it, nor would they allow it in the US. This does not happen on our platform.
Q90 John Nicolson: It may happen elsewhere, and I can tell you what your official TikTok response was to this leak. You did not deny that these were instructions. In fact, you confirmed that these were instructions, but what you said was that the company had changed its policy in May 2019. Previously, you instructed your moderators to take down videos critical of China, specifically talking about incidents in Tiananmen Square, separatism in Tibet, all straight out of the Chinese Communist Party playbook. You confirmed that is what your moderators did, but your defence was that you had changed your policy in May 2019.
Theo Bertram: It is highly regrettable that that is what it was, but it is not our policy today, nor has it been for a long time.
Q91 John Nicolson: You can understand why we are all so suspicious, though, can you not? Throughout this session we have seen a whole series of very disturbing concessions from you about things that your company has done, that you have been caught doing, whether it comes to children, paedophilia, oppression of political views, suppression of political views, and your response always is and has been throughout that, “We’re sorry about that but it’s better now.”
Theo Bertram: I do not think that is a fair characterisation of this—
Q92 John Nicolson: I think that is what the write-up will show your answers have said. Finally, is it not the case that while you may have, under pressure from The Guardian and others, taken down content in the past and now revised your policy, on issues like Tiananmen Square and Tibet, the content may be kept up but your algorithms ensure that that content gets much more limited distribution than it should get?
Theo Bertram: That is completely untrue. I would say to you as well not only can you come and meet our content moderation team, but we allow people to come and inspect our algorithm. I worked for eight years at Google. I never would have been able to say that to you when I worked there. Now that I am here at TikTok we know that people have these concerns and we know it is because we have this question about China hanging over us. That is why, as I have consistently said through this hearing, we are committed to a higher level of transparency than anyone else because we want to prove that our platform does not have any influence from China and that this is a place where the LGBT community and where people with body positivity are welcome, they are protected, they are celebrated and they are lifted up.
Q93 Chair: One further question to clarify a point that was raised by John. When you are asked by local law enforcement to take down a particular piece of content, is it related to just a piece of content or is it a generic instruction related to the type of content?
Theo Bertram: Usually a mixture of things. It would be a specific video, it would be a specific account. That would be typically the type of thing.
Q94 Chair: You said “usually”, so that means sometimes it is not?
Theo Bertram: That is typically what would be requested.
Chair: Typically.
Theo Bertram: Sometimes they have asked for data on who the account is, “What information can you provide us?” I can give you the figures for the UK if that is helpful.
Q95 Chair: We are not pursuing it in that respect because John has raised a point about Putin and Russia and homophobia. Can you categorically say to this Committee that in no circumstances do you take a generic instruction from local law enforcement in order to take down particular types of content, yes or no?
Theo Bertram: It would be a legally valid request in accordance with the law. It depends on the law, but it would be a video or an account. That is what we would typically be removing.
Q96 Chair: Typically again. You are saying “typically”. That just suggests to me that that is not the case but that you are effectively taking instructions from states saying to you, for example, “We don’t want to see gay people on your platform.”
Theo Bertram: No, I do think it is videos, it is accounts. I do not think it is that widespread censoring that you are suggesting. I do not think we do that.
Q97 Chair: It is not suggesting; it has been widely reported and you have acknowledged the fact there have been failings in the past.
I still have not got to the bottom of this issue of whether or not law enforcement is just a particular incidence, or whether or not it is generic over a matter of content.
Theo Bertram: I think it is a particular incidence rather than generic content. Russia does not say to us, “Take down all gay content,” and we comply with that, no. Russia would say to us, “This particular video you need to remove because it breaches our law.”
Q98 Chair: That happens in all instances, so at no stage is there a uniform approach from a state such as Russia on an issue like this?
Theo Bertram: We would not consider that to be legally valid. That is not how legal requests work. They do not say, “You must take down this vast blanket of content.” Many countries would like that but that is not how tech companies work. That is true of all of the industry, not just us.
Q99 Damian Green: I confess I am slightly puzzled by that last exchange. If the British Government said, “Please take down all racist content,” presumably you would, and that would seem to me to be perfectly reasonable.
Theo Bertram: I missed your question.
Damian Green: I am puzzled by the previous exchanges in that if the British Government said, “Take down all racist content, please,” that would seem perfectly reasonable to me and I assume that you would.
Theo Bertram: Yes. Let’s just separate out two things. One is we obviously have policies on our platform against racist content and against all sorts of harmful content, and obviously we remove that already. As I said, for the UK there were 300 million videos created in the first six months of this year and we removed just under 3 million of those videos, 96% of those proactively, 90% for a single view. That would be across all of those different areas: harassment, bullying, hate speech—all of that kind of stuff we remove.
Then there is a separate thing, which is what we are talking about now, which is much lower numbers, which is legal requests. Let me give you an idea of what those numbers are and tell me if we have the right context, because it is much, much rarer. For the UK, we had total requests of 31 in that period of time. Nineteen of those were emergency requests where they asked us for information or to remove a video in an emergency situation, and 12 were normal legal requests. That volume of information that you get through those legal channels in that way is far smaller, but the work of taking off our platform harmful content is one that we do ourselves all the time.
Q100 Damian Green: Thank you. You will be pleased to hear I am not going to quote The Guardian at you. We have heard enough from The Guardian this morning. However, I was struck by the US Secretary of State, Mike Pompeo, saying that Americans should only use TikTok, “If you want your private information in the hands of the Chinese Communist Party.” I assume you will say there is nothing in that.
Theo Bertram: Yes, I would respectfully disagree with Mike Pompeo’s statement. I have explained several times that we have systems in place to protect our users’ data from access from overseas, in China specifically.
Q101 Damian Green: From nowhere in the world can data be fed back to China, so even though your parent company is ultimately Chinese, the notorious Chinese information law does not apply to any data outside China?
Theo Bertram: Two things there. First of all, the parent company, as I explained to your colleague, is not ultimately Chinese, it is outside of China. The key board of directors, the key investors, are largely American. Secondly, yes, you are right: no employee in China can access TikTok data in the way that you are suggesting on behalf of the CCP to carry out mass surveillance. That is not possible.
Q102 Damian Green: Where is British data held? Is it held in Britain?
Theo Bertram: British data is currently held on servers in the US and Singapore. We have announced an investment in a data centre in Ireland that will be built next year and then we will hold the data there.
Q103 Damian Green: Are there not EU rules about the storage of data that do not recognise the US as a safe place?
Theo Bertram: I would not characterise it in that way, but I would certainly say Europe has a problem with US companies and data.
Q104 Damian Green: Indeed, but for some years you must have been breaking EU law, then, if you have been storing British data in the US.
Theo Bertram: It is a little bit more complicated than the way you describe it.
Damian Green: You are only breaking the law in a limited and specific way.
Theo Bertram: I think that is your Government, not my company.
Q105 Damian Green: What is your company doing to break the law? You say it is a bit complicated. In what way is it in a complicated way breaking the law?
Theo Bertram: I told you we are not breaking the law. You said that about your Government. I am saying that is not what I am saying about my company. We entirely comply with GDPR and with its principles.
Q106 Damian Green: Hold on. You just said that you are storing data in America. It is my understanding that under EU law there will be legal problems with that, and you said it is a bit more complicated than that.
Theo Bertram: Yes, I think your understanding of the complexity of EU law is not quite right. It is possible to store data in the US. The Schrems case overall, the idea of the Privacy Shield, which is what Facebook depends upon, is not the legal process that we use. It is complicated. We could go into it, but it will quickly exceed certainly my knowledge of the specific legal expertise required in this case.
Q107 Damian Green: When are you going to move over to a new storage system?
Theo Bertram: Probably 2021. We are building a database as we speak.
Q108 Damian Green: In Ireland?
Theo Bertram: Yes.
Q109 Damian Green: So, everything will come back from America and Singapore. Will that just be for new data or the existing data that you already have stored?
Theo Bertram: We will store all Europeans’ data in Ireland from then going forward. Yes, that is the plan.
Q110 Damian Green: You talked a lot about your moderators. How many UK content moderators do you have?
Theo Bertram: We have 363.
Q111 Damian Green: Is that growing? Is that enough? What is the daily grind of each one of them?
Theo Bertram: Is that growing? Yes. We are at 800 total employees in the UK and more people have joined the company since lockdown than were part of it before. We are probably one of the fastest growing businesses in the country right now.
Q112 Damian Green: How does the work they do personally interact with automatic ways of doing it? Presumably, a lot of it does have to rely on algorithms and automatic checks. Clearly, 363 people cannot look at the amount of content that you said goes through. What is the interaction between the automatic and the human?
Theo Bertram: Yes, we have more than 10,000 people working in Trust and Safety globally across the TikTok part of the world. The way it works is an algorithm is only as good as the human moderation that is trained as well, so they are constantly refining the algorithm and looking at how much content it is removing; is that the right amount, do we need to adjust the policies and enforcement in order to review that?
Some issues you absolutely have to have a moderator look at. Some issues are more cut and dried, and you can say, “That is definitely coming off,” but some things require human resource.
Q113 Damian Green: A final one on a different subject. There are a lot of reports about your global headquarters coming to London. Do you have any update on that?
Theo Bertram: We still need to deal with the issue in the US. We continue to grow very quickly in the UK, but we are focused on the challenge we have in the US before we have a look at the international HQ.
Damian Green: That sounded like a slightly complicated no.
Theo Bertram: It is not a complicated no, it is just—
Damian Green: It is dependent on the election.
Theo Bertram: It is a commercial decision and we are not ready to take that yet.
Q114 Clive Efford: You moved to TikTok in December last year, 2019. Were you shocked with what you found when you got there?
Theo Bertram: No. Every day I work here I am continually impressed by the quality of my colleagues and by the commitment of what we do.
Q115 Clive Efford: Did you not find it lax, in terms of protecting particularly young people from online harms and preventing false information?
Theo Bertram: No. This company is different. We had to build in trust and safety from day one. We are two years-old. At the point where this company started, we knew that Committees like yours are going to grill us on these issues. We have Trust and Safety and over 10,000 people in a company that is just two years-old. It took the other companies a decade or more to get there. This company has trust and safety built into it right at the heart of things.
Q116 Clive Efford: That may be commendable for you, but we have had difficulty at times getting other online platforms in front of us. Several times you have said, “That is no longer our policy,” and at one point you said “thankfully”. Does that not suggest that TikTok has been more keen to grow its footprint in the market and taking a very laissez-faire attitude towards protecting people from online harms and the spread of disinformation?
Theo Bertram: No, absolutely not. I have worked at tech companies for 10 years or more now, European ones and American ones, and we were always accused of it being a wild west. That was probably true 15 years ago. I can remember being invited into No. 10, “Tell us what we should do about tech policy.” None of that happens anymore. The way the tech industry is seen now is with much more scrutiny, much more criticism, much more regulation. That is the world that this company I now work for grows up in. That means that you have to have built in from the start the highest standards to compete with the existing companies.
Q117 Clive Efford: But TikTok has been accused of accessing information on people’s Apple Macs, on their iPads, their iPhones, switching off security, circumventing security even though it has been switched on, circumventing Google policy—your former employer. Do these practices still go on?
Theo Bertram: The thing that you are referring to is the MAC, which is the multiple access control, which is not the Apple Mac. It is not that type of word. MAC refers to a specific feature that allows you to identify the device. The reason why we were doing that, in line with our privacy policy, was that we were trying to make sure that we could identify where a device has multiple accounts or something like that—the kind of behaviour that you would typically associate with spams or bots and generally abusive behaviour. I think that is what you are referring to.
Q118 Clive Efford: Did this not allow you to identify, for instance, the wi-fi networks that people have visited and, therefore, the locations and things they have looked at, and was that information not valuable to you?
Theo Bertram: I can talk you through—and I am happy to send it to you—exactly what we collect. As I said, the reason we collect that information was in order to prevent abusive behaviour.
Q119 Clive Efford: You separated off children under 13 into a separate platform. Why was that?
Theo Bertram: That is in the US and that was part of the agreement with the FTC.
Q120 Clive Efford: What was the reason for it? What was behind it? Why was there concern about young people having access to TikTok’s general platform?
Theo Bertram: The US regulator, the FTC, asked us to create that particular product in order to have a TikTok for under-13s. But TikTok in the UK is only for those over 13 and we do not have the equivalent TikTok Kids here.
Q121 Clive Efford: What is the policy at the moment towards blocking people who groom children? It was that a paedophile found online got a seven-day restriction, then a month, and then on a third-time strike they would be out permanently. Is that acceptable?
Theo Bertram: It is completely untrue. Let me explain exactly how it works. We work closely with NCMEC and IWF, the key law enforcement. At the point where any of this type of content is identified, the immediate action is a suspension. The reason for that is not that that is the end of the story, which is how it is reported. That is pending the legal investigation and the advice that we then get from NCMEC or IWF or other legal bodies. Of course, in every single incident like this, where it is the case that it is, in fact, offending content in this way, they would not only be banned from our platform but there would be legal action as well. What was reported in this case was one part of the process but not the subsequent, so it was portrayed as if all we did was suspend them, whereas that is the immediate initial response while the investigation takes place.
Q122 Clive Efford: Can I ask about the general ethos of the approach of TikTok? Effectively, you have created a playground where people are meant to go and have fun, but the purpose of creating it is to advertise and you make your money through the advertisements that you can sell through that platform. When does fun become exploitation? For instance, we have children’s swing parks in our parks, we have playgrounds in schools and various other places. We do not have advertising plastered around the fences and walls of those places. When does it stop being fun and start being exploitation by a global network?
Theo Bertram: We have zero tolerance for any type of bullying or abusive behaviour on our platform. We have protections in place, and I described the protections we have put in place for direct messaging. We have family pairing that enables parents to protect their children. We removed more than 100 million videos this year as part of our protections. Our platform is somewhere people can come and have fun. Our job is to keep them safe. I know this Committee’s job is to focus on the places where we have slipped up or moments where we haven’t got things right, but the overwhelming experience of our users on our platform is one of great positivity and creativity.
Q123 Clive Efford: I accept that you do not accept that TikTok has been lax in any way, but do you accept that it has caused harm at any stage in its history?
Theo Bertram: Of course I accept that there are moments when we have got things wrong, but overwhelmingly I think TikTok is a force for good.
Q124 Clive Efford: This is one of my problems. We play this sort of game of whack-a-mole, which is the term that has been used in another place, with online platforms trying to catch up with regulation, but if we were in a field like pharmaceuticals we would not allow them out there to do harm before we had tested what they were doing. Do you think that the framework of regulation is sufficient at the moment to prevent companies cutting corners in order to make profits?
Theo Bertram: There are two things there. First, the framework for regulations is going to get tougher, absolutely. We accept that and we will comply with the online harms regulation when it comes through.
Secondly, let me tell you why the comparison with the pharmaceutical industry does not work. The drugs that we all take are made by a small number of companies. It is not the case that we all make drugs for each other and then randomly take them. The world of user-generated content is where the people creating content are not some group of aliens sitting on another planet. They are your constituents and the vast majority of those people are creating fun content or creating perfectly ordinary content. We are dealing with the small minority that create content that is bad, but in the same way that I don’t think it is fair to describe all of your constituency on the basis of the few bad actors, I don’t think it is fair to portray all of user-generated content, all of social media, based on what the bad guys do.
Q125 Clive Efford: I don’t mind the fun side of it. I just want to have a safe place for people, particularly children, to have fun. Can I refer to the answer you gave my colleague who questioned you previously, which was about moderators? You said you had 363, if I have the figure right. My recollection—I do not have the figures in front of me—is that when Germany regulated and introduced quite a stiff fine, the number of moderators employed by Facebook, for instance, went up to several thousands; I think it was something like 20,000. Wouldn’t you be an even faster growing company in the country if we were to regulate in that area?
Theo Bertram: Facebook has 15,000 globally; I don’t think they have 20,000 in Germany. We have 85 in Berlin, but we have over 10,000 globally. The NetzDG law is the first of its kind and we are going to see that roll out. The Digital Service Act will come at European level. The UK has its online harms Bill. I think you are right—this is an inevitable process, yes.
Q126 Clive Efford: Do you have to wait for regulation to employ more moderators or do you think you will be doing that of your own volition?
Theo Bertram: We will inevitably continue to employ more moderators, but I don’t think legislation is the thing driving the hiring of those moderators. We are hiring more moderators, more Trust and Safety, because we want to continue to improve our platform. We have more or as many as our competitors and we have been around for only two years.
Q127 Kevin Brennan: Good morning, Theo. How many of your videos contain music?
Theo Bertram: I don’t know off the top of my head. Music drives the platform. I don’t know how we would measure it, but I think if you flick through you will see music is right at the heart of what we do.
Q128 Kevin Brennan: That is quite an interesting statement, isn’t it: that music drives the platform, but you don’t know how many or what proportion of the videos contain music? Do you know how much value songs have added to your business?
Theo Bertram: I don’t know the number just because Select Committees can sometimes be a test and it is very hard to keep all of the numbers in the head at one time. I don’t think that to suggest that there is any—
Q129 Kevin Brennan: The picture that you paint is that this is a community where people create all the content. I accept there are some incredibly creative things on TikTok, but in many cases creators—the people who made the music in the first place—are having their work replayed and reinterpreted and so on through TikTok. Do you license music before you allow it to be used on the platform?
Theo Bertram: Yes, music is licensed, and we work with the regulators. Don’t forget our videos are 15 seconds, so I don’t think we are cannibalising the music industry—far from it. What we are doing is driving appetite for the music industry. Let me give you one story. There is a young woman in Blackpool who is a grime artist who recorded a song called “M to the B” and it is a very catchy 15-second clip. That clip has had billions of views around the world in places like Hungary and parts of South America. It is the most searchable song on Shazam. It is driving the consumption of music. There was a TikTok dance based around a 15-second clip of a The Weeknd song, which drove that to number one.
Q130 Kevin Brennan: I am very aware how excited lots of parts of the music business are about TikTok and the deals that have been done between record companies and TikTok and so on. Ultimately, what I am interested in are the originators of music. I accept there are these incredible breakthroughs by people via TikTok getting justly rewarded for their music and the way it is being played on TikTok. You said it is 15-second videos, but nevertheless the company itself is making its money off the back of those creators. Are they being adequately rewarded, in your view?
Theo Bertram: We have licensing in place, yes.
Q131 Kevin Brennan: We talked earlier about hashtags. What happens to the videos that contain the hashtags when you take a hashtag down? Are they all taken down as well?
Theo Bertram: It could be both. If a video breaks our guidelines, we would remove it and if we think the hashtag breaks our guidelines, we remove the hashtag as well. Removing a hashtag does not delete the videos themselves. It just makes them hard to discover.
Q132 Kevin Brennan: That is quite an interesting distinction, isn’t it? For example, when you were talking earlier about what happens in jurisdictions that have laws against LGBT people, would you take down a hashtag if the authorities requested you to? The Chair asked you about individual content but if the authorities said, “Take this hashtag down because it encourages homosexuality,” or whatever the Government in that jurisdiction was objecting to, would you take the hashtag down?
Theo Bertram: I don’t know the answer to that. I will write to you and give you the correct answer to that.
Q133 Kevin Brennan: We would be very grateful because I think that is important to know. Before coming on the Committee, I was looking on my own Twitter feed and someone had tagged me in on an article they had written about a hashtag that you cannot search in the United States. The hashtag in the United States is “#joebidenisapedo”. If you search for that hashtag you would not be able to find the videos that promote this conspiracy theory, but there is a video from a 16-year-old woman that has apparently had more than 257,000 views that has that hashtag in it. Is that the sort of thing that you would take down normally? Obviously, the hashtag has been taken down, but that video is still there.
Theo Bertram: I can’t do the content moderation in this way, but if you report that to us, we will review it and if it is a breach of our rules, we will remove it.
Q134 Kevin Brennan: Interestingly, this young woman has 421,000 followers on TikTok, so it is not that obscure.
Theo Bertram: You wouldn’t expect me to remove a video without properly reviewing it. It is very simple to make sure our moderators review it. You just hold your thumb and it will pop up with how you can report it.
Q135 Kevin Brennan: Okay. How do you think a service like TikTok impacts on the lives of prepubescent and adolescent girls?
Theo Bertram: We are an app for those over the age of 13. There was a piece of research done by Childnet that said that more than half of young people feel that social media is a place where they can express themselves where they sometimes don’t feel able to express themselves in real life.
Q136 Kevin Brennan: Do you feel that the culture of “likes” has become a currency of self-esteem for children of that age and at a very sensitive time in their development between childhood and teenagehood?
Theo Bertram: Yes, I think it is a very sensitive time. Quite understandably, you are thinking about how other social media platforms work and you are applying that to us. In this case it is a little bit different. Partly the feel of TikTok is much more come as you are. It is not about being the best that you are. It is much more about authenticity than having an Insta body. It is much more about being yourself, but it is a place where we have to be particularly careful about those who are under 18 or under 16 and we do things differently. For those under 16, there is no direct messaging. You can’t start a livestream if you are under 16.
Q137 Kevin Brennan: Okay, and you told us that earlier on. Would you support the inclusion in the online harms legislation of a duty of care for platforms like TikTok towards their users?
Theo Bertram: If that is in the legislation, of course we will comply with that.
Q138 Kevin Brennan: Theo, were you amused by how TikTok users disrupted the Donald Trump rally in June?
Theo Bertram: I can’t possibly comment on that.
Q139 Kevin Brennan: To remind the Committee, that was when apparently many thousands of TikTok users registered for the rally and an overspill area was ordered, including an appearance by Nigel Farage. Apparently, far fewer people turned up than expected because young TikTok users had deliberately used something called Alt TikTok to avoid it being picked up that they were organising to disrupt the rally by booking places and not turning up. Is that what actually happened?
Theo Bertram: I read the story about it. I don’t know that we have ever seen the evidence about that but—
Q140 Kevin Brennan: There is plenty of evidence of it online. I was just reading it before the hearing. Have you not read any of it?
Theo Bertram: I think the evidence that that actually happened is lighter than the number of stories written about it, but I have certainly read the same stories.
Q141 Kevin Brennan: I will take it by the playful smile on your face that your answer to my first question might be that you thought it was quite amusing; is that fair?
Theo Bertram: In no way do I want to express any amusement at the President of the United States.
Chair: Thank you, Theo Bertram, director of public policy at TikTok, for your evidence this morning. We are going to take a short five-minute recess while we set up our second panel.
Examination of witnesses
Witnesses: Yuan Yang and Rui Ma.
Q142 Chair: This is the Digital, Culture, Media and Sport Committee, and this is the second part of our hearing into online harms and other ethics of data. In the first part we heard from Theo Bertram at TikTok and still this morning we are joined by Rui Ma, creator and co-host of Tech Buzz China, and Yuan Yang, deputy Beijing bureau chief and former China tech correspondent at the Financial Times. Good morning. Thank you for joining us today.
Yuan Yang, I understand that you watched the whole of the previous session and that, Rui Ma, you watched part of it. Yuan, what are your impressions of what you heard from Mr Bertram? He said things such as, “TikTok is not China,” and, “We significantly go beyond GDPR,” for instance. What are your thoughts about his sort of explanation of TikTok?
Yuan Yang: The claim that Mr Bertram made that ByteDance is not a Chinese company is technically correct, but it avoids the main issues that many Governments have about the company. Technically, yes, ByteDance became an Ireland-listed company, but by the same measure you would say that Alibaba is not a Chinese company because of the way that it is financially more appropriate to register it abroad for the company’s own fundraising purpose. ByteDance is a company that is headquartered in Beijing and many of TikTok’s engineers are based in Shanghai. Although the parent company is incorporated in the Cayman Islands, the company’s staff and many of its assets are based in China. As such, I think it is appropriate to describe it as a Chinese company.
Q143 Chair: But do you think that it is China as such? The implication is not just about where the company is based or where it has most of its staff; it is about values.
Yuan Yang: I see what you mean. I don’t think it is right to say that any company is China. When it comes to influence and values, in ByteDance you can look at this question in a number of ways. First, on ByteDance’s governance, there are five members of the board of directors. Zhang Yiming is the CEO and founder, who I would say from the impression that I have of him, having spoken to him off the record and having seen his interviews in Chinese media, while being a Chinese citizen has said that he is not a Communist Party member. I think his politics are probably as flexible as any Silicon Valley technologist. He comes across as a libertarian who has to fit into a very restrictive political framework because of his being in Beijing. Apart from Zhang Yiming, there are the four investors in ByteDance, three of whom are American, and one is a Chinese citizen.
From the governance of ByteDance, it is not fair to say that it is deeply influenced on the board of governors by Chinese Communist Party values. However, as with many companies in China, the company has a Chinese Communist Party committee that sits inside the company and this is a stipulation that the party has of many tech companies, including Alibaba and other private tech companies. There is a clear vehicle for Chinese Government influence into the company.
Q144 Chair: Rui Ma, you saw some of the session. Are there any impressions you would like to add?
Rui Ma: I am more from the business side of analysing the company and I agree with what Yuan has said so far. My impression of the founder is that he is very much enamoured with Silicon Valley views, so he is what you would call a typical technology entrepreneur.
Q145 Chair: Is there anything we need to be wary of in China tech about personal data and privacy? Can the likes of ByteDance, TikTok, be trusted with our constituents’ data?
Rui Ma: When it comes to privacy and data, TikTok should be subject to the exact same rules as everyone else. The concerns that many people have here, and especially in the United States, of the Chinese Government being able to access the data are very valid and well-founded concerns that the company really needs to address. I think that they have tried to come up with proposals on how to address it. I don’t think there has been a final format that is acceptable to everyone yet, but I think that concern is valid.
Yuan Yang: I think the concern about data protection and user privacy is something that we should have for all social media companies. There is an obvious particular threat that we have to consider when it comes to Chinese companies and their sharing of data potentially with the Chinese Government. There is no evidence so far that ByteDance is sharing data with Beijing or that Beijing has an interest in ByteDance’s data, and this is because of the nature of the data. We have to consider what kind of data TikTok gets access to. Here I would say the conversation is very different to other Chinese software and hardware providers. It is very different, for example, from the conversations that we have had over, say, Huawei, which is a comms provider and therefore has much more thoroughgoing access to its users’ data. When it comes to TikTok, the data access is more similar to Instagram or even to Facebook.
They have two kinds of data, broadly. One is data that identifies the user. That could be data that identifies the phone that they are using. It can also be data that identifies what kinds of device they are using, and these are often used for targeting, and that is a business model of many social media apps. The second kind of data that they take is, of course, the data that you would need to feed into the app in order to make it functional, such as the videos you upload and the captions you add. TikTok, unlike say Gmail or a private messaging provider, is primarily a public use app. You intend the videos you post on it to be seen by, hopefully, millions of people and you go viral. I wouldn’t say that there was a real user security concern with the kinds of data that TikTok has with public users.
Q146 Chair: TikTok has recently faced several lawsuits, regulatory action, over the use of children’s data in the United States. How has this affected perceptions of TikTok and how do you think the company has responded in the US?
Rui Ma: From what I have read, the company basically settled the lawsuit immediately over its use of children’s data. That was from Musical.ly, the app that it acquired to form TikTok Global. I think overall it is hard to say. Anecdotally, I don’t perceive from the people I talk with that their perceptions of TikTok have been particularly swayed by this particular lawsuit, meaning that if they had other concerns the outcome of this particular lawsuit didn’t really change their minds. It seems more that the other concerns, including the recent executive orders from the US Government and so on, have been more of a driver in how people perceive the company.
Q147 Chair: What is the long game in that respect? You mentioned the executive orders. What is trying to be achieved?
Rui Ma: The executive orders are trying to protect the national security of the country and protect US citizens’ data. The proposal has been that the easiest way to do that is to make TikTok US, at least, a completely US company.
Q148 Chair: Ostensibly, it is stated that it is about data, but there is an undercurrent here, isn’t there, of competition between China tech and Silicon Valley? This is a key battlefront in that respect. Do you have anything to say about that in relation to Trump’s move?
Rui Ma: I can talk about what people have said about that, outside of the executive order. I think the general sentiment among a lot of my peers here in Silicon Valley is that this move is motivated not just by—or they support the move not just because of the security interests but also because of the sense of reciprocity, because many Silicon Valley companies are prevented from operating normally in China. Therefore, there is a sense that it is something fair to do by targeting a Chinese company and not giving them access to the United States market as well.
Q149 Chair: It is part of a trade war, effectively. Yuan, do you have anything to add to that?
Yuan Yang: I will add an opinion on the executive order. On the order against ByteDance and also the one against Tencent, the company that runs WeChat, another social media app in China, it appears that they were rather rushing to get them out. Taking a step back from all of this, they play into a huge political spectacle that also suits President Trump right now in the months preceding the November election. I think there are a number of agendas at play in the background between China and the US, and only a few of those interests are to do with user security.
Q150 Chair: How big a challenge is China tech to Silicon Valley?
Yuan Yang: I think other Chinese companies that are looking into going to the US will face the same problems as ByteDance does. It has significantly decreased the appetite of Chinese companies thinking that they can enter the US on equal competition grounds with local companies. If you are asking me from an industrial or sector point of view how much does the development of tech giants in China threaten Silicon Valley, I think in many cases the Chinese companies have been models for Silicon Valley giants to learn from. A lot of the things, for example, that Uber or Grab have done in the US have been inspired, if not directly copied, from Chinese companies like Meituan or DiDi. That is the nature of the competition worldwide. Tech companies look at what each other are doing and try to do the same.
I think there is also an element in which there is significant tech use in the consumer sector. There is a very live industry in Beijing that tries to figure out the minutiae of what consumers want and deliver that to them in an app optimised exactly for their tastes. Of course, with the cheap labour in China, it takes off a lot quicker than in the US.
Q151 Chair: From what you observed with these apps, is there anything that you think needs to be addressed in upcoming UK online harms legislation? Yuan, would you be able to answer that first?
Yuan Yang: I am quite sceptical of the user privacy concerns around TikTok. I think there is a significant concern about accountability and transparency. If you think about TikTok as like the organisation I work for, which is a newspaper, we have editors who moderate content, who think about what content our readers would benefit from receiving. There is a whole editorial process and there are also regulations in UK law for media and also in the Financial Times staff what guidelines we would use to decide what is newsworthy, what is printable, what is significant for our readers.
TikTok is a social media platform, not a newspaper, but by and large it commands people’s time, people’s energy, interest and content in much the same way as a traditional news organisation does. However, the way in which its editors, let us say—or rather its algorithm plus its human moderators—recommend content is largely opaque, as is the case for all the other social media platforms that we use nowadays, Twitter, Instagram and so on, that recommend content to users.
The big public interest concern around a platform like TikTok is how we know that the recommendation algorithm is serving the interests of users and not, for example, of simply advertisers, or even is not leading to harm. It might be unintended harm, but harm is created through the misuse of the algorithm.
In the distant future, I think the possibility of Chinese Government influence on the recommendation algorithm is a possibility, although I think it is a very distant possibility and I think that site so far has proved fairly robust, from what we can see, to obvious Chinese Government influence.
Q152 Chair: However, you did outline the fact that they do actually have effectively a Chinese communist shadow board as well.
Yuan Yang: A party committee.
Q153 Chair: Yes, a party committee, exactly, which is ostensibly influence and it is also in the future as well, as we have new entrants into the new marketplace—new TikToks and new ByteDances if you like. Can you envisage a time in which we effectively do see algorithms pushing propaganda on a world view, so to speak, in the west, in the United Kingdom?
Yuan Yang: You mean pushing propaganda to suit Beijing’s aims in the west?
Chair: Yes.
Yuan Yang: I think this is a possibility, but I think so far this has not been the case. Beijing’s focus and interest is very much on curtailing online content within China and for Chinese citizens, which is what takes up 99.9% of President Xi and the Chinese Communist Party leadership’s concerns when it comes to online content. They are much more focused on domestic media and all of the censorship that goes on there.
I would say that is not currently a priority for the Chinese Communist Party. If you consider that the main priority of the CCP is its own survival within China, it is very likely that it is going to focus more domestically than on international content platforms like ByteDance in the future. Of course, you can never say never, but I would also say that it is very difficult for us to write politically about a company or even to legislate over a company based on complete hypotheticals that have not yet been proven.
Rui Ma: I agree with Yuan on the emphasis of the Government and what kind of content they want to control. However, at the same time, for the algorithm output I think something the Committee can consider is that a lot of these acts of censorship—whether they were intentional, arose out of a bug or whatever—were actually reported and discovered by either media or other users and then told to the media. That is because the great thing about TikTok is that it is meant to be a public forum. This is a place where people share content for others to see so, therefore, acts of censorship or suppression are much more easily discovered and, in fact, this is an output of the platform. If the Committee could determine what is an acceptable threshold or what are the results you expect to see from a platform that does not engage in censorship, then you can audit the results with what you see on the platform.
In addition, you could ask for more algorithmic transparency, although algorithms are very complicated and I imagine especially TikTok’s at this stage are not going to be really intelligible, particularly to a human. However, you could still arrive at an estimate of what the algorithm should output and verify and audit it that way. Therefore, in the case of censorship, especially outright censorship, it is fairly easy to uncover.
Q154 Chair: To pick up on that final point there, what you are saying is that effective regulation of algorithms is something that is beyond people—it is something that would itself need an algorithm in order to do it?
Rui Ma: I am not saying the regulation of algorithms—
Chair: Sorry, regulatory oversight of algorithms is probably the correct way of putting it.
Rui Ma: No, what I am saying is that for many algorithms—again, I have never seen TikTok’s algorithm—if someone were just to explain the logic and the maths to a person, you would not be able to physically compute what the expected result is to an exact degree. For example, with TikTok the algorithms are working with probably tens of thousands—I know they started off with nearly 10,000—variables on users. Those are things that you would not be able to hold in your human head.
Yuan Yang: I would say to your question that regulation oversight of algorithms is entirely possible, even if it is just a mere human designing it. There are many academics who work on algorithmic transparency and algorithmic accountability. Especially for an app like TikTok, what you want to find out is the answer to the question, for example, are Uighurs being disadvantaged somehow in their thread compared to videos about other Muslim minorities around the world, or some other comparative case? It is entirely possible to compare video A and video B and look physically at how well they are spreading throughout the platform network and do these kinds of more simple comparisons, even if we are not understanding the intricacy of the recommendation algorithm.
I would say that when it comes to seeing how content is recommended and pushed in front of people’s faces, it is entirely possible to legislate over that and to regulate that by means of stipulating transparency measures that social media platforms would have to meet in order to be regarded as a trustworthy or reliable source of content.
Mr Bertram mentioned the transparency centre that they are opening in LA. I have previously enquired as to whether media can visit virtually or whether any academics have visited the transparency centre, and have yet to hear back from them about that. Once there are academics visiting or once it is opened to be toured virtually, I would be very keen to see what academics working on those transparency issues are saying.
Q155 Chair: Yes, it sounds like a non-transparency centre that they are opening up. However, it is quite interesting because Theo Bertram did say how they were welcoming the potential for academic oversight. One of the constant themes on this Committee is the fact that social media get to pick their academic partners. The problem with that is that really it is not academic oversight as such; it is a partnership, almost a business partnership.
Yuan Yang: Yes. The problem there is also the model of access to the company in return for writing presumably positive things about the company. Of course, it is a model you can sometimes see, for instance, in journalism as well as academia. Therefore, yes, I would support independent academics as opposed to ones they pick.
Q156 Clive Efford: Can you comment on the relationship between Douyin and TikTok? They share the same branding but offer different functionality. Do you think Douyin still influences TikTok over the development of apps?
Rui Ma: I can talk about that first. Douyin was an app that was created in 2016 by ByteDance and is a short-video app that TikTok takes after. Douyin leveraged the existing recommendation engine that ByteDance had created for its suite of earlier products, including this news app called Toutiao, and applied that to short video. TikTok is using the same recommendation engine but has a very different feature set from Douyin, which is because there are additional features and design elements of the app that are effectively localised for each geographic region that it operates in. In China you have additional functionality by virtue of the fact that the app has been around longer, there is a bigger audience and the ecosystem there looks different.
One of the big things that people talk about is that Douyin has, for example, a really big presence in livestreaming, especially recently livestreaming e-commerce. These are things that do not exist in the TikTok app because of the associated functions, like payments for e-commerce. TikTok has livestreaming but does not have a lot of the same functions because, again, it is a different user base. A lot of the user habits that you have in China, like livestreaming e-commerce, is a habit that does not exist yet in the US or in the west more generally. You need a lot of stakeholders—creators, brands that want to distribute through such channels, multichannel networks that manage these business operations for you, as well as users who want to purchase this way—in order to make a function like that work.
While the recommendation engine is something that is shared across multiple ByteDance products, the feature sets are different. I use both TikTok and Douyin and they have pretty different feels. As a user, the content I see is different, the features are different, and the design is somewhat different. I am not sure that necessarily the UK-European version will always follow the Chinese product because, again, the ecosystems and user behaviours are different.
Yuan Yang: TikTok makes itself unavailable to users within China and sometimes we describe Douyin very much, as Rui said, as the Chinese equivalent of TikTok. The two apps are different in some of their functions but in terms of their underlying brand slots they are very much the same. The company makes sure that TikTok is not available to Chinese users and also that Douyin is only available to Chinese users, with the very confusing exception of Hong Kong. Neither TikTok nor Douyin are properly available to Hong Kong users since the national security law, so that is the only region in the world that does not get these short videos.
Q157 Clive Efford: I take it from those answers that there is already evidence that TikTok is diverting from Douyin and will become a different platform entirely in terms of its content and the user experience?
Yuan Yang: In terms of content, yes, it is already diverting and that is something the company worked very hard at trying to do. For its own survival in China it is very important that it does not allow foreign content on to its Chinese platform and, therefore, creates a completely separate ecosystem of content for foreign users.
Q158 Clive Efford: You will be aware of the controversy around Huawei. People, particularly politicians, often talk about TikTok in the same way they talk about Huawei. Is that fair?
Yuan Yang: Maybe I will speak to this because I have done a lot of reporting on Huawei in the last few years. As I said to the honourable Chair, I think the two companies are very different in terms of the threat they could potentially pose. Huawei, as a telecom company, given the current state of telecom standards worldwide, has a lot of access to the data on the phone calls and the messages sent over its network. TikTok, as a social media app mostly used for public sharing of videos that people want to be seen by others, I do not think has much access to very sensitive user data. Therefore, in terms of the threat that they pose, they are very, very different.
I would also say that Huawei is very important to the Chinese Communist Party. They are very different. ByteDance, if you like, in the eyes of Beijing is a very recent tech start-up and is not really systemically important to Beijing. I think Huawei is much more of a national champion that has been in the country for decades and has built the country’s telecom infrastructure. It was the first Chinese company to go out and command a significant portion of foreign infrastructure, so it is much more of a national point of pride as a company, and Huawei is much more systemically important in China. I think the two companies differ very much on those two grounds.
They have a point of similarity, though, which is that they obviously have both been caught up in President Trump’s line of fire and have been subjected, I would say, to very politicised conversations on security in the US. Many of the solutions we have to resolve those problems are about a common standard for data security and for standards either in the telecom space or in the social media space. I think going after one or two different Chinese companies is not that much of a long-term solution to this issue.
Q159 Clive Efford: Rui, in terms of data protection, do you think we have robust enough regulations in the USA and elsewhere to protect against the misuse of data by apps like TikTok?
Rui Ma: I am not a data security law expert or anything. My impression as a consumer is that these things are constantly being refined and updated because of new emerging business models and new ways that companies are using data. As we learn more about how these companies work, more regulations or updated regulations would have to be put in.
Q160 Clive Efford: Would you both say that we should be concerned about TikTok’s algorithm being used to spread disinformation or to censor specific topics in countries like the UK?
Rui Ma: As Theo Bertram was saying, first, you can get a transparent look at its moderation policies and decide if those are sufficient, assuming they abide by those policies very closely. Secondly, because of the fact that the output of the algorithm is public for everyone to see, then the Committee can look, verify, audit and see whether or not such censorship or suppression is going on. The question I would put forth is how often is that going on. Is that real time and what are the resources and the methodologies? However, assuming you can work out a way for that to happen, there is absolutely a way to make sure that is not happening.
Q161 Clive Efford: You mentioned moderators. We were given that figure of 363 moderators. Do you think enough resources go into that area of their operation?
Rui Ma: I personally was a little surprised to hear that it was not higher, but I understand that the company is pretty new and is expanding its moderation efforts overseas. My understanding is that a lot of the moderation used to happen inside of China and as the app exploded in popularity it had to rush to recruit moderators more locally in all the markets it was expanding into. I looked—not this month, but last month when I was doing research—and saw a lot of job ads for content moderators all across the globe for TikTok.
Q162 Clive Efford: Is there any evidence of censorship in countries outside of China about issues such as the repression of the Uighurs, Tibet, Taiwan and the incident in Tiananmen Square?
Yuan Yang: No evidence that I have yet seen. There has been a very interesting report by the Australian Strategic Policy Institute, ASPI, which talks about the way that hashtags—for example, the Russian word for “gay”—are shadow-banned, so you can post things using the hashtag but if anybody else clicks on the hashtag they cannot find any other videos tagged with the same hashtag. It looks to the user like they are not being banned but the hashtag has actually been banned, hence the name shadow-banning. The report found that for a number of these hashtags that could be deemed offensive to the local Government, say in Russia, the company has gone over and beyond what was necessary as stated in the law.
This goes back to Mr Bertram’s Q&A earlier on when he was asked, “Do you do what is requested or what is required?” This is a really important distinction because in many cases, including in China, tech companies—I am not just talking about ByteDance but tech companies like Apple, for example—I would argue go over and beyond what they might be explicitly required to do by law. Because of their need to maintain local Government relations, they do what is asked of them in order to maintain relations with the security department, with the police and so on. I would imagine that ByteDance has similar problems, judging by the outcome of this ASPI report, in other countries. There is an issue as to how much it obeys and how proactive it is in censoring this kind of content that might be offensive to the local Government.
Q163 Clive Efford: If I have that right, you are saying there is no evidence, but in the application there is a suggestion that there is because of the blocking of the hashtags.
Yuan Yang: Yes. To be clear, the report I am talking about was not about Chinese content censorship. It is about censorship not in terms of what pleases the Chinese Government but in terms of what pleases the Russian Government or other Governments around the world that have more restrictive attitudes to speech. There is evidence that TikTok in the past has censored hashtags in those areas. As TikTok does not operate in China, it is looking at non-Chinese Government issues.
Q164 Clive Efford: Would you say that is an area that needs to be investigated further and we need to understand what is going on?
Yuan Yang: I think so because it is something that clearly has happened and where we do not yet have the requisite transparency from TikTok to know why this happened or whether it will happen again. I think the main issue here is one of transparency. We do not know how it decided that a hashtag like the hashtag for “gay” would be offensive, who made that decision and if there is a flow of responsibility throughout the organisation that can answer to a decision like that, or if it was just a recently hired graduate student who happens to be a content moderator and was put in charge of a huge amount of duties—as often happens in tech companies when they are rapidly expanding—and they make a call somewhere and that is the reason why a whole bunch of videos are now unsearchable.
Q165 Clive Efford: Following on from that, TikTok has been criticised in some countries for demoting content relating to LGBT or body positive content. Is there evidence that that is taking place in certain countries?
Yuan Yang: That is an issue where there has been evidence, which ByteDance has spoken to, apologised for and has said it would rectify its behaviour. The root cause of this issue again comes from ByteDance setting out to try to avoid any kind of conflict on its app, whether it is conflict with a local Government or any kind of politicisation on the app or even exposing its users to, as it said, bullying. However, avoiding conflict is not a consistent and helpful philosophy for life, let alone moderation of a very widely used app. The problem the company faces now is in trying to find what kind of philosophy or what kind of driving values it will use in its global approach to content moderation, having learnt the tough way that its previous model of trying to avoid anything that could be politically contentious is not going to work.
Q166 Damian Hinds: We have heard about the product divergence between TikTok and Douyin. Apparently, the operations are totally separate, and it sounds like there are quite limited synergies between them, apart from some shared branding. It strikes me, Yuan, that so long as there is a Chinese ownership stake in this business its valuation is going to be discounted because, whatever the truth about what may happen in the future, there will always be a concern about what may happen, what the influence of the Communist Party may be on the company and how regulators in the US and elsewhere will react. From a valuation perspective, if you were an investment banker advising this company on how to get the most out of this sale, what reasons could you give for why it was advantageous to maintain a Chinese ownership stake?
Yuan Yang: I will let Rui go first on the business side, but I want to add to one of the premises of that question. The two apps are not completely operationally separate in that in respect of ByteDance engineers based in Beijing or Shanghai, lots of engineers are working on what you might call the back-end foundations that supply lots of different ByteDance apps, including Douyin and including TikTok, which is the way the recommendation algorithm works, so there is some engineering overlap between the two apps. There is certainly one body of staff that supplies engineering to all of these different apps that ByteDance runs. That is one conversation on the question, but I will hand over to Rui.
Rui Ma: I was going to say the same thing. While the apps look different and have different functionality, in their essence the recommendation engine, which is the core innovation that ByteDance stumbled upon and has been working on since the founding of the company and which it is world class at, is shared across these two apps and other apps that ByteDance runs as well. For this reason—I was an investment banker before so it is not super-hypothetical—if I was advising ByteDance, because the algorithm is so valuable and you can build additional products on top of it—because it has built Toutiao, Douyin, TikTok and maybe in the future there is more stuff that can be leveraged out of this technology—it would make sense to keep it within the corporate entity.
Q167 Damian Hinds: They could also either sell it or license it. I wonder if you could address that question; it is not meant to be a tricky question at all. It is, as they say on Twitter, a genuine question. Isn’t the corporate valuation of the company eroded by the Chinese ownership majority stake and doesn’t that suggest that more value would be created out of the sale by selling it outright?
Rui Ma: Maybe if, for example, the TikTok Global entity or the business accounted for the majority of the valuation. However, the fact of the matter is, as of today, basically all of the revenues for ByteDance come from its mainly Chinese market that needs these algorithms to continue running. In fact, they are still growing. Douyin just announced reaching 600 million daily active users and it is pushing out all sorts of other products as well, so this is not a market they are going to give up. Therefore, from a valuation perspective, it is crucial that they keep it.
Q168 Damian Hinds: You can split intellectual property. Hilton Hotels did it, for example, decades ago to a US operation and an ex-US operation and there are other examples in history. You can license your technology, you can sell it at a point in time and develop two systems in parallel. I am trying to get to this point about risk and the harm that must be done to the company’s valuation by the uncertainty of what may happen in regulatory terms in the future. Surely that must outweigh any arguments against the administrative inconvenience of having to separate out intellectual property and other assets for the company.
Yuan Yang: I was also going to add, in terms of the prospects for sale of technology versus licensing, the big headache for ByteDance now is that President Trump has had his fingerprints all over the deal and has made it very much a US political issue so that Beijing’s pride is also in the balance. A few weeks ago, the Chinese Government issued a new list of export controls, including controls over the algorithm that powers ByteDance, the AI recommendation algorithm, saying, “If you want to export and sell these to a foreign entity, then you have to go through this new review process.” Many of us watching this play out interpreted that as Beijing trying to stamp its red line on the deal by saying, “The US can push us around, but we do have some defences, too.” Therefore, a sale of the technology is not just down to ByteDance investors and what they want, it is also down to whether the Chinese Government allow them to sell the technology.
I will also add that I am not sure it makes a lot of business sense for a US entity to try to replicate that technology and develop it even in the short run, although I do hear the point you make that risk is a long-running issue for the company.
Q169 Damian Hinds: Rui, can I ask you about the other partners in this arrangement? It sounds more like a joint venture than a sale. A lot has been said about Oracle’s involvement. I am fascinated by Walmart’s involvement—of course, the parent of Asda here in Britain. According to the Walmart side, it says it is excited about what it calls the partnership and its ability to “serve omnichannel customers as well as grow our third-party marketplace and advertising businesses”. I wonder if you could say a word about what Walmart’s involvement here suggests about the future of retail, the merging of channels, and whether this is about the market for “merch”. Is it about influences or is it a pure advertising channel? What do you think is the principal synergy?
Rui Ma: When Walmart announced it was throwing its hat into the fray, I think a lot of us were confused. However, as its statement proves, it does have some reasons to go after this asset. Not only is TikTok very well liked by Generation Z, which is a very lucrative market that brands like Walmart drool after, but also—as I said earlier about Douyin, for example—we have seen China leverage short video and pairing that up with influencers to reimagine e-commerce.
In the form of livestreaming e-commerce, I forget the exact number of revenue it is expected to generate this year but, for example, some of the top livestreamers selling goods on a competing platform, Taobao, are doing upwards of $400 million a month, which is equivalent to a mall’s pull for a whole year. That is something many, many platforms in the US are looking at. Amazon is trying with Amazon Live, which is also livestreaming e-commerce, and I think Walmart sees this as a great way to get on the next wave of retail. Again, it is not certain that things will unfold exactly as they have in China, but there is reason to believe there is enough momentum and enough similarity between the two markets that that could be an outcome.
Q170 Damian Hinds: Finally from me, Yuan, you were mentioning earlier the Chinese Communist Party committees within tech companies, not just TikTok but tech companies generally. Can you tell us a little bit more about them? What do they talk about? What do they do? How often do they meet? What is their relationship to the core management of these businesses?
Yuan Yang: I would love to know more about them, but I can tell you what I do know. This is a drive that President Xi has been pushing for the last few years, this idea of party leadership in all companies. As with many CCP drivers, it may well be a drive that has no specific end goal and its aim is simply that it is a positive thing in general that the Communist Party is involved in some way in these companies. That is not to say that there is a CCP blueprint for ByteDance or a blueprint for Alibaba or whatever, but there is just a general sense that we want to have some say in the company even if we are not exactly sure what we want to do with the company.
The Communist Party committee in ByteDance appears most active, at least most widely written about, in 2018 when the company came under a series of damning indictments from the Chinese Government because it produced a few different social media channels in China that I think your mother definitely would call trashy. They had what the Chinese Communist Party regarded as a bolder content, and bolder content is actually illegal in China, so those apps were shut down. In fact, there was a time in 2018 when it looked like ByteDance, the whole company itself, might get heavily penalised, if not shut down. At that point, the party committee inside ByteDance came into life and held lots of meetings, trying to raise the ideological and political purity of the company, and the leadership had to make apologies. Mr Zhang himself had to write an apology saying he would do better and abide by better social values in the longer run.
I do think a lot of these party meetings, denouncements, apologies and criticisms are really for political show. It is what happens when a regulator slaps you and you have to then say, “I am very, very sorry and I really, really apologise.” Whether Mr Zhang may actually believe it or not, he had to and that is the price of doing business in China. That is what we know about the party committee at a public level.
Q171 Kevin Brennan: I was interested in what you were saying there. I thought Theo Bertram presented the Committee with a picture of a progressive, forward-looking company very, very interested in the welfare of its users, very interested in co-operating with this Committee and with the Government in terms of any legislation around online harms. At the end of the session I got the distinct feeling that the Committee, talented as we all are, had failed to land a single blow on the witness.
Were there questions that we should have asked Theo Bertram, representing TikTok, which we did not ask or answers we should have pursued further in your opinion from what you saw of that session? Rui, I know you only saw part of it so I will ask Yuan first.
Yuan Yang: First, we all recognise that TikTok in broad terms, having managers who may be individually concerned for its users, is also trying to maximise its reach and the amount of time people spend on its platform, and the investors on the board are clearly trying to maximise their investment return. All these things that Mr Bertram asserts exist do exist against the backdrop of what the company is trying to do overall as a multinational company making tech platforms.
I would be certainly very interested in finding out what commitment TikTok can make to transparency for its algorithm and for its data privacy practices—although I think the algorithm is much more important—and transparency for content moderation, so the human side of content moderation as well as the technical side. Saying we have a hypothetical transparency centre, which as far as we can tell nobody has visited yet or no research has been done with, is not quite enough there.
Certainly, it is very difficult to get any guarantee of lack of Chinese Communist Party influence in the company. For a number of reasons, even if Zhang Yiming himself wanted to give such promises he would not be able to talk about them publicly, certainly not to the UK Parliament. It is very difficult to see how to resolve that whole issue, other than to continue to allow ByteDance to localise and devolve itself in the UK and the US and the various different markets that it operates in.
Q172 Kevin Brennan: Rui, was there anything we missed that we should have been pressing on?
Rui Ma: I do not know about necessarily getting an answer from Mr Bertram or TikTok immediately, but one of the things that troubles me with companies wholly run on an algorithm, of which TikTok is an example, is this is a new breed of company and we do not know what the impact is on society. A lot of the solutions we have talked about so far are about understanding censorship, suppression, content moderation and so on, but what about the long-term effects? I would love to see more commitment from the company on engaging academics or maybe this Committee to study longer-term effects, because this is new to humankind and we do not know how it is going to change how we live.
A lot of the other questions that the Committee asked have been asked in China as well. Things like what about the fact that sometimes more negative, vulgar or inappropriate content tends to go viral and what is the platform doing about that; maybe it has content moderation policies in place that we are happy with, but what about the long-term effects; does content necessarily all become worse and worse over time or do we even know that happens because there is no longitudinal study of what is going on?
Also, on filter bubbles, I think Mr Bertram gave an answer that I have heard before from the company about how the algorithm works and how they try not to have filter bubbles so you are not just exposed to the same point of view over and over and become resistant to other points of view. However, is there proof that over the long term that is not what is happening to your constituents or to the citizenry? For me personally, not just periodic audits and making sure that they adhere to policy, but a longer-term study would be interesting. I do not believe that has been done anywhere.
Q173 Kevin Brennan: I was interested in what you were saying, Rui, about how the company operates in the Chinese version of the platform. Would it be unfair to describe it as almost evolving there into an online shopping channel—you call it e-commerce but basically it is a new way of getting people to shop—whereas in the west it is at the moment still simply a platform that will make its money through—I will say “exploiting”, you can interpret the phrase—exploiting the musical content and the way users add to it in order to make a profit and also to drive advertising to the site? What would your comment be on that as a way of framing what the company does, and does that mean in the west it is going to develop into a sort of shopping channel?
Rui Ma: My answer will be not necessarily; I can just tell you what has happened in China. Starting in 2016, Alibaba and also a ByteDance competitor called Kuaishou went big into livestream e-commerce. Livestreaming itself has been around in China for over a decade in the form of entertainment—people showing off their talents and getting tipped. That has been a big business in China for a long time now. I think in the west you do not see that nearly as much; you see that more on the gaming side and it is starting to reach beyond gaming. About four years ago, as I was saying, e-commerce started to take advantage of this so that you can buy products as you are watching the livestream. Of course, this required the app to have a lot of different features and requires the creator to be working with brands or making their own products, but most likely working with brands that are willing to invest in this kind of distribution channel.
In China this is quite mature. Douyin is one of the late movers into this field. Their competitor, Kuaishou, has moved a lot more product in e-commerce, but because of Douyin’s sheer size it is probably quickly catching up. We do not know what the exact numbers are yet. It is a distinct possibility that it is a way that we here in the west will also consume. A lot of venture capitalists in Silicon Valley certainly are very excited about the prospect.
Q174 Kevin Brennan: When Theo Bertram in his evidence said music is the driver of TikTok, is that also true in the Chinese version?
Rui Ma: Music was one of the core features that the team initially decided on when they launched Douyin because they were copying Musical.ly, which was very music based. Douyin today is definitely beyond music. There is a lot of professional generated content or what I call PUGC, professional user generated content, where you have teams of people, whether it be amateur filmmakers or whatever, making longer content. In China I have often said that it feels a lot more like YouTube than TikTok does, which is still very much the one minute and under short video.
Q175 Kevin Brennan: Sorry, Yuan, I did cut across you. Was there something you wanted to add to that?
Yuan Yang: Going back to the question you posed around what else you could have asked Mr Bertram, I think we are all grappling around the issue that there is an unevenness of regulation here. We are not quite sure what the model of regulating a social media platform looks like and yet there is a long history of broadcast media regulation at least in the UK for, say, TV channels. Teenagers spend a lot more time on TikTok than they do watching TV, so the onus of regulation and what content is influencing—people have moved.
But platforms like TikTok do not consider themselves primarily as editors or as content moderators or as channels. If you were to ask Mr Bertram what the editorial policy of TikTok is and what the editorial values are, they may find that a very strange question to think about because they would rather not have to have such a policy, whereas newspapers or TV channels will do so because they are used to themselves as content curators and creators.
Kevin Brennan: Thank you. We should have asked you that before the session, shouldn’t we? Thank you very much.
Chair: Thank you, Rui Ma, creator and co-host, Tech Buzz China, and Yuan Yang, deputy bureau chief and former chief China tech correspondent of the FT. Thank you very much for your evidence today. That concludes our session.