Supplementary written evidence submitted by TikTok
Thank you again for the invitation to give evidence to the Welsh Affairs Select Committee on 10th May. During my testimony, I committed to following up with the Committee on a number of issues. I have therefore provided commentary on these points below.
During the discussions with Wayne David MP, we talked about Project Clover, TikTok's comprehensive plan for ensuring that UK and EEA users' data is protected. While I explained the core of Project Clover during the session, for clarity, the key elements are:
1. Data Stored in Europe, for Europe: We are moving our UK and EEA users' data to data centres in Ireland and Norway, meaning all UK and EEA users' data will be stored locally.
2. Comprehensive gateways for who can access what: We are strengthening further our internal gateways that ensure only employees who need specific data in order to do their jobs can access that data. Importantly, this will mean that 'protected personal data' will not be accessible to employees in China.
3. Third-Party Oversight: In an industry first for platforms like ours, we will be appointing a European third-party IT security company, who will oversee and monitor the operation of our gateways, meaning that there will be additional scrutiny of our practices.
More information on Clover can be found here. We will also ensure we keep individual Members of the Committee, and the House more widely, up to date with key milestones in delivering on this programme, which is already underway.
In answer to a question from Rob Roberts MP, I committed to providing the committee with information on the number of underage (i.e. under 13 years old) users we remove from the platform.
As mentioned during the Committee session, we were the first platform to be transparent about these figures and are the only platform to publish on a regular basis. We publish each quarter the number of accounts of suspected under 13s we remove at a global level. From October - December 2022, the latest figures available, we removed 17,877,316 accounts suspected of being underage. I can also confirm that we have shared the country-level data with Ofcom as our regulator under the Video-Sharing Provider (VSP) regime.
In a related question, Ben Lake MP, asked whether we had information about the average time between an account being created by an underage user and it being identified and removed. I committed to checking to see whether we held this data. Having made enquiries, I can confirm that, unfortunately, this is not readily calculatable.
Rob Roberts MP asked for the reasoning behind TikTok (and other platforms) having a 13 age rating and why we had settled on this threshold.
There are two key reasons for the age 13 threshold. Firstly, in the EU and the UK, the 'Digital Age of Consent' is set at 13. This is confirmed by the Information Commissioners' Office, which states:
"If you are relying on consent as your lawful basis for processing, when offering an online service directly to a child, in the UK only children aged 13 or over are able provide their own consent."
Secondly, for global platforms like TikTok, this approach is mirrored in the US under the 'Children's Online Privacy Protection Rule', which sets 13 as the minimum age at which certain important data handling processes can operate.
It is important to note that TikTok takes an age-appropriate design approach to our platform and so not all functionality is available from age 13. For example, our Direct Message services are not available until 16, and between 16 and 17 are set to default 'off'. There are also a number of services only available to 18+ users. These include the ability to host a LIVE stream or to send gifts as part of our LIVE services and the ability to use monetisation features. More information on these issues can be found here. For those under 18, we also have a default one-hour screen tap cap. To support parents with teen users, we also provide a 'Family Pairing' system that enables parents to set preferences for their teens' account. More information on these tools can be found here.
Wayne David MP also asked us about our position on the Online Safety Bill. As mentioned in the Committee, we are in favour of the Bill and have always supported regulation on safety issues as an important part of scrutiny for companies like ours. As also mentioned, TikTok is already regulated by Ofcom as a VSP.
I committed to providing a bit further detail on the areas of concern we do still have with the Bill. These concerns primarily relate to future-proofing the legislation and ensuring that it works effectively for content platforms like TikTok in addition to more traditional social networks like Facebook or Twitter.
Our concerns relate to clause 12(6) and (7b) of the user empowerment duties require services to "filter out content originating from unverified users from their feed”. We support the intent behind this clause. However, we believe that in practice the clause will not achieve this shared goal and could result in exacerbating the risk of so-called 'filter bubbles' by reducing our ability to ensure that users see a diverse range of content. I have attached a confidential, more detailed briefing for Committee Members.
Thank you again for the Committee's time and we look forward to continued engagement with you in the future.
25 May 2023