Written evidence submitted by TikTok (OSB0181)

 

Executive Summary

TikTok strongly supports the Government's plans to legislate on online harms. It is our view that the user-generated content sharing industry should operate within a clearly defined legal framework established by Parliament. Our submission seeks to contribute to this goal and is guided by the following overarching principles:

       Principles-based approach: We welcome the principles-based approach contained in the Bill, which recognises that each content sharing platform is different and is used differently by users. Our submission is intended to reflect how the new regulation will operate in relation to our platform and the existing work of the industry. We think that the principles-based approach is appropriate to ensure flexibility across the range of platforms and services that will be in scope, but without impacting the effectiveness of the legislation.

       Legal clarity and certainty: To ensure that the proposed legislation brings clarity to the roles and responsibilities of platforms as well those of Ofcom and the Secretary of State, and to the sequencing and timelines within which various legal obligations will take effect. We believe that additional clarity on the roles and responsibilities will facilitate compliance and ultimately enhance safety while minimising the potential impact on free expression caused by over-moderation . We look forward to working with the Committee and the Government on how it is implemented.

       Proportionality: To ensure that the obligations contained in the proposed legislation are proportionate to the risks and allow for appropriate balancing of competing rights, especially safety, privacy and freedom of expression and information.

       Practical “workability”: Alongside other factors, the safety of users is heavily influenced by the measures put in place by platforms  (including the examples we have listed below). Accordingly, it is imperative that the obligations imposed on platforms under the proposed legislation are both practical and workable so that they will have the intended positive real world impact as envisaged by the legislation.

In structure, our submission mirrors the Call for Evidence. We set out why we agree with the objectives of the bill and the systems and processes approach set out and how they could be better achieved in respect of child protection and international examples. On content in scope we explain the challenges of the existing exemptions as laid out, and our view that, if legal but harmful content is included, there must be clearer legal definitions of harms. On services in scope, we address our expected obligations as a Category 1 company and our broad agreement with the rationale behind categorisation.

In the section on powers of Ofcom we set out the ways in which the regulator should be encouraged to work with industry, academics and experts on Codes and the continuing risk of putting too much responsibility on the Secretary of State, which would not be a suitable international precedent. In our final section, we identify aspects of the legislation that could undermine its intended impact, and address practical challenges in the timeline for implementation of the Bill as currently drafted.

 

Introduction to TikTok

 

TikTok values the chance to respond to the Committee’s invitation to make written submissions. The aim of this response is to help inform the Committee's work with a view to Parliament introducing the online safety regime that is most effective, efficient and workable, and fit for purpose in the 21st century.

 

As a challenger platform with safety at the core of what we do, we consider that TikTok is especially well placed to provide insights on the proposed legislation. Our intention in this submission is to share our perspectives and highlight where we consider further clarity is needed at this stage, to discuss where there are potential gaps or conflicts in the legislation that could undermine its intended impact, and address practical challenges in the implementation of the Bill as currently drafted.

 

TikTok is a global, short-form video platform that provides its users with a vibrant, creative experience in a fun and safe environment. Our mission is to inspire creativity and bring joy. For TikTok, creative ideas matter more than social connection, and people on the platform are celebrated for being their authentic selves. The content tends to be light hearted, authentic, real, heart-warming and truly fun but can also serve to educate our user community about the diverse range of issues of importance to them. TikTok opened its UK office in 2018 and we have grown our UK presence since then. We are proud to be engaged in the conversation around how we can play our part to make the UK the safest place to be online.

 

Indeed, the intentions behind this Bill reflect TikTok’s own prioritisation of safety from our inception. Ensuring a safe and secure environment for TikTok's users is our top priority now and will remain so regardless of new legislation. A key part of our regional strategy for prioritising user safety involved the establishment of our EMEA Trust & Safety Hub in Ireland. This dedicated regional hub allows for an even greater focus on strengthening policies, technologies and moderation strategies and ensuring that they complement both local culture and context in the region, including the UK. The hub also collaborates closely with regional regulators, policymakers, government and law enforcement agencies where required in the continued pursuit of promoting the highest possible standard of user safety.

 

To provide context to our submission, we set out below a brief overview of our commitment to safety and details of some of our safety practices.

 

Our commitment to safety

 

From inception TikTok has taken a safety by design approach to protect our users’ safety and well-being. For example, our direct message is designed so that it is not possible to send an attachment (i.e. a non TikTok video or image), and the direct messaging feature is not available for under 16s. There are three main ways we approach safety, which are briefly explained below:

 

  1. Our Terms of Service and Community Guidelines reflect our values and establish the kind of behaviour we expect from our community of users. We enforce these rules using a combination of cutting-edge technology and thousands of safety experts based around the world. For UK users, this work is led from our EMEA Trust & Safety Hub in Ireland, supported by an expert team of Trust and Safety moderators in the UK.

 

  1. We develop robust safety policies and features, including default settings, restricting direct messaging and livestream to over 16s and, through Family Pairing, enabling parents to work with their teen to help them to manage their TikTok experience. We actively promote many of these features to our users to ensure they have a genuine impact, and to ensure our users and their parents and care-givers are aware of digital safety issues. Recent examples include our Guardian's Guide, and new digital safety resources for parents and care-givers.

 

  1. We collaborate with industry partners to make the digital world safer for everyone. We work with issue experts and safety organisations, and we are participants in industry-wide initiatives such as the Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse, the EU’s Code of Practice on Disinformation and the Code of Conduct on Countering Illegal Hate Speech Online.

 

However, there is no finish line when it comes to protecting the TikTok community. We work each day to learn, adapt, and strengthen our policies and practices to keep our community safe. In the last year alone we have made numerous announcements including:

 

       October 2020: we implemented improved notifications to provide clarity to users around content removals;

       December 2020: we strengthened our Community Guidelines to better support the well-being of our community (and all users were notified);

       January 2021: we changed the accounts of both existing and new users under the age of 16 to private by default;

       February 2021: we introduced new prompts to encourage users to consider the impact of their content before they share it, and announced a partnership with INHOPE, a global network of 47 child protection hotlines who help fight child sexual abuse material online;

       March 2021: we launched our European Safety Advisory Council and introduced new features to control comments and prompts to promote kindness;

       April 2021: we announced our European Transparency and Accountability Centre;

       May 2021: we launched #FactCheckYourFeed, a new media literacy campaign to promote critical thinking and rolled out a refreshed Safety Centre with new guides and resources aimed at supporting digital safety and security conversations among families;

       June 2021: announced we're now publishing quarterly Community Guidelines Enforcement Reports as part of our broader transparency reporting efforts, including numbers of suspected underage accounts removed, and joined the Technology Coalition, an organisation that works to protect children from online sexual exploitation and abuse;

       August 2021: further enhanced the protective measures we have in place for our teenage users including changing default DM settings for 16 & 17 year olds to ‘No One’, limiting push notifications and proactively asking younger users to set privacy settings when they upload their videos; and

       September 2021: announced new Family Pairing resources offer digital safety advice developed in collaboration with teens and online youth safety experts, and added additional well-being resources to support our community.

 

We believe many of these features demonstrate our continued commitment to improving a safe and enjoyable experience for our users and are in the spirit of the Bill. We look forward to working with the Committee and the Government over the coming months on how it is implemented.

 

Our approach to age assurance

 

We also acknowledge the focus and desire to learn more about the age assurance systems applied by online platforms, and we would like to use this opportunity to outline our approach. TikTok applies a risk-based approach to age-assurance measures which includes continual efforts to keep users who do not meet the minimum age requirement off the platform and to provide an age-appropriate experience on the service. TikTok users have to be 13 years old or older to have an account on our platform. In the first quarter of 2021 TikTok removed 7,263,952 users for potentially being under the age of 13. This is less than 1% of all accounts on TikTok. By using TikTok, users confirm they are over the relevant age (our Terms of Service state that people “may not… access or use the Services if you are not 13 years or older”). TikTok retains a 12+ App Store rating and a Google Play Store rating of “Parental Guidance Recommended” (which, as explained above, are the most appropriate options from those offered by these app store providers). Parental controls on each app store allow parents to block apps with such ratings on their children’s devices.

 

To help keep people from using TikTok if they're not yet old enough to do so, we've designed a neutral age-gate that requires people to fill in their complete birthdate to prevent people from simply clicking a pre-populated minimum age. We do not prompt users to enter a ‘correct’ date of birth. If someone does not meet our minimum age requirement, we suspend their ability to attempt to re-create an account using a different date of birth. While most people understand the importance of being truthful about their age, some do not provide the correct information, which is a challenge many online services face. That's why our commitment to enforcing our minimum age requirements does not end at the age gate, and we take a number of additional approaches to identify and remove suspected underage account holders.

 

We train our safety moderation team to be alert to signs that an account may be used by a child under the age of 13. We also use other information provided by our users, such as in-app reports from our community, to help identify potential underage accounts. When our safety team believes that an account may belong to an underage person the account will be removed unless the user can provide appropriate evidence to prove they are over the minimum age.

 

We consider that age assurance is an industry-wide challenge but we are deploying the above measures to both assess the level of risk on our platform and act accordingly. Of course we know there is more work to be done and we are committed to working collaboratively with industry peers, regulators, and key stakeholders to find appropriate solutions.


Objectives of the draft Bill

 

Consistent with our commitment to safety, TikTok welcomes the systemic approach to regulation taken by the Government, and recognises that the regulatory framework outlined is one that looks at systems and processes rather than individual pieces of content. It is our view that a principles-based approach where common principles to protect users are established in law, and where the regulator works closely with platforms to ensure they have appropriate systems to tackle the harms on their platform, is a good way to avoid common challenges of other regulatory proposals in this area.

 

Innovation and the UK’s economic growth

 

Given the draft Bill focuses on outcomes and platforms’ ability to protect their users online through a duty of care, we believe that the Bill will be applicable to a wide variety of platforms and services, and is broadly future-proofed by not prescribing the specific technical means that must be used to achieve compliance.   We consider that this builds on the commitments made to promote innovation as part of the DCMS Plan for Digital Regulation, published in Summer 2021.


We would encourage Ofcom only to use its proposed powers (in Part 4 of the Bill) to require platforms to use specific technologies once all other avenues have been explored. We believe that continuing to take an evidence-based approach that favours an assessment of outcomes, not mandated procedure, is the best way to incentivise the creation of newer and better safety technologies in the future. Critical to this will be the clear understanding and definition of the terms “prevalent” and “persistently prevalent” in the Bill given their importance in assessing compliance. We would welcome early clarity on the understanding and definition of these terms. This will ensure that platforms are both motivated to evolve their approaches to safety and also avoid putting a predetermined burden on new entrants into the market.

Protections for children

As set out above, we already take upholding safety on our platform very seriously and are deeply committed to ensuring a safe and positive experience for all our users, especially our younger users.

 

As regards the protection of children under the proposed legislation, we would make several observations. First, we welcome the risk-based approach through the proposed children’s risk assessment requirement as being the most appropriate approach. Second, we consider that certain aspects of the proposed legislation could make the underlying goals difficult to achieve in practice and at scale. For example, in many situations it may be difficult to determine what constitutes “non-designated content that is harmful to children” and therefore to implement effective measures, raising the risk of over-moderation of legitimate speech. Furthermore, we consider it would be operationally challenging to take account of the specific characteristics of a particular child when applying the  “child of ordinary sensibilities”  assessment. Finally, given the recent entry into force of the Age Appropriate Design Code (AADC) we think that it would be a welcome addition if the Secretary of State and Ofcom were required by law to have regard to its content (and/or to consult with the ICO) before introducing child-specific regulations and/or guidance. Given the overlap in safety and privacy issues, this measure, we consider, would help to ensure consistency in protecting the best interests of children in the UK.

 

International comparisons

 

As already articulated, TikTok welcomes the UK government’s focus on systems and processes rather than individual items of content or specific instances of behaviour - this clearly could underpin both a proportionate and nuanced framework that could set an international example. Nonetheless, without clearer definitions and a more definite process for designating harms, there is a risk of an open-ended or overlapping number of harms and obligations on platforms in the UK. The risk is that this encourages over moderation, or makes it hard for platforms to fulfil their duties, in such a way that risks undermining the effectiveness of the overall approach.

 

We would highlight to the UK government the EU’s proposed Digital Services Act’s Good Samaritan provision which serves to create legal clarity on platforms’ responsibilities and enables more proactive safety measures without risking additional liability. Similarly we would draw attention to both the EU Audiovisual Media Services Directive and the UK AVMS Regulations where the complaints processes are defined and limited in scope such that there is a clear distinction between measures for “reporting content” and “complaints about those measures” - thus clarifying platforms’ specific obligations.

 

Content in scope

 

TikTok supports an approach to regulation that seeks to avoid a generalised definition of harm, instead focusing on a number of clear categories and definitions of harm and would encourage the UK government to seek to refine their approach further to achieve this clarity. As it stands, within the Bill there are multiple and potentially overlapping definitions which may  be challenging to implement  in practice; for example on “content that is harmful to children”,  “primary priority content that is harmful to children”, and “non-designated content that is harmful to children''. Further complexity is added where the Bill envisages that these terms should be assessed by reference to a “child of ordinary sensibilities”. Greater legal clarity on these types of overlapping categories will support focused action by platforms on the harms most pertinent to the individual service. 

 

These challenges are further compounded when considering content which may have a “direct or indirect” impact on an individual. Moderators are unlikely to have sufficient context to properly assess the potential indirect consequences of an adult consuming certain content in a consistent and effective manner. Given these difficulties, and the risk of inconsistent moderation, such a duty should instead be limited to content which has a direct impact only. This is particularly true for adults, as it would be consistent with the policy intent to impose a lighter form of obligation for content that is potentially harmful to adults.

 

This challenge has been addressed in different regulatory proposals and we consider international approaches to be a helpful guide here. For example we note that the EU’s proposed Digital Services Act excludes “harmful but legal” content from the core operative obligations, and instead focuses on illegal content. Meanwhile, Ireland’s draft Online Safety and Media Regulation Bill which provides clarity and certainty through a limited number of clearly defined categories of harm captured by that regulation.

 

While the UK government is choosing not to explicitly seek to mirror either of those approaches, we do believe that the UK’s legislative framework does need to establish precision and clarity over what constitutes “harmful but legal” content, if it wants to include this content within the scope of the Bill.

 

More broadly the Government should seek to consult, coordinate and align with these other regulatory regimes where possible to ultimately contribute to better compliance, more effective, consistent moderation, and ultimately improved safety. For example, as highlighted above, the inclusion of the Good Samaritan provision and retention of the prohibition on general monitoring obligations would enable broader safety features to be implemented for UK users.

Further to this and acknowledging the Government’s intention for some "harmful but legal" content to be within scope, we would urge that the types of regulatory requirements and expectations for this type of content be different and enforced differently than for clearly illegal content like terrorist content and child sexual abuse material. Part of this will be creating a clear process in order to designate and define the categories of harms in scope both for adults and for children. While this is currently left to secondary legislation, we advocate a ‘double check’ process whereby Ofcom conducts research and collects evidence on the prevalence and risk of suspected harm, before recommending clear definitions for Parliamentary approval. Recognising the changing nature of harms, this should also include a process to amend and remove categories of harm to ensure the list of priority harms does not become overly burdensome and which would risk curtailing freedom of expression.

In addition to this, clearly we support the approach to CSEA and terrorist content outlined in the legislation, which builds on our existing approach and work with government and regulators. We have been working closely with Ofcom and the Home Office to understand the requirements posed by the interim Codes of Practice. And we now consider that it is important that, alongside the passage of the legislation, Ofcom share full details of the draft and final statutory Codes in good time to give businesses the legal clarity they need to implement them effectively. This would both reflect the serious nature of these harms and the progress already made through the interim Codes and the ongoing dialogue between the Government, regulators and industry on this type of content.

 

We believe that it would be beneficial to industry if the legislation were precise in how it stipulates timelines within which these Codes are to be adopted and enter into force. Clear timelines would enhance the ability of platforms to ensure they are compliant with these statutory Codes of Conduct. Specifically, clear timelines would contribute to platforms’ ability to implement internal cross-functional activity and programmes needed to implement operational measures and enhance overall safety in these important areas. In addition, we would welcome a lead-in period from the date of a code’s publication before it enters force (at present, the Codes would enter force on their date of publication, leaving little or no time for preparation by platforms).

 

Risk Assessments

 

TikTok considers that the risk assessments by Ofcom as set out in the draft legislation act as an appropriate mechanism for providing a genuine, detailed assessment of the suitable threshold for assessing harms. We would now welcome an approach that gives sufficient time for companies to undertake their individual assessments, reflecting that different platforms will likely want and need to take nuanced  approaches to these assessments.

 

While Tiktok places a heavy emphasis on the importance of assessing and mitigating risk, an unduly onerous approach to risk assessments would not only have negative impacts on innovation on platforms and product updates, but also potentially delay the launch of new, or enhancement of existing, safety features.  The current high volume and intensity of risk assessments to be completed in short time periods is a shift away from being focused on the outcomes achieved and towards administration and processes, potentially to the detriment of continued user safety.

 

TikTok is committed to improving the safety of our platform, and the experience of our users. As outlined at the beginning of this response, we are continually innovating and adapting the design and operation of our service to launch or improve safety features available to our users. To ensure these administrative steps do not have the unintended consequence of reducing user safety we would encourage a more flexible approach to risk and impact assessments. This could be achieved by extending the three month deadlines to provide more time, or by moving away from the requirement to first conduct formal risk assessments before changes to the platform are rolled out but to a periodic update to ensure that they capture latest developments while reducing the impact on innovation.

 

Journalistic Content

 

We agree that freedom of expression needs to be protected as part of the Bill and we are glad that this is something the Government has had front of mind when drafting the legislation. However, TikTok has some remaining concerns about the proposed exemption for content of democratic importance and journalistic content set out in the legislation. While we are clear that the environment needs to be safe enough to allow everyone to articulate their views and even unpopular views, this should not act as an excuse for hate speech or misinformation. We are concerned that if this exemption is as blanket as it appears in the draft legislation, this could on the one hand, become a backdoor for bad actors on the platform, and on the other hand, not exempt content posted by specific journalists rather than news publishers.

 

We would draw attention to the fact that TikTok already accepts that in order to protect free expression some content should remain on the platform in the public interest. Our Community Guidelines explicitly allow exceptions for content under certain circumstances, such as educational, documentary, scientific, or artistic content, satirical content, content in fictional settings, counterspeech, and content in the public interest that is newsworthy or otherwise enables individual expression on topics of social importance. However, if the Government intends for this exemption to be applied much more broadly as is implied in the draft legislation; there also needs to be clarity of the classification of a news provider and the definition of news’ provider content. In particular, we do not consider it would be appropriate for individual content moderators to make these sometimes finely balanced distinctions at scale. 

 

Specifically, it would be very difficult for moderators to consistently and reliably assess how content should be categorised under the current definitions of journalistic content or democratic importance currently set out. Every piece of content will have to be individually considered against these new metrics with (as it currently stands) no clear legal framework or parameters. This could lead to inconsistent enforcement and either free speech being curtailed or harmful content not being moderated because of the understanding it fell under these exemptions.

 

Services in scope

 

Each online service is different and is enjoyed differently by its users, which means that a ‘one size fits all’ approach is not realistic when it comes to this kind of regulation. As such, TikTok welcomes the UK government’s efforts to establish a differentiated approach, where common principles and duties are established in law, but where the regulator works to ensure that  individual platforms use the systems most appropriate to them to achieve these ends. Within this framework Ofcom should be given sufficient flexibility as the regulator to reflect the nuances of different types of services and to act accordingly through its risks assessments and Codes of Conduct. To ensure this approach is balanced, proportionate, and ultimately effective, it is equally important that Ofcom sets clear regulatory expectations through its risk assessments and regulatory guidance.

 

In line with this, we broadly agree with the Government's proposed approach to establish category thresholds that are based not only on size of audience but also on the functionalities offered. We consider that this ensures that arbitrary value judgements are not made about harm or impact based only on the size of a platform. In practice, we consider that this should mean that the risk around functionalities is assessed according to evidence received, rather than public perception.

 

Role of Ofcom

 

We strongly welcome the Government’s designation of Ofcom as the responsible regulator for this area, and we look forward to continuing our constructive relationship with them. We hope that they will be able to work with the Secretary of State so that together they can provide the ‘double check’ function, mentioned previously, on the designation of priority harms and Codes of Practice.

 

However, while we understand that oversight of the regulator is essential for confidence in any regulatory framework, we are concerned that the proposals as currently drafted risk putting sole or final onus on the Secretary of State to make finely balanced, legal decisions. For example, we recommend that the Secretary of State ought to be required to consult with Ofcom before making any regulations to define ‘priority illegal content’. In addition, we support the idea that Ofcom should be the responsible authority for drafting Codes of Practice, but have some concerns about the burden of responsibility put on the Secretary of State in authorising these. In the future, this could increase the risk of reactive decisions motivated by political pressure that do not address systemic harms (on an evidence-based approach), but rather single instances of notable content.

 

Relatedly, TikTok would like to see greater clarity on the wider process that will be used to create new Codes going forward. The Government has said that Ofcom will "consult with relevant parties" during the drafting of Codes, and it would be helpful to specify here whether this will include industry (as is the proposed process in Ireland which would require that the responsible regulator would first consult with relevant parties including academics, civil society, technical experts and industry). We believe the existing consultation process could be strengthened by requiring Ofcom to consult with relevant academics, civil society groups, industry, and technical experts in the development of the Codes. In the case of age inappropriate content it would also be helpful to consult with child development experts, the ICO, parents and carers, and young people themselves. Formally including these groups in this process will enable Ofcom to give due regard to its obligations and duties, while helping to make sure that the Codes produced are workable, effective and technically feasible. Consultation with the ICO will also ensure the approach taken is consistent with the approach already being taken to the ICO’s Age Appropriate Design Code (and we note that the proposed Irish legislation in this area would impose a requirement to consult to relevant data protection authority).

 

In addition to clarity on the process, it would also be valuable to establish a sense of how many Codes the industry can expect because, as currently drafted, this could be interpreted as open-ended.

 

Media Literacy

 

More broadly, we believe that Ofcom has an important part to play in facilitating media literacy, and have been encouraged by the regulator’s focus on this area to date. We agree that media literacy is fundamental to empower users with the skills, tools and information that they need to critically assess information and know how to proactively and visibly identify and respond to information. These educational and resilience based initiatives are a key part of a comprehensive and inclusive strategy that empowers users of all ages to navigate the online world safely and securely.

 

We strongly consider that media literacy content and campaigns should be tailored to the unique structure and user base of each online service. For example, TikTok is a full-screen, video-first sharing platform. Content that performs well on TikTok needs to be creative, authentic and engaging. This applies equally to media literacy campaigns, just as it does to user-generated content. Campaigns or content designed for other platforms that are text-based or static will not be as effective as TikTok-native interventions. We therefore believe it is best for platforms, in collaboration with Ofcom, to decide how best to achieve their own media literacy obligations, taking into account these nuances and reflecting the detailed understanding of their userbase’s needs and requirements.

 

With this in mind, we have already taken a range of actions of our own to help improve users’ media literacy and to raise awareness of the tools and information available to them. Our website features a range of media literacy tools and resources to keep our users safe and informed. These are brought together in our Safety Centre, which provides information and advice including resources on anti-bullying, Covid-19 educational videos, links to the websites of our safety partners, and a page for parents who want to find out more about the platform. Furthermore, TikTok's Youth Portal offers both in-app tools and educational content to help teens learn about digital safety. Topics include internet security, personal privacy and community best practices.

 

Our educational video series, “You're in Control" presents TikTok's safety and privacy controls in an accessible and easy to understand fashion. We created this short-form video series, and involved several of our most popular creators, to educate users about safety in the TikTok format they're most accustomed to viewing. The videos can also be accessed directly in-app @TikTokTips. These videos reinforce our Community Guidelines and offer users mini 'how to' tutorials including how to: block a user; report inappropriate behaviour; filter comments; make an account private; and set screen time limits.

 

Finally, we also have dedicated campaigns to media literacy. For example,   this summer, we launched our #FactCheckYourFeed campaign to equip the TikTok community with the skills they need to critically engage with content, navigate the app safely, and guard themselves against potential harm. Collaborating with top creators and expert partners like Citizens Advice, The Jo Cox Foundation, The Student View, and the British Dietetic Association (BDA) to promote insightful content on the app, TikTok has produced five instalments on complex topics including spotting vaccine misinformation, making financial decisions, identifying misleading information about diet and exercise, and improving media literacy and critical thinking skills. In a time when it's more important than ever that we can trust and engage with what we're watching online, the #FactCheckYourFeed campaign aims to encourage the TikTok community to dig a little deeper and think a little wider.

 

Conclusion and ensuring compliance

 

In conclusion, TikTok welcomes the principles underlying the Online Safety Bill and looks forward to working together with government departments and Ofcom towards their intended objective of making the UK the safest place in the world to be online. This legislation has the potential both to be effective and future-proofed, whilst escaping the pitfalls of encouraging blanket over-moderation.

 

Alongside this, we consider that timely compliance with the Bill will be best achieved by having a clear and workable timeline when obligations come into effect, clearly defined duties for platforms, and a clear sense of where the relevant regulatory powers lie. This will help to get this legislation right the first time round, and enable companies to implement these measures with maximum efficacy; interacting with the appropriate regulators and stakeholders as they implement their obligations.

 

Therefore, TikTok would encourage the Committee to consider firstly whether the Government could provide further clarity about when the various powers and changes set out in the Bill will come into force, especially if dates vary across different provisions. This will ensure that platforms can better prepare for its implementation and thus enhance  practical compliance.

 

And secondly, we would encourage the Committee to consider this draft legislation through the lens of avoiding any regulatory duplication or divergence, for example between the remits and approaches of Ofcom and the ICO. While certain mechanisms have now been established for cooperation, like the Digital Regulation Cooperation Forum, we consider that certain provisions in the Bill could lead to inconsistent approaches between regulators. For instance, Ofcom’s mandate under s. 36 (5) to prepare a code of practice in relation to “the protection of users from unwarranted infringements of privacy” potentially ventures into the ICO’s sphere. Similarly, in respect of age assurance issues, these measures are already well developed in the ICO’s AADC. In such cases, Ofcom should be required to consult with the relevant other regulator, so as to limit any inconsistencies or confusion  about what is required of platforms.

 

Taken together, we consider that this additional clarity will help ensure that the Bill achieves its aims; not only in policy terms but in terms of practical compliance.

 

28 September 2021

13