Written evidence submitted by Microsoft (OSB0076)

 

Introduction

 

Microsoft welcomes the opportunity to provide these comments on the UK Government’s Draft Online Safety Bill (“OSB”). We share the Government’s belief that technology companies have an important role to play in helping make the internet safer. We agree that companies like ours should take meaningful steps to ensure that people don’t misuse their services to harm others, and we are committed to continuing our work with UK parliamentarians to advance that goal. At the same time, as a company with strong links to the UK’s vibrant tech sector through our nationwide partner network and our support for start-ups through our accelerator programmes we are acutely aware of the need to not overburden SMEs who are innovating with disproportionate regulation.

 

Microsoft has a long history of working with the UK Government, civil society, academia, and others in industry to advance online safety. We are a founding member of the Global Internet Forum to Counter Terrorism, committed to preventing terrorists and violent extremists from exploiting digital platforms. In partnership with Dartmouth College, we developed PhotoDNA, donated it to the U.S. National Center for Missing and Exploited Children, and make it freely available to companies so that they can quickly identify and remove known images of child sexual exploitation and abuse on their services. We are a founding member of the Digital Trust and Safety Partnership, which promotes a safer and more trustworthy Internet. We are a member of the Internet Watch Foundation, a charity that works in close cooperation with technology companies, law enforcement and government to remove from the internet child sexual abuse images and videos wherever they are hosted in the world. We’re a signatory of the EU’s Code of Practice on Disinformation, and of the Christchurch Call to terrorist and violent extremist content online. We adhere to the EU’s Code of Conduct on Countering Illegal Hate Speech Online, and we invest heavily in tools and sources to promote digital civility and online safety. In 2018, we launched the Defending Democracy Program to protect democratic institutions and processes and fight disinformation. These are just a few examples of Microsoft’s many investments into digital safety.

 

In all of these efforts, we also recognise the need to act thoughtfully in order to protect other important interests. These include the rights to freedom of expression, access to information, the confidentiality of private communications, and privacy more broadly. To this end, we are a member of the Global Network Initiative, were an early supporter of the UN Guiding Principles on Business and Human Rights, and have adopted an integrated approach to human rights decision-making across our business.

 

Against this backdrop, we welcomed the UK Government’s proposal, in its April 2019 Online Harms White Paper, to adopt a “duty of care” approach to online safety, based on principles and accountability. We supported this approach because we understood that it would require providers to take responsibility for online safety on their services through their terms and conditions, while at the same time affording them the flexibility to adopt appropriately balanced content moderation solutions based on the type of service, user expectations, risks of harm, prevalence of illegal content, and other relevant factors.

 

We are encouraged to see that the OSB still references this duty of care (see sects. 5, 17), and that it seeks to pursue a proportionate and risk-based approach (OSB’s Explanatory Notes, para. 7). We are concerned, however, that several aspects of the OSB seem inconsistent with that approach. Rather than adopt a risk-based and contextual framework, the OSB in many cases imposes prescriptive, one-size-fits-all requirements, without regard to the different types and risks of harms that different services confront, and without adequate recognition of the threats to freedom of expression and other fundamental rights that the proposed mandates would entail. Although these mandates might be appropriate for certain services and in certain contexts, they will be entirely inappropriate in others. If

not amended, these aspects of the OSB could have the unfortunate effect of leading providers to over-remove content from their services in order to avoid any risk of non-compliance.

 

Summary

 

As UK policymakers consider how best to address these concerns, we urge them to keep three guiding principles in mind:

 

First, any strengthening of the UK’s current content moderation regulations will impact other important interests, including in particular privacy, freedom of expression, and freedom of access to information. The final OSB text should ensure that regulated actors can take account of all of their interests in complying with their legal obligations, and that they are not incentivised to “over-censor” content in order to avoid liability.

 

Second, lawmakers in many other countries—including those with less robust traditions on due process and the rule of law—are watching the UK’s actions with keen interest, and some may seek to justify their own restrictions on the internet by pointing to the UK regime. It will be important that the UK’s final online safety regime is fully consistent with the democratic values and human rights principles that are core to the UK’s social identity and central to its terms of engagement with the world.

 

Third, the OSB regime should recognise the immense operational challenges that could arise if the UK were to adopt content moderation regulations that are not aligned with those in other major markets. The internet is borderless by design. If the Government imposes regulations that diverge significantly from those of other countries, online service providers might have no other option than to limit their offerings to the UK, depriving UK users of access to content and services available to users elsewhere in the world. This would have a detrimental effect on the UK’s ability to compete successfully in the global economy.

 

These comments set out the key changes that in our view are necessary to help ensure that the final legislation strikes the appropriate balance, specifically:

 

  1. Clarify that it is the responsibility of government to determine legality of content and the responsibility of service providers to enforce their respective terms and conditions.

 

  1. Clarify that the OSB’s obligations do not apply to services offered to enterprise customers.

 

  1. Remove any obligation on providers to interfere with private communications.

 

  1. Revise the obligations on providers of search services to ensure adequate protection of freedom of expression and access to information.

 

  1. Refine the criteria for designating Category 1 services.

 

  1. Revise provisions that are overbroad or vague.

 

  1. Clarify that regulators may not issue “use of technology notices,” or use other OSB measures, to require the removal of specific items of content.

 

  1. Clarify that providers cannot be penalised for reasonable, good-faith efforts to comply with the law.

 

  1. Clarify the provisions around transparency reporting

 

  1. Align the OSB with online content regulations in other major markets.

 

The balance of these comments explains these points in more detail.

 

1.       Clarify that it is the responsibility of government to determine legality of content and the responsibility of service providers to enforce their respective terms and conditions.
 

Microsoft supports the Government’s efforts to eradicate illegal content and activity online. To this end, the OSB imposes a variety of “illegal content duties” on providers of user-to-user (“U2U”) services, including to minimise “the presence of priority illegal content,” “the length of time for which priority illegal content is present,” “the dissemination of priority illegal content,” and to “swiftly take down such content” as soon as the provider becomes aware of it (sect. 9(3)). Section 21 imposes similar obligations on providers of search services, including to “minimise the risk of individuals encountering” either “priority illegal content” or “illegal content that the provider knows about” (sect. 21(3)). Illegal content, in turn, includes any content constituting an offence “of which the victim or intended victim is an individual (or individuals)” (sect. 41(4)).

 

Identification of online content which is illegal, however, is not always a simple task. Often, a provider will not be able to determine from the face of the content or from the information available to it, whether a specific item of content or a user behaviour would likely be adjudicated illegal by the government. For instance, content that accuses someone of a crime, invites others to invest in a financial scheme, or warns of a future attack could be deemed legal, or could constitute an offence directed at one or more victims. Providers often will have no way of knowing which is the case.  Requiring service providers to act as adjudicators, requiring them to decide whether they need to remove content in the face of such uncertainty incentivizes over-removal of content in order to avoid liability. As a result, the current OSB framework could have serious negative repercussions for UK internet users’ fundamental rights to privacy, freedom of expression and access to information.

 

Rather than require providers to themselves determine whether any particular content may be illegal, we urge the Government to distinguish between government responsibility – to adjudicate legality, and platform responsibility – to enforce terms and conditions. The two are likely to diverge in the grey space: where legality is unclear.  In these cases, platforms should not be faced with liability for making what the government may later deem to be “the wrong call.”  Governments may obtain the “right” result by issuing a duly-executed court order for specific illegal online content.

 

Focusing provider efforts on enforcement of their terms and conditions, versus adjudication of legality or illegality, promotes the ultimate goal of user safety.  Legality determinations require a binary choice: remove or retain online content.  Enabling providers to leverage other methods of protecting user safety—for example, by labelling content with a  warning, or placing an interstitial on the content so that users can avoid confronting the content if they wish to do so, allows providers to better  balance promoting user safety consistent with the rights to privacy, freedom of expression, and freedom of access to information, in  circumstances where content is not prima facie illegal.   And, as previously mentioned, should the government subsequently make a determination of illegality, a duly-executed court order should resolve all doubts.

 

2.       Clarify that the OSB’s obligations do not apply to services offered to enterprise customers.

 

The OSB's obligations should apply solely to online service providers with a direct relationship to the generator or distributor of online content and not enterprise institutions. The harms that the OSB seeks to address are associated almost exclusively with consumer-facing online services. For services offered to enterprise customers, (e.g., a school, an engineering firm, a government department), the customer itself typically determines who the end users are, imposes its own rules on acceptable content and takes primary responsibility for enforcing those rules vis-à-vis its own employees and users. This makes sense; one could imagine that a school’s rules might need to be different than an engineering firm’s rules.  Imposing the OSB’s content moderation obligations onto providers of enterprise services would be highly problematic and in many cases infeasible. For instance, in many cases, providers of enterprise services may not have a legal right to scan or remove content that an enterprise customer (or that customer’s employees or customers) uploads to the service; on the contrary, the relevant contract might prohibit the provider from doing so. Also, as a technical matter, a provider might have no means of identifying or removing specific items of content (e.g., specific files or photos) that an enterprise customer stores on the service. Indeed, most enterprise customers would object strenuously if its provider of cloud services monitored the customer’s content or unilaterally removed it.

For these reasons, Microsoft welcomes the fact that the OSB intends to exclude enterprise services from its scope. The OSB defines the provider of a U2U service as ”the entity that has control over who can use the service (and that entity alone)” (sect. 116(2)), and the provider of a search service as being “the entity that has control over the operations of the search engine (and that entity alone)" (sect. 116(5)), and further stating that a person who provides an “access facility” to a U2U service is not to be regarded as a person who has control over who can use that service (sect. 116(4)). The OSB’s Explanatory Notes indicate that these provisions are intended to cover most enterprise services, stating that:

“[E]nterprise software, such as software-as-a-service products, also counts as an access facility; therefore, [Section 116(4)] makes clear that, where multiple entities (or individuals) may be involved in the provision of a service to the end user, it is only the entity with control over who can use the service which is to be considered the service provider. For example, if entity A buys software from software company B on a software-as-a-service basis, and the software enables entity A to create a regulated service, entity A (rather than software company B) is to be considered the service provider” (para. 730).

Unfortunately, this critical Explanatory Note is not reflected in the OSB’s text. To remove any doubt that services provided to enterprise customers fall outside the scope of the OSB, policymakers should add an explicit exemption for enterprise products and services in the OSB’s text. This could be achieved, for instance, by including a version of paragraph 730 of the Explanatory Notes (quoted above) into either the definition of access facilities in Section 93(11), or in the list of exempt services in Schedule 1 to the OSB. Whatever approach the Government adopts, the text of the OSB itself should make absolutely clear that all services provided to enterprise customers fall out of scope. Failure to do so could lead to unworkable results; for instance, providers who offer cloud storage services to enterprise customers often will be in no position to assess whether the service is “likely to be accessed by children,” and therefore should not be required to comply with the OSB’s obligations on such services.

 

 

3.       Remove any obligation on providers to interfere with private communications.

 

People and organisations today rely on a range of online services to engage in private communications. This includes not only email and messaging services, but also online video-conferencing, group chats, and other communications services. People and organisations are willing to use these services because they trust that providers will respect their confidentiality and keep their communications private and secure.

 

The OSB would threaten this trust by requiring providers to interfere with a wide range of online communications. This is because Schedule 1 exempts only a small number of online communications services from the OSB’s scope—namely, email services (para. 1), SMS and MMS services (para. 2), and “services offering only one-to-one live aural communications (para. 3), defined to mean “communications [that] consist solely of speech or other sounds conveyed between two users . . . [and] do not include, and are not accompanied by, any written message, video or other visual image” (sect. 39(6)(a), (b), emphasis added).

 

These exemptions are too narrow. There is no legitimate reason to require providers to interfere with private communications merely because they take place using video or other non-aural means of communication. There also is no legitimate reason to require providers to interfere with private communications merely because they involve more than two people. Requiring providers interfere with non-aural communications, or with online communications involving three or more people, would constitute an unprecedented interference into the rights of private life and correspondence and would likely conflict with providers’ legal obligations in other jurisdictions.

 

To resolve these concerns, the OSB should be revised to exclude from scope all online services that are used to engage in private communications, whether by individuals or by organisations. To the extent the OSB imposes obligations covering user communications, they should apply only to communications that can reasonably be viewed as “public” based on objective and verifiable criteria, such as the nature of the communication and whether the sender specifically selects the recipients of the communication. If public authorities wish to compel a provider to engage in such interference, they should obtain a formal order to that effect, reviewed and approved by an independent authority, as required under current UK law.

 

4.       Revise the obligations on providers of search services to ensure adequate protection of freedom of expression and access to information.

 

The OSB would impose extensive obligations on providers of internet search services, most of which are similar to those imposed on providers of U2U services. This approach fails to recognise that search services differ fundamentally from U2U services. Search services do not facilitate users’ “upload” or “sharing” of content (sect. 2(1)). Rather, they help users discover information stored elsewhere on the internet. Individual users may have entirely valid reasons for wanting to search for online content that others might find objectionable, such as for academic, journalistic, or other research or investigative purposes, or simply to inform themselves about current events; but those purposes are not determinable by the search service. Insofar as a search service merely points users to information stored elsewhere on the internet but does not itself host the content, content removal obligations that might be appropriate when applied to certain U2U services will raise heightened risks to freedom of expression and access to information when applied to search.[1] 

 

The OSB often fails to recognise these points. As a result, several of the obligations that the OSB would impose on search services are neither proportionate nor workable, and retaining them in the final legislation would pose potentially serious threats to fundamental rights. For example:

 

         Search services likely to be accessed by children. The OSB would impose heightened obligations on search services “likely to be accessed by children,” including potentially to remove any search result that might be “harmful to children”[2] (see sects. 17(3), 19(2), 22-24). A search service is “likely to be accessed by children” if: (1) it is “possible for children to access the service or any part of the service” and (2) “there are a significant number of children who are users of the service or that part of it” or “the service, or that part of it, is of a kind likely to attract a significant number of users who are children” (sect. 26(5)).[3] Providers of search services, however, typically do not require users to sign in to an account in order to enter queries or obtain search results—this ability to search anonymously is vital to enabling people to exercise their fundamental right to access information. Search engines also typically do not place limits on who can use the service; as a result, it is “possible” for children to access most search services, and providers generally have no way of knowing how many of its users are children, or whether its service is “likely to attract” children.

 

In light of this, it is difficult to imagine how providers of search services could comply with the OSB’s obligations, without adopting measures that would significantly chill freedom of expression and access to information for all users. Presumably, providers would need to operate two separate search services, one for children and one for adults, and would need to impose age verification or similar measures as a condition of using either of these services.[4] In addition to raising data protection and privacy concerns, this could significantly limit the ability of all users to search anonymously, and could also impinge upon the rights of children (especially older children) to exercise their fundamental rights.

 

Microsoft appreciates the need to protect children from harmful online content, including content accessible through a search engine. For instance, we have built a number of features into our Bing search engine, such as Bing SafeSearch, and offer a range of other resources to help parents protect their children from inappropriate online content, for instance through the Microsoft Family Safety program. Providers of search engines should not, however, be required to censor lawful online content, nor should they be required to age-gate users or operate separate search services for different age groups—all of which would inevitably chill freedom of access to information and other human rights. Microsoft encourages policymakers to remove the OSB obligations directed at search services likely to be accessed by children and instead impose the relevant content moderation obligations on the website operator that is hosting the objectionable content at issue.

 

Complaints procedure for search services. Section 24(4)(c) of the OSB would require providers of search services to establish a complaints procedure for operators whose web pages were demoted or removed from search results as a result of the provider complying with its safety duties under the OSB. This obligation fails to appreciate how search engines operate. Search engines crawl and rank many millions of webpages per day. In their efforts to deliver the webpages that are most responsive to a user’s query, they can affect the rankings of many thousands of web pages. Requiring search providers to respond to complaints by every operator of a website believing that their webpage isn’t ranked high enough would be unworkable. A better solution would be to require search engine providers to publish a summary of the main parameters against which they rank web pages, similar to the obligations imposed on providers of search services under Article 5 of the EU Platform-to-Business Regulation. This would provide a useful level of transparency to web publishers, without imposing obligations that are impossible to meet on providers of search services.

 

5.       Refine the criteria for designating Category 1 services.

 

The OSB would impose heightened obligations on providers of U2U services designated as “Category 1” services (see, e.g., sects. 7(5)-(7), 11, 12(3)-(5)). Schedule 4 requires the Secretary of State to promulgate regulations specifying the conditions for designating U2U services as Category 1 services, based on “(a) number of users, and (b) functionalities” (Sched. 4, para. 1(1)).

 

First, we would urge policymakers to clarify how the number of users of a service will be calculated. For instance, it would be helpful to clarify that this number is calculated within a specified timeframe (e.g., one month), and only includes users who actively use the specific feature of functionality of the service that falls within the OSB’s scope. In addition, relying on the number of users of a service as one of the primary factors in determining the level of obligations can result in platforms potentially floating in and out the higher category over time, which would be  difficult to monitor and implement.

 

Second, the ‘Functionality’ considered or what would constitute ‘Functionality’ would also benefit from further clarification. In some sense the function of the platform is defined by the users and what tools they make use of and how. Platform design choices, content policy decisions, back end systems and culture or stated purpose will be major factors in defining the risk profile of a platform, and in many cases the pure ‘functionality’ of the tools is difficult to separate from these.

 

Third, given the Government’s stated desire to limit Category 1 services to the “highest-risk in-scope services” (Impact Assessment, para. 115, emphasis added), we urge policymakers to revise Schedule 4 to require the Secretary to consider a wider range of factors than just the number of users and functionality of the service. In particular, because general-purpose social media services often present among the greatest risk profiles of all U2U services, the factors used to designate Category 1 services should more clearly align with the characteristic features of those services. To this end, policymakers should consider limiting the Category 1 designation to services that meet each of the following cumulative criteria: (1) they take specific actions to enable the wide dissemination of content, and most content is shared publicly; and (2) they take specific actions to boost the virality of certain content or to amplify certain content to specific users or group of users, or enable their users to disseminate content virally. Other factors the Secretary might wish to consider include the historical prevalence of illegal content on the service and whether users engage anonymously through the service. We encourage the Government to  designate services as Category 1 services on the basis of these factors to ensure that only the highest-risk services are in-scope.

 

 

 

6.       Revise provisions that are overbroad or vague.

 

Several provisions set out in the OSB are overbroad or vague, which could lead providers to over-censor content on their services in order to avoid the risk of liability. In particular, we urge policymakers to clarify or narrow the following provisions:

 

 

         The definition of content “harmful to adults.” The OSB would impose additional obligations on Category 1 services in respect of content that is “harmful to adults” (see sect. 11). Although section 46 empowers the Secretary of State to promulgate regulations designating certain categories of content as “harmful to adults” (sect. 46(2)(b)(i)), it also lists other types of content that are per se harmful to adults, including where “the provider of the service has reasonable grounds to believe [either] that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibility” (sect. 46(3)), or that the “fact of the content’s dissemination” might have such an impact (sect. 46(5)). These vaguely worded provisions have the potential to capture a wide range of legal content (e.g., a video of a mugging, a first-hand account of war crimes, a photo of police brutality), and providers often will have no reasonable basis to make these nuanced and fact-specific risk determinations. Retaining this definition could create significant uncertainty and over-censoring of content by providers seeking to avoid liability. Instead, the OSB should either eliminate this category of content, or limit the definition to specific, readily identifiable categories of content set out in the legislation itself, such as online threats of physical harm and advocacy of self-harm.

 

 

 

7.       Clarify that regulators may not issue “use of technology notices,” or use other OSB measures, to require the removal of specific items of content.

 

Under existing UK law, government authorities may require online service providers to remove content alleged to be illegal only on the basis of a court order. This ensures that authorities comply with due process and duly establish, before an independent tribunal, that the content in question is in fact illegal before requiring providers to remove it.

 

Articles 64 and 65 of the OSB empower Ofcom to issue “use of technology” notices on providers of U2U and search services, respectively. These notices can require the provider not only to “use accredited technology to identify” terrorism content or CSEA content, but also “to swiftly take down that content (either by means of the technology alone or by means of the technology together with the use of human moderators . . .)” (sect. 64(4)(a)-(b), see sect. 65(4)(a)-(b) for similar obligations on search services). The OSB does not appear to require Ofcom to obtain judicial review or approval of these notices before it issues them. Although it appears that recipients may appeal these notices to the Upper Tribunal (sect. 105(2)), the OSB is unclear whether providers may bring such appeals in order to challenge all grounds of such notices, or only the requirement to implement certain technologies.

 

Although sections 64-65 of the OSB are somewhat unclear on the precise powers they confer, to the extent they permit Ofcom to require service providers to remove specific items of illegal content or URLs from their services, the OSB should require Ofcom first to obtain an order mandating such removal from an independent tribunal. Requiring service providers instead to appeal use-of-technology notices to the Upper Tribunal, and from there to a judicial tribunal, would remove the vital procedural protections for freedom of expression that exist today. They could also have the practical effect of defaulting to censorship, shifting the burden of proof away from the government having to prove that the content in question is illegal, and onto the provider having to prove that it is legal.

 

8.       Clarify that providers cannot be penalised for reasonable, good-faith efforts to comply with the law.

 

The OSB requires providers of U2U and search services to use “proportionate systems and processes” to protect users from encountering various categories of content on their services (see, e.g., sects. 9(3), 10(3)). In doing so, providers must also take into account rights to freedom of expression and privacy (sect. 12), along with other interests in discrete cases (e.g., protection of “content of democratic importance” and “journalistic content” with respect to Category 1 services).

 

The OSB provides no clear guidance, however, on how service providers are to weigh these competing obligations. Moreover, given the scope of the OSB’s obligations, the breadth of the relevant definitions (see above), and the lack of clarity in what they require in practice, providers inevitably will need to make difficult judgment calls about whether any specific item of content is problematic and, if so, whether they must remove it or, in the interests of freedom of expression and privacy, address it in some other fashion (e.g., by alerting users that the content might be harmful or inaccurate).

 

Consistent with the notion of a duty of care on which the OSB is based (see sects. 5, 17), the final legislation should ensure that providers cannot be penalised for undertaking reasonable, good-faith efforts to comply with their obligations under the OSB. This should cover both situations in which a provider removes or blocks access to content that it reasonably believes to fall within the scope of a relevant definition (e.g., content harmful to children), and situations in which the provider decides not to remove such content, for instance because the provider concludes that doing so would unduly impinge on fundamental rights or other important interests.

 

To this end, we encourage policymakers to add two new provisions to the OSB. First, providers should be immune from liability under any law for any action they take in reasonable, good-faith belief that their action is consistent with their obligations under the OSB. Second, where a provider decides not to remove or block access to content based on a reasonable, good-faith belief that the OSB permits it to take such action, the provider should be immune from liability under the OSB or other UK law, and should also remain eligible for the limitations on liability for such content set out in the UK’s Electronic Commerce (EC Directive) Regulations 2002 (“EC Regulations”). The EC Regulations have been a foundation of UK internet law and policy for nearly two decades, and nothing in the OSB should alter the careful balancing of interests between content providers, users, and online intermediaries set out in those Regulations.

 

9.       Clarify the provisions for transparency reporting

 

Microsoft recognises the role that meaningful transparency plays in supporting the wider aims of accountability and compliance with regulatory requirements. To this end, Microsoft publishes a range of transparency reports including content removal requests, law enforcement requests, and an annual human rights report. These can be found on Microsoft’s Reports Hub.  Microsoft supports empowering regulators, users and civil society through these reports, which reflect both voluntary and regulatory requirements, as well as reflecting the nature of Microsoft products and services.

 

As outlined in Part 3, Chapter 1 (p44), companies of all categories will be required to follow Ofcom’s guidance around “producing an annual transparency report” which will be subject to Ofcom’s requirements. To the extent these transparency requirements will include mandated disclosures about the details of how these systems work and operate, it could damage their effectiveness by providing valuable information to bad actors looking to subvert those systems and defences. While we think it appropriate that platform providers are transparent in their use of automated defences, it is important that any regulatory reporting requirements strike a balance between documenting steps taken, and ensuring that any information that has to be disclosed, does not undermine the platform’s efforts in this regard.

 

It is important that whether through the bill itself or in future decisions taken by the Secretary of State or Ofcom, sufficient flexibility is ensured to allow providers to publish reports in a manner that is  most meaningful to users and to meet the intended policy objectives of this particular provision. Products and services have to be specifically engineered to make it possible to obtain specific data and therefore transparency reporting can require significant re-engineering if a service has not been designed to gather or retain certain types of data. Therefore, it will be important to allow platforms a grace period during which to ramp up to meet any future reporting requirements. Additionally, any new transparency reporting requirements should not unnecessarily impact users’ right to privacy by requiring excessive tracking or collection of personal data. Transparency requirements should focus on core requirements and not dictate the precise manner in which reporting is provided.  As previously stated, different platforms have different risk profiles, content policies, and moderation tools and thus a prescriptive or uniform reporting template will not be effective.

 

10.   Align the OSB with online content regulations in other major markets.

 

As UK policymakers are aware, the UK’s content moderation regulations do not exist in isolation. The European Union is in the process of updating the 2001 Electronic Commerce Directive in the form of the proposed Digital Services Act, while policymakers in other major markets, including Australia and the United States, have offered their own laws or regulations governing content moderation. In many cases, these proposals take quite different approaches for promoting online safety to those set out in the OSB. In some cases, the OSB would impose diverging or even conflicting obligations on service providers to those set out in these other proposals.

 

This concern is particularly relevant to any requirement in the OSB for service providers to scan private communications. For instance, such a requirement might be in tension with applicable EU data protection regulations as set forth in the E-Privacy Directive, the GDPR, and the forthcoming E-Privacy Regulation. Likewise, many third countries impose safeguards against wiretapping, and may define the concept of “wiretapping” such that it limits the ability of service providers to monitor any type of communications content without the user’s explicit consent.

To avoid placing service providers in an unavoidable conflict-of-laws position, we urge policymakers to align the OSB’s requirements as closely as possible to those of the UK’s main economic partners. In addition, we urge policymakers to include a conflict-of-laws provision in the final regulatory framework, which would excuse service providers for non-compliance in circumstances where compliance with the OSB would violate another law to which the service provider is subject. 

*  *  *  *   *

 

Microsoft appreciates the opportunity to share these views on the  Draft Online Safety Bill with the Committee. We look forward to further discussions on this important initiative.

 

21 September 2021

 

 

13

 


[1] See, e.g., Council of Europe, Recommendation CM/Rec(2012)3 of the Committee of Ministers to member States on the protection of human rights with regard to search engines (4 Apr. 2012) (stating that “[s]earch engines play a crucial role . . . in exercising the right to seek and access information, opinions, facts and ideas” and that “[s]uch access to information is essential to building one’s personal opinion and participating in social, political, cultural, and economic life,” and recommending that member States “ensure that any law, policy, or individual request on de-indexing or filtering is enacted with full respect for relevant legal provisions, the right to freedom of expression and the right to seek, receive and impart information” and that “[t]he principles of due process and access to independent and accountable redress mechanisms should also be respected in this context”), available at https://search.coe.int/cm/Pages/result_details.aspx?ObjectID=09000016805caa87.

[2] We address the scope of the term “harmful to children” below.

[3] The OSB adds that a provider may conclude that that it is not “possible for children to access a service” only if “there are systems or processes in place that achieve the result that children are not normally able to access the service or that part of it” (Section 26(3)).

[4] The OSB’s Explanatory Notes state that “[a]ge verification . . . provide[s] the highest level of confidence about a user’s age.” (para 164).