Written evidence submitted by the Competition and Markets Authority

 

 

Competition and Markets Authority submission to the DCMS Sub-Committee on Online Harms and Disinformation’s call for evidence on online safety and online harms

 

  1. We have set out below the Competition and Market Authority’s (CMA) observations in response to your call for evidence on online safety and online harms. Specifically, this submission relates to your questions about whether the proposed Online Safety Bill will encourage platforms to put in place adequate systems and processes for effectively protecting consumers and whether the content of the Draft Bill might lead to tensions with existing protections under consumer law.

 

  1. Online platforms play a significant role in peoples lives, ensuring that online content has greater visibility and reach than ever before. This undoubtedly brings huge benefits for UK citizens in how they work, shop and communicate with each other. However, in the CMA’s experience, online platforms also allow rogue traders and other ‘bad actors’ to reach consumers more easily and frequently with content that can, and does, cause those consumers real economic and financial harm. Content such as paid-for product endorsements by celebrity ‘influencers’ that have not been labelled as advertising, or fake or misleading consumer reviews, are generally considered unlawful under consumer protection law.

 

  1. Given the scale, frequency and rapid dissemination of such content, it is often not possible to address consumer protection issues as they arise on a reactive case-by-case basis. Individual examples of content may be removed, only to reappear quickly elsewhere. To effectively address many issues and ensure a high level of consumer protection platform operators need to implement appropriate preventative measures themselves, including proactive measures.

 

  1. The CMA has taken enforcement action under consumer law to ensure platform operators take reasonable and proportionate steps – including proactive steps to tackle economically harmful illegal content or activity when it occurs on or is facilitated through their platforms. For the reasons explained below, we consider it is important that platform operators responsibilities for tackling such content should be explicitly clarified in law.

 

  1. The CMA is also concerned that, although the proposed new online safety legislation, as set out in the Draft Bill, is useful in expressly conferring responsibilities on platforms, it risks inadvertently setting a lower standard of consumer protection on platforms for economic and financial harms than that already envisaged by current law, and established by the CMA’s enforcement work. It therefore risks being seen as overriding current law, weakening consumer protections. We set out these concerns in further detail at paragraphs 14 to 15 below. We then go on to set out the options available to address those concerns at paragraphs 16 to 19.

 

Action by the CMA to protect UK consumers

 

  1. The CMA has statutory powers to enforce consumer protection legislation, including the Consumer Protection from Unfair Trading Regulations 2008 (CPRs).[1] The CPRs establish a high level of consumer protection by preventing unfair business practices which are capable of distorting consumers’ economic decision making. The CPRs apply to platform operators where they are traders – i.e. where they are acting for business purposes - and are engaged in commercial practices concerning consumers (as defined broadly by the legislation). The CPRs include a general prohibition on unfair trading which requires traders to exercise ‘professional diligence’ towards consumers (including those consumers who are not their direct customers).[2]

 

  1. Under existing law, where platform operators fail to act with professional diligence – that is, by taking adequate steps to effectively address economically harmful illegal content on their platforms and that failure is likely to distort UK consumers’ economic behaviour, they are likely to infringe the general prohibition in the CPRs.

 

  1. Over recent years the CMA has assumed a lead role in this area by taking important enforcement action in a number of cases aimed at ensuring that platform operators provide adequate protections for consumers. For example, we have taken action to tackle the trading of fake and misleading online reviews on Facebook and eBay[3], and paid-for endorsements of products by celebrity influencers that were not labelled as advertising, on Instagram.[4]

 

  1. We consider that existing consumer law requires platform operators to take reasonable and proportionate steps to effectively protect consumers from economically harmful illegal content. Such steps are likely to include (where appropriate):

 

 

 

 

 

 

 

 

Difficulties of using the existing law to ensure that UK consumers are effectively protected

 

  1. Platform operators qualifying as ‘traders’ under the CPRs must always comply with consumer law as far as their own commercial practices are concerned. However, in the CMA’s experience, platform operators often dispute the nature and extent of their legal responsibilities concerning illegal content on their websites. Broadly speaking, they contend, for example, that the existing law may not make them responsible for the third party content they carry, may not impose general monitoring obligations on them and requires only that they remove illegal content when notified or made aware of it.

 

  1. The CMA, by contrast, consider that the CPRs, at the very least, require platform operators to exercise professional diligence when they are engaged in commercial practices by taking reasonable, proportionate and effective steps in relation to the content they carry. That includes taking proactive measures to prevent economic harm to consumers arising from that content.

 

  1. While the CMA has been able to secure effective outcomes for consumers reflecting our view of the law, the lack of consensus between enforcers and platform operators over the correct application of existing law can, nonetheless, have two unwanted consequences:

 

 

 

  1. The CMA has already recommended to the Government in the Digital Markets Taskforce Advice that powers to tackle economically harmful content online should be strengthened and clarified.[5]

 

Impact of the Draft Online Safety Bill (‘OSB’) on protections for UK consumers

 

  1. We are concerned that, while it may be intended to provide greater protections for UK citizens online, there is a significant risk that the OSB, in its current form, reduces the standard of consumer protection in relation to a range of economically harmful online content below that which we consider is currently required of platform operators under the CPRs. In particular:

 

 

 

  1. Our reading of the OSB, as currently drafted, is that certain economically harmful content that may be illegal under consumer protection law – in particular, material infringing the CPRs is likely to be within scope.[7] If content causing economic harms is to be included within the scope of the Draft OSB - setting express legal standards beneath those that we consider are required under the CPRs there is a significant risk that the OSB is seen as overriding existing protections under consumer protection law. In particular:

 

 

 

 

How might these concerns be addressed?

 

  1. We have identified two potential approaches to effectively managing these risks:

 

 

 

  1. Although we consider that Option 1 is theoretically workable, there are likely to be significant practical barriers to its successful implementation. In particular, the following two steps would be required:

 

 

 

  1. We consider that Option 2 is more likely to be feasible on the basis that the Government has previously stated that its policy intention was to exclude from scope harms resulting from breaches of consumer protection law[9] and it has recently begun to work with the CMA on how to best to achieve that aim. Option 2 requires the following three steps to be successful:

 

 

 

 

  1. The advantage of Option 2 is that this would ensure that different harms are addressed by the legislative and regulatory regimes that are best placed to deal with them. It would also ensure that the high level of consumer protection achieved by the CMA under the current consumer regime is expressly secured and not undercut in future, without altering the stated policy intention of the OSB and inadvertently lowering standards.

 

 

 

 

 


[1] The CMA is a non-ministerial government department which is committed to using its tools - including existing consumer and antitrust enforcement, market studies and merger control - to protect consumers and foster innovation in rapidly developing digital markets. A Digital Markets Unit (DMU) has recently been established within the CMA to begin work to operationalise the future pro-competition regime for digital markets. The CMA is also committed to working with other UK regulators, under the auspices of the Digital Regulation Cooperation Forum, to ensure a greater level of cooperation and coherence, given the unique challenges posed by regulation of online platforms.

[2] Regulation 3(3), CPRs. The CPRs define ‘professional diligence’ as ‘the standard of special skill and care which a trader may reasonably be expected to exercise towards consumers which is commensurate with either (a) honest market practice in the trader’s field of activity, or (b) the general principle of good faith in the trader’s field of activity - Regulation 2(1), CPRs.

[3] We took action to ensure that Facebook and eBay were taking proactive steps to effectively prevent the trade of fake and other types of misleading reviews on their platforms, including through the use of new automated technology to identify and remove this content. See https://www.gov.uk/cma-cases/fake-and-misleading-online-reviews.

[4] Following CMA action, Instagram committed to introduce protections against hidden advertising posted by celebrity ‘influencers’ on its platform by introducing new automated technology to proactively detect endorsements which have not been clearly and prominently disclosed. Where it identified a suspected incentivised endorsement, Instagram committed to automatically refer platform users to its branded content policies, while also prompting them to confirm whether the post contained an incentivised endorsement. Instagram also committed to improve its tools for brands, so that brands could more easily identify hidden advertising in respect of their products, and take the appropriate steps to remedy this. See https://www.gov.uk/cma-cases/social-media-endorsements.

[5] Taskforce advice to the Government, Recommendation 13b, Annex G. 8 December 2020. https://assets.publishing.service.gov.uk/media/5ffc304a8fa8f5640b6dafab/Appendix_G_-_A_modern_competition.pdf

[6] As defined in Clause 41, OSB.

[7] See Clause 41(4)(d), OSB. We do not consider that such content is effectively excluded by the exemptions in Clause 41(6) of the Draft OSB. While ‘harm’ is defined by Clause 137, OSB, to mean ‘physical or psychological harm’, the definition of ‘illegal content’ is not linked to this concept.

[8] I.e. economically harmful content that would amount to an offence under consumer protection law.

[9] See Paragraph 2.4, Government response to Online Harms White Paper Consultation https://www.gov.uk/government/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response.

[10] While we would suggest that content that constitutes an offence under consumer law is excluded, we would not object to other enforcers’ and consumer stakeholders’ desire for fraud offences to be designatedpriority illegal content within the scope of the Bill where this can be achieved without undermining consumer protection in relation to matters covered by consumer law (in the ways set out above).

[11] To ensure that operators are clear on what the law is likely to require of them, any duty would need to be underpinned by guidance, reflecting our expectation that platform operators should adopt a risk-based and proportionate approach to protecting consumers, having regard to, for example, the platform’s business model, the nature of its online content, the specific risks of harm to consumers arising from that content and other practical considerations.