Written evidence submitted by the Competition and Markets Authority
Competition and Markets Authority submission to the DCMS Sub-Committee on Online Harms and Disinformation’s call for evidence on online safety and online harms
- We have set out below the Competition and Market Authority’s (CMA) observations in response to your call for evidence on online safety and online harms. Specifically, this submission relates to your questions about whether the proposed Online Safety Bill will encourage platforms to put in place adequate systems and processes for effectively protecting consumers and whether the content of the Draft Bill might lead to tensions with existing protections under consumer law.
- Online platforms play a significant role in peoples’ lives, ensuring that online content has greater visibility and reach than ever before. This undoubtedly brings huge benefits for UK citizens in how they work, shop and communicate with each other. However, in the CMA’s experience, online platforms also allow rogue traders and other ‘bad actors’ to reach consumers more easily and frequently with content that can, and does, cause those consumers real economic and financial harm. Content such as paid-for product endorsements by celebrity ‘influencers’ that have not been labelled as advertising, or fake or misleading consumer reviews, are generally considered unlawful under consumer protection law.
- Given the scale, frequency and rapid dissemination of such content, it is often not possible to address consumer protection issues as they arise on a reactive case-by-case basis. Individual examples of content may be removed, only to reappear quickly elsewhere. To effectively address many issues – and ensure a high level of consumer protection – platform operators need to implement appropriate preventative measures themselves, including proactive measures.
- The CMA has taken enforcement action under consumer law to ensure platform operators take reasonable and proportionate steps – including proactive steps – to tackle economically harmful illegal content or activity when it occurs on or is facilitated through their platforms. For the reasons explained below, we consider it is important that platform operators’ responsibilities for tackling such content should be explicitly clarified in law.
- The CMA is also concerned that, although the proposed new online safety legislation, as set out in the Draft Bill, is useful in expressly conferring responsibilities on platforms, it risks inadvertently setting a lower standard of consumer protection on platforms for economic and financial harms than that already envisaged by current law, and established by the CMA’s enforcement work. It therefore risks being seen as overriding current law, weakening consumer protections. We set out these concerns in further detail at paragraphs 14 to 15 below. We then go on to set out the options available to address those concerns at paragraphs 16 to 19.
Action by the CMA to protect UK consumers
- The CMA has statutory powers to enforce consumer protection legislation, including the Consumer Protection from Unfair Trading Regulations 2008 (CPRs).[1] The CPRs establish a high level of consumer protection by preventing unfair business practices which are capable of distorting consumers’ economic decision making. The CPRs apply to platform operators where they are ‘traders’ – i.e. where they are acting for business purposes - and are engaged in commercial practices concerning consumers (as defined broadly by the legislation). The CPRs include a general prohibition on unfair trading which requires traders to exercise ‘professional diligence’ towards consumers (including those consumers who are not their direct customers).[2]
- Under existing law, where platform operators fail to act with professional diligence – that is, by taking adequate steps to effectively address economically harmful illegal content on their platforms – and that failure is likely to distort UK consumers’ economic behaviour, they are likely to infringe the general prohibition in the CPRs.
- Over recent years the CMA has assumed a lead role in this area by taking important enforcement action in a number of cases aimed at ensuring that platform operators provide adequate protections for consumers. For example, we have taken action to tackle the trading of fake and misleading online reviews on Facebook and eBay[3], and paid-for endorsements of products by celebrity ‘influencers’ that were not labelled as advertising, on Instagram.[4]
- We consider that existing consumer law requires platform operators to take reasonable and proportionate steps to effectively protect consumers from economically harmful illegal content. Such steps are likely to include (where appropriate):
- Taking proactive steps to identify and assess the systemic risks of harmful content on the platform.
- Implementing effective reporting and flagging mechanisms to make it easy for consumers, businesses and other parties - e.g. law enforcement - to report to the platforms potentially harmful content.
- Implementing specific proactive measures to identify this content and effectively mitigate its effects. For example, subject to the nature and extent of the systemic risks posed to consumers, this is likely to include implementing appropriate automated and manual moderation systems to effectively identify and remove content and respond to evolving threats or abuse.
- On becoming aware of the presence of such content – whether through notifications to the platform or by the platform’s own proactive means – investigating promptly and removing it where appropriate.
- Taking additional reasonable steps to effectively mitigate the risks of harm to consumers by identifying and removing similar content - e.g. checking other online content posted by the same user.
- Applying appropriate and effective sanctions to deter this content/activity in future - such as banning repeat offenders – and keeping records of sanctions.
- Proactively ensuring that systems, policies and processes for the prevention, detection and removal of this content remain effective and keep pace with evolving threats/patterns of abuse. For example, through regular testing, reviewing and updating systems and processes – including any automated systems – when operators receive or become aware of new information (whether through notifications or their own proactive means).
Difficulties of using the existing law to ensure that UK consumers are effectively protected
- Platform operators qualifying as ‘traders’ under the CPRs must always comply with consumer law as far as their own commercial practices are concerned. However, in the CMA’s experience, platform operators often dispute the nature and extent of their legal responsibilities concerning illegal content on their websites. Broadly speaking, they contend, for example, that the existing law may not make them responsible for the third party content they carry, may not impose general monitoring obligations on them and requires only that they remove illegal content when notified or made aware of it.
- The CMA, by contrast, consider that the CPRs, at the very least, require platform operators to exercise professional diligence when they are engaged in commercial practices by taking reasonable, proportionate and effective steps in relation to the content they carry. That includes taking proactive measures to prevent economic harm to consumers arising from that content.
- While the CMA has been able to secure effective outcomes for consumers reflecting our view of the law, the lack of consensus between enforcers and platform operators over the correct application of existing law can, nonetheless, have two unwanted consequences:
- It can lead to enforcement action becoming unnecessarily contentious, lengthy and expensive for all concerned. This can impede effective and efficient resolution of concerns, harming the interests of UK consumers during the period of dispute and leading to increased enforcement costs which must be met by the taxpayer.
- More generally, many platform operators are likely to remain unclear about the full extent of their legal responsibilities in connection with economically harmful content posted on or facilitated through their platforms or hostile to the CMA’s view of those responsibilities. As a result, those operators may fail to take steps to implement appropriate systems and processes to effectively tackle such content until regulatory action is taken. Some operators might even consider their interests best served by waiting to see whether regulators prioritise and execute successful enforcement action against them before taking proactive steps, to the detriment of UK consumers.
- The CMA has already recommended to the Government in the Digital Markets Taskforce Advice that powers to tackle economically harmful content online should be strengthened and clarified.[5]
Impact of the Draft Online Safety Bill (‘OSB’) on protections for UK consumers
- We are concerned that, while it may be intended to provide greater protections for UK citizens online, there is a significant risk that the OSB, in its current form, reduces the standard of consumer protection in relation to a range of economically harmful online content below that which we consider is currently required of platform operators under the CPRs. In particular:
- The Draft OSB gives platform operators explicit duties in relation to ‘illegal content’.[6] Those duties include, under Clause 9(3), a requirement to take down this content when they are notified of it or become aware of it. This would suggest that the legislation envisages a merely reactive ‘notice and takedown’ standard for such content.
- It is only in relation to ‘priority illegal content’ – set out in secondary legislation – that platform operators would have an express heightened duty to ‘minimise’ the presence of such content, broadly reflecting the proactive approach that the CMA has applied vis a vis platform operators’ responsibilities for economically harmful content when enforcing the CPRs.
- Our reading of the OSB, as currently drafted, is that certain economically harmful content that may be illegal under consumer protection law – in particular, material infringing the CPRs – is likely to be within scope.[7] If content causing economic harms is to be included within the scope of the Draft OSB - setting express legal standards beneath those that we consider are required under the CPRs – there is a significant risk that the OSB is seen as overriding existing protections under consumer protection law. In particular:
- It will become even more difficult for consumer protection enforcers - such as the CMA and Trading Standards Services - to ensure that platform operators take appropriate and effective action to tackle online content that is likely to infringe consumer law. For example, some platform operators may seek to argue more forcefully that consumer law does not require them to take adequate proactive steps to ensure that consumers are effectively protected from economic harm arising from the content present on, or facilitated by, their platforms.
- Further, or in the alternative, some platform operators may contend that the professional diligence requirements under the CPRs cannot require them to do more than comply with the newly created – and reactive - provisions in the OSB. Although we would dispute this argument, it would nevertheless pose another significant obstacle to swift and effective enforcement in the interests of UK consumers.
How might these concerns be addressed?
- We have identified two potential approaches to effectively managing these risks:
- Option 1 – Maintain the existing scope of the OSB and strengthen operators’ duties in relation to economically harmful illegal content (particularly content infringing the CPRs); or
- Option 2 – (a) Amend the scope of the OSB to ensure that harms resulting from breaches of consumer protection law are not in scope and make clear on the face of the legislation – or, if not, in Explanatory Notes or in a Ministerial statement in Parliament - that its provisions are without prejudice and complementary to operators’ existing duties and responsibilities under other legal regimes (particularly consumer protection), and (b) Use an alternative or existing legislative initiative to ensure the necessary protections for consumers (and legal certainty for platforms).
- Although we consider that Option 1 is theoretically workable, there are likely to be significant practical barriers to its successful implementation. In particular, the following two steps would be required:
- The ‘illegal content’ safety duties in the Draft OSB would need to be strengthened so as to require the taking of specific, proactive measures in relation to this content[8] where appropriate, reflecting the CMA’s view of the requirements under consumer protection law and its enforcement outcomes. This could be achieved by, for example, ensuring that content constituting an offence under certain consumer protection legislation is designated ‘priority illegal content’ under Clause 44, OSB.
- The enforcers of the online safety legislation would need to be resourced in such a way as to enable them to prioritise consumer protection harms. This could be achieved either by increasing the online safety regulator’s budget so that it is able to take on work previously performed by the CMA and other regulators, or by giving the CMA, FCA, ICO, Trading Standards Services and other consumer protection enforcement bodies concurrent powers to enforce the new legislation.
- We consider that Option 2 is more likely to be feasible on the basis that the Government has previously stated that its policy intention was to exclude from scope harms resulting from breaches of consumer protection law[9] and it has recently begun to work with the CMA on how to best to achieve that aim. Option 2 requires the following three steps to be successful:
- First, Clause 41(6) of the OSB would need to be amended to exclude from scope content amounting to an offence under consumer protection law.[10]
- Second, it would need to be made explicitly clear – in the manner described in paragraph 16 above - that the OSB’s provisions are without prejudice and complementary to operators’ existing duties and responsibilities as applicable under other legal regimes (particularly consumer protection).
- Third, to resolve any disagreements on platform responsibility – as described in paragraphs 10 to 12 above - legislation should be introduced or amended to put it beyond doubt that platform operators have the responsibilities described in paragraphs 4 and 9 in respect of illegal content that harms consumers economically. This could be achieved by amending the CPRs to confirm, explicitly, that where operators act as ‘traders’ and are engaged in ‘commercial practices’ they have a duty to take reasonable and proportionate steps - including, where appropriate, proactive steps - to effectively tackle economically harmful illegal content that is likely to distort UK consumers’ economic behaviour.[11] Incorporating the change specifically within the CPRs would mean that other consumer enforcers - including Trading Standards Services and sector regulators - would be able to invoke the revised provisions in their enforcement work. This would explicitly clarify platform operators’ existing responsibilities under consumer protection law and aid swift and effective enforcement.
- The advantage of Option 2 is that this would ensure that different harms are addressed by the legislative and regulatory regimes that are best placed to deal with them. It would also ensure that the high level of consumer protection achieved by the CMA under the current consumer regime is expressly secured and not undercut in future, without altering the stated policy intention of the OSB and inadvertently lowering standards.
[1] The CMA is a non-ministerial government department which is committed to using its tools - including existing consumer and antitrust enforcement, market studies and merger control - to protect consumers and foster innovation in rapidly developing digital markets. A Digital Markets Unit (DMU) has recently been established within the CMA to begin work to operationalise the future pro-competition regime for digital markets. The CMA is also committed to working with other UK regulators, under the auspices of the Digital Regulation Cooperation Forum, to ensure a greater level of cooperation and coherence, given the unique challenges posed by regulation of online platforms.
[2] Regulation 3(3), CPRs. The CPRs define ‘professional diligence’ as ‘the standard of special skill and care which a trader may reasonably be expected to exercise towards consumers which is commensurate with either (a) honest market practice in the trader’s field of activity, or (b) the general principle of good faith in the trader’s field of activity’ - Regulation 2(1), CPRs.
[3] We took action to ensure that Facebook and eBay were taking proactive steps to effectively prevent the trade of fake and other types of misleading reviews on their platforms, including through the use of new automated technology to identify and remove this content. See https://www.gov.uk/cma-cases/fake-and-misleading-online-reviews.
[4] Following CMA action, Instagram committed to introduce protections against hidden advertising posted by celebrity ‘influencers’ on its platform by introducing new automated technology to proactively detect endorsements which have not been clearly and prominently disclosed. Where it identified a suspected incentivised endorsement, Instagram committed to automatically refer platform users to its branded content policies, while also prompting them to confirm whether the post contained an incentivised endorsement. Instagram also committed to improve its tools for brands, so that brands could more easily identify hidden advertising in respect of their products, and take the appropriate steps to remedy this. See https://www.gov.uk/cma-cases/social-media-endorsements.
[5] Taskforce advice to the Government, Recommendation 13b, Annex G. 8 December 2020. https://assets.publishing.service.gov.uk/media/5ffc304a8fa8f5640b6dafab/Appendix_G_-_A_modern_competition.pdf
[6] As defined in Clause 41, OSB.
[7] See Clause 41(4)(d), OSB. We do not consider that such content is effectively excluded by the exemptions in Clause 41(6) of the Draft OSB. While ‘harm’ is defined by Clause 137, OSB, to mean ‘physical or psychological harm’, the definition of ‘illegal content’ is not linked to this concept.
[8] I.e. economically harmful content that would amount to an offence under consumer protection law.
[9] See Paragraph 2.4, Government response to Online Harms White Paper Consultation https://www.gov.uk/government/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response.
[10] While we would suggest that content that constitutes an offence under consumer law is excluded, we would not object to other enforcers’ and consumer stakeholders’ desire for fraud offences to be designated ‘priority illegal content’ within the scope of the Bill where this can be achieved without undermining consumer protection in relation to matters covered by consumer law (in the ways set out above).
[11] To ensure that operators are clear on what the law is likely to require of them, any duty would need to be underpinned by guidance, reflecting our expectation that platform operators should adopt a risk-based and proportionate approach to protecting consumers, having regard to, for example, the platform’s business model, the nature of its online content, the specific risks of harm to consumers arising from that content and other practical considerations.