Written evidence submitted by Carnegie UK


  1. We submitted evidence to the Committee’s inquiry into Economic Crime in December 2020, which focused primarily on the questions around consumers and economic crime.[1] The debate on how to tackle online fraud - along with the prevalence of it - has intensified since then so we are pleased to provide this updated submission which takes into account the developments in the subsequent 12 months and sets out our proposals for ensuring that the Government's Online Safety Bill - when it is introduced into the House next year -  can adequately tackle the problem.
  2. We would be happy to provide further information on our work in writing or to discuss it with Committee members at a future evidence session.
  3. As set out in our previous submission, Carnegie UK has been working on a public policy project on online harm reduction for the past three years, building on our original proposals for a statutory duty of care, enforced by an independent regulator. All our work can be found here.[2] As we noted in that previous submission, many of the online harms that are covered by the Government's legislative proposals - in particular child sexual abuse and exploitation - have been exacerbated by the Covid 19 crisis. The delays in bringing forward the draft Online Safety Bill and the length of time before the regime is fully implemented[3] are regrettable in this regard.
  4. There is also a significant swathe of online harms outwith the current scope of the draft Online Safety Bill. Despite significant representations before and since the publication of the draft Bill from a large network of consumer and financial bodies[4] as well as the Financial Conduct Authority, the National Crime Agency and the City of London Police,[5] online fraud and paid-for advertising remain outside the scope of the Bill; indeed they are specifically ruled out.[6]
  5. The concession, made at the time of the Bill's publication, to include user-generated scams does not go far enough and, indeed, as Martin Lewis and others have argued, opens up an even bigger incentive for scammers to use paid-for advertising as a means to target victims.[7] Meanwhile the FCA's Chief Executive has specifically highlighted in the recent "Perimeter Report", which is due to be discussed with the Economic Secretary soon,  the "real risks to consumers from outside our remit from both online advertising and from those using exemptions to sell products to ordinary customers. Change is needed and we will continue to push for powers where we need them."[8]
  6. Despite the strength of the evidence put forward, the Government continues to resist calls for online fraud to be brought within the scope of the Online Safety Bill[9], referring instead to other mechanisms to deal with the problem such as the Online Advertising review (date unconfirmed) and the Fraud Action Plan (ditto)[10], or to separate legislation. Ministers frequently refer to their desire to resist a "Christmas Tree Bill", with too many additional components added that it is in danger of collapsing under its own weight. As we have argued in our recent proposals for amendments to the draft Bill,[11] the complexity of the Government's regulatory design means that the Bill is unlikely to stand up to much amendment at all, never mind be effective in tackling at a systemic level the prevalence of harms that are already within scope or might need to be added at a later date as the landscape shifts and more harms emerge. Simplifying it now will help both expand the scope in a coherent, future-proofed way and make it fit for further amendments where necessary, without complicating the framework for those who need to abide by it and enforce it.
  7. The pre-legislative scrutiny period should be for addressing strategic issues, one of which we feel is the complexity of the Bill. We set out in summary below the proposed amendments we have submitted to the Joint Committee,[12] as well as recapping our proposals for "regulatory interlock" which we believe would help bring in the expertise of other regulators to assist Ofcom to address online fraud, without overburdening it, when it takes on its powers.

Amending the Online Safety Bill: a foundation duty and a new definition of harm


  1. The Pre-Legislative Scrutiny process is for tackling strategic issues. A leading strategic problem is that the existing bill is very complex indeed. Without restructuring and simplification, it isn’t a good place to start for future amendment: there is a risk of legislation so complex the regime doesn’t work.  We welcome the indications from the DCMS Secretary of State at a recent Committee hearing that the final Bill will be very different to the draft[13] and we hope that it is along the lines of the suggestions we make below.


  1. Our recommendations include:


Taken all together this results in a radical simplification of the bill, removing the need for downstream SIs) and makes the regime stronger and more future proof.

  1. We believe the Bill is strengthened by:

   11. We also believe the Bill is simplified by:

 12. Our new definition of harm is as follows:

“harm” means any of -

(a) physical, psychological, or economic harm experienced by an individual;

(b) harm to public safety, public health and national security;

(c) harm to relations between those who share a protected characteristic within the meaning of the Equality Act 2010 (or a subset of that characteristic) and those who do not; and

(d) harm to democratic debate or to the integrity and probity of the electoral process.


    1. This defines the types of harm, not the level of harm which is specified elsewhere in the bill. In this we follow the distinction that can already be found in the draft bill. Pre-legislative Scrutiny has drawn attention to harms that go beyond harm to the individual.  This amendment brings a concept of societal harm into scope but seeks to do so such that it is constrained to issues that are germane to the Bill. Language is drawn from OFCOM reports as well as from the draft OSB itself. This amendment encompasses most issues set out in the Online Harms White Paper Table 1, p31. The amendment introduces the concept of economic harm to the individual; this definition would include financial crime, online fraud and scams, which - as noted above - have been the subject of strong representations but does not include economic harm to companies.


    1. This new definition of harm, coupled with the foundation duty and related amendments, would give Ofcom as the online safety regulator significant powers to hold social media companies to account for their risk assessments in relation to online fraud occurring on their platforms, and to the measures they take with regard to mitigating the risks they identify.


    1. While we do not explore this in detail in our most recent publication, there are a couple more areas where the draft Bill is lacking: there is no mention of co-designation (which had been referenced in the Government's Final Response on the White Paper), to give OFCOM the powers to give other specialist regulators the ability to work within the regime; nor is there a duty on regulated services to cooperate with other regulators, such as the police, the ASA and the FCA, which would also be an additional strengthening measure. 


    1. We shared with the Committee previously our proposal for “regulatory interlock”: if any competent regulator or other statutory body identifies a new vector for online harm that breaches their own specialist regulatory regime they should be able to hand a dossier to OFCOM to assess and, if appropriate, process in the online harms regime. Such interlocking regulation would protect consumers and increase the effectiveness of regulators such as the Financial Conduct Authority that find it hard to get purchase with online companies. More detail on this approach is in our blog post from last autumn.[14]


December 2021







[3] See recent note from Ofcom re its estimates for time after Royal Assent to produce, e.g., Codes of Practice:






[6] See clause 39 (2) (f) which rules out “paid-for advertising” as a type of user-generated content to be regulated; and clause 46 (8), (b), (i) which rules out content that is harmful to adults if the “risk of physical or psychological impact flows from – the content’s potential financial impact”.






[9] See recent appearance by DCMS and Home Office Ministers at the Joint Committee on the draft Online Safety Bill ( where Secretary of State Nadine Dorries said she had "legal advice is that it would not work and it would extend the scope of the Bill in a way that would not be appropriate and would not meet the objectives of the Bill, which protects children. That is the only reason I am not including it. It needs its own Bill." Though Damian Hinds was less categorical: "The question is not whether we bear down on it, but how best to bear down on it, whether it is in this piece of legislation or elsewhere. We are very alive to the need to move. I hear very frequently and regularly from the financial services sector as well as from individuals—all our constituents from whom we hear depressingly often—about having been victims or their parents having been victims of scams, as you rightly say. We need to work more on it. The question is whether this is exactly the right vehicle or something else is more appropriate."

[10] In response to a letter which we sent jointly with other organisations to the former Digital Minister in 2020, Ms Dinenage referred to the need for a “proportionate” approach that would not duplicate or conflict with the existing work of government, regulators and other bodies and referred to “the Home Office’s activity with law enforcement to tackle fraud and HM Treasury’s work with the financial sector on tackling economic crime.”