Skip to main content

No longer the land of the lawless: Joint Committee reports

14 December 2021

A landmark report which will make the tech giants abide by UK law has been published by a group of MPs and Peers today.


The Joint Committee on the draft Online Safety Bill, chaired by Damian Collins MP, recommends major changes to the Online Safety Bill, which is due to be put to Parliament for approval in 2022.

This new law will finally make internet service providers responsible for what’s happening on their platforms, including for serious crimes like child abuse, fraud, racist abuse, promoting self harm and also against violence against women, for which previously there was little enforceable sanction.

Chair's comments

Damian Collins MP, Chair of the Joint Committee on the draft Online Safety Bill, said:

“The Committee were unanimous in their conclusion that we need to call time on the Wild West online. What’s illegal offline should be regulated online. For too long, big tech has gotten away with being the land of the lawless. A lack of regulation online has left too many people vulnerable to abuse, fraud, violence and in some cases even loss of life.

“The Committee has set out recommendations to bring more offences clearly within the scope of the Online Safety Bill, give Ofcom the power in law to set minimum safety standards for the services they will regulate, and to take enforcement action against companies if they don’t comply.

“The era of self-regulation for big tech has come to an end. The companies are clearly responsible for services they have designed and profit from, and need to be held to account for the decisions they make.”

Main conclusions and recommendations

During a packed inquiry, MPs and Peers heard from victims of online harms (Molly Russell’s father, Rio Ferdinand, Martin Lewis) Nobel prize-winning journalist (Maria Ressa) academics and experts, the big tech companies, Ofcom, Facebook whistleblowers Frances Haugen and Sophie Zhang, the Government, and received hundreds of pages of written evidence. As a result, they concluded that:

  • Big tech has failed its chance to self-regulate. They must obey this new law and comply with Ofcom as the UK regulator, or face the sanctions.
  • Ofcom should set the standards by which big tech will be held accountable. Their powers to investigate, audit and fine the companies should be increased.
  • Ofcom should draw up mandatory Codes of Practice for internet service providers. For example, they should write a Code of Conduct on risk areas like Child Exploitation and terrorism. They should also be able to introduce additional Codes as new features or problem areas arise, so the legislation doesn’t become outdated as technology develops.
  • And they should require the service providers to conduct internal risk assessments to record reasonable foreseeable threats to user safety, including the potential harmful impact of algorithms, not just content.
  • The new regulatory regime must contain robust protections for freedom of expression, including an automatic exemption for recognised news publishers, and acknowledge that journalism and public interest speech are fundamental to democracy
  • Scams and fraud generated in an aim to tackle harmful advertising such as scam adverts. Paid-for advertising should be covered by the Bill.
  • Service providers should be required to create an Online Safety Policy for users to agree with, similar to their terms of conditions of service.

The Committee also believes the Bill should be clearer about what is specifically illegal online. They believe it should not be up to the tech companies to determine this. The Committee therefore agrees with the Law Commission’s recommendations about adding new criminal offences to the Bill. They recommend that:

  • Cyberflashing be made illegal.
  • Deliberately sending flashing images to people with photosensitive epilepsy with the intention of inducing a seizure be made illegal (known as Zach’s law).
  • Pornography sites will have legal duties to keep children off them regardless of whether they host user-to-user content.
  • Content or activity promoting self-harm be made illegal, such as it already is for suicide.

Further, the report recommends that individual users should be able to make complaints to an ombudsman when platforms fail to comply with the new law. They also recommended that a senior manager at board level or reporting to the board should be designated the "Safety Controller." In that role they would be made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users.

This Bill (with all of these recommendations included) will be a huge step forward in keeping people safe online and protecting victims. An annex attached to this email outlines 13 case studies of where evidence the Committee heard translated directly into improvements to the Bill. These include on online fraud, extreme porn, racist abuse, self-harm, cyberflashing, Zach’s law, deepfakes, misogyny, incitement to riot, protecting candidates, democratic elections and religious discrimination.

Further information