About 5Rights Foundation
5Rights develops new policy, creates innovative frameworks, develops technical standards, publishes research, challenges received narratives and ensures that children's rights and needs are recognised and prioritised in the digital world. While 5Rights works exclusively on behalf of and with children and young people under 18, our solutions and strategies are relevant to many other communities.
Our focus is on implementable change and our work is cited and used widely around the world. We work with governments, inter-governmental institutions, professional associations, academics, businesses, and children, so that digital products and services can impact positively on the lived experiences of young people.
The following 14 key issues are set out with corresponding amendments to the Bill. Text in red indicates additional or amended text to the draft Bill.
Having a single duty to meet the safety objectives is a more straightforward and enforceable structure for the Bill. A duty of care would futureproof the Bill and ensure that the regulator is not always behind the curve as new technologies and products (and associated risks) emerge. Statutory codes of practice issued by the regulator will support regulated services to fulfil the online safety objectives.
Action: Re-introduce an overarching duty of care for services to meet the online safety objectives.
The Bill as currently drafted only applies to user-to-user and search services, leaving a number of services that create risks for children out of scope. The definition of regulated service does not match existing regulation, such as the Age Appropriate Design Code (AADC), which applies to all Internet Society Services (ISS) likely to be accessed by children (“any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services, likely to be accessed by children.”)
This will leave children unprotected in many online environments, such as app stores, e-commerce sites and pornography sites that do not host user-generated content. The status of commercial EdTech providers is also unclear, leaving room for companies to potentially argue for immunity under the internal business services or public bodies exemptions in Schedule 1.
Children have a right to protection wherever they are online. All services, whatever their size, business model or nature, including services that may not fall under the definition of a user-to-user or search service that are “likely to be accessed” by children, must be designed with their safety in mind. Regulatory harmony across regulation that protects children online is imperative to aid compliance and enforcement.
Action: Amend the scope so all services “likely to be accessed” by children will be regulated services.
The draft Bill has been rebadged since the government published its full response to the Online Harms white paper in December, as legislation designed to address harmful content. The single focus on content, rather than content and activity, does not account for the range of risks children are exposed to online.
The definition of harm should build on definitions in existing regulation, particularly the definition of harm to children as anything which “might impair the physical, moral or mental development of persons under the age of 18”, included in the Communications Act 2003 and Ofcom’s video-sharing platform guidance.
The Bill does not tackle advertising or financial and consumer harms. If these harms are left out of scope, children remain at risk from age-inappropriate advertising, online scams, gambling-style features and inappropriate commercial pressures that can lead to the accrual of debt, financial losses and service/contract lock-ins. The Bill is a historic opportunity to bring online advertising under a single regulatory regime, but as currently drafted, most advertising remains out of scope, with paid-for-ads (a contract between the provider and the advertiser) given specific exemption. This means the statutory rules for paid-for advertising under the current VSP regime will be lost when the regulation is superseded by the Online Safety Act.
Action: The language of ‘content and activity’ should be reinstated whenever the Bill refers to content and the definition of harmful content should be amended to be simply a definition of “harm”. The exemption for paid-for advertising should be removed.
The Bill contains several duties in relation to terms of service, reporting and redress, and other “systems and processes” designed to mitigate and manage harm. As written, however, the Bill does not require services to meet minimum standards in these areas. Without setting the bar, the Bill will not establish the necessary standards for safety that both users and providers of regulated services would like to see.
Minimum standards for services likely to be accessed by children set out in a statutory code of practice are needed to usher in a new world of digital design that considers children’s safety first. They would set out clear expectations and ensure that services, both big and small, understand that some design choices are never appropriate in relation to children.
These should include minimum standards for safety by design, the child risk assessment process (including the definition of risk and harm), as well as published terms, age assurance, and moderation, reporting and redress systems. This is the single most important change to make the Bill a systems and process Bill rather than becoming about individual children, products and services or pieces of content.
Under the government’s proposals for a new pro-competition regime, large companies that have been designated ‘strategic market status’ by the Digital Market Unit will need to adhere to a mandatory code of conduct designed to govern the relationships between dominant firms and their users. If the government can introduce mandatory codes to regulate market competition, they can also introduce mandatory rules for online safety.
Action: Services likely to be accessed by children must be required to meet minimum standards of safety by design, published terms, age assurance, moderation, reporting and redress (and others as the regulator sees fit), to be set out in a single statutory code of practice for child online safety prepared by Ofcom.
The requirement to carry out a children’s risk assessment is currently focused on the identification of content that poses a risk to children. Services should be required to assess risks against the 4 Cs risk framework (to include contact, conduct and contract risks, as well as content), and to publish their risk assessments, both to drive transparency and to build knowledge across the sector.
The regulator must also set out minimum standards for the child risk assessment process, to ensure parity, quality and efficacy of the assessments. If the risk assessment is to remain the prime mechanism for risk reduction, greater emphasis must be put on the duty to prevent and mitigate risks across each risk category.
Action: Regulated services must be required to prevent, mitigate or effectively manage risk in accordance with risk profiles drawn up by Ofcom and set out in minimum standards for the child risk assessment, accounting for all risks to children across the 4 Cs of online risk.
The Bill requires regulated services to assess whether it is possible for children to access their service or part of it. Only services likely to be accessed by children will need to meet the safety duties relating to children. To ensure this approach is effective, Ofcom must set out minimum standards for age assurance that are privacy-preserving, rights-respecting, proportionate to risk and purpose, easy for a child to use, accessible and inclusive, enhance a child’s experience rather than merely restrict it, offer a high level of security, transparency, accountability, and clear routes to challenge and redress.
These minimum standards should be implemented on a shorter timescale than the Bill itself, closing a gap in the current legislative framework and giving Ofcom and industry the opportunity to prepare for the full implementation of the Bill.
Action: Ofcom must set out mandatory minimum standards for age assurance solutions in advance of the Bill passing into law, and the Bill itself must require Ofcom to establish the level requirements for age assurance as part of its risk profiles.
The spread of harmful content and activity is supercharged by the automated systems and algorithms (AI systems) of services, with shocking outcomes such as promoting self-harm material, or suggesting suicide sites. Children should not be expected to understand or take action against automated decision-making or algorithmic unfairness. Nor should they be expected to ‘police’ the community rules or terms and conditions that are perpetually broken and unenforced by the service itself.
The Bill must give Ofcom a duty to investigate the automated decision-making systems and algorithms of regulated services that impact on children, and ensure these systems conform to UK laws and obligations concerning children. This would require services to provide information as requested by the regulator, such as information relating to the design goals, inputs and outcomes of algorithms and allow access to personnel from product, governance, and marketing teams.
Where there is evidence or an indication that products and services are discriminating against or systematically disadvantaging individuals or groups of young people or violating their rights, Ofcom should set out a mandatory course of action for compliance.
Action: Ofcom should be given the duty, power and resources to scrutinise the design, operation and outcomes of the algorithms and automated systems by regulated services, and the power to set out mandatory compliance action.
Clause 9 introduces the concept of ‘priority’ illegal content, and only requires services to ‘minimise’ its presence on their services. Services must not be granted a licence to take a hands-off approach to the presence of illegal content and activity on their platforms: a recent BBC investigation revealed the details of a ‘compliance manual’ handed to moderators at OnlyFans, instructing them to escalate an account to senior management only after at least five examples of illegal content had been identified. If content is illegal, it should not be tolerated or be subject to a service’s best efforts, but rather all possible action be taken to prevent its presence in the first place, and then to remove it.
Action: Services must be required to prevent and remove (rather than ‘minimise’) illegal content and activity to reflect its status as simply against the law.
The current definition of ‘journalistic content’ is so broad (news publisher or regulated content that is generated for the purposes of journalism and is UK-linked), it is unclear who would not be covered by it, undermining the very purpose of the duty – to protect news publisher content. Similarly, the definition of ‘content of democratic importance’ is so far-reaching, that it could be reasonably attached to any content. At best, this will create confusion for service providers about how to categorise user content. At worst, it will create significant loopholes that bad actors can easily exploit to challenge the removal of harmful content or activity.
Action: The definitions of ‘journalistic content’, ‘recognised news publisher’ and ‘content of democratic importance’ must be clarified to protect journalists and news organisations but not act as a back door for abusive behaviour or content reaching children.
The counterbalancing duties to protect freedom of expression, privacy and content of journalistic or democratic importance could have the unintended effect of undermining the responsibility services have to put in place appropriate safety policies and procedures. Children have existing rights under the United Nations Convention on the Rights of the Child and set out in General comment 25 on children’s rights in relation to the digital environment. These rights must not be undermined by other aspects of the Bill.
There is no indication in the Bill of how services are to balance these duties with the online safety objectives and safety duties. This creates a very real possibility that services will interpret their duties to have regard for freedom of expression as an instruction to leave hateful and abusive content on their platforms, potentially producing a ‘chilling effect’ that discourages marginalised groups from exercising their own rights to free expression. The most recent Girls Attitudes Survey conducted by Girl Guiding UK found 71% of girls aged 7-21 have experienced some form of online harm in the past year. For 11-21 year olds, they include sexist comments (50%), appearance pressures (45%), harassment (28%), unwanted sexual images (26%), and bullying (21%), all of which are likely to have an impact on their desire to express themselves online.
A child’s right to freedom of expression cannot be considered in isolation from their other rights in the digital world, particularly their rights to freedom of thought and protection from undue influence. While consideration of the impact of regulation on fundamental human rights is welcome, service providers should have a legal duty to protect all rights held by children, and design and operate services in a way that considers their best interests as paramount. This will protect not only a child’s freedom of expression, but their freedom of thought, their right to participation and their right to access information.
Action: A clear hierarchy of intent should be set out, stating that the best interests of the child should be the primary consideration when services consider their duties in relation to freedom of expression.
Ofcom’s remit must be extended to include the power to take complaints from minors. While Ofcom will accept super-complaints, the absence of a function to consider individual complaints does not tally with the conventional understanding of a duty of care in negligence law and deviates from the established process in UK data protection law under the enforcement of the ICO. Super-complaints concerning children should be prioritised by Ofcom and the Bill must ensure individual children are given a mechanism through which their complaints can be considered.
The Bill also reserves the right to issue criminal sanctions against individual company directors, but only when they have failed to comply with information requests from the regulator. The government would need to be persuaded to introduce director liability on the basis of significant failure across the market. Without individual director liability for failures to comply with the duty of care, it is hard to see how the largest tech companies, whose enormous wealth and cash reserves can easily absorb even the heaviest fines, will be sufficiently incentivised to comply with the regime. The Online Safety Bill should follow the precedent set by the Gambling Act 2005 and the Companies Act 2006 to hold individual responsible directors to account.
Action: The Bill should give individual children the right to complain to Ofcom, appeal against a decision of Ofcom, bring legal proceedings against a regulated service provider and claim compensation from a regulated service provider for any damage suffered as a result of their non-compliance with the Online Safety Act. Ofcom should be given the power to enforce financial and criminal sanctions against individual company directors for failures to fulfill the duty of care to children.
In multiple places throughout the Bill, the Secretary of State is given the power to amend or repeal provisions (clause 3) and to direct Ofcom to modify its guidance (clause 33). This undermines the independence of the regulator, and its power to effectively enforce codes of practice and guidance.
Action: The powers of the Secretary of State must be tempered to safeguard the independence of the regulator. Any proposals from the Secretary of State to amend or repeal provisions of the Bill should come before Parliament, and references to the power of the Secretary of State to direct Ofcom should be removed.
The government has recently published its Online Media Literacy Strategy, including a framework of best practice principles and a welcome emphasis on data privacy. In order to meet the aims of the strategy, the Bill must give Ofcom the duty to set out minimum standards for the content and delivery of education initiatives for children, and the strategy must be supported with sufficient investment and expertise.
Education initiatives developed by the private sector such as Google’s ‘Be Internet Awesome’ and Facebook’s ‘My Digital World’ offer resources and workplans to schools at little or no cost, both in the UK and around the world, but teach children to accept certain service design elements as ‘unavoidable’ risks when in fact they could and should be tackled at a design level by those very same companies. Ofcom's media literacy evaluation framework must set standards for providers of digital and data literacy programs to children, particularly those who are, or are funded by, tech companies.
Action: The Bill must task Ofcom with setting standards for educational initiatives designed for children.
The Bill must require Ofcom to draw on the considerable expertise of those working in the field of child online protection and children’s rights, and to capture the views of children themselves, to ensure the regulation, including the codes of practice and any related guidance, fairly represent and respond to the needs and views of children.
Action: The Bill should reference General comment 25 on children’s rights in relation to the digital environment and require Ofcom to seek out the views of both children and experts in the field of child online safety and children’s rights.
 Age Appropriate Design Code, Information Commissioner’s Office
 Communications Act 2003
 Guidance for providers on measures to protect users from harmful material, Ofcom.
 Over a three-month period in 2020, the Advertising Standards Authority identified 159 age-restricted adverts which broke advertising rules by targeting their ads at services with high numbers of child users, including 70 different gambling ads and ten different alcohol ads. See: https://www.asa.org.uk/news/protecting-children-online.html
 Children are put at risk of financial harm through the presence of micro-transactions, loot boxes (which contain an unknown mix of lower and higher value rewards and prizes), and other in-app purchases. It is estimated between 25% and 40% of UK children who play online games have made a loot box purchase. Children as young as four are spending money online and 5Rights research has shown that 80% of the top 50 ‘free’ apps deemed suitable for children aged 5 and under on the Apple UK App store contain in-app purchases. Additionally, 1 in 10 children report making in-app purchases accidentally (see: Children as young as four are spending money online, The Telegraph, April 2021.)
 Video-sharing platform regulation, Ofcom.
 The 4Cs: Classifying Online Risk to Children, Sonia Livingstone and Mariya Stoilova
 5Rights has recently published its Pathways report, looking at the role of system design in children’s online experiences. It reveals the way in which children are offered inappropriate content and contact even when they have been identified as children. Pathways offers irrefutable evidence that should spur the government to take a closer look at the role of algorithms in automating and promoting harmful outcomes for children.
 OnlyFans: How it handles illegal sex videos, BBC
 Girls’ Attitudes Survey, Girlguiding
 General comment No. 14 on the right of the child to have his or her best interests taken as a primary consideration, United Nations Committee on the Rights of the Child
 Part 5 and Schedule 7 of the Gambling Act 2005 concern operating licences issued by the Gambling Commission, including powers to revoke licenses and impose financial penalties.
 Companies Act 2006
 Online Media Literacy Strategy