Written evidence submitted by the 5Rights Foundation (OSB0096)

Key issues and proposed amendments

About 5Rights Foundation

 

5Rights develops new policy, creates innovative frameworks, develops technical standards, publishes research, challenges received narratives and ensures that children's rights and needs are recognised and prioritised in the digital world. While 5Rights works exclusively on behalf of and with children and young people under 18, our solutions and strategies are relevant to many other communities.

 

Our focus is on implementable change and our work is cited and used widely around the world. We work with governments, inter-governmental institutions, professional associations, academics, businesses, and children, so that digital products and services can impact positively on the lived experiences of young people.

 

Summary

 

  1. The online safety objectives are clear and robust, but an overarching duty of care to meet these objectives is needed.
  2. The definition of “regulated services” must bring into scope all services that create risks for children.
  3. The Bill must address all types of harm in relation to its safety objectives, not only harmful user-generated content.
  4. Minimum standards for services likely to be accessed by children must be mandatory and set out in a statutory code of practice to meet the safety objectives and the policy intent of the Bill.
  5. Greater clarity is needed about the nature of risk and the risk assessment process.
  6. Minimum standards for proportionate, privacy-preserving and secure age assurance mechanisms must be established by Ofcom.
  7. Ofcom must have a duty to investigate the impact of algorithms and automated decision-making systems.
  8. There needs to be greater urgency in reducing the presence of illegal content and activity.
  9. The definitions of ‘journalistic content’ and ‘content of democratic importance’ must be reviewed to ensure they do not undermine the safety duties.
  10. The best interests of the child must be given primary consideration by services likely to be accessed by children.
  11. The Bill must extend Ofcom’s enforcement powers so it can issue sanctions against company directors and act on behalf of children.
  12. The powers given to the Secretary of State must be reviewed to ensure the independence of Ofcom.
  13. To deliver the government’s Online Media Literacy Strategy, the Bill should give Ofcom responsibility for producing minimum standards for educational initiatives delivered to children.
  14. Ofcom must be required to consult with experts in the field of child online safety and children’s rights to ensure the Bill delivers on its stated objectives.

 

Key issues and proposed amendments

 

The following 14 key issues are set out with corresponding amendments to the Bill. Text in red indicates additional or amended text to the draft Bill.

 

  1. The online safety objectives are clear and robust, but an overarching duty of care to meet these objectives is needed.

 

The Full Government Response set out a single duty of care to improve the safety of their users. This has now been cut from the Bill. The list of duties for regulated services (part 2, chapters 2 and 3) do not provide the same clarity as the online safety objectives (chapter 5, clause 30). Specific duties will improve the safety of digital services and products, but this approach does not recognise the way risks are interconnected, cumulative and quick to evolve. It will fail to futureproof the Bill and leave places for companies to hide.

 

Having a single duty to meet the safety objectives is a more straightforward and enforceable structure for the Bill. A duty of care would futureproof the Bill and ensure that the regulator is not always behind the curve as new technologies and products (and associated risks) emerge. Statutory codes of practice issued by the regulator will support regulated services to fulfil the online safety objectives.

 

Action: Re-introduce an overarching duty of care for services to meet the online safety objectives.

 

Proposed amendment:

Clause 1: Overview of Act

 

(3) Part 2 imposes duties a duty of care on providers of regulated services to fulfil the online safety objectives and requires OFCOM to issue codes of practice relating to those duties objectives

 

  1. The definition of “regulated services” must bring into scope all services that create risks for children.

 

The Bill as currently drafted only applies to user-to-user and search services, leaving a number of services that create risks for children out of scope. The definition of regulated service does not match existing regulation, such as the Age Appropriate Design Code (AADC), which applies to all Internet Society Services (ISS) likely to be accessed by children (“any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services, likely to be accessed by children.”[1])

 

This will leave children unprotected in many online environments, such as app stores, e-commerce sites and pornography sites that do not host user-generated content. The status of commercial EdTech providers is also unclear, leaving room for companies to potentially argue for immunity under the internal business services or public bodies exemptions in Schedule 1.

 

Children have a right to protection wherever they are online. All services, whatever their size, business model or nature, including services that may not fall under the definition of a user-to-user or search service that are “likely to be accessed” by children, must be designed with their safety in mind. Regulatory harmony across regulation that protects children online is imperative to aid compliance and enforcement.

 

Action: Amend the scope so all services “likely to be accessed” by children will be regulated services.

 

Proposed amendment:

Clause 3: Meaning of “regulated service

 

(2) “Regulated service” means—
(a) a regulated user-to-user service, or
(b) a regulated search service.
(c) all services likely to be accessed by children

 

  1. The Bill must address all types of harm in relation to its safety objectives, not only harmful user-generated content.

 

The draft Bill has been rebadged since the government published its full response to the Online Harms white paper in December, as legislation designed to address harmful content. The single focus on content, rather than content and activity, does not account for the range of risks children are exposed to online.

 

The definition of harm should build on definitions in existing regulation, particularly the definition of harm to children as anything which “might impair the physical, moral or mental development of persons under the age of 18”, included in the Communications Act 2003[2] and Ofcom’s video-sharing platform guidance.[3]

 

The Bill does not tackle advertising or financial and consumer harms. If these harms are left out of scope, children remain at risk from age-inappropriate advertising[4], online scams, gambling-style features and inappropriate commercial pressures that can lead to the accrual of debt, financial losses and service/contract lock-ins.[5] The Bill is a historic opportunity to bring online advertising under a single regulatory regime, but as currently drafted, most advertising remains out of scope, with paid-for-ads (a contract between the provider and the advertiser) given specific exemption. This means the statutory rules for paid-for advertising under the current VSP regime[6] will be lost when the regulation is superseded by the Online Safety Act.

 

Action: The language of ‘content and activity’ should be reinstated whenever the Bill refers to content and the definition of harmful content should be amended to be simply a definition of “harm”. The exemption for paid-for advertising should be removed.

 

Proposed amendments:

 

Clause 45: Meaning of “content and activity that is harmful to children”

(2) “Content and activity that is harmful to children”, in relation to a regulated service, means content or activity that is—

(a) (in the case of a user-to-user service) regulated content and activity in relation to that service, and (b) either—

(i) of a description designated in regulations made by the Secretary of State as primary priority content that is harmful to children (see section 47),

(ii) of a description designated in such regulations as priority content that is harmful to children

 

All references to “content” throughout the Bill to be amended to “content and activity”.

 

***

 

Clause 39: Meaning of “regulated content”, “user-generated content” and “news publisher content”

(2) “Regulated content”, in relation to a regulated user-to-user service, means user-generated content, except—

(a) emails,

(b) SMS messages,

(c) MMS messages

(d) comments and reviews on provider content (see subsection (5)),

(e) one-to-one live aural communications (see subsection (6)),

(f) paid-for advertisements (see subsection (7)), and

(g) news publisher content (see subsection (8)).

 

Remove subsection 7

 

  1. Minimum standards for services likely to be accessed by children must be mandatory and set out in a statutory code of practice to meet the safety objectives and the policy intent of the Bill.

 

The Bill contains several duties in relation to terms of service, reporting and redress, and other “systems and processes” designed to mitigate and manage harm. As written, however, the Bill does not require services to meet minimum standards in these areas. Without setting the bar, the Bill will not establish the necessary standards for safety that both users and providers of regulated services would like to see.

 

Minimum standards for services likely to be accessed by children set out in a statutory code of practice are needed to usher in a new world of digital design that considers children’s safety first. They would set out clear expectations and ensure that services, both big and small, understand that some design choices are never appropriate in relation to children.

 

These should include minimum standards for safety by design, the child risk assessment process (including the definition of risk and harm), as well as published terms, age assurance, and moderation, reporting and redress systems. This is the single most important change to make the Bill a systems and process Bill rather than becoming about individual children, products and services or pieces of content.

 

Under the government’s proposals for a new pro-competition regime, large companies that have been designated ‘strategic market status’ by the Digital Market Unit will need to adhere to a mandatory code of conduct designed to govern the relationships between dominant firms and their users. If the government can introduce mandatory codes to regulate market competition, they can also introduce mandatory rules for online safety.

 

Action: Services likely to be accessed by children must be required to meet minimum standards of safety by design, published terms, age assurance, moderation, reporting and redress (and others as the regulator sees fit), to be set out in a single statutory code of practice for child online safety prepared by Ofcom.

 

 

Proposed amendments:

 

Clause 29: Codes of practice about duties

(1) OFCOM must prepare a statutory code of practice for providers of regulated services describing recommended steps for the purposes of compliance with duties set out in section 9 or 21 (safety duties about illegal content) so far as relating to terrorism content.

(2) OFCOM must prepare a statutory code of practice for providers of regulated services describing recommended steps for the purposes of compliance with duties set out in section 9 or 21 (safety duties about illegal content) so far as relating to CSEA content.

(3) OFCOM must prepare one or more statutory codes of practice for providers of regulated services describing recommended steps for the purposes of compliance with the relevant duties (except to the extent that steps for the purposes of compliance with such duties are described in a code of practice prepared under subsection (1) or (2)).

(4) OFCOM must prepare a statutory code of practice for providers of services likely to be accessed by children for the purposes or compliance with the duties in section 10 or 23 (safety duties about for services likely to be accessed by children) and with the online safety objectives in section 30.

 

***

 

Clause 30: Online Safety Objectives

(2) The online safety objectives for regulated user-to-user services are—

(a) to design and operate a service in such a way that—

(i) For services likely to be accessed by children, to meet the minimum standards of safety by design, published terms, age assurance, moderation, reporting and redress as set out in a statutory code of practice prepared by OFCOM (under section 29 subsection (4)

 

  1. Greater clarity is needed about the nature of risk and the risk assessment process.

The requirement to carry out a children’s risk assessment is currently focused on the identification of content that poses a risk to children. Services should be required to assess risks against the 4 Cs risk framework[7] (to include contact, conduct and contract risks, as well as content), and to publish their risk assessments, both to drive transparency and to build knowledge across the sector.

The regulator must also set out minimum standards for the child risk assessment process, to ensure parity, quality and efficacy of the assessments. If the risk assessment is to remain the prime mechanism for risk reduction, greater emphasis must be put on the duty to prevent and mitigate risks across each risk category.

 

Action: Regulated services must be required to prevent, mitigate or effectively manage risk in accordance with risk profiles drawn up by Ofcom and set out in minimum standards for the child risk assessment, accounting for all risks to children across the 4 Cs of online risk.

 

Proposed amendments:

 

Clause 7: Risk assessment duties

(9) A “children’s risk assessment” of a service of a particular kind means an assessment to identify, assess and understand such of the following as appear to be appropriate, taking into account the risk profile that relates to services of that kind—

(b) the level of risk of children who are users of the service encountering the following by means of the service—

(i) content risks

(ii) contact risks

(iii) conduct risks

(iv) contract risks

giving separate consideration to children in different age groups, and taking into account (in particular) algorithms used by the service and how easily, quickly and widely content may be disseminated by means of the service;

 

***

 

Clause 61: Risk assessments by OFCOM

(3) OFCOM must develop risk profiles for different kinds of regulated services, categorising the services as OFCOM consider appropriate, taking into account -

(a) the characteristics of the services, and

(b) the risk levels and other matters identified in the risk assessment.

(4) OFCOM must develop a child risk assessment, covering content, contact, conduct and contract risks to children.

 

  1. Minimum standards for proportionate, privacy-preserving and secure age assurance mechanisms must be established by Ofcom.

The Bill requires regulated services to assess whether it is possible for children to access their service or part of it. Only services likely to be accessed by children will need to meet the safety duties relating to children. To ensure this approach is effective, Ofcom must set out minimum standards for age assurance that are privacy-preserving, rights-respecting, proportionate to risk and purpose, easy for a child to use, accessible and inclusive, enhance a child’s experience rather than merely restrict it, offer a high level of security, transparency, accountability, and clear routes to challenge and redress.

These minimum standards should be implemented on a shorter timescale than the Bill itself, closing a gap in the current legislative framework and giving Ofcom and industry the opportunity to prepare for the full implementation of the Bill.

Action: Ofcom must set out mandatory minimum standards for age assurance solutions in advance of the Bill passing into law, and the Bill itself must require Ofcom to establish the level requirements for age assurance as part of its risk profiles.

 

Proposed amendments:

 

Clause 30: Online Safety Objectives

(2) The online safety objectives for regulated user-to-user services are

(a) to design and operate a service in such a way that—

(i) For services likely to be accessed by children, to meet the minimum standards of safety by design, published terms, age assurance, moderation, reporting and redress as set out in a statutory code of practice prepared by OFCOM (under section 29 subsection (4)

 

***

 

Clause 61: Risk assessments by OFCOM

(3) OFCOM must develop risk profiles for different kinds of regulated services, categorising the services as OFCOM consider appropriate, taking into account— (a) the characteristics of the services, and

(b) the risk levels and other matters identified in the risk assessment, and

(c) the age assurance requirements in relation to risk levels

 

  1. Ofcom must have a duty to investigate the impact of algorithms and automated decision-making systems on children.

 

The spread of harmful content and activity is supercharged by the automated systems and algorithms (AI systems) of services, with shocking outcomes such as promoting self-harm material, or suggesting suicide sites.[8] Children should not be expected to understand or take action against automated decision-making or algorithmic unfairness. Nor should they be expected to ‘police’ the community rules or terms and conditions that are perpetually broken and unenforced by the service itself

 

The Bill must give Ofcom a duty to investigate the automated decision-making systems and algorithms of regulated services that impact on children, and ensure these systems conform to UK laws and obligations concerning children. This would require services to provide information as requested by the regulator, such as information relating to the design goals, inputs and outcomes of algorithms and allow access to personnel from product, governance, and marketing teams.

 

Where there is evidence or an indication that products and services are discriminating against or systematically disadvantaging individuals or groups of young people or violating their rights, Ofcom should set out a mandatory course of action for compliance.

 

Action: Ofcom should be given the duty, power and resources to scrutinise the design, operation and outcomes of the algorithms and automated systems by regulated services, and the power to set out mandatory compliance action.

 

Proposed amendments:

 

Insert in Part 4 – OFCOM’s power and duties in relation to regulated services, Chapter 1 – General Duties:

Clause 59: Duty to conduct investigation into the operation of algorithms and automated decision-making systems

(1) A duty:

(a) To investigate, where there is evidence that an automated system is discriminating against or systematically disadvantaging groups of people or violating their rights, exploiting vulnerabilities manipulating, or withholding information in a way that disadvantages the user, and set out a mandatory course of action for compliance.

(b) To take enforcement action against regulated services that fail to take the mandated course of action.

(c) To publish results of any action required to support future compliance across the sector.

 

***

 

Clause 70: Power to require information

(4) The information that may be required by OFCOM under subsection (1) includes, in particular, information that they require for any one or more of the following purposes—

(a) the purpose of assessing compliance by a provider of a regulated service

(m) the purpose of investigating algorithms and automated systems, in relation to OFCOM’s duty under section 59

 

 

  1. There needs to be greater urgency in reducing the presence of illegal content and activity.

 

Clause 9 introduces the concept of ‘priority’ illegal content, and only requires services to ‘minimise’ its presence on their services. Services must not be granted a licence to take a hands-off approach to the presence of illegal content and activity on their platforms: a recent BBC investigation revealed the details of a ‘compliance manual’ handed to moderators at OnlyFans, instructing them to escalate an account to senior management only after at least five examples of illegal content had been identified.[9] If content is illegal, it should not be tolerated or be subject to a service’s best efforts, but rather all possible action be taken to prevent its presence in the first place, and then to remove it.

 

Action: Services must be required to prevent and remove (rather than ‘minimise’) illegal content and activity to reflect its status as simply against the law.

 

 

Proposed amendment

Clause 9: Safety duties about illegal content
 

(3) A duty to operate a service using proportionate systems and processes
designed to—
(a) minimise prevent the presence of priority illegal content or activity;
(b) minimise the length of time for which priority illegal content or activity is present;
(c) minimise prevent the dissemination of priority illegal content or activity;
(d) where the provider is alerted by a person to the presence of any illegal content or activity, or becomes aware of it in any other way, swiftly take down such content or stop such activity.

 

  1. The definitions of ‘journalistic content’ and ‘content of democratic importance’ must be reviewed to ensure they do not undermine the safety duties.

 

The current definition of ‘journalistic content’ is so broad (news publisher or regulated content that is generated for the purposes of journalism and is UK-linked), it is unclear who would not be covered by it, undermining the very purpose of the duty – to protect news publisher content. Similarly, the definition of ‘content of democratic importance’ is so far-reaching, that it could be reasonably attached to any content. At best, this will create confusion for service providers about how to categorise user content. At worst, it will create significant loopholes that bad actors can easily exploit to challenge the removal of harmful content or activity.

 

Action: The definitions of ‘journalistic content’, ‘recognised news publisher’ and ‘content of democratic importance’ must be clarified to protect journalists and news organisations but not act as a back door for abusive behaviour or content reaching children.

 

  1. The best interests of the child must be given primary consideration by services likely to be accessed by children.

 

The counterbalancing duties to protect freedom of expression, privacy and content of journalistic or democratic importance could have the unintended effect of undermining the responsibility services have to put in place appropriate safety policies and procedures. Children have existing rights under the United Nations Convention on the Rights of the Child and set out in General comment 25 on children’s rights in relation to the digital environment. These rights must not be undermined by other aspects of the Bill.

 

There is no indication in the Bill of how services are to balance these duties with the online safety objectives and safety duties. This creates a very real possibility that services will interpret their duties to have regard for freedom of expression as an instruction to leave hateful and abusive content on their platforms, potentially producing a ‘chilling effect’ that discourages marginalised groups from exercising their own rights to free expression. The most recent Girls Attitudes Survey[10] conducted by Girl Guiding UK found 71% of girls aged 7-21 have experienced some form of online harm in the past year. For 11-21 year olds, they include sexist comments (50%), appearance pressures (45%), harassment (28%), unwanted sexual images (26%), and bullying (21%), all of which are likely to have an impact on their desire to express themselves online.

 

A child’s right to freedom of expression cannot be considered in isolation from their other rights in the digital world, particularly their rights to freedom of thought and protection from undue influence. While consideration of the impact of regulation on fundamental human rights is welcome, service providers should have a legal duty to protect all rights held by children, and design and operate services in a way that considers their best interests as paramount.[11] This will protect not only a child’s freedom of expression, but their freedom of thought, their right to participation and their right to access information.

 

Action: A clear hierarchy of intent should be set out, stating that the best interests of the child should be the primary consideration when services consider their duties in relation to freedom of expression.

 

 

Proposed amendments:

 

Clause 5: Providers of user-to-user services: duties of care

(5) All providers of Category 1 services must comply with the following duties in relation to each such service—

(c) the duties about rights to freedom of expression and privacy set out in section 12(3), (4) and (5)

(d) each of the duties to protect content of democratic importance (see section 13)

...

(g) to ensure the best interest of the child are a primary consideration, in accordance with the UN Convention on the Rights of the Child and General comment 25 in relation to children’s rights in the digital environment.

 

***

 

Clause 10: Safety duties for services likely to be accessed by children

(11) In relation to duties under this section, section 12(2) (duties about rights to freedom of expression and privacy), the best interests of the child are a primary consideration.

 

***

 

Clause 12: Duties about rights to freedom of expression and privacy

(2) A duty to have regard to the importance of—

(a) protecting users’ right to freedom of expression within the law

(b) protecting users from unwarranted infringements of privacy, when deciding on, and implementing, safety policies and procedures.

(c) for services likely to be accessed by children, the best interests of the child are a primary consideration

 

 

  1. The Bill must extend Ofcom’s enforcement powers so it can issue sanctions against company directors and act on behalf of children.

 

Ofcom’s remit must be extended to include the power to take complaints from minors. While Ofcom will accept super-complaints, the absence of a function to consider individual complaints does not tally with the conventional understanding of a duty of care in negligence law and deviates from the established process in UK data protection law under the enforcement of the ICO. Super-complaints concerning children should be prioritised by Ofcom and the Bill must ensure individual children are given a mechanism through which their complaints can be considered.

 

The Bill also reserves the right to issue criminal sanctions against individual company directors, but only when they have failed to comply with information requests from the regulator. The government would need to be persuaded to introduce director liability on the basis of significant failure across the market. Without individual director liability for failures to comply with the duty of care, it is hard to see how the largest tech companies, whose enormous wealth and cash reserves can easily absorb even the heaviest fines, will be sufficiently incentivised to comply with the regime. The Online Safety Bill should follow the precedent set by the Gambling Act 2005[12] and the Companies Act 2006[13] to hold individual responsible directors to account.

 

Action: The Bill should give individual children the right to complain to Ofcom, appeal against a decision of Ofcom, bring legal proceedings against a regulated service provider and claim compensation from a regulated service provider for any damage suffered as a result of their non-compliance with the Online Safety Act. Ofcom should be given the power to enforce financial and criminal sanctions against individual company directors for failures to fulfill the duty of care to children. 

 

Proposed amendments:

 

Part 5: Appeals and Complaints

 

Chapter 3:

Clause 109: Complaints from children

(1) A child, or representative of a child, may make a complaint to OFCOM that any feature of one or more regulated services, or any conduct of one or more providers of such services, or any combination of such features and such conduct, is appears to be, or presents a material risk of-

(a) causing them significant harm

(b) significantly adversely affecting their rights, as set out in the United Nations Convention on the Rights of the Child and General comment 25 on children's rights in relation to the digital environment

(c) causing significant unwarranted infringements of privacy, or

(d) otherwise having the effect of impairing their physical, moral or mental development

 

***

New clause:

Clause 88: Offences in connection with the duty of care

(1) An offence under this section may be committed only by a person who—

(a) is given a warning in relation to a regulated service, and

(b) is the provider of that service, and references to “person” are to such a person.

(2) A person commits an offence if the person fails to comply with a safety objective relating to the duty of care.

 

 

  1. The powers of the Secretary of State must be reviewed to ensure the independence of Ofcom

 

In multiple places throughout the Bill, the Secretary of State is given the power to amend or repeal provisions (clause 3) and to direct Ofcom to modify its guidance (clause 33). This undermines the independence of the regulator, and its power to effectively enforce codes of practice and guidance.

 

Action: The powers of the Secretary of State must be tempered to safeguard the independence of the regulator.  Any proposals from the Secretary of State to amend or repeal provisions of the Bill should come before Parliament, and references to the power of the Secretary of State to direct Ofcom should be removed.

 

Proposed amendments:

 

Clause 3: Meaning of “regulated service”

Subsections 8-12 of clause 3 should be removed.

 

Clause 33: Secretary of State’s power of direction

Clause 33 should be removed.

 

Clause 39: Meaning of "regulated content", "user-generated content" and "news publisher content"

Subsections 12 and 13 of clause 39 should be removed.

 

Part 6: Secretary of State’s functions in relation to regulated services

Part 6 should be removed.

 

  1. To deliver the government’s Online Media Literacy Strategy, the Bill should give Ofcom responsibility for producing minimum standards for educational initiatives delivered to children.

 

The government has recently published its Online Media Literacy Strategy[14], including a framework of best practice principles and a welcome emphasis on data privacy. In order to meet the aims of the strategy, the Bill must give Ofcom the duty to set out minimum standards for the content and delivery of education initiatives for children, and the strategy must be supported with sufficient investment and expertise.

 

Education initiatives developed by the private sector such as Google’s ‘Be Internet Awesome’ and Facebook’s ‘My Digital World’ offer resources and workplans to schools at little or no cost, both in the UK and around the world, but teach children to accept certain service design elements as ‘unavoidable’ risks when in fact they could and should be tackled at a design level by those very same companies. Ofcom's media literacy evaluation framework must set standards for providers of digital and data literacy programs to children, particularly those who are, or are funded by, tech companies.

 

Action: The Bill must task Ofcom with setting standards for educational initiatives designed for children.

 

Proposed amendment:

 

Clause 103: Media Literacy

(4) OFCOM must prepare guidance—

(a) about the evaluation of educational initiatives mentioned in subsection

(3) by persons providing them,

(b) about the evaluation, by providers of regulated services, of any actions taken by them in relation to those services to improve the media literacy of members of the public, and

(c) about the evaluation, by persons developing or using technologies and systems mentioned in subsection (1)(b), of the effectiveness of those technologies and systems in improving the media literacy of members of the public.

(d) about minimum standards that educational initiatives designed for children must meet

 

  1. Ofcom must be required to consult with experts in the field of child online safety and children’s rights to ensure the Bill delivers on its stated objectives.

 

The Bill must require Ofcom to draw on the considerable expertise of those working in the field of child online protection and children’s rights, and to capture the views of children themselves, to ensure the regulation, including the codes of practice and any related guidance, fairly represent and respond to the needs and views of children.

 

Action: The Bill should reference General comment 25 on children’s rights in relation to the digital environment and require Ofcom to seek out the views of both children and experts in the field of child online safety and children’s rights.

 

Proposed amendments:

 

Clause 29: Codes of practice about duties

(5) Before preparing a code of practice or amendments under this section, OFCOM must consult—

(f) persons whom OFCOM consider to have relevant expertise in equality issues and human rights, in particular—

(i) the right to freedom of expression set out in Article 10 of the Convention,

(ii) the right to respect for a person’s private and family life, home and correspondence set out in Article 8 of the Convention, and

(iii) the rights of the child as set out in the United Nations Convention on the Rights of the Child and General comment 25 on children's rights in relation to the digital environment

 

***

 

Clause 99: Research about users’ experiences of regulated services

(1) Section 14 of the Communications Act (consumer research) is amended as follows.

(2) After subsection (6A) insert—

“(6B) OFCOM must make arrangements for ascertaining—

(e) the views of children who are users of regulated services

 

***

 

Clause 101: OFCOM’s report about researchers’ access to information

(1) OFCOM must prepare a report—

(a) describing how, and to what extent, persons carrying out independent research into online safety matters are currently able to obtain information from providers of regulated services to inform their research

(3) In preparing the report, OFCOM must consult—

...

(g) Child online safety and child rights experts

 

 

September 2021


[1] Age Appropriate Design Code, Information Commissioner’s Office

[2] Communications Act 2003

[3] Guidance for providers on measures to protect users from harmful material, Ofcom.

[4] Over a three-month period in 2020, the Advertising Standards Authority identified 159 age-restricted adverts which broke advertising rules by targeting their ads at services with high numbers of child users, including 70 different gambling ads and ten different alcohol ads. See: https://www.asa.org.uk/news/protecting-children-online.html

[5] Children are put at risk of financial harm through the presence of micro-transactions, loot boxes (which contain an unknown mix of lower and higher value rewards and prizes), and other in-app purchases. It is estimated between 25% and 40% of UK children who play online games have made a loot box purchase. Children as young as four are spending money online and 5Rights research has shown that 80% of the top 50 ‘free’ apps deemed suitable for children aged 5 and under on the Apple UK App store contain in-app purchases. Additionally, 1 in 10 children report making in-app purchases accidentally (see: Children as young as four are spending money online, The Telegraph, April 2021.)

[6] Video-sharing platform regulation, Ofcom.

[7] The 4Cs: Classifying Online Risk to Children, Sonia Livingstone and Mariya Stoilova

[8] 5Rights has recently published its Pathways report, looking at the role of system design in children’s online experiences. It reveals the way in which children are offered inappropriate content and contact even when they have been identified as children. Pathways offers irrefutable evidence that should spur the government to take a closer look at the role of algorithms in automating and promoting harmful outcomes for children.

[9] OnlyFans: How it handles illegal sex videos, BBC

[10] Girls’ Attitudes Survey, Girlguiding

[11] General comment No. 14 on the right of the child to have his or her best interests taken as a primary consideration, United Nations Committee on the Rights of the Child

[12] Part 5 and Schedule 7 of the Gambling Act 2005 concern operating licences issued by the Gambling Commission, including powers to revoke licenses and impose financial penalties.

[13] Companies Act 2006

[14] Online Media Literacy Strategy