Call for Evidence
This submission form is not currently public. Please only use this form if invited to do so by the committee, otherwise your submission might not be considered.
The Joint pre-legislative scrutiny Committee on the Draft Online Safety Bill was appointed in July 2021 to consider the Government's draft Bill to establish a new regulatory framework to tackle harmful content online. The Committee invites interested individuals and organisations to submit written evidence to this inquiry. The deadline for written evidence is 23:59 GMT on Thursday 16 September 2021. The Committee will make recommendations in a report to both Houses by 10th December 2021.
Draft Online Safety Bill
The draft Bill would establish Ofcom as an independent regulator of providers of online services where users can generate and share content or search content. Ofcom would oversee a statutory duty of care on these providers. To be compliant with their duties, providers would be required to set up systems and processes to reduce the presence of illegal content and activity, and content and activity that may harm children. A small number of high-risk high-reach user-to-user services would additionally have to reduce the presence of content that is legal but may harm adults. Ofcom would have the power to impose penalties of £18 million or 10% of qualifying worldwide revenue on non-compliant providers.
In scrutinising the draft Bill we aim to:
- Clarify and examine the Government's policy objectives;
- Assess whether the Bill as drafted would achieve the Government's policy objectives;
- Identify any unintended consequences of the Bill;
- Identify whether there are any gaps in the Bill; and
- Make recommendations to improve the drafting of the Bill
Areas of interest
We welcome submissions on any aspect of the draft Bill. We are particularly interested in views on any, or all, of the questions outlined below. We are also interested in views about the novel legal concepts included in the draft Bill. In your submission, please prioritise what you consider to be the most important aspects of the draft Bill. We do not expect submissions to cover all of the questions below.
- Will the proposed legislation effectively deliver the policy aim of making the UK the safest place to be online?
- Will the proposed legislation help to deliver the policy aim of using digital technologies and services to support the UK’s economic growth? Will it support a more inclusive, competitive, and innovative future digital economy?
- Are children effectively protected from harmful activity and content under the measures proposed in the draft Bill?
- Does the draft Bill make adequate provisions for people who are more likely to experience harm online or who may be more vulnerable to exploitation?
- Is the “duty of care” approach in the draft Bill effective?
- Does the Bill deliver the intention to focus on systems and processes rather than content, and is this an effective approach for moderating content? What role do you see for e.g. safety by design, algorithmic recommendations, minimum standards, default settings?
- How does the draft Bill differ to online safety legislation in other countries (e.g. Australia, Canada, Germany, Ireland, and the EU Digital Services Act) and what lessons can be learnt?
- Does the proposed legislation represent a threat to freedom of expression, or are the protections for freedom of expression provided in the draft Bill sufficient?
Content in Scope
- The draft Bill specifically includes CSEA and terrorism content and activity as priority illegal content. Are there other types of illegal content that could or should be prioritised in the Bill?
- The draft Bill specifically places a duty on providers to protect democratic content, and content of journalistic importance. What is your view of these measures and their likely effectiveness?
- Earlier proposals included content such as misinformation/disinformation that could lead to societal harm in scope of the Bill. These types of content have since been removed. What do you think of this decision?
- Are there any types of content omitted from the scope of the Bill that you consider significant e.g. commercial pornography or the promotion of financial scams? How should they be covered if so?
- What would be a suitable threshold for significant physical or psychological harm, and what would be a suitable way for service providers to determine whether this threshold had been met?
- Are the definitions in the draft Bill suitable for service providers to accurately identify and reduce the presence of legal but harmful content, whilst preserving the presence of legitimate content?
Services in Scope
- The draft Bill applies to providers of user-to-user services and search services. Will this achieve the Government's policy aims? Should other types of services be included in the scope of the Bill?
- The draft Bill sets a threshold for services to be designated as 'Category 1' services. What threshold would be suitable for this?
- Are the distinctions between categories of services appropriate, and do they reliably reflect their ability to cause harm?
- Will the regulatory approach in the Bill affect competition between different sizes and types of services?
Algorithms and user agency
- What role do algorithms currently play in influencing the presence of certain types of content online and how it is disseminated? What role might they play in reducing the presence of illegal and/or harmful content?
- Are there any foreseeable problems that could arise if service providers increased their use of algorithms to fulfil their safety duties? How might the draft Bill address them?
- Does the draft Bill give sufficient consideration to the role of user agency in promoting online safety?
The role of Ofcom
- Is Ofcom suitable for and capable of undertaking the role proposed for it in the draft Bill?
- Are Ofcom’s powers under the Bill proportionate, whilst remaining sufficient to allow it to carry out its regulatory role? Does Ofcom have sufficient resources to support these powers?
- How will Ofcom interact with the police in relation to illegal content, and do the police have the necessary resources (including knowledge and skills) for enforcement online?
- Are there systems in place to promote, transparency, accountability, and independence of the independent regulator?
- How much influence will a) Parliament and b) The Secretary of State have on Ofcom, and is this appropriate?
- Does the draft Bill make appropriate provisions for the relationship between Ofcom and Parliament? Is the status given to the Codes of Practice and minimum standards required under the draft Bill and are the provisions for scrutiny of these appropriate?
- Are the media literacy duties given to Ofcom in the draft Bill sufficient?