Written evidence submitted by the Molly Rose Foundation (OSB0149)

 

Introduction

 

The Molly Rose Foundation [MRF] welcomes the draft Online Safety Bill which introduces legally enforceable and proportionate duties of care, to mitigate risk for young and vulnerable people when online. It is an important step towards making the UK the safest place to be online, but it is important that it will be both robust in its scope and swift in its introduction.

 

The proposed legislation should not be seen as a single opportunity to make the internet safer nor considered in isolation. In practice it will stand alongside other pieces of legislation such as the Age Appropriate Design Code in the UK and Digital Services Act in the EU. To remain effective, the Online Safety Bill will need build on its foundations and be flexible and swift to change whenever needed, as well as integrated with other UK or global legislation.

 

 

 

About MRF

 

The Molly Rose Foundation is a Charitable Incorporated Organisation formed in memory of Molly Russell who took her own life aged 14. The charity’s aim is suicide prevention, targeted towards young people under the age of 25. We want to help connect those suffering from depression or other mental illness that puts them at risk of suicide to the help, support and practical advice they require.

 

Registration: charity status under the Charity commission for England and Wales in August 2018 (Charity No 1179482)

 

 

 

Summary of the MRF response

 

Meeting objectives

 

As well as legislating to remove harmful online content, the Bill must promote future digital developments that are both competitive and safe by design, thus delivering a flourishing digital economy that always aims to protect its users.

 

In the current era of unregulated tech, the users most exposed to risk are children and other vulnerable people. They are least well equipped to fully understand the techniques used to keep them online and the positive feedback of algorithmic amplification that profoundly influences their viewing habits. The Online Safety Bill must look after the interest of these users and compel the platforms to consider their safety, from the initial design process forward. Content that is ‘legal but harmful’ will also need to be regulated, if young people are to be safe online.

 

Legislation must be applied flexibly, appropriately and dynamically, in a fast-moving sector that can outpace a traditional legislative process. The “Duty of Care” approach is therefore likely to be effective as it encourages platforms to share responsibility with the regulator. However, in the longer-term, the effectiveness of this form of regulation will need to be continuously monitored and the regulations and guidance amended, if ever it is found the legislation is being circumvented.

 

The powers of the regulator will need to be swiftly applied to sanction companies who are not fulfilling their Duties of Care.

 

Minimum safety standards should be established by the regulator and an overarching safety Duty of Care should be included in the Bill to ensure these are met. Again, this should be reviewed regularly to allow proven new practices to be adopted both widely and quickly. This will also ease the burden of developing new legislation that will be placed on companies, especially start-ups, as a clear framework will be provided for the development of their safe practices, without always starting from first principles.

 

Also, as in other industry sectors, the sharing of best practice should be encouraged by the Bill so all online industries can make use of suitable new developments proven to make the digital world safer.

 

 

Services in scope

 

MRF accepts some initial limits in the scope of the draft Bill in order to ensure it does not become a too unwieldy and time-consuming piece of legislation. Other measures for future inclusion, such as those causing societal harm, the introduction of age verification or assurance, and the prevention of online scams should be prioritised and added as soon as they have been fully considered. From the outset, this legislation should be designed to grow and change as needed in the future.

 

Similarly, other services that are not user-to-user should be become regulated in the future if there is an identified need for new forms of online safety. The designation of the categories of company described in the draft Bill and the effect this may have on competition, especially for start-ups, should be continuously reviewed.

 

MRF believes that new technology might require not just new legislation but also a new form of legislation. The speed of development in this sector meaning the Online Safety Bill may need to be considered as a ‘living legislation’ if it is to keep pace. An ongoing Joint-Committee could provide governance of and guidance for the regulator and introduce amendments whenever required.

 

 

Algorithms

 

Algorithmic amplification should be carefully controlled. Examples of unintended consequences when using such machine learning are frequent and they often introduce a likelihood of harm. MRF believes development of algorithms to do good should be encouraged (the recent introduction of the R;pple suicide prevention tool being an example).

 

All platforms should include sufficient human moderation in their safety planning. This human touch is important, to keep procedures rooted in human experience and to feedback into the algorithmic moderation that will be doing the majority of the work. Human moderation is often arduous for those who undertake it, so their wellbeing needs also to be addressed by the companies, whether they engage moderation directly or via third-party businesses.

 

The draft Bill should provide a clear channel for the user to report harmful content and a mechanism for their complaint. It is not clear to us at MRF that this is sufficiently provisioned in the Bill as drafted. Potentially this could involve an existing organisation, such as the UK Safer Internet Centre, which might allow this channel to be established with greater efficiency.

 

 

The role of Ofcom and Government

 

MRF has been supportive of Ofcom being the regulator as it has pre-existing relevant experience, and being an existing regulator, it is quicker to establish in this new role.

 

Ofcom will need sufficient additional resources and staff. A process to regularly review the role of the regulator should be introduced, so that alternatives, such as a potential separate regulator with specific responsibilities, can be assessed and established if ever decided preferable.

 

It is MRF’s view that the financial sanctions proposed are proportionate and give Ofcom sufficient power to discourage prioritising profit over user safety. However, MRF strongly advocates that criminal liability for senior managers should also be an available sanction from the outset of the introduction of this Bill. In order to prompt and embed a new era of social responsibility in the tech industry, it is essential to challenge and change the established corporate culture found at many of the global tech companies.

 

MRF believes that for good governance the regulator should have an appropriate degree of independence. We are concerned that, as drafted in the Bill, the Secretary of State has a high level of individual influence. This may allow for the rapid introduction of measures to improve online safety, but it also allows scope for other influence to be brought to bear on the regulator. The relationship between Parliament and the Regulator should therefore be reviewed and sufficient independence maintained. An on-going Online Safety Joint-Committee would provide an independent body to introduce and moderate amendments to the legislation.

 

 

Evidence, analysis and conclusion

 

A reliable body of independent research is required to assess the effects of online technology and the effectiveness of the proposed regulation. MRF believes that tech companies should be compelled to provide anonymised data to bona fide academic institutions to provide independent analysis. This should be funded by an industry levy.

 

In addition to tech company data, other sources of reliable data should be utilised for evidence gathering. For example, individuals affected by online harms sometimes contact one of a number of support services and, in turn, these services can provide a near ‘real-time’ source of independent anonymised data. Taking one example, the UK 24/7 text support line, ‘Shout’ has an established dataset of 35 million messages. The support ‘Shout’ provides, via an average of 1300-1500 conversations per day, also allows for up-to-date analysis across demographics and from hour to hour. The combined data from such support services would provide valuable insight into harms and safety online.

 

On occasions MRF has sensed a measure of trepidation about the introduction of this Bill, both from those involved with its development and other agencies. New complex legislation is necessarily daunting, but we urge the Committee to move forward boldly. The fear of not getting everything right should not limit the scope of this legislation, instead checks and balances should be included so that after independent review, any minor unintended consequences can be amended. Such measures will also help to allay the fears of those concerned more about free speech than online safety. From an organisation born from the tragic loss of a young life, MRF urges the Committee to do all they can to help ensure the potential of other young people is not dimmed by what they experience online.

 

In closing, we think you should, ‘move fast and mend things’ to help rebalance the effects of an industry famously encouraged to, ‘move fast and break things’, tragically, sometimes at the great cost of its users.

 

 

Advocacy and references

 

The Molly Rose Foundation is pleased to be among the signatories to a letter published 2nd September 2021 in the Times - supporting the ICO’s Age Appropriate Design Code as it comes into force in the UK. https://mollyrosefoundation.org/september-2021/

 

https://mollyrosefoundation.org/good-morning-britain-interview-with-ian-russell-as-part-of-their-mental-health-awareness-week-coverage/

 

The inability in September 2020 for platforms to control egregious content and how their algorithms push it to young people is clearly evidenced in the media reporting of suicide content migrating across platforms https://www.bbc.co.uk/news/technology-54069650

 

Example statistics generated by the ‘Shout’ text line:

65% of conversations are with young people

25% of conversations with children under 13 discuss self-harm

32% of conversations with children under 13 discuss suicide

Conversations with school-age children peak after 10pm

 

https://mollyrosefoundation.org/bbc-news-report-22nd-january/

 

27 September 2021

5