Written evidence submitted by The Lord Bishop of Oxford, Steven Croft (OSB0212)



03 November 2021

Dear Colleagues

“For it is not light that is needed but fire; it is not the gentle shower but thunder. We need the storm, the whirlwind and the earthquake…the conscience of the nation must be roused” (Frederick Douglas).

Thank you for your diligence and labour on the draft Online Safety Bill. I have watched or read most of the sessions and I am deeply grateful for your careful and perceptive gathering of evidence. I write at this stage of your deliberations to add my voice and perspective to the process of translating all of this evidence into effective legislation.

As some will know, I am a member of the APPG on Artificial Intelligence; I was a member of the House of Lords Select Committee on Artificial Intelligence in 2017-18 chaired by Lord Clement Jones and was a founding Board Member of the government’s Centre for Data Ethics and Innovation from 2018 to May 2021.  I am currently a member of the Lords Select Committee on the Environment and Climate Change and lead for the Lords Spiritual on climate and on new technology.  I currently serve as a member of the Ada Lovelace Institute Reimagining Data Project which is looking at global models for regulation.  I was Bishop of Sheffield for 7 years from 2009 to 2016 before moving to Oxford.

The Diocese of Oxford is a network of more than a thousand churches, chaplaincies and schools across the three counties of Oxfordshire, Berkshire and Buckinghamshire.  As a diocese we are responsible for the education of more than 50,000 children in 283 primary and secondary schools. We have a long standing concern for the mental health of children and young people and are working actively to provide chaplaincy support to schools and resources for good self-care. 

The severity and range of online harms

The evidence presented to the joint committee is horrific both in the severity and range of online harms. 

The harm to the mental health of children and young people in the area of body image and self-worth is profoundly disturbing and all too recognisable from both statistics and stories of tragic incidences of self-harm and suicide among young people.  Our society has acquiesced in giving children unrestricted access to a wide range of harmful material online. I was less familiar with the detail of online financial scams and harms outlined by Martin Lewis and others but it is almost equally disturbing.

The dangers of violent extremism and the role played by social media has been well documented as have the harm those in public life experience daily through corrosive online abuse. I was unaware of the deliberate harm caused to those with epilepsy through deliberate trolling.  It is abundantly clear from all of the evidence gathered that all forms of racism, antisemitism and hatred are multiplied through the current misuse of technology.  In addition to the harm caused to individuals, the values of our whole society are continually being undermined by unregulated novel technologies, threatening democratic rights and freedoms in the present and the future. 

The responsibility of government

The primary responsibility of government is the protection of all its citizens. Every person is of equal worth. The evidence to the Committee has demonstrated beyond doubt that severe harm is being caused through unregulated technology and social media to children and young people, to minorities on the grounds of ethnicity, gender or religion, and to those in public life. 

Technology continues to generate changes without regulation. Jonathan Haidt and Jim Steyer argued on 21st October that the effects of social media on the mental health of young women has only been felt sharply since 2012 with a hockey stick effect in the graphs of self-harm and suicide. They argue that the mental health effects of social media use over just the last nine years will be felt for a generation to come. This tidal wave of harm is building and if unchecked will cause untold damage across society in the coming years.  I appreciate that the Committee is developing regulation for the United Kingdom only yet the evidence brought has been of harm done across the world by technology which no government has yet been successful in regulating. 

The evidence brought to the Committee by Frances Haugen, Sophie Zang and others has given a definitive answer to the question of whether the big tech companies have the will and capacity to self-regulate now or in the future. It is clear that they do not nor is it in their commercial interest to do so.  I was able to be present in the Vatican last year when Pope Francis signed the Rome Call for AI Ethics. I have been present in meetings convened by the Archbishop of Canterbury this year with representatives of the technology companies discussing, among other things, online harms.  Almost all present as immature in their understanding of themselves, the power they wield and the danger to society their technology presents.

I would encourage the Committee to navigate its revision of the Bill by these three key triangulation points:

  1. The scale and range of the harms described,
  2. The primary duty of government to protect all its citizens and especially the vulnerable
  3. The proven inability of big tech to self-regulate are the three triangulation points by which.


The safest place in the world to go online?

The government’s strapline for the online harms legislation at an earlier stage was to make the UK the safest place in the world to go online. I would encourage the Committee to abandon this ambition.


Nowhere in the world can it be said to be safe to go online at present. Nowhere. To be the safest place in an unsafe world does not take us to where we need to be.

I would suggest instead a more modest and reasonable but far reaching goal: for the UK to be the first place in the world where it is safe to be online. Safe not safer should be the goal.

To reach this goal does not require a fresh set of ethics, values or standards. We simply need to take the ethics, values, standards and the vision of a good society which apply in the offline world, which are the basis of our laws, freedoms and democracy and apply them through regulation, laws and standards to the online world.

To be clear, among many other elements this would mean:


The Online Safety Bill

I am by nature a gradualist. I believe most change occurs slowly and carefully, in stages. By this measure, the present draft of the Online Safety Bill is a significant step forward – but only a step.

I have become convinced over the last year that the steps outlined there are not enough. This is one of these moments when Parliament has the opportunity to draw a line in the sand.

My appeal to the Scrutiny Committee in general terms is to take every opportunity possible to strengthen the legislation.  In particular I would encourage you to:

  1. Assert the need for a single overarching duty of care. The original White Paper proposed a single proportionate approach based on duty of care, and this overarching principle should be clearly articulated on the face of the bill.


  1. Ensure the provision of robust and adequate age verification to mitigate evident harm to children and young people across a wide range of fronts


  1. Ensure that harms to society are expressly included in scope. These harms are not accidental or incidental, let alone unavoidable. They are directly related to specific business models and the active creation of algorithms that propagate and magnify harmful material to boost profit. Business must be accountable for real but additional harms to the whole of society where these arise because of the way an algorithm has been crafted and deployed.


  1. Ensure that powers to scrutinise companies use of algorithms – and their effects on users - are adequate. Given the failure of self-regulation, transparency that is sufficient to ensure effective scrutiny must be compelled by legislation. Evaluation must be both adequate and unambiguously independent. Scrutiny must include harms to individual users, but evaluation must also include broader harms to society.


  1. Strengthen the provisions for director liability. There is clear and accumulating evidence of the ways in which companies have misled, prevaricated, and resisted any reform to date. Therefore, powerful sanctions, albeit of a last resort, are now needed to bring about significant change to operating models that have been enormously personally profitable to directors.


  1. Ensure the powers of the Secretary of State are redrawn. It is crucial to preserving confidence in the independence of any regulator, in this case OCFOM, that it can never be susceptible to arbitrary political interference.


  1. Offer the government guidance on dovetailing. There are complex questions about what particular harms should be incorporated and what should remain out of scope, and there are legitimate concerns about a proliferating bill. The committee should consider highlighting to government the need and urgency for dovetailing where it judges some harms to be inadequately covered at present but right to remain outside the scope of this specific bill. 


In conclusion:

The way in which humanity learns to live with the power of new technologies will be one of the defining moral questions of the 21st Century alongside the challenge of climate change and radical inequalities of wealth. 

The Online Safety Bill presents a significant opportunity to arrest the harmful development of new technologies and by that means to enable the immense good which is possible through the wise use of such technology to prevail.  As several witnesses have said, it is not possible to imagine such a Bill being pioneered in the United States or in the rest of the world outside Europe.  The legislation has the potential to be world changing.

This is therefore a moment to be bold and radical in a way which transcends party politics.  I look forward to seeing the results of the Committee’s work and to a strengthened Online Safety Bill becoming law.

With kind regards


3 November 2021


Church House Oxford, Langford Locks, Kidlington, Oxford OX5 1GF

Tel: 01865 208222