Corrected oral evidence: Consideration of government's draft Online Safety Bill
Thursday 4 November 2021
Watch the meeting: https://parliamentlive.tv/event/index/6cc46469-1a1f-4ac0-bd12-d0a308b90a97
Members present: Damian Collins MP (The Chair); Lord Clement-Jones; Lord Gilbert of Panteg; Darren Jones MP; Baroness Kidron; Lord Knight of Weymouth; John Nicolson MP; Dean Russell MP; Lord Stevenson of Balmacara; Suzanne Webb MP.
Evidence Session No. 17 Heard in Public Questions 274 - 294
I: Rt Hon Nadine Dorries MP, Secretary of State for Digital, Culture, Media and Sport; Chris Philp MP, Parliamentary Under-Secretary of State (Minister for Tech and the Digital Economy), Department for Digital, Culture, Media and Sport; Rt Hon Damian Hinds MP, Minister of State (Minister for Security and Borders), Home Office; Sarah Connolly, Director, Security and Online Harms, Department for Digital, Culture, Media and Sport.
USE OF THE TRANSCRIPT
Nadine Dorries MP, Chris Philp MP, Damian Hinds MP and Sarah Connolly.
Q274 The Chair: Good morning. Welcome to this final public evidence session of the Joint Committee on the draft Online Safety Bill. This morning we are pleased to welcome the Secretary of State, Nadine Dorries, along with Minister Chris Philp from the DCMS, Minister Damian Hinds from the Home Office, and Sarah Connolly from the Department for Digital, Culture, Media and Sport. Good morning, and welcome.
Nadine, Secretary of State, I will start off. This Bill has been four years in the thinking, planning and making. You are the Secretary of State who will introduce it to the House of Commons. How important do you think this legislation is?
Nadine Dorries: Good morning, Chair, and members of the committee. May I begin by thanking you all for the work that you have done so far on this committee? The extent to which you have probed this Bill has been incredibly important, as are the expert witness statements that you have taken. I see this very much as a joint project between myself, as Secretary of State, the department, and the Joint Committee. I will explain why I think that is the case a bit later.
I wanted to say thank you because, to answer your question and to your point, I think this is possibly the most important piece of legislation to pass through Parliament, certainly in my 17 years here, because of the implications and because it is a novel piece of legislation. It is also ground-breaking. It absolutely has to be watertight, so it is extremely important because of what the Bill can, and will, deliver.
The Bill has to be watertight because it has three important core principles. First, and most important, the Bill is there to protect our children. Secondly, it is there to remove illegal content online. Thirdly, it is to make platforms respond to content that is legal but which is harmful. Those three principles themselves highlight how important the Bill is, and how much it has to deliver. It also has to walk a very thin tightrope between freedom of expression and those protections.
I believe we can achieve that with the work you have done and the work we are doing because, since my appointment as Secretary of State, I have been working with officials to move this Bill further and to strengthen those regulations. I believe, hopefully, with your recommendations and the work that we are doing in the department, we will be in a place to be able to deliver on those core principles in a way that will change the culture of the world online. Make no mistake, this is novel; it is ground-breaking, and the world is watching what we are doing on online safety and harms and protecting individuals.
It is impossible to put into words how important this legislation will be.
The Chair: Thank you very much. You say that you have been looking at progressing the Bill since you were appointed as Secretary of State. By that, would it be fair to assume that, as far as you and the department are concerned, the Bill as published in draft form earlier this year is not the Government’s final word on the legislation?
Nadine Dorries: No, it is not the Government’s final word. It is not my final word. I have been pushing on a number of areas, which I hope to be able to highlight this morning. It is not the final word because of the work that you have been undertaking. I want to reassure you that we are awaiting your recommendations as soon as possible, and we will be looking at them very seriously indeed. At the risk of saying too much, I want to reassure you that they will be very carefully and very seriously looked at. I see this as very much a joint effort on behalf of all of us.
Q275 The Chair: Thank you. I am sure all the members of the committee will appreciate those sentiments. I want to start off, in terms of the subject areas we want to discuss this morning, with your powers. That seems an appropriate place to start.
I am sure that your officials have been monitoring the work of the committee so far. The powers of the Secretary of State have been a fairly consistent theme that people have brought up. I wonder how you see those powers, as currently drafted, and whether you think it would be more appropriate to allow more parliamentary scrutiny of the operation of the online safety regime, and whether there are any proposals to adjust that or add additional powers and responsibilities to the regulator.
Nadine Dorries: I do. I have seen the comments regarding the powers of the Secretary of State and, I would say on an equal level, the powers that will be invested in an independent body in Ofcom. I do not believe that the Bill goes far enough in terms of scrutiny.
For example, I think there is a clause in the Bill which says that within two to five years we will have to re-examine. That is not good enough, because when we had the idea of this Bill—the genesis of it—TikTok had not even been heard of. It is a rapidly changing landscape. Therefore, I think it is exceptional in the parliamentary scrutiny required. We do not know what will happen the day after this Bill receives Royal Assent.
To go back to my earlier comments about the work you have undertaken, the Bill requires Ofcom to publish accounts and to publish an annual report. It requires us to review the Bill within two to five years. It also requires Ofcom to appear before the DCMS Committee when it is scheduled. I do not see any of those requirements as enough. I think that there is a role for a committee like yours to continue, in the same way as the Human Rights Joint Committee does, to work to scrutinise the Bill, moving forward.
I know there is an assurance that Secretary of State powers will be in secondary legislation, published, laid before Parliament and passed by statutory instrument. Those powers are to give Secretary of State guidance to the independent regulator when concerns are raised. I understand that people may feel that is not enough. I think a body with a similar role to yours, that has ongoing scrutiny of both the landscape of what is happening on the internet and what is happening with regard to Secretary of State powers and the role of the regulator, is something that will reassure both Parliament and everybody who has an interest in this. It will provide the reassurance that there are eyes on this weekly.
As I said, we do not know what will happen the day after the Bill becomes law. This Bill has to be watertight. It may be that secondary powers need to be enacted at times, and a committee such as yours may make recommendations to that effect. Therefore, my comment about us working as a team—because I think teamwork delivers really well—or for there to be a role for a committee like yours, working together with the Secretary of State and with DCMS to provide a watching brief on this, when it becomes an Act, will be really important. That is something I have asked officials to work on, and to look at how we can include a requirement in the Bill that that scrutiny takes place, moving forward.
I believe that a Joint Committee is the best committee, because it has people from both Houses, where concerns are raised. It has an extraordinary degree of expertise from both Houses. Already, a committee like yours is at a stage probably far more advanced than any of us. You have probed the life out of this Bill. You know every line of the Bill. You have taken evidence from expert witnesses on the Bill. The people on this committee probably understand it more than anyone else. Therefore, why would we want to dispose of expertise that we can continue to use to assist in the passage of the Bill and once it becomes an Act?
I have probably said too much, but I think I have given you a fair flavour of what I am looking to have included in the Bill.
The Chair: That is very clear.
Nadine Dorries: I must also say that it is subject to write-around and to scrutiny by parliamentary counsel. I just put those caveats in there.
The Chair: We are pleased to hear the leadership you are demonstrating on that point in particular. We have probed the life out of the Bill. I just hope we can pump some life back into it by the time we produce our report.
That sort of ongoing scrutiny is important, but there is still the question of the mechanism for amending the guidance to the regulator as given by the Government through the Secretary of State. One of the comments we have received is whether that should be at least done in Parliament through a more open process. While a negative statutory instrument can be an effective way to introduce an emergency power or provision if needed—we can all envisage a scenario where national security might require an intervention on this—there is a question as to whether, even if it was through an affirmative statutory instrument that was at least considered by a Committee of both Houses that allowed the Minister to set out on the record their reasons for bringing that change, that is a reform that you have considered as part of the Bill.
Nadine Dorries: I think you have probably just spoken to my previous answer. I look forward to seeing what your recommendations will be.
The Chair: Thank you very much.
Q276 Lord Clement-Jones: Good morning. It is very good to hear of your flexible approach to the Bill, Secretary of State, and I hope that perhaps you will be as flexible in the area that I will be talking about, which is the powers of the regulator.
The draft Bill in its current form limits the type of information that Ofcom can require in transparency reporting and risk assessments, but does not set binding minimum standards. We have explored that question at some length. We had a very good session with Dame Melanie. Why did the Government decide that codes of practice should not be directly binding on companies? It seems paradoxical, if we are trying to make sure that they observe certain standards, that in a sense they can decide how to deliver their duties in a way that suits them. It limits the powers of the regulator to a considerable degree.
Nadine Dorries: The duties are binding. I think the provisions in the Bill are very robust on the work of Ofcom in holding those platforms to account. Again, I would be very interested in seeing what your recommendations are with regard to those powers. Holding the platforms to account on their terms and conditions is the role of Ofcom. If we regard the platforms and their terms and conditions as the car, Ofcom has the power to lift up the hood, look under the engine and expose what those platforms are doing that causes harm and facilitates illegal content.
I am interested in what you think about that and what your recommendations are, but I think the powers in the Bill, which have been tested by parliamentary counsel, probably are sufficient. The duties are binding. They are binding by their terms and conditions. The question that I have asked within the department is similar to yours. The question I have asked repeatedly is, “This Bill holds platforms to their own terms and conditions. What if they decide, the day after this Bill becomes an Act, that they will change their terms and conditions?” I think that is why it is important—to the points raised earlier by the Chairman and me—that we keep a watching brief on this Bill, moving forward, in order that it remains watertight, and on the Act when it becomes an Act.
We need to take a step back and look at what is happening now with regard to the terms and conditions. Platforms tell us that they abide by them. We know they do not, but they tell us that they do. Advertisers accept that they do, and users accept that they do. The terms and conditions provide a degree of reassurance. If, the day the Bill becomes an Act, platforms decide that they are going to change their terms and conditions, they are by any other means admitting that they have facilitated harmful and illegal content and have not removed harmful and illegal content up until that point. I am not sure what message that sends to advertisers or to users. I think it will be an incredibly bold platform that decides, after all they have said in the past, to change the whole basis on which they operate.
There is a lot to this answer. Chris is dealing with the technical side of the Bill and will be taking it through committee. I do not know if you want to add to that, Chris.
Chris Philp: Thank you, Secretary of State. As the Secretary of State said, the duties set out in the Bill are binding. What the code of practice does is set out a way, which Ofcom has specified, and Parliament has approved, that those duties can be met. It is open to service providers to meet those duties by some other equivalent mechanism, but there is a burden of proof, as it were, on the operators to demonstrate to Ofcom that the measures being taken, if they are different from the code of practice, are indeed equivalent.
It will of course be open to Ofcom, as the regulator, to say, if a particular operator is doing something different from the code of practice, “We think that is not equivalent and therefore you are not meeting your duty”. The point of bite is at the duty level, and it is up to the operator to demonstrate that they are delivering those duties. Ofcom can pull them up and impose penalties should they fail to meet those duties.
I agree with the point the Secretary of State made regarding terms and conditions, which of course apply to the legal but harmful arena, where we need to be extremely vigilant. Clearly, in the areas of “illegal content” and “legal but harmful to children”, we are specifying those centrally and therefore not relying on terms and conditions. I think the point the Secretary of State made on the T&Cs, or the terms of service as they are sometimes called, is a critical one. Vigilance by us, by Ofcom and indeed by an expert Joint Committee will be critical on an ongoing basis.
Nadine Dorries: We must also remember that we have given Ofcom teeth—some may say fangs—and it has the ability to fine the companies. It has the ability to put in an expert panel. It has the ability to request a huge amount of information from those organisations, which provides the transparency that it needs in order to hold them to account, both to their duties and the code of practice.
Ofcom, as a regulator, has been given an extraordinary amount of power to be able to do that. Again, I underpin that with the fact that that is why we require ongoing parliamentary scrutiny. It is an independent body that has been given an extraordinary amount of power in order to keep our children and individuals safe. Therefore, we need that scrutiny. The question you have asked, ongoing, will be something that we will be able to appraise.
Lord Clement-Jones: Of course, Dame Melanie has also predicted that there will be a fair amount of pushback by platforms in terms of legal action and so on. It is the ability to insist on T&Cs, or particular terms and conditions, which could be the crux of this and could be the gap through which platforms try to squeeze.
Nadine Dorries: You may have noticed that in my very first sentence I used the word “watertight”. It is for exactly that point that this Bill or this Act, on the face of the Act and with the secondary powers that it endows, has to be watertight. Again, that is why we are very much looking forward to your recommendations. We have to ensure that it is watertight.
I heard Dame Melanie’s evidence. I believe that this will set off a culture change in our online environment and landscape. There will be huge kickback because you have to follow the money. People are making a huge amount of money from these particular platforms and sites. Of course, there will be kickback. We must not forget that the world is watching what we are doing in legislating to hold the platforms to account. That is why it has to be watertight. That includes the codes of practice and the terms and conditions. It is facilitated by the powers that we will be bestowing on the independent regulator.
Lord Clement-Jones: We also have the same sort of concerns about the ability of Ofcom to insist on particular forms of risk assessment when it audits, and so on, and indeed its powers of audit. Are you equally open to suggestion about whether or not we could tighten up the provisions there, with a greater ability to specify on the part of Ofcom?
Nadine Dorries: I can reassure you that if you have recommendations, and they can pass parliamentary counsel and they will not dilute the scope of the Bill, and they are in line with the three core principles and will help to make the Bill watertight and ground-breaking, and assist with the novel nature of the Bill, if there is something that we can see helps deliver, we will be very much open to your recommendations.
Chris Philp: To add to the Secretary of State’s point, I certainly reiterate the point about us being open to recommendations and suggestions. As you consider the question about enforcing risk assessments, I would suggest paying particular attention to Clauses 7 to 19, which set out which risk assessments need to be made and having to take into account, critically, Ofcom’s own guidance about risk assessment set out in Clause 62. You have to follow it through. It then goes to Clause 82, under which the risk assessments set out in Clauses 7 to 19 are enforceable. There can then be a notice given under Clause 80 requiring companies to bring risk assessments into compliance.
It is a slightly complicated path to follow through, but having followed that through, if you feel there are gaps, we would be very interested to hear about them.
Lord Clement-Jones: We take some comfort from the VSP guidance as well, which in a sense is a bit of a precursor to all this.
Then of course there is the question of algorithmic inspection, which is absolutely crucial. We have doubts about whether or not Ofcom has enough ability to look under the bonnet of what the algorithms are doing, particularly on amplification, which we have heard a great deal about from our Facebook witnesses, indeed our ex-Facebook witnesses as well. Are you confident that there is enough power accorded to Ofcom?
Nadine Dorries: I am. I would like to use this opportunity, if I may, to explain why I have a particular interest in this. In my previous role as Minister for Mental Health and Suicide Prevention for two years, I made a point of meeting with the parents of children who had lost their lives, had taken their own lives. I cannot put into words how devastating it is to sit down with parents of children who have taken their own lives needlessly. It was not that they went online and looked for the means to do so, but because algorithms took them in that direction, whether it was to pro-anorexia sites, suicide chatrooms or self-harm sites.
In my previous role, I also received many prevention of future death reports from coroners highlighting the same concerns. We know, fundamentally, that algorithms cause harm. What happens online with algorithms transfers into real life.
As someone who did my previous job and worked with this on a daily basis, I am limited in what I can say because, of course, we do not want to draw attention to some of the things that happen online. For those of us who are parents and those of us who are very aware through our work of the dangers caused to young people and children, I just want to provide you with reassurance that this is something I am very interested in.
I believe that Ofcom, with the regulatory framework we have set, has the powers to request full transparency of how those algorithms are used and promoted. You took information from Frances Haugen. You have information on how they are used and how it works. We think that Ofcom has the powers to lift the lid on the algorithms, and the power to set huge fines.
Let us not underestimate the fines. It is 10% of global turnover. I think Facebook turned over £86 billion last year. That is a considerable amount of money. There is also the criminal liability on individuals. I think my advice to people like Mark Zuckerberg, Nick Clegg and others who want to take off into the metaverse would be to stay in the real world. This Bill will be an Act very soon. It is the algorithms that do the harm. The Act will be there, and they will be accountable to the Act.
I want to reassure you that this is something that I am personally very invested in. I am assured that Ofcom has the powers. I think that Chris will probably give us the technical points in a moment. I know that you have done a lot of technical probing yourselves, so you will be aware of that. I go back to the original and substantive point that ongoing scrutiny is why this will be important.
Lord Clement-Jones: Please do. I will just clarify, on the criminal sanctions point, that, as you know, there has been debate about when they come into effect. They are in the draft Bill. Many people are saying that they should come into effect alongside the rest of the Bill. You almost seem to be implying that you agree with that.
Nadine Dorries: Yes. Absolutely. I say to platforms, “Take note now. It will not be two years. We are looking at truncating that to a very much shorter timeframe”. That is one of the areas where, as the Secretary of State, I want to go further with this Bill. It is just nonsense that platforms are being given two years to make themselves ready for what would be criminal action. They know what they are doing now. They actually have the ability to put right what they are doing wrong now. They have the ability now to abide by their own terms and conditions. They could remove harmful algorithms tomorrow.
I believe we heard that they are putting 10,000 or 20,000 engineers on to the metaverse. Rebranding does not work. When harm is caused, we are coming after it. Put those 10,000 or 20,000 engineers now on to abiding by your terms and conditions and removing your harmful algorithms. If you do not, this Bill will be watertight. I am looking at three to six months for criminal liability. Obviously, I will put in the rider that that is subject to being probed and examined by parliamentary counsel and write-arounds and others, but that is certainly what I am looking to put in the Bill.
Platforms know now—they know today—what they are doing wrong. They have a chance to put that absolutely right now. Why would we give them two years to change what they can change today? Remove your harmful algorithms today and named individuals will not be subject to criminal liability and prosecution.
Chris Philp: On the technicality that the Secretary of State asked me to comment on, clearly it is in places such as Clauses 49 and 50. Clause 49(1) allows Ofcom to specify the kind of information it needs. Clause 49(4) lists, over a page and a half, the kinds of information that Ofcom can require. Under Clause 50, Ofcom will prepare guidance on the exact information that will satisfy it.
Ofcom will have extremely wide-ranging powers to make sure that the information required is coughed up, or delivered up, I should say. That is critical because, as the Secretary of State says and as Frances Haugen made clear to the committee in her evidence just a week or two ago, the way that social media firms are allowing algorithms to run rampant is completely unacceptable. They are serving up content to children, to vulnerable people and to members of the public more widely as well, simply based on the ability of that content to make money for the platforms by driving engagement. As far as I can see, they have regard primarily, or even in some cases exclusively, only to the objective of making money. There is no or scant regard for protecting the people using those services.
I have constituency cases where children have seen that content. We know about the terrible Molly Russell case, where we believe Instagram actively pushed content towards her, promoting suicide. That is completely unacceptable and irresponsible. This Bill is designed to stop it.
The Chair: Thank you. Joining us remotely, Suzanne Webb.
Q277 Suzanne Webb: Thank you very much. It is great to see you all here. I really welcome the fact that you are talking about an accompanying audit committee, as I call it. I think it is the right thing to do. I am sitting here nodding incredibly robustly at the comments as to why the tech companies do not just do something now. It is such a valid point. They know what is going on, and they need to follow their moral compass. They do not need to wait for the Bill. They just need to get on with it now.
All the time I have been sitting on this committee and listening to the evidence, what has struck me is what is happening to user safety. I am deeply unconvinced that some of the tech companies take it seriously or that it has the right level of prominence and attention in the tech companies themselves. Some heads of safety do not even have a dotted line to the board or an audit and risk committee. I think that is shocking. It should be the case.
Ofcom gave really good evidence, but Ofcom is only as good as the information it will be given. Some suggest that the tech companies may well be marking their own homework. You have also touched on the need for tech companies to stay in the real world and not the metaverse, and to be accountable. What is the Government’s view on giving explicit duties to the regulator to audit not just the companies’ risk assessments, including the effects of algorithms and functionality, but effectively the governance?
To make this work, the tech companies have to put in a robust governance structure. I think it is really important that that structure is audited. We are not just looking at the outputs. We are looking at the means for the tech companies in the first instance to get themselves there. If they do not have those structures in place, they are not going to be able to have robust outcomes. I would suggest, basically, that Ofcom has some powers to go in and look, to make sure that they have the right level of structure and governance in the first place in order to deliver what they need to do.
Nadine Dorries: We are not against that. Again, I emphasise that we are very interested and excited to see what your recommendations will be on that. It is interesting. I will turn to Sarah. She has been with the Bill since it started, and I wonder whether this has already been probed in the past, before we arrived, at the time of the genesis of the Bill. I will hand over to Sarah, if that is okay, in case it has been discussed and there is a reason why it is not possible.
Sarah Connolly: The short answer is that we looked at it a little bit and did not, at the time, take it any further. As the Secretary of State says, we would be interested in understanding your recommendation and your rationale for it. It is quite an interesting idea. As you can imagine, over the lifetime of this work, we have gone through various iterations of where the right pressure points are, if you like. We would be really interested to understand a bit more of the granularity of what you are thinking in this case.
Nadine Dorries: Basically, Suzanne, how do you think it could work? How could you make it tight and robust in a form that would pass scrutiny by parliamentary counsel? The recommendation would be welcome. As I suspected, it has already been probed at the beginning of the Bill, but if you want to put it in a recommendation, we are certainly willing to look at it.
Chris Philp: The approach the Bill seeks to adopt is specifying the result we want to achieve and the companies delivering the result. They may do it in the way you are describing, or they may do it in other ways. It may have been that that was the approach taken when the matter last came up, but any new ideas are gratefully received.
Suzanne Webb: My thought process is that in the early days, when the Bill is there and everyone is putting in place everything that they need to do, there is some form of scrutiny to make sure that companies have all the support and advice that they need as well, to make sure that they have the appropriate governance structures. It may be something that is short-lived, temporary and just at the beginning of the Bill, and does not need to continue. It would be just in the early days, something between three to six months, to make sure that everyone was on the same page. I appreciate your thoughts. It is definitely something we can talk about.
Nadine Dorries: Sarah has just whispered in my ear that it is a good idea. She has run this for six years, so I would say get that in your recommendations and we will start work on looking at it now, I promise you. We will take it away and look at it now. Thank you.
Suzanne Webb: Brilliant. Thank you.
Q278 The Chair: Secretary of State, you and Mr Philp have set out the powers in the Bill for legal requirements on the companies; there is some content that is illegal and there is some content that the Bill describes as being legal but harmful. A very important piece of work that has been done in conjunction with the Government publishing the draft report is the work of the Law Commission looking at modern communication offences.
In our work, we are considering what impact those recommendations might have on the Bill. They set out a series of quite important reforms with regard to offences relating to self-harm, knowingly untruthful statements and so on. What is your view on the recommendations of the Law Commission’s report? Obviously, we have seen some press reports this week suggesting that the Government are minded to accept some of those recommendations.
Nadine Dorries: Yes, the press reports have stolen my thunder really. Obviously, we asked the Law Commission to review the draft Bill as it is, and the existing legislation on abusive and harmful communications. What it has done is given us some really useful information that helps to make the Bill watertight. Not all of its recommendations are for DCMS. I will hand over to my colleague, Mr Hinds, in a moment, but the recommendations that we are minded to accept are these.
I will read this out, so that I do not get the legality of it wrong: “An offence to capture the most serious and genuinely threatening communications when they are sent and posted to convey a threat of serious harm; false communications where someone sends a communication they know to be false and intends to cause non-trivial, emotional, psychological or physical harm; and a harm-based communications offence that captures communications a sender intended to cause harm with and whilst without reasonable excuse”.
I know my honourable Friend is interested in epilepsy. Other parts of government are taking this forward. Again, subject to the usual conditions of probing by parliamentary counsel, write-around and others, I am minded to accept those recommendations by the Law Commission. I think they will only serve to strengthen the Bill and add to it in a really positive way. I am very grateful to the Law Commission for its recommendations.
I will hand over to Mr Hinds because some of the recommendations by the Law Commission—this is a cross-government department Bill, as you know—relate to the Home Office.
Damian Hinds: Thank you, Secretary of State. As you know, Chair, there are three sets of material that the Law Commission has been looking at: the communications offences, a DCMS lead; the Ministry of Justice material; and the Home Office, on hate crime.
To be clear on the offences that the Secretary of State has been talking about, yes, we think they are a useful addition to what is available to law enforcement in tackling particularly violence against women and girls. It is a reflection of the way the world has changed, how domestic abuse changes and how stranger abuse changes as well over time. We welcome this work.
The Chair: Is the recommendation on content that promotes or glamourises self-harm one of the Home Office areas of responsibility? I wondered if that was something you were able to speak on today.
Damian Hinds: On the organisational technicality, I think that would be a DCMS lead. For the avoidance of doubt, I am happy to say that, yes, I welcome anything that minimises the glorification and promotion of self-harm, eating disorders and other harm to young people.
Sarah Connolly: To be clear, that particular Law Commission recommendation—self-harm—is Ministry of Justice.
The Chair: Is it envisaged across government that there will be a response at the same time? Where the Government are minded to accept those recommendations, how will they be introduced? Will they be introduced through this Bill or through other amendments to legislation?
Sarah Connolly: Ministry of Justice colleagues are looking at the options in their space. We wanted to try to bring forward what we could bring forward in this Bill as quickly as possible, subject, as the Secretary of State says, to the usual clearances and write-arounds. That is the intention.
The Chair: To be clear, for the purposes of this Bill, it would mean that those offences then fall into the category of not being legal but harmful. They are defined in law as offences, and the companies would have to address them. It would not be a matter of what their terms and services said about them.
Nadine Dorries: Correct.
Sarah Connolly: The provisions that pertain to communications in this Bill would be criminal, yes.
Q279 John Nicolson: Secretary of State, ladies and gentlemen, thank you so much for joining us. Secretary of State, is there anything that you have heard in the evidence thus far that has surprised you?
Nadine Dorries: It has probably reinforced my suspicions and general knowledge about the way algorithms work. I knew from speaking to parents in my previous role, and I knew from the future death reports that I received from coroners, that there were suicide chatrooms and that young people were directed in a harmful direction on the internet. I suppose the evidence that you received—I also met with Frances Haugen and others—reassured me that we are doing the right thing, and confirmed what I think we all knew was happening within these platforms anyway.
John Nicolson: Clause 46 of the Bill sets out the meaning of content that is harmful to adults as well. Do you think, for instance, to call somebody a “public school posh boy f-wit” would fall within that? You did not tweet “f-wit”, but I am a Presbyterian boy and I will not say the full Anglo-Saxon. Does that seem appropriate to you?
Nadine Dorries: I am not actually sure what you are referring to.
John Nicolson: I am referring to your tweet about James O’Brien, the journalist. I have it here. “To be fair, I think the fact that Mr James O’Brien is a public school posh boy f-wit has more to do with it than him being a journalist”.
Nadine Dorries: I thank you for reminding me of that comment. I receive a huge amount of abuse online.
John Nicolson: I have just quoted a bit.
Nadine Dorries: Thank you, because what you allow me to do is to highlight the importance of this Bill in terms of misinformation and disinformation. That comment may have been as a result—if you would like to look at Mr O’Brien’s tweets—of some of the appalling things that, as a woman, are directed towards me online. Yes, I may have sent that tweet—
John Nicolson: Well, you did. It is not “may”.
Nadine Dorries: I did, indeed. It is also quite interesting that the honourable Member himself has tweeted about me, I think a dozen times, in recent times and I know has mentioned my name a number of times during this committee hearing. I think that some of us sometimes need to take stock—
John Nicolson: None of it offensive.
Nadine Dorries: —of how we behave. I think calling someone a bigot and a homophobe probably is offensive.
John Nicolson: We could run through a list of your past votes—
Nadine Dorries: Yes, do, but I am not sure how that is relevant to this Bill.
John Nicolson: We would support that, but nevertheless you are the witness here, not me. There is a duty to protect journalistic standards in this, under category 1. You also said that you would like to nail a journalist’s testicles to the floor using your own front teeth. Does that seem appropriate?
Nadine Dorries: At the time, I sent that tweet as a mother to a journalist who was hanging around outside my young daughter’s house, taking photographs of her without her knowledge. As a mother, the lioness probably roared up in me and I think it was totally appropriate, because not only was he hanging around outside her house, he followed her on a dog walk with her dog. So, yes, I think any mother in my situation would have done exactly the same thing and—
John Nicolson: As the Secretary of State, your job—
Nadine Dorries: I had not actually finished my answer.
The Chair: Let the Secretary of State finish and then you can come back in.
Nadine Dorries: Thank you. I would just like to say that I think it was the lioness in me, which is probably present in every mother and every parent across the land, who when faced with a journalist, without permission—it was pre-Leveson; I think we are talking about 11 years ago, so a long time ago—would have said or done exactly the same thing.
May I say also today as Secretary of State, and 11 years on, that my daughter is much older now, and if a journalist was hanging around outside her house—by the way, the only reason she knew he was taking photographs of her inside her house was because a neighbour alerted her, and again putting this into the context of misinformation and disinformation—as an older woman now, 30 years old, she would probably take care of that situation very well herself.
John Nicolson: I am sure she would, and I hope she would not use language like that because obviously it is a distressing —
Nadine Dorries: I hope a journalist would not hang around a young woman’s home and follow her on a walk and photograph her inside her house without her knowledge.
John Nicolson: It is a truism to say that journalists behave improperly. We all know that and we have discussed it many times in this House. We are not talking about journalists’ behaviour; we are talking about your reaction to it and some of the language that you use, which I think is important.
We have also talked about the definition of journalists, and Tommy Robinson’s name has come up a number of times. You have retweeted Tommy Robinson, the far-right racist leader of the English National Party, or whatever it is called these days. Do you think it is appropriate to retweet Tommy Robinson?
Nadine Dorries: I do not know what retweet it was. I think if I was aware it was Tommy Robinson I probably deleted it. I imagine I was not aware at the time.
John Nicolson: The give-away of course was the fact that he used his name, Tommy Robinson.
Nadine Dorries: I probably did not know who Tommy Robinson was because I do not follow the far right.
John Nicolson: Really. Four years ago, you did not know who Tommy Robinson was?
Nadine Dorries: I would not have tweeted to retweet Tommy Robinson if I had known. I know who he is now, but I would not if I had known at the time. As somebody who does not follow the far right, I do not believe I would have known who he was.
John Nicolson: I am astonished you did not know who Tommy Robinson was. I doubt there is a member of this committee who did not know who Tommy Robinson was four years ago.
Chris Philp: Are these questions designed to scrutinise the Bill or personally to attack the Secretary of State?
John Nicolson: I will come to you in a second, if you do not mind.
Chris Philp: Because it seems to be the latter.
John Nicolson: I am talking to the Secretary of State. Let us move on to disinformation. Of course, that is not part of the Bill but many people think it should be. In the run-up to the Hartlepool by-election you tweeted that Boris Johnson was bringing jobs to Hartlepool, and you mentioned 180,000 well-paid jobs. The population of Hartlepool is 92,000.
Nadine Dorries: I was talking about jobs in the region. I think a missing comma, because of the lack of digits that you are allowed on Twitter, allowed that to be misinterpreted grammatically. I was talking about jobs in the region. If you read that tweet back, the sentence ends at Hartlepool, and then talks about the number of jobs coming to the region.
John Nicolson: You also retweeted a doctored video about Keir Starmer that falsely claimed he had obstructed the prosecution of grooming gangs when he was Director of Public Prosecutions.
Nadine Dorries: I was retweeting something one of our Whips had tweeted, and then removed it when it was pointed out to me that it was incorrect.
John Nicolson: The reason all this matters, to address your colleague’s point—
Nadine Dorries: I am glad we are getting to the substance of your investigation.
John Nicolson: —is that somebody less benign than you might one day be Secretary of State. The Bill gives enormous powers to the Secretary of State. The boss of Ofcom specifically said that she thought that some of the powers that you had were too strong. Do you agree with her?
Nadine Dorries: No, I do not. I think they are absolutely necessary for the Bill to work, and to be able to provide guidance when necessary to Ofcom. I can understand why an independent body that feels it has a very important job to do thinks that further guidance might be onerous but, no, I do not think that the powers are too strong.
It goes back to my substantive point that the role of a committee such as yours would be necessary, ongoing, to provide an additional layer of parliamentary scrutiny. It is not for Ofcom to say whether the role of the Secretary of State is too powerful; it is the role of Parliament. The powers will come before Parliament in a way that is published and scrutinised, using secondary powers, and my suggestion as Secretary of State, so as to hold me to account, is that there is even a further layer of scrutiny, from both Houses, not just from the Commons, by a committee such as yours providing a role such as yours that can continue to scrutinise and investigate Secretary of State powers.
John Nicolson: Specifically, the power that the Ofcom boss was referring to was Clause 33(3)(a), where you can direct Ofcom to modify a code of practice to ensure that the code of practice reflects government policy. A lot of people think that is an overreaching power that gives you enormous influence, perhaps excessive influence.
Nadine Dorries: I do not believe it does, but I absolutely understand the concerns that some people may have. They are novel powers, possibly, in their reach, but then so is this Bill, so is the work of this committee. We are dealing with a unique situation in our time, as parliamentarians, and in the world. The power of the internet, the force of the internet on people’s lives, is something that we have to legislate for because it is causing harm to individuals and to our children, and, therefore, they are extraordinary powers. I will hand over to Chris for the technical part of that answer.
Chris Philp: Thank you. On the technicalities of it, you referred to the power of direction under Clause 33(1) where the Secretary of State may direct Ofcom to modify a code of practice submitted under Clause 32(1). It is 32(1) where Ofcom would submit a code of practice to the Secretary of State, which is then laid before Parliament. The code, which may be modified by the Secretary of State—yes, that is correct; you are right in saying that—is then subject to parliamentary procedures under Clause 32(3). Whether or not the Secretary of State exercises her powers of modification, the code, modified or not, is subject to parliamentary scrutiny and voting under the subsequent subsections of Clause 32.
John Nicolson: That is specifically what the Ofcom boss was concerned about in her evidence.
Chris Philp: I hope Clause 32 on parliamentary approval offers reassurance on the point.
John Nicolson: Let us see. Very quickly, Secretary of State, I have been lobbied on a number of different issues—I would add, without payment. To get a bit more detail about epilepsy, would the sending of flashing images with the aim of triggering epileptic seizures, and I know you touched on this, referring to my colleague Mr Russell, fall within the scope of what is harmful to children?
Nadine Dorries: That is a recommendation by the Law Commission. It is being dealt with by other government departments and the Ministry of Justice. It is not being dealt with in DCMS.
John Nicolson: Would you consider doing it?
Nadine Dorries: It is not being dealt with in DCMS.
John Nicolson: What about cyber flashing, which is a growing problem online, with adults and children getting unwanted images of nude males?
Nadine Dorries: It is the same; it is the MoJ.
John Nicolson: Thank you.
The Chair: May I clarify something on the Law Commission proposals? I appreciate they reflect other areas of government, but are the Government accepting all the recommendations made in the report the Law Commission published in the summer?
Nadine Dorries: I have listed the ones that we are accepting. As I say, it is cross-government and there are a number of departments they pertain to. Sarah?
Sarah Connolly: It is the Ministry of Justice. I would not wish to speak for Ministry of Justice officials, or indeed Ministers, but we have looked in the round at the Law Commission recommendations.
The Chair: Would it be fair to say that you are not aware of any objection to them?
Nadine Dorries: I do not think it is for us to say, Chair. I think you need to speak to the MoJ.
Chris Philp: On the epilepsy question, and perhaps Sarah can correct me if I get this wrong, under the lawful but harmful to children pillar of the Bill, if a platform conducting risk assessment becomes aware that a particular group of its users are epileptic, and there are measures it can take to prevent them seeing certain content, aside from the Law Commission recommendations, just under the Bill as constructed, a duty would arise under those circumstances. Sarah, perhaps you can correct me if I have got that wrong.
Sarah Connolly: That is absolutely perfect.
The Chair: In practice, that would mean that the regulator could take action against a company for failing to mitigate that content.
Chris Philp: Yes, if there were reasonable steps they could take. Clearly, they would need to know that the child was epileptic, or that there were reasonable steps they could take to identify the child as epileptic. In those circumstances the duty might engage, separately from the Law Commission recommendations.
The Chair: Certainly on the basis of the evidence we have received, children were being targeted with flashing images because they were known to be epileptic, so in that circumstance the platform should know as well and intervene to stop it.
To be clear, the Law Commission recommendation obviously creates criminal offences against the person who is committing those crimes, but under this regulatory regime it would also allow the regulator to insist on removal of such content.
Chris Philp: Yes, indeed. Once those offences are criminalised, and we have said already that the areas that fall in the DCMS ambit will be, that is right. It would move from either of the second two categories, the lawful but harmful category either for children or adults, and move into the first category of straight-up illegal, and there would be a hard duty on the social media platforms to take action, regardless of what their terms and conditions may say. You are quite right; the fact of making this stuff illegal has a flow-through implication to really strengthen the provisions of the Bill, which then apply because it becomes illegal.
Q280 Lord Gilbert of Panteg: Secretary of State, Ministers, Sarah, good morning, and thank you for the evidence so far. Before I get stuck into my questions about freedom of expression, I think something very clear has come through from the evidence this morning, if I may say so. You have a real passion for this Bill, for getting it right, and for having it properly scrutinised. For those of us who for two or three years were worried about whether the Bill was ever going to see the light of day and get through Parliament, I think you have dispelled that hesitation in our minds.
May I start with some initial general thoughts on the Bill, Secretary of State? You describe very powerfully your experience as the Minister for Mental Health and the impact that you saw on children and families of children who harmed themselves or committed suicide. You have talked about the way in which you and your family have been harassed online, and that must have made a huge impact on you. Now you have come to the Bill and you are immersing yourself in the detail of the Bill. You have been quite strident already, subject to write-around, about some of the things you have already come to a view on.
Nadine Dorries: I have just had a message; some of that has been cleared.
Lord Gilbert of Panteg: When you saw the Bill, was there anything that struck you as, “Yikes, it seems wrong that isn’t in there”? I will tell you what struck me when I saw the Bill. It does not appear to me that it will deal with children accessing commercial pornography, and I thought that was a big thing. Are there any big things that struck you that you think are either missing, or go too far or not far enough in the Bill that you have not already outlined?
Nadine Dorries: To your first point, thank you for that question because you have given me an opportunity to explain one of the other areas where I want to go further.
I do not want to leave anyone in any doubt. I decided when I took up my appointment that I needed to grasp the nettle of this Bill. There are a lot of noises off on both sides of the argument—people wanting to go further, people thinking the Bill went too far. I can understand the difficult environment in which the Bill sat. It has been described to me as just awful and horrible as a Secretary of State to have to take through a piece of legislation like this, but I do not see it like that.
Thank you too because not just as a woman do I, and most women, experience horrendous abuse online, but abuse online has real-life consequences. My abuse online, over years from one person, actually moved to physical stalking and someone moving half way across the country to rent a house a mile from mine because they were no longer satisfied with their online harassment, and felt that they had to go further.
I am also someone who passionately believes in freedom of expression. In case anyone had not noticed, I am a Conservative, and I am pretty strong on that, so I also understood the difficulty between finding the sweet spot between protecting freedom of expression and protecting children by removing illegal content and legal but harmful content.
The Bill in its present form, and in its enhanced form, I hope, will provide greater protections for freedom of expression. For example, there will be an appeals process. Platforms are not going to be able to arbitrarily remove democratic content willy-nilly as they please. I cite the video by David Davis, which was perfectly legal, and perfectly acceptable and appropriate, and was taken down by algorithms that swept up what was just a democratic video and democratic content. Platforms will not be able to do that in future. They put it back up, I am aware, but it took a long time and was a difficult process. That will not happen in future once this Bill becomes law.
Freedom of expression is enhanced as a result of the Bill. I read earlier this morning commentary about the fact that the Bill will destroy freedom of expression. It is not. It will provide an appeals process. It will provide greater space for freedom of expression, and it will provide reassurances.
On pornography, the Bill will not stop anybody accessing commercial pornography. It does not prevent children accessing commercial pornography, all as a result of algorithms. This is something else I heard a great deal about from parents in my previous ministerial role, and from people concerned. Young children who may explore pornography online find themselves bombarded with images on their phones as a result of algorithms. It is my mission to ensure that a child’s innocence is not wiped away by an algorithm.
I do not believe that the Bill goes far enough in preventing children from accessing commercial pornography. That is tied into age verification and there are elements of that that I have asked officials, subject to parliamentary counsel and write-around, to look at further, to see whether we can do more. I realise that there is a gap. I am not going to call it a loophole. There is a gap, and I think we need to close that gap somehow if we can.
I am also interested in the committee’s take on this in your recommendations; I think it is definitely an area we can probe and move the Bill a bit further on.
Q281 Lord Gilbert of Panteg: Thank you, Secretary of State. I certainly think we will have some recommendations for you in that area, but it is very good to hear that.
I will move to the actual provisions in the Bill to protect freedom of expression. Protecting freedom of expression is an objective of the Bill for a start, but those are words. That is fair enough; I think they are good words. The substance comes in a number of specific provisions: the Clause 12 provision to have regard to the importance of protecting users’ rights to freedom of expression; the category 1 service duty to protect content of democratic importance in Clause 13; the category 1 service duty to protect journalistic content in Clause 14; and the exclusion of news publisher content in Clause 39, defined in Clause 40.
Can we go through those and get your thoughts on how they work and whether they go far enough? We start with the general provision to have regard to the right of freedom of expression. It seems to me that, if that is to be more than a tick-box, there needs to be some quite specific coding of it by Ofcom at some stage, about what is expected and required of companies, because they could just say, “We have had regard to freedom of expression”. It is just one specific point, and you may want to follow up in writing.
A general point is that in the press release launching the Bill your predecessor said the freedom of expression measures would remove the risk that platforms could “over-remove content” in complying with the Bill, but in a reply that you sent to the House of Lords Select Committee freedom of expression inquiry you said that that the duties would reduce the risk of over-removal. I wondered whether that was a shift in expectation or just a change in words. Do you think the general regard for the right to freedom of expression is something that Ofcom needs to codify?
Nadine Dorries: I will take some of the points that you have just asked and then hand over to Minister Philp to answer. Yes, I believe there will be a code in place for that. You are quite right to highlight the exceptions for commercial journalism. I wrote an op-ed recently and I actually regard journalism as another platform to hold online platforms to account. They are completely exempt from the provisions of this Bill. Of course some people raised the point about citizen journalists and how that applies. It will be up to the platforms to look at the content of the citizen journalist and decide whether they are, or whether they are someone posting something harmful and abusive online. That will again be regulated by Ofcom. There is that protection.
You have asked for some quite detailed technical information on some quite detailed clauses. I will hand over to Minister Philp. I am not sure if Sarah wants or needs to come in, but I will hand over to Minister Philp to answer some of that.
Chris Philp: Subsection (2) of Clause 12, as you say, Lord Gilbert, imposes a general duty. That being a duty, it is obviously legally binding. Yes, there will be a code of practice that supports that duty, as we have heard before, laying out how the duty can be met, can be performed, but it will be open for platforms to meet that duty in an alternative way, provided that Ofcom can be convinced and persuaded that it genuinely achieves that in meeting the duty.
Obviously, paragraphs (a) and (b) below subsection (2) talk about protecting users’ freedom of expression within the law, and protecting users from unwarranted infringements of privacy. Some of those things are defined a little bit further in Clause 23. That general duty, while high level, will be buttressed by a more detailed code of conduct and, as a duty, will have legal teeth. I think it sets a very important principle that is amplified further in some more specific areas, which we are probably going to talk about in your next questions.
That duty is important. It is worth remembering that at the moment this whole space is totally unregulated. As we sit here today, there is no duty at all on social media companies to have regard to freedom of speech. The fact this will now be in a Bill and, ultimately, in an Act of Parliament, and I am sure we can find ways of improving and fine-tuning it, means it creates freedom of speech duties and protections for journalistic and democratically important content. While I am sure they are not perfect and we can try to improve them, currently there is nothing at all. This is, as in so many areas, a huge step forward.
Q282 Lord Gilbert of Panteg: Let us have a quick look at content of democratic importance. Concern has been expressed that it will be so narrow that it will apply only to privileged politicians, or so broad that it applies to virtually everything that is discussed. I think Ofcom would welcome some steer from Parliament before it tries to make that interpretation.
It seems to me that one of the problems goes back to the issue of diversity of thought. If you are a west coast billionaire, you probably have a very narrow range of opinions in your head, and everybody you know thinks the same. If they are defining the breadth of political debate, it will be a very narrow range of political views, and much on either side of it will be regarded by them as extreme. Do you accept that there is some sort of philosophical discussion about what democratic debate is and what constitutes a range of opinions?
Nadine Dorries: I think the Bill is quite clear on that. If it is democratic debate and it is political debate, if is not harmful and if it is not abusive and therefore illegal, it is allowed. The Bill will not provide restrictions on what people can say politically, or in any other way that is legal and not harmful. The Law Commission recommendations are quite clear, and our provisions already in the Bill are quite clear: if it causes physical or psychological injury, of course it would not be allowed. On the parameters of what is acceptable and what is not, what is more relevant are the parameters of what is not acceptable, which I think are quite clear.
Again, if you have recommendations on how we could tighten that up and make it more robust and go further, we would be very happy to look at them.
Lord Gilbert of Panteg: I think we will come back to that. The trick is giving Ofcom sufficient space in which to think about this and develop codes, and giving it the kind of steer about something that many people think is for parliamentarians and democracy to form a view on. You usefully say you would like us to consider that.
On journalistic content, obviously, the concern is that anybody could easily define themselves as a citizen journalist. It is arguable that the journalistic content provision does not amount to very much, but the one thing it contains is the expedited appeals process if your content is taken down. One can easily see that being overwhelmed if everybody positioned themselves as a citizen journalist. Have you found a way of cracking that?
Nadine Dorries: The responsibility for that would be on the platforms to make an assessment as to whether the person posting as a citizen journalist actually was. The platforms have the power to analyse who is on their platform, to know who is on their platform, their history of content, and whether what they are saying is legitimate or not. Every member of society can have a political opinion, but that does not mean they are classified as a citizen journalist. It just means it is the democratic right and democratic content. I can feel a technical point coming up on my left, which I will allow Minister Philp to answer. I will finish there because I know he is very keen to give you the technical answer. I am so glad he is here.
Chris Philp: I am sorry if I was twitching visibly. I was just going to add to the Secretary of State’s points in relation to Clause 14 that the test applies to journalistic content. It applies to the content they are writing, not the identity of the person. We are talking about category 1 services, the very largest companies such as Facebook. The judgment they will have to make is not so much about the identity of the individual, and whether they are a journalist, a citizen journalist or neither of those, but whether the content itself meets the test of being journalistic. There is clearly an element of subjectivity and judgment in that, and they will have to look at each bit of content and make that determination, and the codes of practice will, hopefully, assist in that. They are codes of practice that of course Parliament will vote on, as we discussed bit earlier, under Clause 32, from memory. Hopefully, that adds a bit of additional colour. I will do my best to restrain my enthusiasm in the future.
Lord Gilbert of Panteg: The Bill describes journalistic content as content generated for the purposes of journalism. Would you like to see if we can have a better go at that?
Nadine Dorries: Do, please. I go back to my substantive point. This is teamwork. We are doing this together. Please do not hesitate in making any recommendations that you think will improve this Bill because you are doing the work with us—anything you can think of to improve it, which will pass parliamentary counsel and write-around, and you think you can get into the Bill. I do not think we can say this often enough. It is novel. It is ground-breaking. The world is watching and it has to be watertight because it has a very important job to do the day it becomes an Act. If you can make it better now, please, be our guests. Do it.
Lord Gilbert of Panteg: Another area is news publishers. I have been reading some of your words too, and I have been reading an article you wrote in the Manchester Evening News.
Nadine Dorries: That is very disconcerting.
Lord Gilbert of Panteg: For anybody who is in doubt about your views on journalism, it is a passionate defence of journalism and why it is crucial that we support journalism. You pay tribute to journalists and the work they do. It is an article that I read with interest. One thing you say in it is that our democracy depends on journalism. You paid tribute to the Bedford Times.
Chris Philp: If we are on local papers, may I put on record my admiration for the Croydon Advertiser as well, please?
Lord Gilbert of Panteg: Any more bids? I was going to mention the Pontypool Free Press and Herald of the Hills, which I think has a poetic title that will beat any others.
Nadine Dorries: We will stay with the Bedford Times.
Lord Gilbert of Panteg: You go on to talk about protecting journalists and journalism and news media organisations and creating a level playing field with the platforms. It sems to me there are two aspects to that. The first is not covered by this Bill but it is important, and that is creating a much more transparent and fair process by which platforms pay for quality journalism. I know that elsewhere in your department you are looking at that. The digital advertising market and the work you are doing on that is really important.
It also goes to the provision in the Bill relating to news media organisations and the news publisher provision. The Bill basically puts publishers out of scope so that platforms need not deal with content on news publisher sites. The Bill has a pretty good crack, I think, at defining what constitutes a news publisher, and I do not think we need to do much with it.
In the most extreme cases, Silicon Valley takes its own view, based on its business interests, its political sensibilities and sometimes whim, in taking down legitimate news stories published by news media organisations. YouTube suspended talkRADIO’s account, even though Ofcom found no problem with it. Twitter took down the New York Post for 16 days in the run-up to the 2020 election, later admitting it was a quick interpretation based on no other evidence. Meanwhile, it leaves up a pile of lies peddled by Chinese state media about the genocide in Xinjiang.
What seems to me a much better protection, which would meet the objective in your article, would be to have a positive provision, whereby Silicon Valley, the platforms, cannot take down news coverage content. Not that it is outside the scope of the Bill, but if content is published by a news publisher that is subject to redress as defined in the Bill, they simply cannot take it down.
Nadine Dorries: I was just asking for some advice because I think we could only do that with registered commercial news publishers within the UK. I do not think we could stop platforms taking down the Washington Post, for instance. I will hand over to Sarah.
Sarah Connolly: I am interested in understanding what the difference is between what you are suggesting and leaving it completely out of scope.
Lord Gilbert of Panteg: At the moment, it is out of scope so the platforms do not need to consider news content.
Chris Philp: That is in relation to the harm provisions. Do the duties to protect journalistic content in Clause 14 not apply?
Lord Gilbert of Panteg: I am talking about the news publisher provision, putting news publisher content out of scope of the Bill, Clauses 39 and 40. What I am saying is why do you not go further and say not only is it outside the scope of the Bill but that there would be a positive requirement for platforms not to remove news publisher content?
Chris Philp: I understand the request, which is different from the contents of Clauses 39 and 40, but is not what you are asking done already in Clause 14 with the duty to protect journalistic content? If we take down the BBC website or take down the Times of London, would that not violate Clause 14?
Lord Gilbert of Panteg: I do not think journalistic content covers it. It would be a positive requirement not to take down stuff that comes within the definition in Clauses 39 and 40; that is, newspapers’ and broadcasters’ content.
Nadine Dorries: I think some work may have been done on that over the years, so I will hand over to Sarah to answer that.
Sarah Connolly: I apologise, it might be me this morning, but I am slightly struggling. Given that this is content that is out of scope, it will appear on platforms anyway, and, therefore, because it is out of scope, assuming it is not illegal of course, it would stay up. It more or less—
Lord Gilbert of Panteg: They could still take it down. There is nothing that stops them taking it down. They can still take down talkRADIO.
Nadine Dorries: Why do we not look at this? Why do you not do something in your report that clarifies how you would like this to look better, and a recommendation about how we can do that? It is a really useful line of questioning because it gives us advance notice so that we can take this away and look at it. Again, if you could put a recommendation in the report, the stars can align and maybe we could do something.
Sarah Connolly: I think it would be quite interesting to unpick the legal risk. I think in what you are suggesting the platforms would carry quite a lot of legal risk. It would be helpful, if you are thinking of recommendations, perhaps to give us a bit more.
Lord Gilbert of Panteg: We can come back to you on it. I do not think it would carry a legal risk. What I am trying to get at is to meet the objective the Secretary of State sets out in her article by stopping platforms taking down talkRADIO or UK journalism, as defined in the Bill, rather than just putting it outside the scope of the Bill.
Nadine Dorries: I think I understand what you are asking for. If you can put something in a recommendation, we will start work on that, now that we have advance notice of what you are looking at, and then, hopefully, by the time your recommendations come forward on 10 December—
The Chair: By the 10th.
Nadine Dorries: As soon as possible would be wonderful, because we want to get the Bill on the Floor of the House, so the sooner you can make your recommendations, the better.
Chris Philp: For clarity, are you suggesting a hard prohibition on platforms taking down content produced by organisations defined by Clause 40? I think that is your suggestion.
Lord Gilbert of Panteg: That is my suggestion. I think that is much closer to what the Secretary of State defines as protection in her article.
Chris Philp: As one clarifying question, does Clause 14 as drafted, which protects journalistic content, not come very close to doing that already?
Lord Gilbert of Panteg: No, I do not think it does.
Chris Philp: Why not?
Lord Gilbert of Panteg: I still think the platform, on the basis of its own sensibilities, can take stuff down. It does not.
Nadine Dorries: There would be a right of appeal, but it could if it wanted to. We need to look at that, and that is work we need to do
Lord Gilbert of Panteg: A right of appeal is no good for a news organisation because by the time—
Nadine Dorries: The right of appeal we are putting into the Bill has to be a swift response, so it would not be down for long if it was down. Again, I cite the David Davis video, which was long and tortuous to get back up. If they took down democratic content, Ofcom would be able, quite quickly, to keep an eye on that, and make sure it got back up quite quickly. We have that provision on the right of appeal and making them respond and act quickly, but I think we need to take it away and do some work on it.
Lord Gilbert of Panteg: Thank you.
Q283 The Chair: On these freedom of speech questions, obviously the Bill makes social media platforms responsible for enforcing actions that relate to speech, content, and things people have posted. It is doing that at the request of the state, at the request of the Government. Therefore, it would be reasonable to assume that the provisions under the ECHR would apply in that circumstance. If the speech is illegal under criminal law, it is quite clear that there is responsibility to act. Where speech is legal but harmful as set out in the Bill, it would be interesting to know what the department’s view is on compliance with ECHR requirements, with regard to freedom of expression in particular.
Nadine Dorries: We believe that the provisions in the Bill comply with Article 10 of the ECHR. We hope to have that advice ready. It is forthcoming, and it will come forward as part of the passage of the Bill. We have taken advice and we think it complies. Obviously, we are in a state of doing further work on the Bill, so that assessment needs to be remade. We will look at your recommendations and the Law Commission recommendations. The work is ongoing. We believe we comply with Article 10 of the ECHR, and we of course take advice on that.
The Chair: I believe we have written to the department separately about that.
Nadine Dorries: I think you have written to officials and not to me, but I have seen the letter. I cannot give it to you now because work is ongoing. As soon as we have our final determination, we will be able to bring it forward as a part of the passage of the Bill.
The Chair: You are quite right, we did not write to you, we wrote to officials, but if it is possible to have an official response from the department before we publish our report that will be helpful in our final deliberations as well.
Nadine Dorries: I warn you that we may not be able to. Subject to all the conditions we are subject to, we may not be able to give you the final decision. As you know, parliamentary counsel are very careful about what we can say, so we may not be able to give you the exact decision, because there is work ongoing in moving the goalposts. The Bill is at this moment in the department being made tougher and stronger, and going further. It is work in progress at the moment.
The Chair: I do not think there is anything I can add to what Stephen Gilbert said in his questioning about journalistic exemptions and exemptions for news organisations. I want to ask about democratic content as well, but more as clarification. If someone was spreading information online saying that in the general election coming up, whenever it is, you can vote on a Friday as well as a Thursday in an attempt to mislead people into not voting, or indeed if someone had published a deepfake film with you, Secretary of State, saying things you had never said—
Nadine Dorries: That is not unusual.
The Chair: —with the purpose of discrediting you as a political figure and perhaps trying to hamper your attempts to be re-elected to the House, some might say that it was all part of the democratic process. Others would say it was clearly knowingly false information, certainly in the case of trying to mislead people about how and when you could vote, which is trying to suppress voting. Would that be covered as part of the exemption for democratic content, or would it be regarded as harmful to democracy and therefore should be removed?
Nadine Dorries: I stand ready to be corrected on both sides, but my belief is that it would be removed as democratic but harmful. Misinformation and disinformation would come under the first category. What was your second point?
The Chair: They are both the same. They would certainly be knowingly false. If you are accepting the Law Commission recommendation on knowingly false, they would be covered by that. What I want to be clear about is where the thresholds are. Although it might be considered democratic content because it is part of a debate about an election, it is exempt, because it is knowingly false, and therefore not part of legitimate debate, and it is trying to mislead people and perhaps influence them into not voting.
Sarah Connolly: It is not straightforward. It is a very good question. It would fall under disinformation and the provisions in the Bill on that, and would be taken down, again, subject, of course, to platforms’ T&Cs. Platforms change their T&Cs in the run-up to elections. We have seen that previously here and in the most recent election in the US. I would expect it to hit a threshold where it would be removed under the disinformation provisions.
Nadine Dorries: I think Minister Philp would like to add to that.
Chris Philp: As Sarah said, that analysis relies on the terms and conditions by service providers remaining as currently drafted. If they changed them, that duty would potentially fall away. It is worth stressing that risk.
In relation to the new criminal offence being proposed on false communication, if it met that test, it would be as a criminal matter and would fall under the first pillar, and, regardless of terms and conditions, would be subject to removal. For clarity, the test under the new proposed Law Commission offence is communications where the defendant knows the communication to be false, and, moreover, intends to cause non-trivial emotional, psychological or physical harm to a likely audience, and to have no reasonable excuse.
It would be a question of fact as to whether the disinformation you are describing fell under that. The knowingly false bit would be met. The no reasonable excuse bit would be met. The hardest limb to meet would be whether it will cause non-trivial emotional, psychological or physical harm. If, and only if, that test is met would it fall into the criminal bucket.
The Chair: What you are saying is that the provision, as it is a matter of criminal law, would trump the requirement to protect content of democratic importance.
Chris Philp: In your example they would pull in the same direction. Yes, if it is straight-up illegal, that duty, the bucket 1 duty, to prevent illegal content would apply. It has to meet the test I just set out, which is quite a high test.
The Chair: With regard to voter suppression, trying to persuade people they could vote on different days, or that they needed to bring a form of ID they did not need to bring in order to vote, is clearly false. Where we would want additional clarity is for something like a deepfake film of a politician.
We saw that in America with Nancy Pelosi, when there was a film that made her sound as if her speech was impaired, as if perhaps she was drunk or had taken medication before giving a press conference. Facebook thought it was okay and YouTube thought it was bad, and took it down. The platforms had different policies. Films like that could have malicious intent, or the person who created them could say, “It’s satire as far as I am concerned”. I think there would need to be some clarification, particularly as we could easily envisage what we are used to in election campaigns anyway—grainy footage of people making unguarded comments being released to the media. We are living in a world where that could easily be created totally falsely digitally.
What would we expect a company to do in that situation? There is a tension between the provisions around knowingly false content set out by the Law Commission, and the protection of democratic speech, and probably the regulator should be able to give the companies some clear direction on that.
Nadine Dorries: I think Sarah has a comment to make.
Sarah Connolly: There are two things. The other thing we are doing in this space is talking to Cabinet Office colleagues about the Elections Bill and how that might look. There is another strand of work focused on exactly the things you are worried about, which is worth flagging.
The other thing is a slightly meta point—that was an accident, I did not mean to do that. There is something about the way this Bill has been constructed; it is all focused on systems and processes and changing levels of risk and harm. Where we are now seeing things like deepfakes, and where we had things such as up-skirting and down-blousing two years ago, the entire framework is able to flex and to bend, and to move towards identifying those risks, putting the onus on the platforms and the companies in scope to identify those risks, and then to do something about, it, with Ofcom’s support.
You are absolutely right: deepfakes are on the rise. Part of the complexity of this Bill—I was going to say convoluted—is deliberately to make sure that it has the flex, and that Ofcom has the flex, to identify that risk and to take action before it becomes a problem.
The Chair: I have one final question on this. Would it be fair for us to assume that, if offences are created in the Elections Bill going through Parliament, the regulator would regard that as forms of illegal content and therefore would have the power to require companies to comply with the Elections Bill?
Sarah Connolly: Exactly. If it is illegal offline, it is illegal online.
The Chair: Thank you.
Q284 Baroness Kidron: Secretary of State, can I first of all welcome the fact that the first on your list of three was protecting children, and that that has been a drumbeat through your evidence? Thank you for that.
I want to tease out some things around parents’ expectations of the Bill. Everybody said it is deeply technical and it is a bit complicated and so on, and perhaps some of our suggestions will be aimed at that. People in the wider world, particularly parents, expect children to be protected wherever they are online as a result of this Bill. At the moment, the scope is user to user and search.
As you well know, we have other regulation—the age‑appropriate design code—that has a broader scope. Why are children not covered in all the same places that the code covers them, particularly because that would give some sort of regulatory alignment for the ICO and Ofcom? They are very likely, in some of the sorts of situations that you have already described, to have to work jointly to take action. That is my first question.
Nadine Dorries: Thank you for your comments, and thank you for the work that you have done on the age‑appropriate design code. I think people said when you first did that work that the internet would crash. Actually, you have changed the way the internet works in your own way, in that every online provider, whatever they are doing, looks at your age‑appropriate code. I think the impact that you have had with that code is underrated. I want to thank you for that.
You are quite right; we have an aligned passion on keeping children safe. I do not want to see children’s innocence wiped away by an algorithm. I do not want to see them injured, harmed, bullied or any other of the adverse impacts that online use by children has on their lives by wrongfully promoted algorithms. Your work is essential towards achieving that goal.
I have to disagree with you about children on all platforms online. The first point is that we need to keep the scope of the Bill very tight in order to keep it watertight and effective, so that it works. This is not the Bill to fix all online problems and harms. It is important to say that. This Bill is not to fix the internet. This Bill is solely aimed at platforms that we know do harm to children. For example, as a busy working mum, I would often, when my girls wanted to buy clothes, give them my credit card and say, “Go on ASOS. Your limit’s 50 quid”. There is no harm that could happen to them on a retail site. There was no need for an age. They were using my credit card. Someone will tell me I was wrong to do that.
The Chair: I do not think you are the only person who has ever done that.
Nadine Dorries: That did not actually happen, of course. I am just using it as an example.
Baroness Kidron: I was going to ask to borrow your credit card.
Nadine Dorries: It never happened. Oh my gosh, what have I just said?
It is not appropriate that all sites that children can access have to have barriers for children to enter those sites. I would cite retail sites as one. We have looked at this. I do not want you to think that we do not respect the age design code. It is hugely important in the department, and it has huge respect, but we believe it would widen the scope of the Bill, which we are seriously concerned about. That is the legal advice. We do not believe that it would add benefits in a way that would equal the risks of widening the scope of the Bill. As I mentioned with pornography earlier, I am really concerned about that. That is not ASOS; it is something far worse. I want to reassure you that we are looking at age verification.
Here is an interesting point. When the Bill began, there was no way to endorse age verification online. The original proposals were that if you wanted to watch pornography you would have to go to the Post Office or somewhere, get an ID card, your porn pass, and then access your porn sites or whatever it is that consenting adults do online in their edgy corners of the internet.
Now, as the tech world is advancing in the way it is, age verification sites online are developing because, for online banking and other means, people now need to be able to have ID online so that they can verify their bank accounts and use them. The environment and the landscape are already moving in that way, which is why we are looking at age verification and how we can do more on age verification; there are now ways, using passports and other things, to verify age online. We are looking at that and any recommendations that you have.
I do not think we can bring retail and shopping sites in, and other sites on the internet that are not harmful. We need to focus on the big sites such as Facebook, Instagram, Twitter and the others where we know algorithms do harm, we know harmful algorithms are promoted, and we know that, as a result, children take their own lives, and self‑harm.
There is an interesting statistic that I am not sure the committee is aware of. Anorexia is the most fatal of all mental health diseases; one in four young women diagnosed with anorexia dies as a result. It is one of the most fatal illnesses that anyone can contract, and the seriousness of that impact is hugely underrated. One in four young girls who develops anorexia will die as a result, and there has been a 22% increase in the number of young girls diagnosed with anorexia. There is a bar that has to go into the anorexia bracket for those who have been diagnosed with anorexia in recent months. There is a very strong school of thought that that is around lockdown, that young girls spending more time on sites such as Instagram and Facebook, and wrongful promotion of algorithms, have produced that result. It is those harmful sites where this Bill needs to be tight in scope and focus to reduce the most harm.
Baroness Kidron: Thank you for the answer. There was a very famous case on Amazon, where, when you went for a school bag, you were offered a knife. People who bought this might buy that. The point that I would like to make is that all companies likely to be accessed by children already have to do a risk assessment.
Nadine Dorries: It is as a result of your work.
Baroness Kidron: Yes, and it is a bit of an irony that they will have to do a risk assessment without considering any harm, including harms by algorithms. I will come back to you on that, but I think it is an issue.
Nadine Dorries: Please put it in your report. Please make the recommendation. If you agree on it, please put it in the recommendation. We will very seriously look at it.
Baroness Kidron: Thank you. I am glad you raised the question of age assurance. One of my concerns is that many of the detractors of the Bill talk about the fact that age assurance will affect adult privacy. I was grateful to you for what you said earlier about adults; it is terribly important that we have minimum standards that ensure privacy-preserving age assurance. At the moment, there is nothing in the Bill that talks about minimum standards. There is, I know, an immense amount of work going on about voluntary standards.
I am concerned about two things. One is that we do not want just a standard for the provider; we also need to ensure that the services using age assurance do not demand more information than is required, but that it should not be voluntary. As you will have noticed, Melanie gave us a statistic that 50% on TikTok, which has an age limit of 13, are 10 year-olds. That is failure at such a vast level that voluntary no longer works.
Nadine Dorries: I know that you are referring to the huge amount of work that you are doing in your Private Member’s Bill on this as well. We have huge sympathy with it, but I go back to the substantive point. I want this Bill to work; I want it to be legally watertight and robust because it is novel and ground-breaking.
I go back to the original comment that I made. It may not be that we can do all of that in this Bill now, but the enhanced levels that I would like to propose in the Bill for parliamentary scrutiny as time moves on do not mean that we will stop. There will be recommendations that a committee with a role like yours will be able to continue to make as time moves on. Ofcom’s annual report will highlight the work that it has done and the problems it has encountered. There may be times at certain pulse points in the future when we can stand back, look at how the Bill has worked and say, “Did we get it absolutely right? Is there more that needs to be done?”
The whole point of the additional parliamentary scrutiny is that we will be able to say, “This will be work in progress. This will become an Act. This will become the face of the Bill. This will be law”. There will be times moving forward when we will be able to adapt the Bill as a result of the processes that we are going to put in place where enhancements can be made. Chris, do you want to come in on that?
Chris Philp: Thanks, Secretary of State. I have one or two points, which I will ask Sarah to confirm once I have made them.
On the question about age assurance with privacy attached, the existing data protection regulations—data protection is a separate topic from this Bill—mean that an organisation can only use data for the purpose for which it was collected. If data is collected either from an adult or a child for age verification purposes, it can only be used for that purpose and for no other.
On your point about TikTok, and indeed other services such as Facebook, which have a purported age limit of 13—we know, as you said, that very many children under the age of 13, in fact, use them—they will have to do the age‑related risk assessment as a duty under this Bill and then stick to the results of it, and that will be enforced by Ofcom. It will be very hard for TikTok, Facebook and all the rest of them to ever say, “We think it is okay for somebody under 13 to use our service”, when, for the last 10 years for Facebook, and two or three years for TikTok, they themselves have specified 13. If they repeat that age of 13 in their risk assessment, as I would expect them to do because they have been publicly saying it for years, Ofcom’s enforcement powers will apply.
In relation to all of these age matters, there will be some primary priority matters where people under 18 will be completely prohibited from looking at it, or more general priority matters where they have to have regard to age, which will be enforced with teeth by Ofcom. Those are all important points to make. I can see you want to reply.
Baroness Kidron: I am itching now.
Chris Philp: Yes, it is clearly contagious. Sarah, was that all?
Sarah Connolly: It was all right.
Chris Philp: Okay, good.
The Chair: I think Minister Hinds wants to come in on that as well. Briefly, please.
Damian Hinds: Thank you, Chair. Baroness Kidron, I want to re‑emphasise and amplify some of the points about the importance of age assurance. We are all paying close attention to your Private Member’s Bill and the debate and discussion around that.
From a Home Office point of view, I want to stress that age assurance is obviously important in the context of, for example, children’s access to pornography and some of the onward links there to violence against women and girls, and children developing, in some cases, very unhealthy attitudes and approaches. It is also very important in the context of curtailing child sexual exploitation and abuse, because with good age assurance it is much easier to spot cases where there is inappropriate contact from adult to child.
I might stray, if I am allowed to, outside my Home Office brief for a moment. When there is a nominal minimum age limit for some of these sites but every other kid in school is on them, it is quite difficult for a lot of parents to say to their child, “You’ve got to abide by the rules”. A lot of parents are looking to us to make it easier to set those boundaries.
Nadine Dorries: It is so much easier to say to your child that it is illegal to do that.
Damian Hinds: Yes; that it is just not possible.
Nadine Dorries: Yes.
Baroness Kidron: I am very glad about all those comments because I think they drive towards mandatory standards. I will come back to you on that. One of the things I am concerned about is that 400,000 kids are on TikTok, underage, 10 years old, driving to all the issues that you have talked about right now. We should not be waiting a moment longer—not a moment longer.
Damian Hinds: Baroness Kidron, to be fair, let us not isolate TikTok.
Baroness Kidron: No, I know it is not just TikTok.
Damian Hinds: We know from the Ofcom research that it is all social media.
Baroness Kidron: There was evidence we got this week, which is why I am mentioning it.
Nadine Dorries: We are on the same page.
Baroness Kidron: But maybe I am running a bit quicker.
Nadine Dorries: I would love to be running alongside you, but I have to have parliamentary counsel and write‑around. Getting this Bill workable, passable by Parliament, keeping the scope tight and getting it into law is a feat on its own. It is not the fix everything on the internet Bill.
Baroness Kidron: No, I understand.
Nadine Dorries: That just is not going to be possible. I would love to be running alongside you, and I am doing everything I can to take and to answer your concerns. That is why we are looking further at age verification. That is why we are looking further at children accessing commercial pornography online. I am going as fast as I can.
Chris Philp: On the speed point, we are looking at how we will implement this following Royal Assent. There are certain areas—the one we are discussing, for the reasons we all understand, is one of them—where we will try to find ways of expediting implementation. We had a meeting on that a day or two ago. We are very conscious of that.
I have a further supplementary point on risk assessments. We have constructed this so that there is no wiggle room for platforms that may try to fudge their risk assessment in relation to children. Ofcom will do its own sector risk assessment first, and the companies’ own risk assessments will be measured against that. This is a point that may have been made earlier. We will make sure that they cannot get themselves some sort of get out of jail free card by fudging or diluting their risk assessment. That will not be acceptable at all.
Baroness Kidron: Okay, that is good to hear. I have one more very specific question that I know is close to your heart, Secretary of State. As you know, we have heard evidence from a family whose child died, and they are involved in an internecine struggle to get the material from Facebook. This is also something that the Molly Russell case threw up.
Do you believe that codes of conduct about moderation and about report and redress should be mandatory, should have minimum standards and should allow bereaved parents to access the accounts of their children so that they know what their children were seeing in the moments before they died?
Nadine Dorries: That comes under the provision in the regulatory framework for platforms to have that transparency with Ofcom. I will give that to Sarah on the legal side.
Sarah Connolly: It is something we have certainly talked about before. My recollection is that it is not legally straightforward, which is part of the reason that—
Baroness Kidron: You need some help.
Sarah Connolly: It relates to various coroner cases, I know. We might want to have a separate conversation about it. It was something that we discussed in the department at one point, and there were some challenges in that space.
Baroness Kidron: I know that and I would appreciate that. This is the overall point that I was trying to make at the beginning. The minute that the next case comes up and parents cannot go to Ofcom as individuals and they cannot get the information from the tech companies, we are back to, “What was all that about?” That idea that children are safe—
Nadine Dorries: I can say what we would like it to be about. We would like it to be about platforms behaving responsibly, being transparent and abiding by their terms and conditions so that parents do not need to go to Ofcom. Parents can go to Ofcom, but we do not want Ofcom having a to and fro with parents.
I know that Melanie at Ofcom raised a point about an ombudsman, which is a slow and onerous process. We do not want to get into that. We want to get into making platforms behave responsibly as quickly as possible, under a legal framework, and that is what we are focused on. Hopefully, with the transparency and all the powers in Ofcom’s regulatory framework, we will get to that place quickly. With Ofcom’s horizon scanning and risk assessment, having been allocated £100 million, they should be on the case now, so that when this Bill becomes an Act, given that the opposition party and Keir Starmer have pledged to support the Bill—who would not want to support it?—and given that we want to make it an Act as quickly as possible, hopefully we will be in that position fairly quickly.
Baroness Kidron: I have two more questions. One is for Mr Hinds. The whole issue of making more things illegal creates a whole enforcement requirement. In a recent meeting, someone involved in enforcing child sexual abuse material specifically said that if they never got another new case it would take them 10 years to get through their backlog. There is an intrinsic resource problem. The more things you make illegal, the more you have to rely on enforcement. What happens then? Could you speak to that issue a bit?
Damian Hinds: I recognise the problem that you are talking about. On the other hand, there is a lot we can do to remove impediments, improve process and take away barriers that stop us identifying and bringing to justice perpetrators of these horrendous crimes. There is the use of ever‑improved automatic scanning for the hash-matching technology, for example, and making sure that there are good data flows from the companies. By the way, the co‑operation to date from a number of social media companies in this area is good, but we want it to keep improving. We want to close down those spaces. We need to keep striving ever more to do that.
Baroness Kidron: I was thinking more generally. As we—the Ministry of Justice—create more things that will be illegal now, the things we have talked about, how do you think the enforcement community will manage chasing after illegality if the platforms do not do it automatically?
Damian Hinds: It has to be some of both. There are today a lot of people dedicated to tracking down exactly those categories of crime. Crime is changing. The scale of child sexual abuse and exploitation online is staggering. Much of our approach relies on good working with the platforms. There is particular concern—I do not know if we will come on to it later—about the use of end‑to‑end encryption, which in principle is a good technology that helps to protect people online, but the way it can be deployed in a manner that closes down two of the absolutely fundamental tools that we have for identifying and removing child sexual abuse and material is a very concerning development that we need to act against.
Baroness Kidron: My final question is about having automatic transparency. One of the things that was very interesting about Frances Haugen’s testimony and other conversations she had while she was here was that she offered some thoughts about what might be useful for the regulator to know automatically. For example, if you took the 10 most viral pieces of information on Facebook every month, and Facebook compulsorily had to say what they were, she said that behind the scenes they would be fighting very hard to make sure they were not misinformation.
There are other automatic transparency pieces. Do you have the appetite for something in that regard? At the moment, in the way it is constructed, Ofcom has to require information. It has good information powers, and Melanie was very clear about that, but there may be a role for automatic transparency, which would go some way for the researchers to have access.
Nadine Dorries: We looked at Frances Haugen’s information very carefully as well. What she spoke about very powerfully were the transparency and risk assessments. I believe that is exactly the sort of system that we are putting in place with this Bill. I will hand over to Sarah because she may have something to add.
Sarah Connolly: For total clarity, there are two kinds of information-gathering bits that come in the Bill. The first are the information-gathering powers that Ofcom has—“We would like to know X, Y and Z”—lifting the bonnet. What you are talking about are the transparency reports that the category 1, 2a and 2b providers have to produce. It will be for Ofcom to decide exactly what will go into the transparency reporting. There is an interesting conversation to be had about what would be really helpful in that space.
We have done quite a lot of thinking already about the sorts of things. We have tried to do it in quite a conscious way. You might want the number of things that platforms removed to go way up as soon as the legislation kicks in because it says they are doing the thing that we want them to do. The chances are that they will be quite edgy about those numbers going way up. There is something about having quite a mature conversation with the regulator and the platforms about the things that will actually shine a light. The top 10 things that are going viral is quite an interesting idea that is reasonably available and might fall into that space. It is a conversation that we would want to have with the platforms and with Ofcom, to get a sense of what those little nuggets are and how Ofcom might use them.
Baroness Kidron: Okay. Thank you.
The Chair: Thank you. John, do you have a question on this topic?
Q285 John Nicolson: I do. I have a very quick question for you, Minister. Earlier, we were talking about Ofcom, specifically the code of practice and whether or not it could be changed to reflect government policy. I want a bit of clarification on that. When the modified code is laid before Parliament, will the House get a chance to scrutinise the modified code and debate it.
Nadine Dorries: Yes.
John Nicolson: Good. I just wanted to check that.
Nadine Dorries: When it is published, Parliament will scrutinise, and it will be debated.
Chris Philp: As I referenced in my previous answer to your question—the one that did not involve abusing the Secretary of State—
John Nicolson: There was no abuse of the Secretary of State.
Chris Philp: Clause 32 is the critical one. I will step through that again. The Secretary of State’s powers to direct Ofcom arise, as you said in your original question, under Clause 33(1). Let us say, hypothetically, that Ofcom tables a draft code of practice. The Secretary of State, let us hypothetically say, exercises her power under Clause 33(1) and modifies it. What then happens is that that code of conduct is considered under Clause 32(3) onwards. If you read through those, there is a 40‑day period. Parliament can resolve not to approve the code of practice.
John Nicolson: Parliament will get the chance to vote on it.
Chris Philp: Yes, as set out in Clause 32(3) onwards.
John Nicolson: Each change of the code will result in a parliamentary debate and then a vote.
Sarah Connolly: In the event of Clause 33(1), the Secretary of State’s power of direction, Ofcom would make modifications to the code of practice that had been submitted previously. It is an overarching power. All the codes of practice that would be impacted by any direction from the Secretary of State to Ofcom would then come back to Parliament, be laid before it and subject to scrutiny. I am not 100% sure whether or not—
Nadine Dorries: We can write to you to confirm that.
Chris Philp: It does. Clause 32(3) says, “If, within the 40‑day period, either House of Parliament resolves not to approve the code of practice”, et cetera.
Nadine Dorries: Okay.
The Chair: What would be the mechanism for Parliament not approving it?
Nadine Dorries: A vote.
The Chair: That is the fundamental question. On our reading of the Bill, it can be done through a negative SI without a vote. A Member could pray against the SI, but the Government are not required to bring it to a vote.
Sarah Connolly: We will double‑check, but I think that is correct.
The Chair: When we discussed this earlier in the session—
Nadine Dorries: We will write to you and confirm that.
The Chair: In terms of scrutiny of the process, one of the questions we have raised is that if it was done through an affirmative SI, which would then mean a vote and a debate, that would be a more transparent process.
Nadine Dorries: I think it may. There may be a consideration for a point that somebody—I think it might have been Baroness Kidron—raised, which is that it may be required to act rapidly.
The Chair: Yes, indeed.
Nadine Dorries: I am trying to think of an example; I suppose, online radicalisation. There may be a need for the Secretary of State to guide Ofcom with powers to deal with an issue that is arising and needs immediate concern. We need to go back and look at it.
This might lead me into making the point that there are those who have made the comparison with this Bill and online radicalisation and others. I think I want to mention our colleague, David Amess. There are people who said, following David’s horrific, appalling and desperately upsetting murder, that this Bill should not be conflated with David Amess’s murder. That is not the case. David’s murderer was radicalised online, and therefore it is not right for anyone to make the assertion that those two instances should not be conflated. They are very much linked. There may be times when those powers would be needed under a negative SI when it looks towards other things that happen online that are dangerous.
The Chair: It is absolutely right to say, as you set out, that there will be circumstances, either because of a threat to an individual or national security, where the Government want to move very fast. The London riots a few years ago would be a situation like that. I could easily envisage that. The question is whether, as part of normal procedure, amendments to the code should be subject to an affirmative process of the House where, as you and the Minister described earlier, it would be laid before the House, the House could debate it and there would be a vote at least in committee if not on the Floor of the House.
Nadine Dorries: We will look at that.
The Chair: Thank you. Very patiently and observing us from a distance has been Jim Knight. Jim, over to you.
Q286 Lord Knight of Weymouth: Thank you very much. Secretary of State, I very much welcome your stance and the inclusive way in which you want to deal with this with the committee, although I worry slightly that being a member of the committee might now be a life sentence, from what you said.
Nadine Dorries: You were up for it.
Lord Knight of Weymouth: I am also hearing from you that you want the Bill to be watertight. I am aware that you have been bounced by the Prime Minister into having to introduce it this year.
Nadine Dorries: That depends on you.
Lord Knight of Weymouth: Of course. We are all aligned. We are on the same page on the broad principles of the Bill and what the Government are trying to achieve, but there is a bunch of detail where the committee might want to come forward with changes. Do we have time for those? You have a Bill Minister who is clearly all over the Bill. He is chomping at the bit to go out and defend it and get on with it.
Chris Philp: And improve it.
Lord Knight of Weymouth: Do we have time to make significant changes if that is what this committee wants you to do?
Nadine Dorries: That is a very valid question. Thank you for that. Minister Philp is full of energy. He is no different from you when you were a Minister. I remember you were very much the same. It is great. We are a good double act. It is very good that he is really into the technical side of stuff, and that is fantastic.
Yes is the short answer. When I arrived in this role, the first question I asked was, how long do we have to make this Bill tougher, stronger, watertight and really impactful, and how do we get this Bill to solidly protect children? That has to be the No. 1 principle on the Bill: how do we protect children? I was told, “You have four months, Secretary of State”. That clock is ticking down now. It is two‑and‑three‑quarter months.
That is why I am urging the committee to get your recommendations to us as soon as possible. We have been watching the evidence sessions. Officials have been taking away already. I have mentioned the areas where I want further movement. There are other areas. I do not know if anyone will ask me about anonymity, but that is something else that I am looking at. We are doing lots of work.
The Chair: Yes, I know.
Nadine Dorries: Poor Minister Philp has been thrown into his office to deal with it and make it work. The poor officials are also working round the clock to get it to work, but we are down to two‑and‑three‑quarter months. The Prime Minister wanted the Bill to come in before Christmas. He probably felt the committee was going to report much earlier than it was. The date given to us by you when we checked was 10 December. Yes, the Prime Minister was right. Yes, we can get this Bill on the Floor of the House before Christmas, but that would not give us time to incorporate your recommendations. This is an instance and a situation whereby your recommendations are part of the teamwork in improving this Bill. We want to take your recommendations, and, wherever possible, subject to the hoops that we have to go through, we can incorporate them in the Bill.
We want to do your recommendations justice and get them to parliamentary counsel. Jim, you know how that works. We will give your recommendations to the parliamentary lawyers, and they will say, “Oh yes, but what about?”, and that “what about?” takes a while.
The analogy of the aubergine was given to me. We want to ban aubergines. Parliamentary counsel come back and say, “Ah, but what about the sellers of seeds of aubergines?” Those are the kinds of conversations we will go through.
Getting your recommendations as quickly as possible, and getting them to parliamentary counsel to work on as fast as possible and to write‑around, means we get a Bill to the Floor of the House. So it is really over to you, guys. How quickly can you get the work to us?
Lord Knight of Weymouth: Undoubtedly, we agree that we want to get this implemented as quickly as we can in order to protect children and adults from harm.
The Bill is long. It takes a while to read it. It is complicated. One thing that could simplify it would be to go back to the position in the White Paper and have a single duty‑of‑care approach rather than the multiple safety duties—
Nadine Dorries: It does not work. We have tested it legally. It does not work. How do you define a duty of care? How do you define all the requirements that would come under that duty of care? We would love that to work. I spoke to Secretary of State Jeremy Wright and others, who worked on that overarching duty of care in the Bill, but it does not pass parliamentary counsel unfortunately. If it does not pass the lawyers in Parliament, we cannot do it.
Lord Knight of Weymouth: If we want to come back with recommendations around a single duty of care, we need to work pretty hard on getting it legally watertight for you.
Nadine Dorries: You do. The definitions within that duty of care are huge, onerous and difficult legally to make tight and applicable. I am not going to tell you what to do, but I would probably put your efforts into other parts of the Bill, because we have already been there and we know that it would be almost impossible to get that into the Bill. Jim, that is why the Bill is so technical. That is why I am so grateful to have Minister Philp. That is why the Bill is so long. It is a technical, long Bill, but in order to meet the criteria of watertight it has to be.
Q287 Lord Knight of Weymouth: Okay. I think I can anticipate in that vein your answer to my next question. Also in the White Paper that has now been rowed back on was the inclusion of societal harms. In recent months following the European Championships, we had the issues of racism in football. Following the murders in Plymouth, we had the issues of incel, misogyny and attacks on women and, most recently, anonymity. Every month, something else crops up for which the answer is this Bill. Are you confident that excluding societal harms is the right answer in terms of the expectations around what this Bill can do?
Nadine Dorries: You rightly anticipated what my answer was going to be. I am sorry, Jim. I suppose I should now refer to you as Lord Knight. I do apologise.
Lord Knight of Weymouth: It is fine, Nadine.
Nadine Dorries: I have known you for such a long time. It is for the same reasons why the legal definition of societal harm is too difficult to put into law. We want a Bill that works. I am not sure if you were here during the passing of the Dangerous Dogs Act in 1992. That is the rock that no one wants to die on again. If we put societal harms into the Bill, I am afraid we would not be able to make it work. We have looked at it. We have explored it. We have probed it. Legally, it is just a non‑starter, I am afraid.
Chris Philp: It is worth stressing that where the content is—
Nadine Dorries: Here he goes.
Chris Philp: Thank you for your kind remarks earlier. I agree that we make a good double act. Where the content is illegal, that is in scope of the first pillar. Equally, if it can be demonstrated that the disinformation/misinformation content causes physical or psychological harm to individuals, again that will fall into the scope of the third pillar—the legal but harmful. It is worth saying that, to that extent, the content could be included if it meets one of those two criteria.
In addition to that, outside legislation, we have the cross‑Whitehall counter disinformation unit, which is led from DCMS but draws from the Cabinet Office, the FCDO and others.
Lord Knight of Weymouth: Will that become permanent? There is a question as to whether that is a temporary or permanent unit.
Chris Philp: I think Sarah is responsible for it, so she is well placed to answer.
Sarah Connolly: I am responsible for it. It has been stood up now continuously for just over two years. Unless the Minister or Secretary of State want to tell me anything otherwise, I do not think we are intending to close it imminently.
Chris Philp: We certainly want to continue with its work, which is more important now than ever. I spoke at a conference a couple of days ago with representatives from 15 like‑minded countries across North America and Europe about how we need to work together to counter disinformation, and our unit’s work is critical to that.
Lord Knight of Weymouth: Thank you. Within this area, my perception is that a lot of the problematic content comes through in private groups within these social media platforms. Are you confident that there are enough powers for the regulator to be able to access private messaging, private groups, which is where a lot of this harmful content is being perpetrated?
Nadine Dorries: I will hand that question over to Minister Hinds. Would you be happy to take that?
Damian Hinds: Yes, sure. Lord Knight, you are absolutely right about the central role of private channels and a further trend in that direction. The provisions in the Bill on Ofcom’s powers and on the responsibilities of the platform do not change as a result of content being on a private platform, or private part of the platform, versus a public part.
There are three stages. There is public, there is private, and then there is private and encrypted, and therefore impossible even for the platform itself to see. But the responsibilities are the same in each of those cases. The bespoke technology, systems and processes, approaches and solutions may be different, but the responsibility remains.
Lord Knight of Weymouth: While I have your attention, Minister Hinds—
Damian Hinds: You always have my attention, Jim.
Lord Knight of Weymouth: Are you also happy that there are enough powers, or do you think it would be helpful if the Bill had more powers for the regulator to work with other regulators, in particular the security services, in respect of the security needs of the country and being able to interrogate that sort of content?
Damian Hinds: The need for partnership working is clear. That does not start when this Bill gets on the statute books; it has started already. We as the Home Office play our part in putting people together and making sure things are linked up and joined up. But, no, you are absolutely right. Ofcom in this new world will have a very big responsibility on behalf of us all in some of the most sensitive and most important areas of society. So, yes.
Q288 Lord Knight of Weymouth: Thank you. I have two other quick questions. It is a follow-up, Secretary of State, in respect of the answer that you gave in respect of an ombudsman. I accept that ombudsman processes can be a bit tedious, tortuous and long-winded. Given that Ofcom as a regulator will not be dealing with individual complaints, what form of redress do you think we need to open up for your constituents and the public to be able to complain more easily if we are not going to have an ombudsman route?
Nadine Dorries: The job of Ofcom is to ensure that the platforms themselves respond in a way that is fast and positive to individual complaints. We all know in this room how it works. You will receive something that is highly abusive—death threats.
Following David’s death, Mark Francois made a speech in the House where he spoke about an MP who had received a threat that she would be locked in a burning car and the sender of the threat wanted to watch as the flesh melted from her face. That was me. It took Twitter a very long time to track down who that person was. It was someone sitting in a dormitory in Oxford University. It took a very long time, and they were very unresponsive and refused to remove the content until the police became involved. That is not good enough for me and for everybody else who makes complaints to platforms about content that is illegal, and that is legal but harmful, and meets that criteria. The role of Ofcom is to ensure that platforms remove content rapidly and responsibly, behave responsibly and transparently, and abide by their terms and conditions.
When this becomes law and receives Royal Assent, the powers within this Bill and the powers within the regulatory framework that will be bestowed on Ofcom will ensure that individuals will have the right responses from those platforms, which will have to act quickly and responsibly and bear in mind the consequences if they do not. The consequences if they do not are a fine. Ofcom will have the ability to investigate, to seize equipment, to enter premises, and there are the final criminal sanctions that rest with, I believe, me, Secretary of State. If that content is left online and results in harm, particularly to children, those criminal sanctions will be enacted.
We cannot have platforms promoting algorithms, facilitating cyberbullying or allowing content that causes children, particularly, to take their own life, to self‑harm or causes them psychological or physical injury. That is why I made the comment to Mark Zuckerberg and Nick Clegg that this will be law. “You need to act now. You need to use the powers that you have now to respond to people when they contact you and highlight what is going wrong on those platforms right now”.
Lord Knight of Weymouth: For clarity, once this is implemented and all those sanctions create the behavioural effect that you want from the platforms, if I am still not happy with how my complaint is dealt with by a social media platform, I should come to Ofcom and say, “You are not fulfilling your duties properly”, and Ofcom needs to be resourced to deal with all those incoming people who are disgruntled.
Nadine Dorries: There are a number of layers to that question. The first is, yes, individuals can go to Ofcom. So can organisations. If you are running a women’s refuge, and you are subjected to heaps of misogynistic abuse online, that is the super‑complaint that can go to Ofcom, and it deals with that.
How do we measure this? Ofcom will be able, through transparency powers, to know how many complaints were submitted to platform providers, how many they acted on and how quickly. That will all be part of its reporting process to Parliament. That will be subjected to levels of parliamentary scrutiny where—I refer back to your life sentence—if they are not acting fast enough we can do further work. I am hoping that platforms—which can fix this problem now, which have the powers now; they do not need to wait for the Act—start doing it now.
Q289 Lord Knight of Weymouth: My very final question in a way is broad. We have talked almost exclusively about user‑to‑user content on social media platforms in this session. Search is also in scope. What are your concerns about the harms of search?
Nadine Dorries: Searches are in scope. They are not category 1, are they?
Sarah Connolly: It depends.
Nadine Dorries: Some of the answers to these questions are so difficult. That is a technical question. I will hand that over to Mr Philp.
Chris Philp: Search could fall into category 1 if it meets the criteria set out, which is, as currently drafted, both of a certain scale and of a certain potential to cause harm. One could, of course—I mention in passing—consider changing that “and” to an “or”. Beyond that, there might also be a requirement to meet the transparency demands for search under category 2A, which means they have to disclose more information.
Lord Knight of Weymouth: There are issues relating to search engines and advertising, but I know my colleague, Dean, will ask about advertising, so I will leave that to him.
The Chair: Dean Russell.
Nadine Dorries: Moving swiftly over.
Q290 Dean Russell: I am mostly going to be asking about advertising. May I begin by thanking you, Secretary of State, and all of you for your evidence today? It is incredibly welcome that you are all supportive of amendments and strengthening this Bill, because that has very much come through in our witness statements.
On the mental health side—I have seen first-hand the brilliant work you have done around mental health and your passion for it—one thing that has come through in our witness statements, in particular from Martin Lewis from MoneySavingExpert, was the risk of scam ads, and the fact that, at the moment, you could do user‑generated content that is harmful and would be covered by this Bill, but if it was pulled you could do the same content but put 10 quid behind it and it would be an ad and would not be covered by this Bill.
Could you explain first why online advertising has not been included? Secondly, is there scope for us to make suggestions around that within the Bill?
Nadine Dorries: I will hand over to Minister Hinds for that, but I will make a number of points. I did say this is not the fix the everything on the internet Bill and that we need to focus on protecting children, illegal content, and legal but harmful content. That has to be our tight scope on this Bill. We cannot fix everything, but we are doing work on online advertising. That is happening. What you talk about is so important that it needs a Bill of its own because it would be lost within this Bill and it would widen its scope too much. I have full sympathy, but this is not the Bill for that. I will hand over to Minister Hinds on that.
Damian Hinds: Thank you very much, Secretary of State, and thank you, Dean, for bringing this up because it is an incredibly important area. We have seen huge growth, of course, in fraud. As other crime categories have been coming down for a long time, mercifully, I am afraid some criminals have moved on to the internet and a different type of crime—distance crime—with typically no witnesses, and it can often be perpetrated from abroad. It is a very different set‑up altogether.
We brought user‑generated content fraud into the scope of the Bill. That is important for certain categories of fraud such as romance fraud, which is a particularly horrible type of fraud. But I am also very conscious of the role of paid advertising in other categories particularly of fraud, and we need to bear down on it.
The question is not whether we bear down on it, but how best to bear down on it, whether it is in this piece of legislation or elsewhere. We are very alive to the need to move. I hear very frequently and regularly from the financial services sector as well as from individuals—all our constituents from whom we hear depressingly often—about having been victims or their parents having been victims of scams, as you rightly say. We need to work more on it. The question is whether this is exactly the right vehicle or something else is more appropriate.
Dean Russell: Thank you. May I expand on that very slightly? I appreciate that you do not want a catch-all and also that it creates loopholes. From the evidence we have had from the platforms, Facebook in particular, I did not come away from that feeling that they were going to do everything that we need them to do. I agree with comments of other colleagues that we should not really need this Bill, to be honest, and we are having to force them down that route. Their income comes from advertising. That is their sole area for that.
Even if we cannot put recommendations specific for the Bill, would it be helpful for us to put together some thoughts about how other elements could play into aligning this Bill with the need to protect people from scam ads? I do not know whether we are allowed to do that within our recommendations. There is such a quick shift from user‑generated content to being able to turn it into advertising for the same content by putting money behind it. If we cannot put it in this Bill, we could make recommendations about how that could work in another form.
Nadine Dorries: We are interested to see any recommendations. Can I clarify on the fraud advertising element? This Bill will tackle user‑generated advertising—for example, someone going on to Facebook and saying, “Join my Ponzi scheme”. If it is generated by a user on Facebook, individuals, it applies. It also applies to romance scams. As we have all read and all heard, there are some awful stories about what happens there. So it does also include those.
It does not exactly exclude advertising; it just includes the advertising that we know we can make work within the scope of this Bill so that this Bill does what it is supposed to do and what it says on the face of the Bill.
In DCMS, we have an online advertising programme that we are working on. Any recommendations that you give to us in your report would go as part of the consideration of what we are doing in the online advertising programme. They would not be wasted recommendations; They will be used.
As I said, scam advertising is such a big piece of work in a Bill. I worked with Martin Lewis before when I was in mental health. I had the wording of a letter changed that bailiffs sent out to an individual so that it did not cause distress. I understand his intentions are always really thoughtful and genuine. But in the context of legal application, we have to absolutely guarantee that what we put in the Bill works, and this Bill and the scope of this Bill is about protecting children No. 1, illegal content No. 2, and legal but harmful. We need to make those adverts illegal, and that needs to be, as the Minister said, its own piece of work in another department. But please bring your recommendations to us.
Chris Philp: Dean, to add to that, if there is anything in this space based on the evidence you have heard that you think is really targeted at a particular part of the advertising space, maybe in connection with fraud, and is very important and very urgent but self-contained and targeted—it is not too expansive because that would run into the issues the Secretary of State laid out—suggest it if you see something urgent, important and targeted.
Nadine Dorries: Also be aware, we are not going to say no because we do not like it. I have absolutely 100% sympathy with the deepfake videos and the advertising. Absolutely, I would love to include them in this Bill. I would love to put what Martin Lewis is campaigning on in here, but my legal advice is that it would not work and it would extend the scope of the Bill in a way that would not be appropriate and would not meet the objectives of the Bill, which protects children. That is the only reason I am not including it. It needs its own Bill.
Dean Russell: I understand. Thank you.
Damian Hinds: To be clear, fraudulent advertising—advertising of scams—is illegal. It is illegal whether it happens on Facebook or it happens on a street corner. That remains the case. The question is about what is in which piece of legislation. Chair, your terms of reference are to do with what is in the Bill and what you think should be in this Bill.
The Chair: Indeed.
Damian Hinds: You are at liberty to say when you think something should be in this Bill. The Secretary of State is saying, quite rightly, that we need to think about what will be most effective. My overarching principle on this is that we need to align people’s objectives. Banks, for example, lose quite a lot of money on the back of having to recompense people for having been defrauded. As you, Dean, rightly identified, certain platforms make money out of advertising that may itself then contribute to that defrauding, so we need to make sure that everybody’s incentives are the same to minimise these horrible acts preying on vulnerable people and preying on everyone like us. It is a crime that can and does go into every segment now of our society.
Dean Russell: There are a couple of areas around that that I would like to turn to, not advertising but algorithms. I know we talked about it earlier. One thing that has come through in the evidence to me as well has been the increased addictive nature of algorithms. It seems to me that, even though the witness from Facebook said that it has never attempted to create algorithms that are more addictive, we see that they are, especially on things like Instagram, TikTok and so on. One area where there seems to be a massive gap is in research or the publication of research into harms to children, and teenage girls in particular, around things like dopamine levels in the brain and all of that.
Those different platforms vary in their take on whether they have done research or whether they have research. Is there anything within the scope of this Bill where we could require them to publish their research related to algorithms and the impact that has had on the psychology of children?
Nadine Dorries: I do not know if we can compel people to publish research in this Bill. That would extend the scope of the Bill and be one of those offshoots or side acts that would dilute the Bill. It is one of those things that would get debated in Parliament, and I think it would be difficult. But you are absolutely right. Research is done particularly into the effect on body image on young girls and on anorexia, but there is not enough. I would certainly love the evidence on that because that would be more power to our elbow.
It is important to state again that people will say you are giving extraordinary powers to Ofcom. The reason why we are giving it extraordinary powers is because of our fundamental question of algorithms. The platforms are the car. Lifting the lid of the bonnet and getting underneath the engine is the algorithms. We need to give Ofcom those powers in order to bust the myths around algorithms, to demand the transparency on algorithms and what algorithms platforms are using. They hide behind smoke and mirrors. They literally hide behind them. They look at you and say one thing, and behind the mirror there is something else going on. We know that for a fact. You have taken evidence on that yourself. That is why Ofcom needs those powers.
I hope, as a result of this, once we get more evidence on the algorithms and once we have more information, that maybe that in itself will stimulate more research into the impact of those algorithms.
Dean Russell: Hopefully, perhaps the Joint Committee that you have mentioned could look into that further as we go forward as well.
Sarah Connolly: We would expect Ofcom to publish its own research in this space. It already does fairly regular research under its existing powers. We would expect it to do more in this space and make that publicly available in addition to what the Secretary of State has been saying about the transparency and information-gathering powers.
Q291 Dean Russell: That would be incredibly welcome. Finally, because it was mentioned earlier—not so much a trailer—I know that anonymity was something that I wanted to be discussed, which has definitely come up within our witness statements very regularly. There are different schools of thought that have come up. Do you have complete anonymity because that protects whistleblowers; or do you have an element of anonymity that you could call traceability, in which case there is a known individual but they can use whatever name and profile they want; or do you continue anonymity as it stands now?
What are your thoughts with regard to this Bill? My take is that, based on the statements so far, traceability seems like the best of those three options. What is your take on that?
Nadine Dorries: First of all, it is important to say that this Bill will end anonymous abuse because it will end all abuse, whether it is by Joe who is not really Joe, or Sally who is not really Sally. It will end all abuse.
It is important to say that anonymity online is important for some groups and some individuals—Afghan refugees; people who are in this country for various reasons legitimately and are working and need anonymity; whistleblowers; and domestic abuse victims. There are many groups who go online and need anonymity. I do not think we can ban anonymity. We are looking at ways—and this is something else we are already looking at, and I do not know if you are doing recommendations on this or not yet—that we might go further in allowing people to choose whether or not they can see anonymous accounts.
I do not know if you have seen this, but recently the FA is providing guidance to its own football players on how they apply filters so that they do not see racist abuse or abuse with key words in it. We are looking at how that can be applied so that people do not see anonymous accounts. Traceability is key in that.
There was a reason how and why that student who posted that vile tweet about me was traced down to a dorm room in Oxford University. We know that Twitter knew exactly who they were, where they were, what their IP address was, and what their location was. We know they have this information already. What we are looking at and the way we want to go further is that individuals can decide that they do not want to see anonymous accounts. That is what we are doing on that. But we cannot ban anonymity. There are lots of people who need that. I know there are people who say you must. Interestingly, there are civil libertarians saying to me, “You need to ban all anonymous accounts”. I do not think we do. What is under that anonymous account is the platforms knowing who those people are and us looking at ways we can stop people seeing those accounts.
Dean Russell: Absolutely.
Chris Philp: I want to endorse what the Secretary of State said about traceability. It is important that, where law enforcement agencies need to find out who is behind the account, they can do so.
Nadine Dorries: Quickly.
Chris Philp: Exactly. First, not just in most cases but in all cases, and, secondly, quickly. I had a constituency case a couple of years ago where there was some terrible abuse of a family whose son had been brutally murdered, and there was some abuse being sent to the victim’s 17 year-old sister via Snapchat. To get Snapchat to reveal to the British police who was doing that proved very difficult and took a very long time. Eventually Sajid Javid, who was Home Secretary at the time, ended up getting it sorted out, but he had to phone the American Justice Minister to do it. It should not be like that at all.
Damian Hinds: I want to stress that it is important that we re-emphasise this. People operating anonymously should not think that they are untouchable and unfindable. We can and do find people through the Investigatory Powers Act and the Police and Criminal Evidence Act. There are powers to do so. On the dark web, we have an important investigatory capability.
The key point is the one that Chris mentioned. It is about speed and ease. There should not be impediments to finding people in this way.
There is also a further thing, which is sometimes the perception that when somebody is anonymous it is difficult for law enforcement to track them down. It probably impedes some complaints being made in the first place, so it is all about that facilitation.
Dean Russell: I have one final, very quick question. It is very quick because it is about—
The Chair: “Very finally” has become one of the expressions of this committee.
Q292 Dean Russell: It is very final and quick because it is about immediacy, if you will excuse the pun.
One of the things that I am very conscious of is the expectation that, when this Bill comes in, if it comes in on a Monday and it is all law by Tuesday, all harmful content will disappear off the web or off social media. The sense I get from this is that it will take a bit of time. Even Ofcom said it will take some time.
What is your sense of timescales? When this comes in and it becomes law, how long do you think it will take for platforms to put into place all the stuff that is needed to stop harmful content?
Nadine Dorries: The short answer to that is they have the stuff in place already. I heard the evidence of Ofcom saying it thinks it will take three years. I am afraid that is not the case. The considerable funding it is about to receive, the work on risk assessment moving forward, horizon scanning, understanding what platforms are doing, and where the risks are is all happening now.
As I said, Dean, platforms do not have to wait for this Act to come in. This Act will come in. Named individuals will be criminally liable. This Act gives us the powers to issue huge fines and for Ofcom to seize equipment.
If I were running a platform now, I would be looking at my terms and conditions, looking at how my algorithms work, and putting in place what I needed to put in place now. I would not be waiting for this Bill to become law. It is not going to take time. This Bill will be up and running from day one as soon as it receives Royal Assent. It will be up and running. Ofcom will be working, it will be ready to go, and it will move from day one.
Dean Russell: Fantastic, that is great to hear. Thank you, and thank, Chair.
The Chair: Thank you. Before we close, I want to pick up on a couple of points related to the discussion on advertising. Chris Philp, you said last year to the Home Affairs Committee that people traffickers advertising on Facebook for clients was morally reprehensible. That would seem to meet your criteria of being something that is a known threat, a live problem and something you want to act on. How comfortable would you be presiding over the implementation of a Bill that allowed Facebook to carry on taking money from people traffickers?
Chris Philp: I absolutely stand by what I said to that committee when I was immigration compliance Minister. Organised immigration crime is abhorrent and is putting lives at risk. Only yesterday, somebody drowned in the English Channel. As the Secretary of State said, Facebook and others can and should take action today. Whether it is through this Bill or the online advertising programme, we will force them, shortly, to stop doing this. If they have so much as a shred of decency, they will voluntarily act now today or tomorrow rather than waiting for these Bills to come into force.
We are completely committed, as a Government, to making sure that this stuff is sorted out. There are a number of ways we can do that. One is this Bill, but we need to be mindful that we do not expand the scope too far. As I hinted in my comment on the back of one of Dean Russell’s questions, if there are particular areas in the paid-for advertising space like fraud—and you have mentioned there may be another there—where you and your recommendations think we could take a rifle shot that picks on a particular area without blowing the whole scope up, we would be very interested to hear about that.
One way or another we will stop this. The more we can do here the better, but we do have to be careful not to stray too far outside the edge. There may be some areas, and you may recommend some areas, that are very specific and targeted that we can take on.
The Chair: I can recommend one now. I take my leave from the Secretary of State on this. It is illegal content. I asked Sarah Connolly earlier whether the offences in the Elections Bill would be enforced through online safety because they are illegal, and she said, yes, they would. That stipulates that foreign powers advertising to target UK voters during an election is an offence. If the Russians buy ads to target voters in the UK, that is an offence. In that situation, does the regulator say to Facebook, “You can’t accept those ads”, or, “You’ve got to take them down because that is a criminal offence set out in the Elections Act”, or does the regulator say, “Unfortunately, as this is advertising, I can’t give you any guidance on this”?
Where we have existing examples of illegality and the Bill stops illegal content, is the fact that it is an ad a barrier to enforcement? It is unclear as to what that would be. I speak as an ex-ad man. Asking Ofcom to take on the responsibility of assessing every claim made in every advert in the country would be an impossible task. We have systems in place to do that anyway. We do not need to disturb them. Where it is content that we know is illegal, should the regulator ask those platforms to remove that content as well? In some cases, as the Secretary of State said, it will be user generated. On Facebook, you have to have a Facebook page to run a Facebook ad. That is where the focus should be, saying, “We want illegal content removed and we do not care whether it is an ad or it is something that someone’s posted. It should all be in scope”.
Nadine Dorries: Damian wants to answer.
Damian Hinds: I want to reassure you that from both of our departments we continue to look at this. These are active areas of discussion. A lot of the categories that you are talking about would be user-generated content anyway rather than paid ads. A lot of the immigration crime, for example, would be UGC.
The challenges you raise are absolutely fair. They are not the first time they have come up. Knowing where to draw that line and how you can deal with this harmful and sometimes illegal content in advertising while not having unintended consequences through the rest of advertising, bearing in mind, of course, that advertising has very substantially moved to being an online activity through a combination of search and programmatic, can be tricky. But you are absolutely right to raise the issue.
The Chair: It is really a question of enforcement. The advertising industry does not allow illegal ads to run on other media, and the media owners will not accept them. We are dealing with traditional media where there are a far smaller number of ads being run and a far smaller number of advertisers. The job is easier. In this case, it is a question of illegal content or even ads that the ASA alerts the platforms to through its scam ads process. The platforms say they will take them down, but there is no requirement that they should.
Even in the scope of this Bill looking at the power of the regulator to make the companies enforce their own terms of service, both Google and Facebook said to this committee last week that their platform policy is not to allow illegal ads. On that basis, they could be done under the enforcement of their platform policies. The fact that this content is illegal anyway should be enough to trigger it.
Nadine Dorries: It is the job of Ofcom to hold them to account on their terms and conditions on their own platforms.
The Chair: Either way, through illegal content or implementation of terms of service, it should be looked at. It is something that the committee feels quite strongly about. It is something we have taken quite a lot of evidence on, and I am sure we will make recommendations on that to you. If Ministers are getting advice from government lawyers that there are legal problems to this, in as far as it could be shared, we will be interested to see what those arguments are so that we can consider that before we make our report.
Nadine Dorries: That may not be possible. It may be easier if you make your recommendations. I do not think your committee disbands until the Bill comes before Parliament. Am I right?
The Chair: It is a good question. David, do you know? I am advised by the clerk to the committee that it ceases to function once we have reported.
Nadine Dorries: Really?
The Chair: Yes.
Nadine Dorries: Put it in the recommendations. That is all I can say at this point.
The Chair: Okay, thank you. We have a final question from Stephen Gilbert.
Q293 Lord Gilbert of Panteg: I have two very brief points. One is on language. It strikes me that we talk about two things here. We are talking about regulating content largely by making it illegal and having a duty on platforms to take it down, and then we talk about legal but harmful content. Actually, the harmful stuff is not generally content; it is the activity of platforms and the way that they behave. Would a better definition not be to talk about illegal content and harmful activity so that it is clear? We may come back to it in the Bill.
Nadine Dorries: Lord Gilbert, we do. It is illegal content and legal but harmful content. Content that causes psychological harm or physical injury but psychological harm is deemed to be legal. There is no existing law in the Bill that it pertains to.
Sarah Connolly: The Bill is one of these tricky things. Fundamentally, it is about systems and processes. Then it is about how those systems and processes function in regard to illegal content, legal but harmful, and children’s services. It becomes very difficult very quickly because you get into the content by the very nature of the discussion, but it is the systems and processes. At the moment, the systems and processes do not function in a way that stops illegal content on platforms, stops legal but harmful, which is an agreement of their terms and conditions, and protects children in the way that we would like. I have found it helpful anyway to keep coming back to this: it is systems and processes, and content is the manifestation of it.
Baroness Kidron: In light of the metaverse, which you point out, and VR and AR and experiences and so on, I am worried about the word “content”. If it said “content and activity” rather than “content”, we would be reassured. That was in the full government response, as you know, Sarah. We have had so many people misunderstand the Bill because they think it is a content moderation Bill. If you could go back to your own language that said “content and activity”, I think it would relax a lot of people, not least the committee.
Nadine Dorries: We will certainly look at that. I make the point that this Bill also applies to the metaverse.
Baroness Kidron: Indeed.
Nadine Dorries: It was very useful that Facebook relaunched its new ideas in time for this Bill to become law.
Q294 Lord Gilbert of Panteg: That leads me, Chair, to my very, very final, final, final point, which is scope. You have talked a lot about scope today. You are quite right; we could carry on piling loads of stuff into this Bill and it will never appear on the statute book. You have been very generous about the role of parliamentary scrutiny.
Do you see a much broader role for the Joint Committee than just scrutinising the implementation in the future of this Bill to looking across the piece at all of the regulatory work in the digital area? For example, competition law has a big role to play in the way that platforms—
Nadine Dorries: Can I stop you there? I do not think we would be able to put that in the Bill because we would be asking for something in the Bill that does not—
Lord Gilbert of Panteg: No, I am not asking you to put it in the Bill. Would you think that that is something that the Joint Committee could do on an ongoing basis—that is look across the piece and not just at the Bill?
Nadine Dorries: For the point of legals and the purpose of the Bill, I am considering requesting within the Bill that a Joint Committee or a committee—it will be a Joint Committee from my perspective because I really value the expertise from the Lords, and I think it needs to be across both Houses—is established for the purpose of parliamentary scrutiny moving forward on what is a novel piece of legislation. There has never been anything like this before, and we are legislating for an ever-shifting landscape. I have to keep the focus that that is the reason. What the Joint Committee then does as part of its purpose of establishment would be over to you. There is a case to be made that that would be very useful in terms of ongoing scrutiny.
Lord Gilbert of Panteg: We will make that case.
Nadine Dorries: Yes.
The Chair: Would the department regard the metaverse as being a user-to-user service?
Nadine Dorries: Absolutely. This Act, we have been advised, would apply to the metaverse too, so there is no hiding place.
The Chair: Very good. Thank you very much. That is the end of the session.