final logo red (RGB)

 

Communications and Digital Committee

Corrected oral evidence: Freedom of expression online

Tuesday 11 May 2021

3 pm

 

Watch the meeting

Members present: Lord Stevenson of Balmacara (Acting Chair); Baroness Bull; Baroness Buscombe; Viscount Colville of Culross; Baroness Featherstone; Lord Gilbert of Panteg (The Chair); Baroness Grender; Lord Griffiths of Burry Port; Lord Lipsey; Baroness Rebuck; Lord Vaizey of Didcot; The Lord Bishop of Worcester.

Evidence Session No. 28              Virtual Proceeding              Questions 221 - 233

 

Witnesses

I: Caroline Dinenage MP, Minister of State for Digital and Culture, Department for Digital, Culture, Media and Sport; Sarah Connolly, Director for Security and Online Harms, Department for Digital, Culture, Media and Sport.

 

USE OF THE TRANSCRIPT

This is a corrected transcript of evidence taken in public and webcast on www.parliamentlive.tv.

 


16

 

Examination of Witnesses

Caroline Dinenage and Sarah Connolly.

Q221         The Chair: Welcome to this meeting of the Communications and Digital Committee. I am stepping in as Chair as the normal Chair, Lord Gilbert, is unable to be present in both sound and vision for the whole of the period. We have two separate sessions and there will be a break in between them. We are very pleased to have with us the Minister of State at DCMS. She is accompanied by Sarah Connolly, director for security and online harms.

Caroline Dinenage obviously has a very busy day because, the Queen’s Speech having just happened, the news is out that a new Bill is on its way. We have had some background notes about you, Minister. Would you like to introduce yourself very briefly and say a bit about the timing, content and likely stages for the new Bill? That would probably get us off to a good start.

Caroline Dinenage: I am the Minister of State for Digital and Culture at the Department for Digital, Culture, Media and Sport. You will have heard in the Queen’s Speech today that the Online Safety Bill has been announced. It will be coming forward very soon. As you all know, the Bill will contain all the details about how the legislation will enhance people’s rights online, protect our freedom of expression, protect journalism and protect democratic political debate in the UK. It will also offer much-needed protections, particularly for children, and especially draw attention to quite a lot of the illegal content that is currently out there and not being tackled. It will draw a much closer parallel between what online platforms say they do and what they do in real life. That Bill will be published very shortly. We intend to take it for pre-legislative scrutiny before it comes before either House in full.

Q222         The Chair: It is very good to hear that pre-legislative scrutiny is definitely going to be part of the process. That is always a good way of getting controversial material out and discussed, and some of the big issues nailed, before you get into the detailed line-by-line scrutiny, so that is very good.

We have a number of questions that colleagues will want to ask, but I wondered if I could just ask you a specific one to get us going, really. The Bill is as yet unseen, but well described. We know roughly what is in it. It presumably follows on from the comments made by the Government in relation to the consultation process. However, it is clear that there are two moving parts that go with that. One is the work of the Competition and Markets Authority, and particularly its digital unit. Could you say a bit about how you see that fitting into the work of the Bill? In particular, when are the powers for that body going to be established?

In parallel with that, what is the law going to be in relation to the issues that we are dealing with here, particularly the harm side of the equation as it affects both individuals and children? There is, of course, work being done by the Law Commission, so perhaps we could tie in both those things. These are the other moving parts. How do they fit in?

Caroline Dinenage: Let me kick off, and then Sarah Connolly, our director of online harms, will be able to fill in anything that I leave out. The digital markets unit, which is being run by the Competition and Markets Authority, was announced by my department in November 2020. It is basically to allow for swift action to address concerns about competition in these really fast-moving digital markets.

It is all part of our ambition to deliver a really strategic and pro-innovation approach to governing digital technologies. We know that, in a much more competitive market, we will be able to see a lot more innovation coming through, so we really want to drive that competition. We want to promote safety and security online, and we want to support an open and democratic society.

This new digital markets unit will be housed within the Competitions and Markets Authority, which was launched in April, to be able to operationalise the regime. It will eventually have statutory power. It does not at the moment, but in the meantime the CMA will continue to use its existing powers to investigate harms to competition in the digital markets. When it is on a statutory footing, the new unit will be given a range of powers.

This will work alongside the online safety work, because competitive digital markets will be much more likely to protect things such as freedom of expression. It will also be really good for users in protecting privacy and the impacts of a pro-competition regime. It will be a way of boosting the growth across all society and enabling that to flourish.

You mentioned other legislative innovations. We have commissioned the Law Commission to do a piece of work on online abuse and hate speech. We have had some interim findings back from that, but the final report from that will be later on this year. There may well be some recommendations that we will want to scoop up into the online safety legislation as it goes through. A number of strains of work are all happening simultaneously, as you have highlighted.

The Chair: Could I push a little harder on the digital markets unit? It does not have statutory powers yet. Obviously, it has powers within the main CMA legislation. Can you be a bit more precise about the timing of that? We have had evidence from them, which is published and available. They say that they do not need it immediately, but they would like it soon.

Caroline Dinenage: We will be very well aware of the pressure and the calls for urgency on this. That is why we have said that we will consult on the form and function of it this year, and then we will legislate to put it on a statutory footing as soon as parliamentary time allows. We understand the urgency.

In the meantime, although it does not have those statutory powers of its own, the CMA has existing powers. It can use those where appropriate to investigate harm to competition in digital markets. Once the DMU is on a statutory footing of its own, it will have additional powers to that, but there are interim powers can be used in the meantime.

The Chair: To be absolutely clear about it, you said that the work that is being done with the Law Commission is also going apace, but it is not impossible that that might come into the scope of the Online Safety Bill.

Caroline Dinenage: Yes. Two separate Law Commission investigations are going on at the moment. One is to do with online abuse, really, but looking at a range of different—[Inaudible.] Another separate piece of work from the Law Commission is looking at the sharing of online intimate images: things such as revenge porn, upskirting and deepfakes. That one will report back much later, so whether we will be able to include any of that we are not sure. For the first report, it is likely that, if they come out with any recommendations that lend themselves to this piece of legislation, that could be swept up in this work.

Q223         Baroness Rebuck: Minister, my question is about digital citizenship and how to improve it. All our witnesses, I would say, have acknowledged the problem of abusive and harmful behaviour online, often amplified by the platforms’ algorithms. One of our witnesses said that people just feel less inhibited online, and that digital etiquette needs to be promoted.

I was interested to read that the Church of England has called for civil society projects to “cool the temperature” of online debates by modelling new approaches to discussing challenging issues. In other words, it is looking at changing behaviour in order to avoid legislating, and certainly legislating too heavily. I know there are many educational projects in schools, some supported temporarily by tech platforms, but arguably it is not enough, because the online world is still not safe for children and young adults.

Our witnesses have also called for adult guidance and awareness raising, through libraries or other community groups. One campaigning witness put it this way: “Why not a nationwide government-led campaign?” I am really interested in what plans government has to promote digital citizenship for both children and adults.

Caroline Dinenage: It is a really important question. As well as ensuring that companies take action to keep users safe, we want to make sure that the users themselves are educated, but also empowered to make more informed and safer choices online, and to moderate the way that they communicate with each other without hiding behind the barriers that the screen offers.

We are publishing a media literacy strategy and it will be a complementary tool to the regulatory regime that we have already spoken about. Its starting point is reviewing the existing UK media literacy landscape, and then setting out plans for how to ensure a much more coordinated approach to online media literacy education. We want to make sure that it is really inclusive, not just for children but for adults, young people and those who are vulnerable or disabled, and that it explores a range of issues.

We have seen poignantly in the last year how vaccine disinformation and misinformation have spread. We have had to work really hard within my department to tackle that, alongside the platforms. We need to equip people with the resilience and the discernment to be able to question and challenge what they see online. We have also seen lots of instances of online abuse and people not keeping their data and privacy safe. We need to equip users with the skills they need to address these sorts of challenges.

In answer to your question about how people speak to each other online, we want to explore these issues about digital citizenship and reflect them in the strategy as well.

Baroness Rebuck: You mentioned the media literacy strategy, which is a foundation of digital civility. It has been delayed for quite a long time. It is meant to come out this spring. Is it imminent?

Caroline Dinenage: Yes, we hope to publish that very shortly. We have to consult widely on it because, as we have seen, Covid-19 has chucked up a few issues that are extremely pertinent to this. We wanted to make sure that we had taken account of the insights we had gathered over this period and that they were all rolled into it. That strategy will be published later this year.

Baroness Rebuck: It will be later this year now, so not in the spring. I wanted to pick up on something you said about empowerment, which is interesting for users. I was rather surprised when we talked to a bunch of students that one of them, who was very concerned about privacy, said that it was too time-consuming to alter her privacy settings. She asked why they were not presented to her the other way round, with opting out as the norm as opposed to opting in.

I worry about privacy and safety tools that platforms occasionally introduce. Everything seems to depend on the terms and conditions, which are long and complicated. I just question whether the average user has the time, the inclination or, if I speak for myself, the ability to understand some of these complex signposts. Might it be a role of government to insist that the kind of safety tools that consumers need should be presented very simply, rather like nutrition labels? If we keep putting the onus on the user to select the way in which they use social media, we have to have a way of alerting them very simply and easily to the tools that are at their disposal.

Caroline Dinenage: It is a really excellent point. To pick up that theme, we have also seen it with parents using the parental restrictions to keep their children safe online. There was a frightening statistic. Sarah will correct me if I get this wrong, but I think something like 70% of parents were aware that there were parental controls but had not necessarily triggered them.

This is why it has to be a double strand of work. It will give people the education, support and enlightenment to understand how to keep themselves safe online and really embed that. The online safety work will also help to tackle those harms at source so that it is much harder, for example, for young children to find harmful content online.

Q224         Viscount Colville of Culross: Good afternoon, Minister. I would like to take you on to this new concept of legal but harmful, which is being talked about. In the full government response to the online harms White Paper, the definition for “legal but harmful is that it should be content that gives rise to a reasonably foreseeable risk of a significant adverse psychological impact on individuals. Many lawyers argue that this opens the door to subjectivity and a triggering of the threshold by the most readily upset user. Are you concerned that the lack of clarity in this definition and the low threshold could easily push the tech providers to err on the side of caution and, as a result, have a chilling effect on free speech?

Caroline Dinenage: No, I am not massively concerned about this. This approach has been designed from the point of view of protecting freedom of expression. Companies can just set out very clearly their terms and conditions for this content and enforce them consistently. In the past, we have seen these kinds of guidelines being enforced in a very inconsistent way.

We are not preventing adults accessing content that they might find offensive. We are not trying to legislate against people being offended on the internet. We are not trying to legislate against adults being able to find any kind of content, as long as it is legal. It will not require companies to remove any kind of legal content that is accessed by adults. It will set out some priority categories of legal but harmful content for adults in secondary legislation.

The list will be compiled by Ofcom, really, working with expert advice. It will be subject to parliamentary oversight and democratic debate. It is likely to include categories such as online abuse and hate speech that do not quite meet the criminal threshold, but can be incredibly distressing and harmful, and can push the barriers as to what is legal. At the end of the day, companies will be very free to decide what legal but harmful content can be accessed on their services. The only difference is that they will have to assess the risks, set out very clearly how they will treat their content in their terms and conditions, and then make sure that it is consistently enforced, rather than the slightly hotchpotch system we have had up until now.

Viscount Colville of Culross: Is it not concerning that you are leaving something as subjective as this up to the companies to enforce? Would it be better, if we were going to create a definition, to have something that was in line with the existing law? For instance, at the moment there is a threshold of harm that is recognised as a medical condition, which is the standard form in civil claims for negligence. At least that is a well-established threshold. It is recognised by the law and much easier to enforce.

If you are talking about online abuse, people deciding whether they are offended, and what impact that has had on it, it is very subjective. The danger is that the companies are going to respond overenthusiastically to that. Do you not agree?

Caroline Dinenage: No, I do not agree. I have said very clearly that it is not about preventing adults being offended. We are talking about content that can be extremely harmful and emotive. We know, because of the way algorithms work and how content can be shared, particularly by the biggest platforms, that these sorts of harms can spread very quickly and affect huge numbers of people. When something kicks off and you get things such as pile-on abuse, it can massively impact somebody’s psychological wellbeing and, in some cases, their physical wellbeing.

There is quite a clear distinction between what somebody finds offensive, which is not the issue here, and what could cause real harm. There will be a legal framework for this, because we will set out some of these priority categories in secondary legislation. This will not just be pulled from thin air. It will be informed by Ofcom once it has taken expert advice on a range of themes; then it will be subject to parliamentary oversight and democratic debate. There are a number of levels to this, but the key to it is to try to encourage freedom of speech. People are much more likely to be able to go forward and speak their minds without worrying that they will be piled upon by a group of others, which can be incredibly threatening and really impact people’s wellbeing.

Viscount Colville of Culross: At the moment, looking at the government response, all the duty of care is on the platform providers and the tech companies. There is not much responsibility on the sender of the post or much focus on their intention in sending the post. Should we put some more responsibility on the user so that they act responsibly when they are on the internet?

Caroline Dinenage: Yes. I feel very strongly that our online safety is not all about what the Government or platforms are doing. I feel very strongly that we have a huge responsibility to keep ourselves and our children safe online. We cannot simply leave it up to others to do that for us. The world is too big. With the best will in the world, whatever changes the Government bring about, they will not protect people from every potential harm that is out there on the internet. The media literacy strategy is about trying to equip people with the skills and discernment to make those changes and keep themselves safer. I hope it will influence the way people talk to each other as well.

The Chair: We touched there on issues about journalistic freedom. I noticed that you made that, Minister, one of your key points when you were introducing the Bill and the work that you have done on it. That segues neatly into the Lord Bishop, who has a question to ask on that.

Q225         The Lord Bishop of Worcester: Thank you, Minister, for being with us this afternoon. More importantly, thank you for all the work that you and your staff are doing on this crucial matter. As our inquiry has gone on, it has come home to us again and again what a fine balancing act it is between freedom of expression and reducing and minimising harm.

I want to press you, as Lord Stevenson has implied, about this business of the protection of journalists. The Government have stated that the Online Safety Bill—and it is really good to know that that is coming forward very soon—will contain robust protections for journalistic content. That is clearly a really good thing as far as freedom of expression is concerned, but it is not clear yet what form these protections will take or how journalistic content will be defined. From what do the Government think journalists need to be protected?

Caroline Dinenage: Thank you for your kind words about the team. They have worked incredibly hard here. This is a huge piece of legislation that we are attempting to bring forward. No other country in the world has ever attempted anything quite so expansive as this. The team has been living and breathing this since way before I became the Minister for it in February last year. Thank you for those comments.

What you say is absolutely true. A free press is just one of the pillars of our democratic society. It is vitally important that the online harm laws do not adversely impact journalistic content. To try and answer your question a little more explicitly, news publishers’ own content on their site is not in scope. Users’ comments on news publishers’ own content will also be exempt. In addition to that, we will include in the legislation safeguards to ensure that journalistic content that is shared on in-scope services is also protected.

The Lord Bishop of Worcester: That is clearly a very good thing, but one thing that emerges from that is the question of why users’ freedom of expression would not also be under threat. Why does journalistic content needs specific protection? One of our witnesses, Peter Wright of DMG Media, argued for a compete exemption for established media outlets. When he was asked whether any regime from which the media need such exemption would be compatible with users’ freedom of expression, he replied, “The answer is not really, truthfully”. I will press you a bit on that, if I may.

Caroline Dinenage: We have to be really careful to take into account the freedom of expression of journalistic content, because there are places around the world where that is simply not enjoyed. We need to make sure that we are not putting in place any legislation that is going to hamper that or in some way prevent people’s voices being heard. We are allowing freedom of expression of individuals as well as journalists, but against a backdrop of bringing forward the necessary protections, particularly for children, from harmful content, and doing much more to root out illegal content, which is really what the legislation is all about. There will not be any new legal duties for news publishers’ content as a result of this legislation.

Q226         Lord Lipsey: I have a question of age verification. None of us wants young people to access pornography. We in Parliament thought this was going to be dealt with by the Digital Economy Act 2017. It does not appear to have been, because that seems to have broken down. This is really my question: why are we waiting? When is this going to be cracked?

Caroline Dinenage: I will answer both those questions. I hope we will crack it very soon. As you have probably picked up from what I have said so far, one of the biggest thrusts of this piece of legislation is about ensuring the most comprehensive approach possible to protecting children. Part 3 of the Digital Economy Act attempted to do this, but it probably underlines better than anything how fast-moving the online world is and why it is so vitally important to have a piece of legislation that is future-proofed against this technology.

For example, the Digital Economy Act did not include social media. If you look at any of the reports on where children are most likely to stumble across online pornography, it is through social media sites. This piece of legislation will include social media, so it will go much further to protect children from a much broader range of harmful and age-inappropriate content on all the services that are in scope. It will do that by capturing not only the most visited pornography sites, but pornography on social media.

Lord Lipsey: Is this not an area where the best could very easily become an enemy of the good? Yes, it would be lovely to know exactly how social media was going to develop in the years to come and to legislate entirely to get age verification for that. Yes, it would be great to think that all 15 year-olds could be completely prevented by some mechanism from finding a way around the controls. Is it not rather important to get on with things in the meantime, put the best we can in place, and then let it evolve as the problems evolve?

Caroline Dinenage: I would probably summarise it as not so much the best being the enemy of the good, but the best being the enemy of the not good enough. The British Board of Film Classification published in 2020 research that found that only 7% of children who accessed pornography online did it through a dedicated pornography site. Most children even intentionally accessing porn were doing it across social media predominantly, as well as video-sharing platforms.[1] On top of that, you have all the legions of very, very young children who are stumbling across it accidentally on social media. These are the huge platforms that so many young children have access to, and that is why we need to make sure that they are in scope.

I agree with you. I am the mother of teenage boys and we are never going to do everything to protect them from all the sorts of learning that they like to do online. They will always find ways around it. My absolute priority is trying to protect young people from stumbling across inappropriate images, and then, further than that, making sure that we provide greater protections as and where we can. The Digital Economy Act just did not go anywhere near close to being able to achieve that.

Q227         Baroness Buscombe: Hello, Minister. It is really good to see you. I am really pleased, as others are too, that there is going to be pre-legislative scrutiny on this Bill. The more we have researched for our part this whole issue, the more we have realised, as the Bishop has already said, that this is such a fine balance. Some of the aspects of the criminal law have to be dealt with with such care, so that we do not have the dreaded unintended consequences by trying to do the right thing. Hats off to your team from me too, because this is a really difficult issue.

I want to just touch on the whole issue of the small players versus the unassailable big players, the monopolistic platforms and so on. We have said, and the reality is, that they are to some degree unassailable and competition in the market is important. One thing that concerns us, and we wonder if you are concerned, is that the proposed global reach of the Online Safety Bill will lead to some smaller companies blocking access to their platforms from the UK.

From your response to the consultation at the end of last year, we understand that the online safety regime will apply on a mere availability basis. Any website that can be accessed from the UK will be in scope of the legislation if its user-generated content facilitates public or private online interaction between service users or as a search engine. This raises the possibility that services based abroad with relatively fewer UK-based users may prefer to block access to UK users rather than incur the costs of compliance. We know this has already happened, for example, with a fair number of newspapers in the States that have blocked access from the EU. The concern is that they will then have to comply with the GDPR rules and so on. How are we going to deal with this issue, which is quite a fraught one?

Caroline Dinenage: It is lovely to see you again, Baroness Buscombe, and thank you for your question. It is a really important one. What you are fearing is highly unlikely. The UK supports a very open internet. We are really committed to ensuring that people in the UK have access to diverse content. Do not forget that overseeing all this will be Ofcom, which will take a risk-based, targeted and proportionate approach to its oversight and enforcement. The focus is on protecting people in the UK. It is all about targeting their enforcement activity where there is the greatest evidence of harm. We have always talked about previous legislation being like a vertical legislation where you chose a harm and you followed it, whereas this legislation is much more horizontal. It is much more about systems, processes and making sure that the systems and processes are in place to keep people safe. I hope that, by doing that, we will have a much more future-proofed piece of legislation.

You are absolutely right that Ofcom will have powers to take action against companies that fail to protect their users. The other point that it is important for me to make is that this threat is a global one. So many of our international partners are developing their own approaches to this internet safety. Very recently, in April, G7 leaders endorsed a set of internet safety principles that committed to protecting freedom of speech online, while agreeing that tech companies have a corporate responsibility for their users’ safety. Although we are one of the first countries in the world to do such legislation, and we are doing it in a more comprehensive way than many before us, so many other countries around the world are looking very closely at what we are doing with a view to emulating it and taking much of it on board in their own approaches.

I am basically trying to say that it does not apply to all internet services. Those that do not have any user-generated content are not subject to the new regime anyway. As long as those that do have no illegal content,[2] they should largely be absolutely fine under this legislation.

Baroness Buscombe: That is interesting. This is a huge role for Ofcom. Going forward, given Ofcom’s need to be proportionate and fleet of foot in order to be fair and sufficiently flexible as a regulator, something like the digital markets unit will have a fantastically good role to play here. This will be ongoing, will it not? It will be a moving feast in terms of the ongoing speed of development of technology, capability, global reach and new players coming in the whole time. The oversight will need to be remarkably broad, in a sense, with people who absolutely know what to look for and the right questions to ask. That is quite a big ask of Ofcom.

Caroline Dinenage: You have to look at it more technically than that, I suppose. There are so many companies that will not fall into the scope of this new framework. It will be very much risk-based, targeted, and proportionate to the size and capacity of the business and the risks posed to the services given to users. This will be reflected in Ofcom’s codes of practice.

I cannot stress this enough. It just has to have proportionality at its very heart. The regulatory expectations will be risk-based rather than absolute. That means in practice that the regulatory burden on low-risk companies with just a few users, whether they are in the UK or anywhere, will be minimal.

Q228         The Chair: Thank you very much indeed, Minister. You have been fantastically gracious with your time, but I have also noticed that we have not had to use Sarah Connolly at all. I just wanted to feel that her time spent on the call has not been wasted. If you felt that it was appropriate for her to come in and say anything at any point, you would do that, would you not? I will take that as a yes.

We have a bit of time, so we will ask a few more supplementary questions, if that is all right. Maybe Sarah can come in on those. Your exchange with Baroness Buscombe was very interesting, because you were talking about finding the right model for Britain in relation to our users and the companies that operate here. A lot of good points were made in that exchange.

Have you looked at any overseas examples for inspiration, if not for detail? We have had evidence from America, where they are pursuing a slightly different but quite similar track, and from Germany, where the rule is much more direct but also has some interesting points about it. Were those in your mind when you were thinking about final drafts, or is this very much a British-only solution?

Caroline Dinenage: This goes back to what I was, probably really badly, when I was trying to describe earlier having a horizontal approach to risk rather than a vertical one. The German legislation puts content-focused rules on companies, and our approach is more about the systems and processes they have in place to keep users safe on their services. That is the difference between our approach and the German one. As you say, we have not used Sarah’s brilliant skills, so I will pass you over to her.

Sarah Connolly: Thank you, Minister. The short answer is yes. My team spent a lot of the last four years talking to various counterparts around the world trying to understand where they are coming out at. As the Minister suggested, we spent some time with the Germans. She has pointed out the main difference in approach. A number of countries have gone down the content regulation approach, whereas we have gone in a very conscious direction that is about systems and processes, for a whole set of reasons, not just the volumes that were being talked about.

We are looking at something that is slightly different, but the Irish published a draft Bill earlier this year that is not dissimilar to our approach. It is reflective of close co-operation that we have had over the years in talking to them. The EU also published its draft Bill not that long ago. Again, it is very similar in approach. You are starting to see a bit of a shift towards a British approach, if I might say so. We saw in the negotiations at the G7 that it is very definitely a global problem. We are talking to all the countries that you would expect us to in order to come to a global solution.

The Chair: Ireland is very interesting, because so many of the companies are based there, even if they are not always operating out of there. The lessons there will be helpful.

Q229         Baroness Grender: Hello, Minister. Thank you so much for all your evidence so far. These things are fast-moving, and over the last year one of the biggest lessons has been economic harm online to individuals. Individuals have been terrified into going on to what they think are government websites to pay money for Covid tests. There have been so many financial scams. I would love to know whether you have therefore shifted and adjusted your attention in this area with regard to what you are planning to put into the forthcoming Bill.

Caroline Dinenage: Good afternoon, Baroness Grender. It is lovely to see you again. I do not really want to give too much away, because the legislation is coming forward quite soon. Yes, we have been looking again at the approach to things such as economic harms and fraud, from a user-generated perspective. You will see that reflected when the Bill is published.

The Chair: Thank you for the hint. We will look carefully at that. That is a good way of putting it.

Q230         Baroness Featherstone: Do the Government have a view, or has it appeared in any way in the Bill, on whether taking down harmful or illegal stuff is more or less important than taking too much down, ie being risk averse, and therefore suppressing a lot of information just in case?

Caroline Dinenage: It is a great question. Thank you, Baroness Featherstone. It is lovely to see you again as well. We have discovered from the Covid pandemic over the last 12 months that, when we talk about things such as vaccine disinformation and misinformation, the approach of taking things down straight away would not have been the appropriate way of addressing some people’s genuine concerns about vaccine hesitancy and what have you.

We have had an approach of informing people and trying to get access to robust information. We have worked very closely with the platforms to try to redirect people to legitimate and reliable sources of information. We know that, once you are in this algorithm bubble and you start looking at anti-vax content, for example, you keep being bombarded with more of it. We have been working very closely with the platforms to reassure people with correct information, to signpost them to legitimate forms of reliable information, and then, as a last resort, to take information down. You can make matters worse if you censor the internet.

Q231         Baroness Bull: Thank you for the evidence so far. I wanted to go back to the online harms legislation, which I know we do not have all the detail of. Are you confident that harms related to negative body image and appearance-related bullying will be covered? There is some concern that, because size and shape are not a protected characteristic, and because body image is relatively poorly understood, we are seeing the impact of poor body image and its prevalence, particularly among young people. Poor body image has been detected in children as young as five and it echoes throughout their lives. Many of us who are concerned about this are very keen to be sure that the potential for harm will be recognised in the online harms Bill. Can you say anything about that?

Caroline Dinenage: It is a big worry. I have given evidence to the Women and Equalities Select Committee on this very theme. Whether you are a parent of girls or boys, you worry about the huge pressure that they are coming under online.

There are a number of things. Of course, this legislation is really only about user-generated content. I mentioned earlier the Law Commission work that is looking at online abuse and whether any changes to legislation need to be brought about there. Sometimes it is that kind of abuse that provokes body image issues.

Taking that to a more serious level, the Law Commission is also looking at whether we can change the law to make incitement to self-harm a legal offence. As you know, it is illegal to incite somebody or encourage somebody to take their own life, but self-harm is not. I have spoken very movingly with a number of parents whose children have found their way on to websites where there was quite a lot of user-generated content of that nature. It is very harmful indeed and often comes from a body image angle.

More broadly, a number of other things are going on. We are planning to consult later on this year on online advertising, which will potentially have a bigger impact on that space. As I mentioned earlier, the Law Commission is doing a parallel piece of work about the sharing of intimate images online, which again can massively impact someone’s body image. There are a few different strains of government work that are going on at the same time, which are very complementary and are seeking to tackle this really pernicious issue.

Baroness Bull: That is all really good to hear. Incitement to lose weight could be an incitement to self-harm if the person has an eating disorder or is at risk of one. That gets into very complicated territory given that there is a parallel need for certain parts of the population to exercise more and lose weight.

Caroline Dinenage: Yes, you are absolutely right. Eating disorder sites, or those that in some way glamourise or promote eating disorders, could be something that comes into scope, particularly for children.

The Chair: That exchange has prompted another thought from Lord Lipsey, who would like to come back to questions about how you legislate in this area.

Q232         Lord Lipsey: It is very encouraging to realise that stuff that is harmful is going to be dealt with through Ofcom putting forward proposals that are turned into secondary legislation and then looked at by Parliament. It is not going to be a scattergun approach with harms just picked up at random as they arise. That is a very good thing.

I am not so happy about the parliamentary scrutiny. We all know, and you as a Minister particularly know, that to call parliamentary scrutiny of secondary legislation patchy is being very kind to it. Most stuff goes through on negative resolutions with no proper scrutiny at all. Are you able to give any hints that, where there is a particularly important piece of Ofcom guidance on harms, that might be subject to either primary legislation in Parliament or some form of debate as well as the formal scrutiny through the secondary legislation? In other words, is Parliament really going to be involved, or is this just a way round Parliament?

Caroline Dinenage: I am not sure that there are any plans for primary legislation at this stage, although you know as well as I do that often one piece of legislation can provoke another one. I would not ever take it off the table, but the plan is to scoop all this work up in secondary legislation.

As for Ofcoms remit, when it comes to publish its guidelines it has to take it into consideration advice from experts in a range of fields. That is fundamentally important, because we cannot possibly expect Ofcom to have expertise on every single harm. Yesterday, I was speaking to some family members of a gentleman who had taken his own life. They said that people who contributed to the website forums that he had been accessing used terminology or abbreviations that might mean absolutely nothing to someone like me or you, but are very pertinent and potentially very harmful to somebody who is in that kind of mind space. That is why it is fundamentally important that Ofcom takes advice from charities, organisations and medical experts in this field to make sure that, when it forms this guidance, it is cognisant of the range of aspects it has to take into consideration.

Lord Lipsey: I do not want to get back into the arguments that have gone on about the role of experts, into which we could quickly get absorbed. I want to emphasise the role of Parliament, because parliamentarians are the people who, week by week, meet people who are affected in concrete terms by some of these problems. They spot things coming over in all sorts of ways. It is terribly important that Ofcom, working through your department, does not just pay lip service to Parliament, but consults thoroughly and properly. I am not so worried about the secondary legislation procedures. I am more concerned about the attitude of mind that would accompany that.

Caroline Dinenage: I can understand that. The fact that this will go through pre-legislative scrutiny, which gets looked at by both Houses and Members from all different parts of the House, is very reassuring.

Sarah Connolly: It is absolutely the intention to make sure that this has proper scrutiny. We are, and have been, very conscious in the department for some time that this is novel and contentious, and that it needs to be done with proper oversight from Parliament. As the Minister said, this is why we were so keen on pre-legislative scrutiny. That intent will remain through the SI process.

To be clear, not having the specific harms in the primary Bill is also a conscious choice. As well as the tech moving so quickly, the list of harms moves so quickly. Rather than having to have brand-new legislation every six months to respond to a new threat or risk, we wanted to try to do it in a slightly different way. The spirit of parliamentary oversight is fundamental to this moving forward.

Lord Lipsey: I have not made myself clear, though. What you are doing with the Bill is absolutely fine, and I am glad there is pre-legislative scrutiny. I sense a willingness to engage with Parliament over the rest, but every time you say “SI” I get the impression that something is going to be sneaked through. All I really want from either of you is a statement that we are determined that, as we go through defining these online harms with the aid of Ofcom, Parliament will be thoroughly involved by whatever is the most appropriate method.

Caroline Dinenage: I was a Minister in the Administration between 2017 and 2019. I can assure you that at no stage was it possible to sneak through any SIs. Very frequently, they ended up debated on the Floor of the House. As Sarah said, we want to make sure that Parliament is very firmly included in this. We did not want to put the names of the harms in the primary legislation, because they change so much. None of us here had heard the expressions “upskirting” and “deepfakes 10 years ago, for example. Now they are in everyday parlance. That is why we need to make sure that this Bill will stand the test of time.

The Chair: It is good to hear that. I think Lord Lipsey is also suggesting that to help the processnot so much through the pre-legislative scrutiny where things can be challenged, debated and discussed, but when looking at the final version of a Billit is useful to have draft regulations on the more contentious issues available. We certainly find this in our House, and it may be true in the Commons. Even if they are not in final form, at least one gets the sense of the language and everything else. I wonder if that thought might appeal to you.

Caroline Dinenage: Yes, that is very appealing from my perspective. Certainly at your end of the building there is huge expertise. We would be mad not to want to draw on that.

Q233         The Chair: You are certainly not mad. That puts in mind one final question. It is something that you are probably familiar with. A number of meetings have been held, in the context of this Bill, on how we deal with freedom of expression and wider issues. There is this question about the reliance on one regulator when in practice, as we have already discussed in our meeting today, others will have regulatory input and will need to be taken into account, not least the CMA.

A thought that this committee came up with in an earlier report was that there would be a point at which it might be necessary to see a meeting of regulators. I do not know what the common term would be for a group of regulators meeting regularly—a parliament of regulators, perhaps—but you know where I am coming from. In a sense, we already have interesting approaches that different regulators take to the same issue. The personal data regulator is different from Ofcom, which is different from the CMA. I am sure that is a common debate that you must have with officials all the time on that. Have you thought further about whether there needs to be some sort of arrangement under which the regulators share process, practice and examples, so that we get a common approach across these different areas?

Caroline Dinenage: You are absolutely right to highlight this. Sarah will correct me if I get this wrong, but plans are already afoot for the regulators to work very collegiately together. As you say, the Competition and Markets Authority has a really big footprint in this space with the Digital Markets Unit. There is also the Information Commissioner. The age-appropriate design code, which your colleague Baroness Kidron was instrumental in bringing through, is an Information Commissioner piece of work. That very much deals with people’s data, children’s data specifically, online. It is fundamental that the regulators can work in a seamless and collegiate way, because what they do overlaps so very much, as we have discussed.

Sarah Connolly: Minister, you are absolutely right. The Digital Regulation Cooperation Forum has the ICO, Ofcom and the CMA. They have voluntarily come together to work through these kinds of issues. Not only are we alive to it; the regulators are also very alive to it.

The Chair: Following up Baroness Grender’s point, you probably want to add to that the FCA and its impact.

We have used up our time, I am afraid. Thank you very much indeed. You have kept us entertained, amused and informed about what we are doing, as appropriate for a Minister with your responsibilities. It has been very good of you to give us so much time at this very difficult time of year for you. I just remind you that a transcript will have been taken, and the material will be available for anyone to look at if you want to check that afterwards. We are very grateful to you for the time you have given. Perhaps we will be in touch again to talk further about these issues, but thank you for that.

Caroline Dinenage: Thank you for having us.


[1]              Amended by witness: The research found that only 7% of children who accessed pornography online did it through a dedicated pornography site alone. Most children intentionally accessing porn were doing it across a number of sources, including social media, as well as video-sharing platforms and via image or video search engines.

[2]              Amended by witness: or, if they are likely to be accessed by children, have no content that is harmful to children,