Corrected oral evidence: Consideration of government’s draft Online Safety Bill
Monday 1 November 2021
2.30 pm
Watch the meeting: https://parliamentlive.tv/event/index/ad031a27-95cb-47c9-9656-c75b1d76d050
Members present: Damian Collins MP (The Chair); Lord Clement-Jones; Lord Gilbert of Panteg; Baroness Kidron; Darren Jones MP; Lord Knight of Weymouth; John Nicolson MP; Dean Russell MP; Lord Stevenson of Balmacara; Suzanne Webb MP.
Evidence Session No. 16 Heard in Public Questions 250 - 273
Witnesses
I: Dame Melanie Dawes DCB, Chief Executive, Ofcom; Richard Wronka, Director for Online Harms Ofcom.
USE OF THE TRANSCRIPT
43
Dame Melanie Dawes and Richard Wronka.
Q250 The Chair: Good afternoon, and welcome to this further evidence session of the Joint Committee on the draft Online Safety Bill. This afternoon we are delighted to welcome Dame Melanie Dawes, the chief executive of Ofcom, and Richard Wronka, who is the director for online harms at Ofcom. Welcome to the committee.
I would like to start by asking about the auditing powers that Ofcom will take on as part of its responsibilities for online safety. From the evidence that we have taken so far, that seems to be a really important part of the work that the regulator will be asked to do, in particular the regulation of the systems of the companies as well as the content that is served from user to user, looking particularly at the social media platforms.
Dame Melanie, do you feel that Ofcom will have the resources it needs to look not just at content moderation but at the systems that drive content curation on social media platforms?
Dame Melanie Dawes: Thank you very much for inviting us today. We are very glad to be here and to help in any way we can with the committee’s work.
You are absolutely right that powers of audit, and information powers in the round, are a very important part of the toolkit that we think we will need as a regulator. On that specific question, we broadly think that we have the right powers, yes. Through the skilled person’s report we can do something, although it is not called an audit in the Bill. It is the same sort of thing as, for example, the Information Commissioner’s Office is able to use to get under the bonnet when it needs to, particularly if we have concerns that there is something going wrong.
More generally across the Bill, do we feel that we have what we need to act, and act quickly when we need to? The answer is broadly yes. We have clear safety duties. We think they are well written. We have a few suggestions as to how they could be made even more specific in a couple of places. We have the right kind of apparatus through risk assessments by companies, transparency reports from them and our information powers to be able to know what is going on. Finally, we have sanctions, fines and so on that we can deploy when we think we need to take enforcement action.
This is a really challenging task. There is no question about that, but we think the Bill gives us, broadly, the right overall things that we need.
The Chair: As you rightly say, the terminology used by Ofcom and the ICO is different. Indeed, the nature of the investigations would be different as well.
For the benefit of the committee, how would you see what you refer to as the skilled person’s report working?
Dame Melanie Dawes: Richard might want to come in on this. Broadly, where we believe that there is a risk, and that it is not being addressed, we are able to appoint a skilled person or to require a skilled person to be appointed by the service so that they can conduct an investigation.
Richard Wronka: Yes, that is right. First, we think that the skilled person’s report brings expertise, especially in the operation of algorithms and other systems, which can be quite complex in themselves. That is an important point. The powers in the draft Bill give the skilled person quite extensive powers to engage with the platform and require the company to engage effectively with the process. That means offering employees for interview, for instance, and offering information that they have about how their systems work. We think that is a really important part of the process.
Of course, we see that sitting within a wider package of information-gathering powers. We have broad and effective information-gathering powers that we think are in the draft Bill at the moment—for instance, the ability to require platforms to generate or obtain information and not just to hand over information they already have to hand. There is also the fact that the information-gathering powers apply to third parties and not just the regulated companies themselves. That is also a really important part of the package.
The Chair: In this case, is the skilled person someone from within the company or someone externally who is appointed?
Richard Wronka: It would be external. It would be either an appointment made by Ofcom or, if we saw fit, an appointment made by the company with our approval.
Dame Melanie Dawes: I think the Bill is deliberately providing for the fact that it should not be someone from the company. In the circumstances in which we would launch one of these reports or audits, it is because we think there is a problem. By definition, you would not be asking for somebody inside the company unless there were very particular reasons to do it. Equally, it might be something where you really need very specific skills.
Thinking about my previous role, it is rather like the way we were able to appoint audit firms to go in and look at local authority balance sheets in particular circumstances, for example. That is how I see this particular provision in the Bill. It is potentially quite wide-ranging, albeit applied in specific circumstances.
The Chair: It would be a Red Adair-type person, who would be brought in to troubleshoot the problem.
Dame Melanie Dawes: Yes, if you like.
Q251 The Chair: On the information-gathering powers, we spent some time last week with the technology companies discussing the information they put in their own transparency reports, which is often not very helpful in helping us to understand the scale of the problems they face. It does not matter particularly that Facebook takes down 97% of the content that it finds. I find it interesting that it boasts about the fact that it has increased it, which must mean that it used to find a lot of harmful content that it did not take down, but it does now take down what it finds.
That tells you nothing about its assessment of how much there is. In fact, both Facebook and YouTube were unable to give any sense of how much harmful content or hate speech their AI finds. They could not corroborate leaked information, or give any report on what the accurate figures were. Is that the sort of information you would expect to have access to as the regulator?
Dame Melanie Dawes: Yes, absolutely. We have a huge shift in the industry to try to achieve—largely, I think, for their risk assessments, at least in the first instance. We will look for them to engage with that in a way that, from the user’s perspective, looks at what is going on in their services. It is not just from their end of the telescope as to how they have designed things, but actually what happens when algorithms, user behaviour and features such as direct messaging and so on interact in the real world, and what kind of harm we then see being served up to users of different ages, different experiences and so on.
It will raise questions about what the right metrics are. Are there any that we can set that are common across the industry? It would be really good if we can find some where everybody is looking at the same things, but equally we need to recognise the fact that platforms are different and are offering different kinds of experiences. All that will be very important.
We are starting to get into it already with the video-sharing platforms regime. Next year, we will publish our first state of industry report from our perspective as a regulator on what is going on. We will be seeking to bring as many numbers and as much granularity into that as we possibly can.
The Chair: Do you expect Ofcom to take a view? If what the Wall Street Journal reported was accurate, that Facebook engineers supposedly said that they only pick up 3% to 5% of hate speech through AI, would you say, “If that’s the case, either you need to improve your AI or you need to build in more human moderation, because those figures are not good enough”?
Dame Melanie Dawes: It certainly does not sound like it is very much, does it? I think I would say to all the platforms that what we are looking for is transparency and openness, and a recognition that they might not have all the information at the moment about what is going on in their services. Part of it will be: how are they verifying what they know with the results of external researchers?
While we are on the subject, I think that tightening up the requirement to work with external researchers would be a good thing in the Bill. From my perspective, although Ofcom will do whatever we can, a lot of what we will be doing will be behind the regulatory veil. We will not be able to publish everything, for reasons of confidentiality. The more that we can have engagement with external researchers, who are shining the light wherever possible, and the more that platforms can recognise that that is in their interests, the better.
The Chair: I appreciate that there will be some things that would make the problem worse if they were public and they would need to be dealt with privately, but do you feel that, if you agree on improved performance metrics that you expect from the companies, those sorts of things should be included in the regular reporting? For example, if there are figures around to help the effectiveness of AI in detecting hate speech as a category, is that something that the companies should regularly report on, rather than putting out transparency reports, which, out of context, are basically meaningless?
Dame Melanie Dawes: Yes. We will be producing guidance on what transparency reports should contain. That is one of the roles we have under the Bill. We will also, similarly, be setting out requirements on risk assessments and what they should contain. There is a link between the two. We cannot have everything in the public domain. We accept that. It is always true for any kind of regulatory system that you have to accept some confidentiality behind closed doors. I think the question of what the right metrics are will thread its way through, from the risk assessment through to the platform service transparency reports and Ofcom’s own cross-cutting transparency reports.
The Chair: You will have seen a lot of the information that Frances Haugen has published. Do you think that sort of internal research is the sort of information the regulator should have the right to have access to, if it requested it?
Dame Melanie Dawes: Yes, absolutely.
The Chair: The other question I want to ask based on her evidence was this. She recommended what she called sectoral analysis of the audience bases as well. A lot of the data information that we get from the companies is very generalised, based on the fact that the broad number of interactions on the platforms will largely be quite positive, but for people who experience hate speech and harmful content, and other forms of illegal content, their experience is probably going to be pretty awful. You do not get the true picture, because it is mixed in with everything else.
Dame Melanie Dawes: Yes. I think that is where you need experts, whether that is groups like the NSPCC or others, that are really expert in understanding the perspective of particular groups of users. We need them to be part of the system, whether that is through super-complaints in very specific circumstances or research partnerships. For Ofcom, that is very much one of the things that we are trying to build, where we need explicit partnerships with people out there who have a particular angle that is very important to bring to bear.
The Chair: I understand that the platforms’ own policies on moderation are important, and their enforcement around it. The message that has come to us very clearly from the inquiry so far is that the greater level of harm is probably because of the amplification of harmful speech. There is a process that takes that harmful speech and other forms of illegal content to much bigger audiences, and that is the thing that the companies are most responsible for.
I wonder whether that is something you agree with as Ofcom embarks on this process. Once the legislation is passed, should the greatest priority of responsibility be attached to not actively promoting content that is regulated under the codes of practice, and that that is the greater area of harm and the issue that the companies are most directly responsible for?
Dame Melanie Dawes: Yes, I agree with that. Whether it is illegal content or legal but harmful content, a lot of the harm is caused by the number of people who see it or are shown the material very quickly in a way that is uncontrolled and can be very damaging. That does not mean that individual pieces of content are not extremely damaging in their own right, but who sees is the thing that we need to be most focused on.
The Chair: Who sees it and the process of that reach. Your view would clearly be that if a company is saying that it has policies in place to deal with a certain form of content, but you find examples that it has recommended that to someone in the newsfeed, or it is failing to act against a group that is promoting it, that is the sort of thing you would come down on quite strongly.
Dame Melanie Dawes: Yes. It is about how users are able to make things more viral, and about how the companies encourage that through things like recommended algorithms and so on, as you say. In the end, the test for me is: what is it like from the user perspective? That is the thing that we really need to keep coming back to, and to shift the culture on to. It is not, “I’m here in a company, I’ve designed it this way and this should all be fine”. When you put it together and you are a user, particularly a younger or more vulnerable user, what do you experience when all those features come together? That is the real test for me about whether or not harm is being properly managed.
The Chair: I think that is right in terms of what the user sees, but it has to be clear as well in the definition of this. The role of the algorithm is not just allowing other users to expose their content to a bigger audience, although that does happen. There is clearly evidence of people understanding how you game the system through groups and advertising to drive content through it. The platforms are also making decisions themselves as to the content they think is likely to have a bigger audience. In the case of a company like TikTok, it is doing that purely on data profiling rather than even initial social interaction.
Dame Melanie Dawes: I agree with what you are saying there; absolutely.
The Chair: Thank you.
Q252 Darren Jones: Good morning. I just want to walk through how you think the risk assessments might be done in practice. My first question is: do you feel you have sufficient powers in the Bill to set minimum standards for risk assessments?
Dame Melanie Dawes: We certainly think that the way the Bill is drawn on risk assessments is good in large part—for example, the details of the kinds of functions, such as anonymity, encryption and so on that need to be addressed and looked at, which might be causing risk or indeed might be used to moderate risk in some circumstances. We think all that is good and sensible.
There are a number of areas where we think the Bill could be tightened up. In particular, there is your point about adequacy. We think there is a slight risk that a service may not identify a risk, and then not be required under the safety duties to address that risk. Simply to address the drafting of the Bill to make sure that did not take place would, for us, remove a potential loophole. Our concern is that, if we did then identify one of those problems, we would have to go all the way back to the risk assessments and get them to do it again before we were able to engage the safety duties for any kind of enforcement action.
That is a rather detailed answer, perhaps. We think we are broadly there, but with the gap of adequacy in standards not quite being strong enough at the moment.
Darren Jones: How do you think we can fix that?
Dame Melanie Dawes: I will turn to Richard on that one.
Richard Wronka: Our focus in these kinds of situations is on being able to move quite quickly to engage with the company on the steps that it needs to put in place to address the risk that it has missed or has underestimated. At the moment, the Bill frames the safety duties in relation to proportionality, which we think is a very important principle, but proportionality is defined in relation to the risks identified in the risk assessment rather than being defined in terms of a reasonably foreseeable risk. I think we would want to address that, so that we could require a company to take the steps that are needed to address reasonably foreseeable risks.
Darren Jones: Are you comfortable that you are able to define what reasonably foreseeable is without ending up in a very prolonged court case?
Richard Wronka: I am sure it would be challenging, but I think the risk assessment guidance that we will be setting out in the regime through our codes of practice, and which will set out recommended steps for companies, gives us a basis to have those discussions.
Darren Jones: Do you therefore think that your codes of practice should be binding?
Dame Melanie Dawes: They are statutory codes, but the way platforms are able to discharge their duties, particularly their safety duties, means that they can choose another route. I personally think, as I explained in the letter I wrote to the committee back in September, that at some point it is right that there is flexibility for services to determine how they address the safety duties. At the same time, that makes it potentially harder for us to prove a breach against those duties, because it leaves open so many different options through which they could be addressed.
There are a couple of areas in the safety duties where we think slightly tightening up the language would actually help to give us hooks. Those would be things like mentioning a list of options that companies can use to meet the duties, such as age verification where there is adult content that should not be exposed to under-18s, or uses of technology in other circumstances. That could go some way to giving us a little more by way of hooks to take action where we need to without undermining the fundamental principle of the Bill, which is that it is flexible. I think that is important in this very fast-moving industry.
Darren Jones: Forgive me if I have come to the wrong conclusion, but my understanding of the Bill as it is currently drafted is that you cannot tell a company to do something in a particular way.
Dame Melanie Dawes: We cannot tell them a specific action that they need to take. There are exceptions for use of technology in certain circumstances, which are almost the exceptions that prove the rule. You are right in that sense. But what we can do is enforce against the overall failure to meet the safety duties, if that is what we believe is happening. Our information powers give us, as we think we need, a lot of flexibility in asking for data, information, analysis and so on if we think there is a problem.
Richard Wronka: We would be able to make a requirement on services to take specific steps where we have identified that they have breached their safety duties. What we are not able to do through the framework set out in the regime is to set binding requirements before the event through codes of practice. There is a bit of a distinction there.
Q253 Darren Jones: Understood. One of my concerns has been the depth and breadth of the companies that you are going to be tasked with regulating. In last week’s session we uncovered that the big businesses do not really have board-level accountability for safety at the moment. In previous sessions, we have confirmed that the people programming algorithms—I often say down the corridor, but it is often in different countries—the product teams, the compliance regulatory legal teams and the public relations teams are very far apart from each other.
Do you feel that you have sufficient powers to be able to get a grasp of that, or do you think you need more powers to make sure that companies, when they are doing their risk assessments, really go the 100 miles that they need to go to make sure that you are getting the information you need from the point of programming as well as the point of user experience?
Dame Melanie Dawes: I am not sure that we need more powers specifically, but I think that the risk you identify, which is that some of the platforms do not really have the right governance in place or, if they do, it is not quite clear what it is and how effectively it is working, will need to change if they are to take these duties seriously and discharge their responsibilities towards them.
Although the Bill does not require them to change their governance, and I do not think it necessarily should, I do not think we necessarily need any more as a regulator. I think this will be one of the big issues for some platforms. Some of the smaller platforms that we are working with already under the video-sharing platforms legislation are not very well resourced when it comes to things like content moderation or user complaints. Another issue is the resourcing that will be needed in some of these companies.
Darren Jones: You just mentioned user complaints. I am keen to get your view on Ofcom being the front door for individual complaints, in a similar way to the ICO. What do you feel about that?
Dame Melanie Dawes: To be honest, that could not just overwhelm us in volume but conflict a bit with the role that the Bill gives us, which is a strategic one looking at overall systems and processes, scanning the industry and looking across the piece at the overall harms that are there and then forensically going in where we see a failure. I think it might overwhelm anybody to have the ability to have the role of receiving any individual complaint. I think we should be quite cautious there. To be honest, for Ofcom it would be a very different kind of role as a regulator. I am not sure that it works brilliantly for regulators that have that dual role. It can be quite a conflict as to whether you are coming at the more strategic end or at the more operational end.
Darren Jones: You would be more comfortable with the idea of its being referred to an ombudsman if the internal complaints procedure had been exhausted at a particular company.
Dame Melanie Dawes: Yes, I would. That is how things work on telecoms complaints at the moment. That is not to say that we will not be very interested in complaints. What we will definitely want to look at is how companies are handling their own user complaints, what their metrics are, what their timescales are, what their results are on the complaints and so on. That will be very important. I would view that as more of a strategic conversation about their overall system and process, and how it is working for users. When it comes to individual complaints, I think something rather separate from what Ofcom is doing is better, if you believe that needs to be included as part of the framework.
Darren Jones: My very last question for this section is this. You are being given quite a lot of power and responsibility. Do you feel comfortable about having appropriate oversight from Parliament or Ministers about your performance at Ofcom, or do you think that something needs to improve in that?
Dame Melanie Dawes: What we need as a regulator is a balance in relation to independence, when it comes to making the regulatory decisions that Parliament has asked us to do, whether that is setting codes or decisions in relation to particular companies. I would see the other side of the coin as our accountability to Parliament. I am well aware that this interests the committee. I think that is a very healthy debate.
The only thing I would say is that sometimes we are not able to share very much information about individual regulated entities. A conversation about horizon scanning and about where we are going on our regular reports and assessments seems to me a very healthy and potentially quite important part of this, particularly if we include other regulators in the conversation as well. We are looking at some of the same issues.
Darren Jones: Thank you.
Q254 The Chair: I want to ask a follow-up to that. Do you think you can avoid being drawn into investigations on individual cases? It is an issue that we have talked a lot about on the committee. It is often cited in the press, and government Ministers speak quite a lot about it. It is something like the racial attacks on the England footballers after the end of the European Championships.
You can imagine that you would be asked, “Have the companies breached their obligations as set out in the codes of practice? Have they failed to do things you asked them to do? Do you regard that as a serious breach and failure? What action will you take? Does that failure in and of itself trigger a fine or some other form of intervention?” It is very difficult to see how you could avoid being drawn into those sorts of discussions. I wondered what your thoughts were on that.
Dame Melanie Dawes: I think we will get drawn into discussions in the media. There are bound to be events like that which still happen. We are not going to have a complete transformation, certainly not quickly, in the scale of harm that is out there. I am sure that we are going to get drawn in, and people will want to know what we are doing.
In that particular case, what would have happened had we been already regulating, and the framework had been in place? We would have known what the terms and conditions were for different platforms about their approach to legal but harmful material and how they were tackling illegal hate speech. We would have been able to ask them whether they were following those terms and conditions and whether any of the evidence that was being discussed in that particular episode suggested that they were not. We would have had actions by the company that we were hopefully already familiar with, and an ability to go in and ask them in a much more structured way what was going on.
The Chair: In terms of action and to Darren’s question about taking enforcement action, the experience of the ICO seems to be that any time it tries to levy any kind of fine on a big tech company, the company triggers a legal process that can take a long time. It has a debilitating effect. It is almost a form of lawfare, in a way. It forces the regulator to choose the battles it is going to fight because it simply does not have the budget or resources to fight them all. In the resourcing that Ofcom has, what consideration has been made of the money that will need to be available to engage in legal actions around enforcement of fines and other things?
Dame Melanie Dawes: One of the real uncertainties on resourcing is simply the sheer legal weight that we may find against us, frankly. You are right. Some of the platforms have been quite resistant to efforts by the ICO, the CMA or indeed other agencies to take them through a legal process. It is something we have factored in.
Overall on our budgeting, we have had constructive conversations with the Treasury about the next couple of years, and with DCMS, on what we need. Once the Bill has gone through Parliament and it is clear what the scope and the overall shape of the ambition are on things like media literacy, enforcement and so on, there will come a point when we have a real conversation about the overall spending cap. The Treasury will set a spending cap for us, which we then collect fees for.
The Chair: Thank you. John Nicolson.
John Nicolson: The question I was going to ask has been covered.
The Chair: Suzanne Webb, joining us remotely.
Q255 Suzanne Webb: Much of what I wanted to ask has been covered, but I have been jotting down some notes. It is not a criticism at all but a compliment that you seem very assured about what needs to be done. Thank goodness you do, because it is one hell of a job that you are going to be picking up. You have the eyes of the tech companies, the Secretary of State and the users on you for what they feel they need to create online safety.
I have a bit of a cheeky question for you. With regard to the Bill, what is it about what needs to be delivered by Ofcom that would keep you awake at night at the moment, in what you feel needs to be addressed and needs to be done? Secondly, once the Bill is in place, what would keep you awake at night when you were up and running with it?
Dame Melanie Dawes: That is a very good question. You have put your finger on a real dilemma for us. This is, on one level, an incredibly exciting and important task, and we are all very positive about it at Ofcom. We are feeling confident that we can make a difference, but it is huge. The fact that these companies have been unregulated while they have grown so rapidly means that we have a very long way to go to bring some kind of transparency.
What is really hard is the sense of expectations. We can definitely make a difference. We can make a difference on transparency. We will absolutely prioritise making a difference for young people in lots of different respects through the regime. We want to make a difference in making it clearer how freedom of expression is being safeguarded while legal but harmful content is addressed. We feel that we know we can improve things. My concern, now and later, is that we are also clear what the Bill will and will not do, and that is why I welcome the work of the committee.
I would be concerned if the Bill’s scope grew so much that we cannot deliver against it. It is incredibly important for us all to understand some of the challenges that we are going to face, and the fact that, while we will move as fast as we can when we see a problem, we will sometimes not be able to achieve the results we want very quickly. I am very confident that we can make a difference. That is why we feel pretty excited about this. We are always balancing that against the scale of the task.
Suzanne Webb: Thank you. Following on from that, you have said what the Bill will and will not do. My biggest concern is what the tech companies will and will not do. Do you think the Bill will be agile enough? Do you think you have enough powers that will be agile enough to respond? It was mentioned earlier that we had a conversation last week with some tech companies. The concern that came out of that was how unwieldly the layers of governance in those organisations are, and the fact that safety does not sit at board level. Do you think that you and the Bill will be agile enough to do what it is supposed to say on the side of the tin, which is basically user safety?
Dame Melanie Dawes: I think the Bill has a lot of flexibility for Ministers and Parliament to come back and reshape it at the top level. The review after two years that the Government are able to carry out is a very helpful provision. There is the fact that senior management sanctions can be introduced at that stage. The flexibility that Ministers have on defining priority harms is also a helpful thing. There is quite a lot of flexibility for the Government and Parliament to come in and reshape, if you like, the infrastructure of the Bill.
As the regulator, do we feel we have enough flexibility? Yes, I think we do broadly, but I say that with an expectation that there are bound to be some things that we had not thought of, and some things that come along and surprise us all. What I would hope for then is that we can act quickly, talk to the Government and Parliament and get the changes that are needed. This is such a fast-moving industry. The announcements from Facebook last week, which I think we should now call Meta, about where it is going with this in the future show that the technology is still changing and potentially creating great benefits, perhaps, but also new harms that we will need to address.
Richard Wronka: Might I add a point? I think that our horizon-scanning capability is an incredibly important part of this. That will be informed by our information-gathering powers and our transparency reporting powers, as we have discussed already. It will also be informed by the partnership work that we are undertaking. Civil society organisations, quite rightly, are obviously very active in this space, as are law enforcement agencies, all of whom will be able to help us paint a picture of where harms are developing in the future and not just addressing today’s problems.
Suzanne Webb: I have one very quick question on that. Do you think the tech companies will do what they are supposed to do?
Dame Melanie Dawes: I think we will have a much better idea as a result of this Bill what we want them to do. Ofcom will have a much clearer ability to go in and get information on whether they are doing those things. I am sure that will drive change and action over time, yes. Above all, it is a cultural shift. It is a shift in how decisions are taken, how products are designed, how new services are introduced, how data is sought and how users are consulted. That will take time, but that is what we are aiming for.
Suzanne Webb: Thank you.
Q256 Dean Russell: Thank you for your evidence today. I will be frank. Last week, we interviewed Facebook. I came away from that session feeling that it was not really committed to the safety levels that we have heard about in our sessions. Call me cynical, but the timing of its Meta announcement was quite interesting. It happened within probably minutes of our session, where we were asking it about safety. It then put out what I thought was a rather toe-curling and cringeworthy video talking about safety, just after we had heard much evidence about the fact that the safety is not there.
I would like to get your sense of how much actual accountability you think the platforms will take. I know you talked about it just now, but do we need to really toughen up this Bill so that individuals within Facebook and the other platforms feel the pain if they cause harm to other people?
Dame Melanie Dawes: There are a couple of areas where I would slightly toughen the Bill. One of them is something I mentioned earlier, which is requiring them to engage with external researchers. There is an opportunity for the regulator to set some terms for that, some accredited researchers. There is a slight risk that the European Union will make that a requirement in its Bill, and therefore British research groups do not get the same cut in what could be a growing market. It would disadvantage UK users if we do not have the same powers as are going to come in the EU. That would be a toughening, and I think would go to the heart of some of the engagement that needs to change.
We see some platforms—Google, for example—opening up its centre in Dublin, where people can go in and engage with it on its algorithms and how they are designed. We see Twitter publishing research on when it has had external researchers looking at whether or not it is recommending content from different sides of the political spectrum. That sort of engagement with external researchers and others, while it still has further to go, is what we want to see from all the platforms. As I say, it is cultural as much as anything.
Dean Russell: Within that, we have heard evidence where even in a public body like this, with MPs who are looking to change legislation around online safety and online harms, platforms still have not taken down certain content. I will not give examples that have been given before by colleagues, but even in this very public and visible platform they still have not removed content.
My concern is about the immediacy that will come off the back of the Bill and the powers that you have. Do you think there will be immediacy of change? The football example was used earlier. There is also the more damaging one of flashing images being shown, or sent directly, to people with epilepsy. Do you think this Bill will address that, or is it going to be two or three years down the line before we start to see any actual change?
Dame Melanie Dawes: What we would like to try to achieve, in the years before the Bill comes into force, is a deepening of our conversation with the platforms that we are not already regulating. We are already working in a formal way with quite a lot of the platforms, particularly those where young people spend a lot of their time—TikTok, Snapchat, Twitch and so on—so we are well placed. In fact, we will be writing to them all by the end of this year to confirm the steps we are looking for from them in the coming 12 months. That is already happening with some of the competitors to the bigger platforms, to be honest, those such as TikTok, which is already biting away at their audience share, particularly the younger audience share. I think that will have an impact.
What we want to do beyond that in relation to this Bill is to begin to have a deeper conversation—we have not really started the regulatory conversation yet—during 2022 and 2023 about how we are all going to get ready. In relation specifically to the sending of flashing images to people with epilepsy, the Law Commission has made recommendations to make that illegal. It seems extraordinary that it is not already, or that anyone would do it. That has the potential to come clearly under the scope of the Bill. It is the sort of specific example that we could begin a conversation on very quickly indeed.
Dean Russell: In the practicalities of this, in three years’ time this Bill comes in in its current form. In a real-world example, if somebody sends flashing images to a child with epilepsy, what would the process be at that point? Would someone report it to Ofcom? Would they report it to the platform? How would you make sure that that bit of content—flashing images as an example—comes off that platform immediately?
Dame Melanie Dawes: First, I hope we would know about it. There are a number of ways in which that can happen. People can tell us. We may not be able to deal with individual complaints, but they can certainly tell us if they have a concern. We can follow that up straightaway with the platform. It is possible that we might find, in this case, maybe the Epilepsy Society or a similar organisation ready to make a super-complaint if things move fast.
If it is illegal content or clearly harmful, particularly if it was in relation to children, the Bill is very clear that that sort of thing is not within what is acceptable, and I think we have the ability to get on the phone very quickly about that. That may sound a bit unspecific, but that is the starting point: “What is going on? What have you done? What have you changed? What are your terms and conditions? Why is this still happening?” We would expect action.
Dean Russell: If they do not take that action, what then?
Dame Melanie Dawes: Then we can go down an enforcement route. As the Bill is coming in, obviously there are a number of bits of it that need to be put in place. One of the reasons why we are very pleased that we have already had a lot of resourcing from the Government to begin our preparation is that we can do as much as possible in advance of Royal Assent and the various stages after that. Until we actually have the formal powers we cannot use them, but we can certainly anticipate them in the way we engage with the platforms on issues that are so problematic—pretty quickly, I hope.
Dean Russell: Is there anything we need to add to the Bill to help that process at your end? What would it look like as it stands versus what we could do to make it better?
Richard Wronka: In the example you are giving, there is a premium on the platform addressing the issue before someone is harmed by the dangerous content—in this case, flashing imagery that has been sent maliciously. To generalise a little bit, we think there is a really important role, as indeed we have seen already, for platforms to use technology to proactively identify harmful content. In some cases, they have done that broadly successfully, and in some cases less successfully.
First, there is a point about how the use of technology powers is currently constructed in the Bill. We think there is a case for extra clarity to describe the situations in which Ofcom might be able to recommend or require the use of technology to identify illegal content with the intention of it being removed. In this instance, there is a separate question about what is technically feasible. In this particular situation, it is probably the only way to avoid the harm happening in the first place, which is what we all ideally want. That is one specific thing that we think could be addressed through the Bill.
In addition, recognising that some of this is technology at the cutting edge, we think there is an important role for Ofcom in researching, potentially with the industry, what is viable today and what the state of the art might be in five or 10 years’ time, and trying to speed up some of the processes and channel investment in the industry into developing technologies that are more effective than the ones that are currently at the disposal of platforms, as well as thinking about how they can be deployed by a wide diversity of services, not just the very biggest services.
Dean Russell: Thank you.
Q257 Baroness Kidron: First, I want to pick up on a couple of things you have said. Melanie, I was struck that you said a couple of times that you may not be able to be very transparent about individual companies, either to a parliamentary scrutiny committee or more broadly in public. I am curious about that. One of the things that people are looking for from the Bill is a more radical version of transparency. What do you feel curtails you? I think there is an expectation that we might know what goes wrong under the hood, rather than not know.
Dame Melanie Dawes: It is a very good challenge. As Mr Collins was hinting earlier, we have to strike a balance between bringing the companies along with us and creating some kind of space whereby they can share with us what is actually going on, safe in the knowledge that it is not immediately going to end up in the public domain. It is that balanced against the need to massively improve transparency for the public and Parliament.
We manage these things at the moment. For example, in consumer views on telecoms providers, we ask through our information powers for consistent information on complaints handling, service quality, et cetera, and we publish that, but that is through a particular gateway where we have asked for specific things. I think there is scope for that kind of transparency reporting from us as a regulator.
We are grappling with the question as we think about what can be put in the video-sharing platforms report next year. I would like that to be meaningful to service users, and therefore not just a general description of the overall industry but something that helps people understand what it is like on TikTok, compared to Snapchat, compared to Twitch, et cetera. It is a good question, and it is not that we are trying to avoid transparency ourselves, but it is a bit of a balancing act, and one that we are giving quite a lot of thought to right now.
Baroness Kidron: It is really interesting. There is the question of how mandatory the risk assessment is, and then how clear it is where the failure is. We have heard some evidence that the failure in your duty of care, as currently drafted, is about failure to give information rather than failure to mitigate or uphold the mitigation or management of risk, so there is a slight feeling that it does not land anywhere. Do you know what I mean?
If you cannot 100% tell us—I accept your answer in part that there must be a dialogue between regulator and company to see whether you can sort out harm, because in the end it is not punishment but removal of risk and harm that one is gearing towards—and if the companies can interpret your risk assessment, and you cannot take action on the mitigation and upholding of terms, I am struggling to see where we actually hold them to account. Do you recognise that problem?
Dame Melanie Dawes: This is partly where the company transparency reports come in. I do not know if Richard wants to pick that one up.
I have asked in my own mind, and we have had a bit of a debate internally about whether the risk assessments from companies should be published, for example. Would that be a positive step? The fear I would have is that they would not be as good. That is the balancing act, I think, but it is a very important set of questions.
Richard Wronka: We certainly do not see it as an either/or issue. We would like to push both as far as possible, but keeping the ability to have a private dialogue with companies where we feel that is the most constructive way of achieving the right outcomes.
It might be worth adding that we anticipate taking a supervisory approach with the largest and riskiest companies, very close engagement not just on current risk but on looking towards the future and horizon scanning, and really getting under the skin of how they take decisions, what their corporate governance systems look like, how they are thinking about harm, and what they are doing to address harm. That supervisory approach is broadly the approach that we are putting in place for the video-sharing platforms that we are regulating today.
Baroness Kidron: At the risk of repeating what Darren Jones said, do you not think that you would be protected by having a mandatory risk assessment process, so that you could not be accused of getting too cosy with the companies? That is the other danger.
Dame Melanie Dawes: Do you mean a mandatory risk assessment process for Ofcom to carry out, or for them to carry out?
Baroness Kidron: No, for them to carry out at the behest of the regulator. You have done your market risk assessment, as it were, and you say, “Here are risk profiles for the different companies”—that gets a mention in the Bill although not a big one—“and you fall into this category, and we expect your risk assessment to answer all these questions, to a certain quality that we determine”. Then if you have private conversations, at least there is some sort of transparency and anchoring of the expectation, and it cannot get too hidden from public view. That seems to me to be a problem not of your making, just to be clear, but perhaps a problem that the Bill is imposing on you.
Dame Melanie Dawes: I certainly think it should be clearer in the Bill that risk assessments need to be of a certain standard, and that it is part of the fulfilment of your duty of care to produce a good one rather than a half-hearted one. That is something we said in our letter, with a couple of recommendations. We can say more about that, and if you have specific questions we can follow up in writing, that would really help. The Bill already gives us a lot to work with, that aside, because it itemises the sorts of things that need to be looked at, the features that need to be considered, and so on. I think this is quite a good section of the Bill in that respect. It is just clarity needed that it is part of the safety duties.
Q258 Baroness Kidron: My other question is a little similar and is about the terms and conditions. Again, you mentioned what the terms and conditions say and whether they are fulfilling them. Is it clear enough in the Bill that if they are not upholding their terms and conditions you can act? I was thinking in particular of the example that has come up several times while we have been taking evidence, which is the 12 people who were creating a vast amount of misinformation. If misinformation is against the terms and conditions, why were they not banned? This is not a question of freedom of speech. It is a question of narrowing the terms and conditions. I am interested to know whether you feel you have powers in the Bill to say, “Here are your terms and conditions. I notice that 12 people are doing this. Why aren’t they dealt with?”
Dame Melanie Dawes: Terms and conditions will be a really important hook for us. As I was saying earlier, this is one of the areas where we think the safety duties could be slightly more specific. It is one of the things you need to have in place, like age verification for adult content, et cetera.
To be clear, we are not talking about terms and conditions as they are today where you scroll through them as fast as you can to click accept. This is about terms and conditions that make sense to the user and are not just about your assent; they are about you being given information that helps you to manage your life online and manage risks to you. They are a commitment from the company to you as to what you can expect. There will be some platforms that offer very low protections in relation to legal but harmful material; they are small platforms, and that is what their users know as they go on those platforms. As long as they meet the other duties, that might be something we are all prepared to live with, but particularly for the larger platforms where it is hard to avoid life online—Facebook, Twitter, YouTube, Google and so on—real clarity about what you can expect is what I mean by terms and conditions, not the sort of thing you see today.
Baroness Kidron: That is very helpful. I want to ask you about the proportionality point, because the Regulators’ Code is very clear about proportionality of action against risk. At the moment, the Bill is drafted around size and reach rather than risk. We have taken evidence from a number of people who said that they do not understand the tier system, that small is not safe, and that, particularly in my area in relation to children, you can have a company that has two employees and it does an awful lot of damage and has an awful lot of reach. Several people have given the same thought to the committee—that we should not have these tiers but we should have proportion to risk. What is your response on that?
Dame Melanie Dawes: Richard might want to come in on this. I personally think that some kind of tier system is quite helpful, because we need more from the bigger platforms where so many people are spending their lives at the moment, particularly children. Enormous amounts of time are spent by children under the age of 13 on TikTok and so on. Our data suggests that half of 10 year-olds in the UK are on TikTok; whether they have accounts or not, they are on TikTok. We have to get real about some of that.
The big platforms are a big part of the problem, but I agree that we must not lose sight of the smaller but extremely risky platforms, particularly when you think about things such as grooming and child sexual exploitation. What we do not want is to create a system whereby all that shifts to a place that is less well regulated. The duties in the Bill cover those smaller platforms, which gives me some assurance that the tier system can work as currently calibrated. My understanding was that risk was part of how the tiers will be constructed, but of course that is yet to be determined. I do not know whether Richard can shed any more light on that.
Richard Wronka: There are almost two separate issues. There is the tiering in which sets of safety duties apply to difference services, and there is the definition of proportionality in the Bill, especially in relation to the illegal content duties.
We think that proportionality is an absolutely fundamentally important principle in the regime. We think that the definition of proportionality could be improved. It refers to size and capacity and it feels quite right to me that that should be a consideration. Of course, that cuts both ways, because it means that the services with the largest size and the largest capacity should be held to the highest expectations, which might be quite useful for us. A particular concern is the characterisation of how risk feeds into that, and whether it is bound into what is identified within the risk assessment rather than a reasonable or robust assessment of what risks actually occur on the service. There is potentially a bit of a loophole that might be addressed.
Q259 Baroness Kidron: Thank you for that. The last thing I want to bring up is that parents have been led to expect that this Bill will keep kids safe. That is just the common understanding, yet we in this room know that it is constructed around user to user and search and it does not actually deal with children everywhere.
I have two issues. First, should it not deal with where children are likely to be accessed? Period. It could use a definition around the Children’s Code whereby, if the Children’s Code applies, so should the safety duties. Also, in asking for your view on that question, I would be quite interested to know if you think that you and the ICO could bump into each other very badly if you are not aligned. It is really hard in an attention economy to work out whether something is a contravention of a data regulation or a safety regulation, and unless you are absolutely looped in together and have the same remit for “likely to be accessed”, we might see a little bit of a car crash. That is two separate questions in one, if you do not mind.
Dame Melanie Dawes: We are very keen to avoid that last scenario. In answer to your first question, we think that the Government’s definitions here broadly, in fact more than broadly, capture all the services that children are spending their time on. Our research suggests that is the case. I have already mentioned the figures in relation to TikTok. It is the platforms that we are familiar with, where kids are spending so much of their time online at the moment.
If there are exceptions that either we or the Government have not thought of, we will be very happy to give a view on some of those. For example, we think it includes Zoom and services such as Wikipedia, which, although I suspect it will have a very strong story to tell the regulator about how it runs its content moderation, is probably a user-to-user service and therefore still caught by the Bill. We think it is quite a broad set of definitions, but we are very happy to engage on particular examples if you think we are missing something out in relation to children.
On the ICO, you are right, I do not think that we need to have identical scope to be able to work effectively together. Certainly the Children’s Code, which I think is a very good step forward, has a lot in common with what we require now under the video-sharing platforms regime. We are doing a very intense piece of work with the ICO to see how we can join up on that. What Elizabeth Denham and I would both like to do over the next few years is create one set of requirements that may be operated by two different regulatory systems and sets of legal powers, but that as far as possible are asking the same things. If there were conflicts, for example between privacy and harm, we would resolve what we think they mean and how platforms should approach them, rather than leaving them just to the industry itself. We have a lot of work to do on that, but we are very keen to align as far as we can with the ICO, and with the digital markets unit once it gets up and running.
Baroness Kidron: I will probably write to you about some things that are missed out, but I have to raise this. If it was “likely to be accessed”, pornography would come in scope because they would have to have age assurance, so I think it is game, set and match on that point.
Dame Melanie Dawes: Yes. With commercial porn sites there is clearly a risk that if we do not include them they will push all the porn and porn viewing and all the risk into unregulated areas. On the question of age verification in the context of the VSP regime, verifying users who are posting content is the other angle we are already addressing with the quite large number of user-generated porn sites we already regulate. It is very much a live debate for us.
Baroness Kidron: Do you not feel that it would bring the Bill into disrepute if commercial porn sites were not covered on behalf of children?
Dame Melanie Dawes: I can certainly see that there is a risk that we push the problem into unregulated areas if it is not included, but equally, there are concerns about scope.
Baroness Kidron: Thank you.
Q260 Lord Clement-Jones: Sorry, I was hoping to be here at the very beginning, so excuse me if we cover some ground a little further. I come back to the prescriptive or non-prescriptive nature of risk assessment, basically, because there seems to be some concern, now we have the experience of the VSP codes and guidance, that the powers you will potentially have under the Bill are not as strong as those you have under the VSP regime, particularly with risk assessment and what you can insist on and so on. Is that the case, and should we align the Bill much more closely with the VSP regime?
Dame Melanie Dawes: There are some things in the VSP regime that are not part of the fuller regime, such as the inclusion of advertising, for example, but broadly our assessment is that it works in the other direction; the fuller Bill that is being discussed is a much more comprehensive system, and I believe requires more of companies in relation to risk assessment.
Richard Wronka: That is absolutely right. Specifically in relation to risk assessment there are no explicit requirements in the legislation that sets up the VSP regime, so we think it is one area where it is very welcome that the Online Safety Bill will go further. We have included recommendations for the video-sharing platforms that we regulate in our guidance on risk assessment, which we published just a few weeks ago, because we think that is the best way for video-sharing platforms to protect their users. It also helps them to prepare for the online safety regime that is coming down the road, but it is not a specific requirement in the legislation that sets up the video-sharing platform regime.
Lord Clement-Jones: That is interesting, because it is not the general perception. To translate it into the Bill, you can require a platform to take specified actions to come into compliance, and you can suspend or restrict a service. You will be able to do what you can do with the VSP currently.
Dame Melanie Dawes: Yes. To require them to take specific actions we will have to take them through an enforcement process, so it is not an easy thing to do, but it can be done, and business disruption orders are provided for in the Bill. A court will require a pretty high evidence threshold to approve our application for business disruption measures, but they are there, as we would expect.
Richard Wronka: Might I add one point on that? The legislation that sets up the video-sharing regime includes a list of a dozen or so appropriate measures, which are the specific things that the video-sharing platform companies need to consider implementing to protect their users. This links back to the discussions we have been having about the safety duties in the Online Safety Bill and how those safety duties can be made a bit more specific to make it clearer to companies in scope of the online safety regime exactly what they might need to do. We would expect or look for that to be a non-exhaustive list so that there is a bit of flexibility, and that it sets out a clear direction of travel and a clear baseline for companies coming into the scope of regulation.
Q261 Lord Clement-Jones: Thank you. May we come on to the question of algorithms? I know you have talked about that in the context of skilled person’s reports and so on. Do you regard yourself as having sufficient powers under the Bill as regards algorithmic inspection? There appears to be a difference between the powers that you would be given and the powers that, for instance, the ICO has.
Dame Melanie Dawes: Once you allow for the skilled person’s report, which is something we can initiate under the Online Safety Bill, we think we have broadly similar powers to the ICO, and that the skilled person’s report is just the sort of thing that we would use as a regulator on areas where we need a level of expertise and understanding to go in under the bonnet and look at what is going on. The reports are akin, I think, to the audit and rather similar to the ICO’s powers, albeit by a different name.
When we are thinking about algorithms, a lot of it, I think, will be in the risk assessment. For us, it is so central that we understand how algorithms are used that we have to start right at the very beginning, and ask the question all the way along the line. In the end, an algorithm is a set of rules that a computer operates to ask questions or to solve problems, so we need to look at it as a system. What is the data that is going into the algorithms? How are the algorithms directed? Who designs them? How are they traded off against other features of a service? What are the questions they are trying to answer, and, overall, what are the outcomes that you get when you put the algorithms and all the other service features and indeed user behaviour together? What does it then look like from the point of view of the user? There is quite a broad set of questions that algorithms are at the heart of and that we will need to be asking.
Lord Clement-Jones: Do you feel confident that you can get to the bottom of what the algorithm is doing, whatever the platform tells you, so that you can have direct audit or inspection, whatever may be involved? You are going to have to have technical people anyway within Ofcom. Will they have the ability actually to get to grips with the algorithm itself?
Dame Melanie Dawes: There is a question of whether we have the right powers under the Bill. The answer to that is yes, although we are under no illusions about how hard it will be to use them in some circumstances. If we do not get co‑operation, and we have to go through all the layers of information requests and skilled person’s reports, that could take time. We will have to be very determined, I suspect, with some platforms that we will be working with.
Do we have the skills? We are building new teams now. Our new chief technology officer, Sachin Jogia, started with us a month ago. He came from Amazon and has been working on Alexa for a number of years. He has a very good understanding of how algorithms are used to engage with the user, and to build services. We are finding that it is never easy, but there are people who want to join Ofcom right now because they are really excited about this mission, and that has been very encouraging.
Lord Clement-Jones: Before I hand back to the Chair, I must press you, because there has been quite a lot of commentary about this. You are confident that you have enough power within the scope of the Bill to insist on being able to audit platforms’ algorithms.
Dame Melanie Dawes: Yes, we think we do.
The Chair: One question in particular is: do you think your job is to audit the algorithm or just to tell them what not to do? You might say, “I don’t want you to recommend illegal content. I have an example here of a piece of illegal content you’ve recommended, so whatever you’ve done to your algorithm is down to you, but the outcome is not the result we want, and that is what we are regulating on”.
Dame Melanie Dawes: In my mind, an audit is where you find out what is going on. If we get to the stage of a skilled person’s report, it is a compliance intervention, an enforcement intervention; it will happen when something has gone wrong that we are worried about, and a risk is happening that we do not believe is being addressed. In those circumstances, you go in with a very clear purpose and ask what is going on in relation to the risk. What is the evidence of its prevalence? What are the steps that are being taken? What are the steps that are not being taken? What are the recommendations? In my mind, that is what the skilled person’s process aims to achieve in the Bill, and that is how we would use it. Specific, and potentially quite deep, questions are being asked, but in relation to a specific risk, because that is where the provision is in the Bill.
The Chair: For example, the skilled person might say, “The algorithm is taking data feeds on these datasets on these issues, and that is what is contributing to the problem, and you could resolve that. Therefore, our recommendation is that you should do this”. Do you think that is what the skilled person would do as part of a review like that?
Dame Melanie Dawes: Yes, or they might say that algorithms are not being sufficiently assessed, and there is insufficient research about what is going on with users, and what it all looks like when you actually hit real life and real behaviours. It could be all sorts of things that are going wrong, but I would be quite surprised, if we got to the stage of launching a skilled person’s report, to find that there was nothing that came out of that report that needed to change.
The Chair: In some ways, it should not be Ofcom’s job to fix Facebook’s algorithm for it, for example. It is your job to say, “This is the service level we expect from you. We can give you guidance on how you do that, but ultimately we will regulate you based on the outcome rather than the input”.
Dame Melanie Dawes: Exactly. It is an incredibly important point that we are regulating on the outcome, and the systems and processes that contribute to that outcome. In a case such as this, to be really clear about what I am saying, our skilled person might well have a pretty good idea of the broad recommendations they were making, but I would be quite surprised if they were saying very specifically, “For this technology, make that tweak to the code”. It is much more likely to be that the systems and processes are not in place in the correct way or the monitoring is not right. Whatever it is, it will, I expect, be a level above that. Who knows? We will get into this, I suspect, with some companies, at some point.
Lord Clement-Jones: It is worth saying that things like amplification of the wrong kinds of material are what we would be concerned about, so the ability to understand what is actually happening is crucial.
Dame Melanie Dawes: Yes, and that is where the risk assessment is right up front. Almost the first thing the Bill does is Ofcom’s own risk assessment. To be honest, if we could have different names for our risk assessment and the platforms’ risk assessment, it would be quite helpful.
The Chair: Noted. We have written that down.
Dame Melanie Dawes: It could be a landscape system-wide assessment and any of the platforms’ assessments. The Bill is clear that algorithms need to be part of that, and that is the starting point. How are you using them? Who is determining them? What decisions are you taking through them? How are you designing your services? All those questions need to be there right at the beginning.
The Chair: The answer back from the companies cannot be that it is too complicated to fix.
Dame Melanie Dawes: Not if it is causing harm, no.
The Chair: Exactly, no. That is the point.
Dame Melanie Dawes: In some cases the technology does not exist yet. In the next few years—potentially there is a real opportunity for UK companies to be at the cutting edge—the development of new tech will solve some of the problems. We know that for things like age verification or age assurance the technology is changing all the time, so we are not going to be saying, “This is the specific answer”, but we hope that the regime will encourage a sense of endeavour across the industry to come up with tech solutions where that is the answer.
The Chair: It is not as if this has not always existed. They were created by somebody, and if they are causing harm they can be uncreated as well. Facebook did not always run the way it runs today. It runs today because that is the way it has decided for itself, but it could be undone.
Dame Melanie Dawes: Yes, it could, if it decides to.
The Chair: We are moving on to other topics, but, Darren, do you have any other questions on this before we move on?
Q262 Darren Jones: I have a question on the resources you have. This will create a lot of work at law firms, consultancies, the technology companies and the regulator. Have you had a discussion with the Secretary of State about salary bandings?
Dame Melanie Dawes: Salary bandings inside Ofcom? No, we have not, partly because Ofcom is fully independent in how we use our budgets and how we set our salaries. We have always had that flexibility. It is set out in the Communications Act from nearly 20 years ago. We have the flexibility to offer salaries that will attract the right kinds of people.
Darren Jones: Within the envelope of money you have to spend.
Dame Melanie Dawes: Within our overall envelope of money, yes.
Darren Jones: It was reported this morning that Gill Whitehead is to become the CEO of the Digital Regulation Cooperation Forum. I am not a purist and I think it is better to have worked in a regulated business as well as in a regulator, but there is no denying that if you have worked in a regulator your stock price increases with the regulated business when you decide to go back. How will you avoid the criticism of a revolving door between industry and the regulator?
Dame Melanie Dawes: I believe it is essential that we get people in from the industry. Gill is an incredibly senior person. She has worked at Channel 4 as well, at board level, and more recently at Google. We are very pleased to have her running the DRCF. For all four of us as regulators, to have somebody who can come in and take a look at some of the work we are doing, and help us not just to join up our programme but to think about what we are doing as individual regulators, is fantastic. We are very pleased with that hire, and she starts in a couple of weeks.
The revolving door is really important for us to manage. Ofcom has very long‑standing conflict of interest policies. If anybody senior gets a job in a regulated entity, the moment they are offered the job and have accepted it, they go on gardening leave. We are quite tough on that kind of thing; I would observe that in a sense we are even more punctilious about it than I experienced in the Civil Service.
Darren Jones: My last question is on the skilled person’s role. It sounds to me like you are hiring in expertise when there is a particular need for it, as opposed to having a continuous expertise. Have I characterised that incorrectly, or is that correct?
Dame Melanie Dawes: I think it is both. We definitely need standing expertise, but there will sometimes be moments when either we do not have quite enough people, or there is a very specific skillset that we need. I think it is good that we have the flexibility to reach out into the market. We definitely need our own expertise, and that is very much what we are building to.
The Chair: Joining us remotely, Jim Knight.
Lord Knight of Weymouth: Let me start by following up briefly on the algorithm discussion you just had with Tim Clement‑Jones and the Chair. When Google, for example, changes its search engine algorithm, it is a big deal. It happens on a regular basis and there is a whole industry of search engine optimisation people who scrutinise all that, and that is their living. When doing the risk assessments, will you also be requiring the platforms to notify you of any significant algorithm changes? Will you have the competence to know what is significant, and will you have the powers to require them to do that?
Dame Melanie Dawes: I will ask Richard to come in on this. I believe we have the ability, or rather there is a requirement on the companies to redo their risk assessments in response to significant changes.
Richard Wronka: Yes. Absolutely. There are a few things to unpack. First, companies have to redo their risk assessments when they have made significant changes to their service. There is no explicit power to share that proactively with the regulator, although that is where our information-gathering powers kick in and, in practice, we would expect the supervisory approach that I described earlier, which we would have in place with the biggest platforms, to alert us to those changes before they have been put in place. Hopefully, through that package of measures, we would get to the right place.
Q263 Lord Knight of Weymouth: Thank you. Moving on, early in the first set of exchanges with the Chair you talked about being mindful of the whole user experience in relation to harm. Obviously, the business model for most of these platforms is based on advertising, and they have very consciously embedded the advertising into the user experience so as to improve the number of impressions delivered to their advertisers and to make the whole user experience appear relevant. Do you think it is appropriate for you to have within your scope the paid‑for advertising that is part of that whole user experience?
Dame Melanie Dawes: Rather similarly to the conversation I was having earlier with Baroness Kidron, there is a risk that excluding paid‑for advertising means that people will just use that route rather than another route to promulgate harm on services. There is that risk, but, again, it is an expansion in scope, and at some point I think we have to draw a line, at least for the time being, around what we are trying to do, or else we will not be able to get anything meaningful done. I can see the arguments. If we end up with paid‑for advertising coming in, as I think I said in my letter of 13 September, a number of things would then be quite important, including retaining the focus on systems and processes rather than individual fraudulent or other content.
Fraud online, which is a lot of the driver for this shift and for people’s desire to expand the scope, is clearly a huge problem, but it is promulgated by criminals. Regulating criminals is not usually particularly effective, because the whole point is that they are not terribly interested in what is within or without the law. It will require, even more than some other areas of the Bill, if this comes in, the onus and strategy to be clearly owned by the criminal enforcement agencies, and it has to be very cross‑cutting because the harm moves around to the opportunity, as we know.
Lord Knight of Weymouth: Those authorities are telling us that they would like to have in scope advertising that leads to online fraud, which raises a similar question to the conversation you had with Baroness Kidron. In that context, it was about the overlap with the ICO, but clearly there is overlap with the Advertising Standards Authority, which is a self-regulator, so that is slightly different and more awkward, and then you have the law enforcement authorities and the financial regulator, the CMA. Should we have a co-regulation concurrence arrangement more clearly put into the Bill to allow you to use each other’s powers and each other’s sanctions by reference, to make all that easier for you and for the public as regards their expectations of the Bill?
Dame Melanie Dawes: That is a very good point. On advertising specifically, I suspect that, if that were to come in, we would almost certainly want to co-designate the ASA in some form to be able to carry out the duties, in the same way as it does right now for broadcasting, and in the same way as we are likely to ask it to do on video‑sharing platforms.
More generally, this is a pretty broad piece of legislation. It is looking at a whole range of different harms—some of them criminal, some of them legal but harmful—so the partnerships Ofcom has will need to be even deeper than usual. We are used to working across boundaries and operating concurrent regimes, particularly with the CMA. We think that we and our fellow regulators could do with a little more by way of legislative support to be able to work together. I am thinking of things such as information powers and requirements to consult each other; they are very regulatory issues, not the sort of thing that is very headline grabbing.
We have done work already with the Government on that, and we think it is potentially quite important. While I have the opportunity I would say that co‑operation with international regulators is where we think the Bill could be slightly improved. We might find we need the ability to share with international regulators in some circumstances.
Lord Knight of Weymouth: If there is anything you can share with the committee about your thinking on how you get more legal strength and support for that co‑operation, I am sure we would be really interested to receive it.
Let me go back to the user experience. If you are a user of, let us say, Facebook, to pick a platform that we seem to pick on quite a lot on this committee for really good reason, you have your newsfeed, and embedded within that will be advertising and comments from other users. There is also quite a lot from private groups, if you are a member of private groups, and it is very easy to find yourself a member of those. Do you feel that you have the power to be able to get into private channels without impinging on user privacy and security?
Dame Melanie Dawes: It is quite a challenging area. Sometimes private channels are absolutely essential for us to understand and be able to regulate, particularly when thinking about the most vulnerable users and issues such as grooming of children. They are a very important part of the chain by which some harms are promulgated. At the same time, privacy is extremely important. We are already working with the ICO on some of the issues. Questions such as end‑to‑end encryption on private messaging services are what our security agencies are particularly concerned about, as is the Home Office.
From our perspective, what is important, particularly thinking about the duties in the Bill, is that where we believe that technology needs to be used for certain reasons to be able to find out what is going on within those messaging channels, as well as on other platforms, we are able to require those technologies, and we have the power to do that. We said in my letter of 13 September that we thought some tightening up was needed around the use of technology power to enable us to act quickly if we needed to.
Q264 Lord Knight of Weymouth: Thank you. I agree with you about there being a need for you to be able to find ways into private channels in respect of individual harms, and potential individual harms.
The Bill takes us away from societal harms and then has definitions on journalistic content and content of democratic importance. We have had quite a lot of representation, indeed from the platforms themselves, that it is all a bit vague and there are a lot of loopholes in all that. Do you think we should do something on societal harms? Do you think that those potential loopholes in journalistic and democratic content are things that we need to look to tighten?
Dame Melanie Dawes: When it comes to what we mean by content of democratic importance or journalistic content, I feel as the regulator that these are not technocratic questions. They are questions that are pretty much, I would say, for Parliament to debate. I would prefer it if there was a bit more clarity in the Bill about what those things mean. When it comes to journalists, unless you work for a designated broadcasting authority or a news publisher, we do not regulate journalism. At the moment, we do not have a system that says who is or who is not a journalist. I do not want to create such a system through the back door through this Bill in a way that is not what Parliament intended.
There is a lot we can do to make sure that processes are in place, so if somebody believes they are a journalist, a politician or somebody else debating an issue of democratic importance, making sure that the right processes are there for them to be able to expedite a complaint quickly might be a big part in practice of the sort of solution that needs to be put in place, rather than spending a long time defining who or who does not fall into that category, and whether or not their content falls into that category. When it comes down to practicalities, it may be that we do not need to worry too much about some of the definitions, but I think Parliament needs to be very comfortable that you are giving us clear direction as a regulator.
Lord Knight of Weymouth: But if, for example, someone is posting—to be topical—some antivax content that is amplified by the platform using its systems, perhaps despite its best intentions, you could argue that might be an individual harm or it might be a societal harm. You then approach the platform and say, “We think there is a problem with your systems because you are not picking this up”, and then it says, “Ah, but this happens to have come from the former President of the United States, and therefore he has a democratic get‑out”. What do you say to that?
Dame Melanie Dawes: It is always difficult to get into individual decisions, and it is a good example of what will or will not be in accordance with the Bill. [Inaudible.] What the Bill will require the largest platforms to have—those that have the legal but harmful duty, where I think this is engaged—is clarity in their terms and conditions about how they are not discriminating against people from one side of the political spectrum or another, but they will be free to decide if a particular step they take is appropriate in the circumstances. The key thing is for them to set out in advance what those policies are, and we will hold them to account for delivering against them.
I did not really answer your question earlier about societal harm. I wonder in practice—thinking about societal harms that are created for example from large amounts of disinformation that encourages people to ignore coronavirus vaccines, or to drink bleach, or whatever, and, equally, very widely spread hateful speech online, which undermines our sense of how we behave with one another as well as our democratic discourse—how different those general issues are compared to their cumulative effect on a number of individuals, and whether there is that much difference in the sorts of steps we want platforms to take to address legal but harmful material.
Again, it is a really big question of scope and very much one for Parliament—and one that is, to be honest, above our pay grade as a regulator. When I start to think in practical terms about the things we would want them to do, I wonder how much difference there is in those steps.
Lord Knight of Weymouth: Thank you. That is really helpful. I have a final question, which is on the media literacy powers. We have heard from one or two witnesses that they think that Ofcom having those powers is fine and dandy, but there also should be something that has an impact on schools and education, and we should bring the Department for Education in scope. Do you have enough powers to do it on your own, or do you need help from others in order to get the media literacy side right?
Dame Melanie Dawes: We can achieve some things with the powers we have under the Bill and the relatively limited budgets that we will have at our disposal, but I do not think we should confuse what the regulator can achieve by way of research and understanding and setting good practice with the need for publicly funded educational programmes, which are very much for government to oversee and find budgets for. Whether you introduce that kind of consideration to the Bill, I do not know, but it is quite important to be clear what the regulator can achieve.
Lord Knight of Weymouth: Thank you very much.
Q265 The Chair: Going back to the discussion on advertising, presumably, the primary benefit of bringing advertising in scope is not trying to regulate criminals to stop them advertising it, but regulating the platforms to not run the ads.
Dame Melanie Dawes: Yes, that is right. To be clear, you still need to take care about the impact a regulatory system will have when the problem is led entirely by criminal activity, has no basis in ordinary user behaviour triggering it and is entirely malicious. There are many areas of the Bill as it stands that are like that, such as child sexual exploitation and terrorist content. I still think that questions of fraud are quite different from the sorts of issues that the Bill is trying to address, so I am a little bit worried about focus if we expand it too far.
The Chair: It is simply about the definition of enforcement, which exists in other industries. If fraudulent ad scammers were running on ITV, and ITV said, “Well, the thing is, they just keep on buying the space, and we just can’t stop them. We just keep taking their money. You’ve got to stop them doing this. It’s terrible”, you would laugh, as we all would, because it is nonsense. It is not allowed in other media, but there is not enough effective enforcement in that space online.
The platforms may not want to do it because they are profiting from it at the moment, but it is easier to define than organic content is because there is a clear transaction and scope. There is guidance from external bodies like the ASA. We could say, “Don’t run ads that are in breach of UK consumer protection law”. The ASA does not have a golden ticket to exempt people from consumer protection legislation. If the ASA finds that an ad should not be running and other media do not run it, they should not run it either. I do not think we would need to ask the regulator to do anything more than that. If that is what you were asked to do, is that something you could do?
Dame Melanie Dawes: Yes, we can certainly do things in that space. As I said earlier, we would need a relationship almost certainly with the ASA of some form or another. I think you are right. The question is whether it comes under this Bill or whether it is part of a wider review of advertising online, which the Government are looking into. I hope you will not mind me wanting to protect a little a sense of what Ofcom is able to achieve there. I recognise the reasons why others want it brought in now, with this legislative opportunity.
The Chair: It would seem odd to have a situation where someone sets up a Facebook page and posts stuff that would fall foul of the codes of practice, and the regulator would say that the page should be closed down because what someone is posting on it is illegal, but then it puts a penny of advertising spend behind that post and turns it into a boosted piece of content that is an ad and it is fine. In the user experience, that would seem to be an odd distinction to draw, and it reflects how the online advertising market works in a different way from the way we think of advertising elsewhere.
For the purposes of this Bill, there are categories of illegal content where you expect the platforms to have systems in place to remove illegal content. Fraud is a category of illegal content. Why should we care whether it is a fraud ad or a fraud organic post? For the purposes of this Bill, that is the question we are asking.
Dame Melanie Dawes: That is a perfectly good set of questions for us to be asking in relation to the Bill.
The Chair: Thank you.
Q266 John Nicolson: Good afternoon. Does this Bill give too much power to the Secretary of State?
Dame Melanie Dawes: It gives pretty broad powers to the Secretary of State, and in many ways that is appropriate. I mentioned at the beginning that it is quite right that the Government have given themselves, or are proposing to give themselves, flexibility to adapt the regime. Defining harms through secondary legislation is absolutely an appropriate thing for Ministers to be doing. When it comes to the core tasks of the regulator, it is important that we have sufficient independence. We mentioned in our letter that there are quite significant powers for the Secretary of State to direct us on the codes, and issue broad exemptions.
John Nicolson: Yes. Could you give us a headline description of what the Secretary of State’s powers are?
Dame Melanie Dawes: I am not sure I could do it in one sentence—
John Nicolson: I bet.
Dame Melanie Dawes: —because there are quite a lot of them.
John Nicolson: Look at it. There are two pages of powers. Two pages is absolutely extraordinary, is it not? I think the idea that a Secretary of State from either party could have that number of powers would send a chill through most people. It is worrisome.
I am looking at one of the powers in particular; it is directing “OFCOM to modify a code of practice … to ensure the code of practice reflects government policy”. That could mean anything, could it not? It could be constantly changing. It does not really allow us to have faith in the idea of Ofcom trying to be an honest broker to make us all safe, children in particular, if a Secretary of State can one day just decide to instruct you on the basis of some new government policy.
Dame Melanie Dawes: It is a broad provision. In other areas, it is quite right that the Government can direct us, including on things like national security. That is the sort of thing, on your list of various powers for the Secretary of State, that I absolutely think is appropriate for the Bill.
John Nicolson: Including that one?
Dame Melanie Dawes: Specifically, the ability to direct us on national security—
John Nicolson: No.
Dame Melanie Dawes: —is appropriate.
John Nicolson: No, specifically, to direct you to modify how you behave on the basis of government policy.
Dame Melanie Dawes: I think that is a very broad provision. Yes. The Government—
John Nicolson: Should it not be static? Should there not be a static set of rules that you apply? You should not be shifting the way you exercise your powers on the basis of shifting and changing government policy. How many Secretaries of State have we had since 2019? I have lost track of them. Each of them has a very different set of priorities. I would be particularly worried about this Secretary of State instructing you about anything at all. Do you not need more ongoing stability?
Dame Melanie Dawes: The Bill provides for Ministers to issue a statement of strategic priorities, which happens in other regulatory regimes, and that is quite sensible. The provisions that require us to consult the Government and the Secretary of State when we are setting codes are all very sensible. Power to direct us for reasons of government policy is, as I said, a broad power.
John Nicolson: Okay. What do you make of what Professor Damian Tambini said? He is a policy fellow and associate professor at LSE, and he wrote to the committee: “The approach of the Bill is, all too often, to say that whereas in the first instance defining harm will be done within the Bill, regulatory discretion is a matter for the platforms, or if not Ofcom, but that if it is too difficult, then the Minister can decide”. In effect, he is saying, is he not, that it gives the Minister censorship powers?
Dame Melanie Dawes: As I said, there are lots of ways for the Government to give Ofcom statements of strategic priorities for us to consult them, and all of that is appropriate. I do not recognise that characterisation, if I am honest. We have been talking until now about the sorts of things that Ofcom will require of the platforms. I think our powers are sufficiently broad and sufficiently specific. Once we have set the codes—you are talking just about that particular provision on setting the codes—we very much feel that our independence as a regulator will be very clear.
John Nicolson: Is there any power that the Secretary of State has in the proposals that you would prefer not to see?
Dame Melanie Dawes: That is a very broad power on the ability to direct us as we set the codes—as you say, it could bring in any area of policy; it is extremely open-ended—and on the ability to give Ofcom guidance on how we carry out our work. We have to have regard to that guidance. Those two things seem to be beyond what we would normally expect to see in regimes setting up an independent regulatory system.
John Nicolson: What would you like us to recommend replacing that with?
Dame Melanie Dawes: I do not have specific suggestions, but we are very happy to write on that. I am not saying at all that the Government should not direct us in certain areas, that they should not have a view or that we should not be required to consult them. When it comes to questions of national security, I hope the Government will tell us very quickly if we need to change what we are doing because there is something that we were not aware of.
Q267 John Nicolson: You mentioned all the people you are hiring at the moment and the skills base they all have. Presumably, knowledge of social media is a key requirement for anybody who works at Ofcom.
Dame Melanie Dawes: Yes, it is certainly a requirement we need for many people. I would never say absolutely everybody, but certainly working in this policy area.
John Nicolson: What about the chair?
Dame Melanie Dawes: I do not think there is much I can say about that, Mr Nicolson. The Government have launched a process today—
John Nicolson: Yes, I noticed.
Dame Melanie Dawes: We will see how that goes.
John Nicolson: You want somebody, presumably, who is familiar with social media. You do not want an old curmudgeon.
Dame Melanie Dawes: The Government set out the job description. Above all, what we need at Ofcom is somebody who shares the same sense of mission and purpose that we all have in making a difference on this regime above all others over the coming years, but also somebody—
John Nicolson: Who commands all-party respect as well.
Dame Melanie Dawes: The thing I was going to add was somebody who understands the importance of our independence as a regulator. I have said that in a number of places recently. I am very confident, in fact, that whoever the new chair is will feel that sense of independence. It is part of the Ofcom’s DNA.
We need that, because our decisions are often challenged in court. That is the reality of life as a regulator. It is entirely appropriate that they are often challenged in court, and you need to be able to show a clear decision-making line if you are to be able to defend them. Making decisions on the evidence, in accordance with our legal powers, and understanding the importance of that and the independence that has to surround it is very important.
John Nicolson: Thank you.
The Chair: Is there a danger that Ofcom will become too big? This has been mooted before. Should we have one regulator that does pipes and signals and regulates the price of that, and another that regulates what flows through them, which is the stuff we are talking about today?
Dame Melanie Dawes: It is a perfectly fair question as we are about to expand. I believe passionately that our converged remit is just as appropriate today and in the next few years as it ever has been.
When we look at the day-to-day reality of the companies we work with, it could be Sky providing broadband and increasingly getting into the value chain on TV sets, but absolutely as a content provider, and sitting across the media and traditional telecoms side, or a company like Amazon, which is increasingly providing parts of the telecoms value chain through Amazon Web Services but also has Alexa, which is arguably a telephone-like communications system, and is commissioning TV and runs a video-sharing platform in Twitch. They are operating right across Ofcom’s remits, and we will see more of that in the future.
For us, there is an efficiency and an effectiveness that goes with being able to see across. We will still not be an enormous regulator even after this; as I said, there are questions about our overall budget. We will be significantly smaller than the Financial Conduct Authority, because we are a strategic regulator. We look at systems, processes and outcomes, rather than regulating individual decisions in our regulated companies.
The Chair: Thank you.
Q268 Lord Stevenson of Balmacara: I would like to go back over what you said about the need to work with other regulators. You were very clear about that and have moved on a little bit from what I have read in the papers. You are saying that there is a need for the exact positioning of the other regulators you signal you want to work with to be on the same page as you in their powers and objectives. Am I reading that right?
Dame Melanie Dawes: We are suggesting relatively minor changes. We published some advice for the Government in the spring on this, as four regulators, on things that we thought would help us to be able to work together. It was really about clearing away things that would otherwise stop us. It is about being able to share information. It is about being required to have regard to each other’s duties. It is that sort of thing. Since the spring, this Bill has been published and we have more information about the digital markets Bill as well, so we have been going back to that with the Government and fine-tuning those proposals. I am happy to share with the committee where we have got to on that.
Lord Stevenson of Balmacara: I would certainly like to know about the digital markets Bill if there is one. I do not think we have seen that yet.
Dame Melanie Dawes: Perhaps I should have said the preparations for the Bill.
Lord Stevenson of Balmacara: We will not hold you to that. A little bit more on that would be helpful to us as we finalise.
There are two things that I want to ask about in narrower detail. You have two ways of doing co-regulation at the moment. I assume, by what you are saying, that you would rather see a statutory solution, as there was with the VSP—an amendment to the existing Act or something in the current Bill that would reflect what we have just said rather than relying on an order under the Deregulation and Contracting Out Act 1994, which we could do but it seems a bit arcane.
Dame Melanie Dawes: I am not sure whether we have a view on quite how.
Richard Wronka: We have made use of the Deregulation and Contracting Out Act. I believe that is the power we rely on for our co-regulatory relationship with the Advertising Standards Authority, for instance. We think it works. There could be an explicit power in this Bill designed specifically for that purpose. Surprisingly perhaps, we think the Deregulation and Contracting Out Act serves as a proper basis for co-regulatory arrangements.
Lord Stevenson of Balmacara: It is what you say you do with the ASA, but I thought I was picking up other words that you were using earlier, which I think reflected that it would not appear in the 1994 Act.
Dame Melanie Dawes: No, it is slightly different. We ask the ASA to discharge duties on our behalf, and we have to do it that way partly because of its own set-up, whereas here we are talking about the ability to work across our boundaries.
The work programme we have developed so far for the Digital Regulation Cooperation Forum is partly about quite strategic work together. For example, we have a project going on in algorithms to see whether we can join up more, because there is a slight risk that we just ask the same platforms, or indeed the same research bodies, the same questions. That is one end of the spectrum. It may also reach into co-regulatory work, and certainly the VSP and the age-appropriate design code is a good example of where we need to come together and ideally come up with something that companies can see across rather than seeing all the crevices between them. Sometimes, we might feel that we want to take, or that we could take, joint enforcement action. That is another possibility.
Lord Stevenson of Balmacara: It seems inconceivable that you would not want to do that, in the sense that the flow of personal information through into products that are then using it to create advertising sales but also services to customers seems to suggest that you are going to be bound at many levels to the ICO at least, and if we are going down the route of bringing in fraud and looking into advertising in more general terms, you are going to go into a different relationship with the FCA, with all that that implies. We all wish it well, and I am sure it works at a personal level.
Given the fact that these things will be voluntary, at least in terms of the people work, what happens when it goes wrong? Where does that get resolved? Take a hypothetical example, maybe one that is not too far away from where we are. The FCA is about to issue a duty of care responsibility and rethink how it relates to its financial clients. I am sure you are up to speed on that. Of course, the wording of its duty of care will be different from your duty of care wording, if indeed we get the right wording in the final Bill. Who bottoms that out? We are now talking about different regulators. We are talking about different departments. We are talking about different Ministers. Describe the point when the car crash happens. Who picks up the pieces? Is there a bonnet?
Dame Melanie Dawes: As regulators, we feel that it is our job to pick up the pieces in so far as they relate to the work that we have. There are two questions. First, is the legislation drawn broadly enough to be there for every eventuality? You can never say yes in every respect, but it is a broad piece of legislation with plenty of flexibility around it to address any new things that come up and surprise us.
As regulators, we want to be able to adapt to real-life circumstances and particular issues that we are all experiencing, maybe with one platform, or particular questions that we need to address together. What will make us do that work? We think the suggestions that I was referring to would help, because they give us some statutory underpinnings.
Ultimately, our own sense of wanting to work together as a group is hugely important. We recently got some of our board members together, and there was genuine enthusiasm. It is also something that Parliament will probably want to hold us to account on. If we get into questions of joint committees and supervisory committees in the future, this would be an area where it can only help us to have those sorts of cross-cutting issues on the agenda.
Lord Stevenson of Balmacara: If both of you had any thoughts on that, we would find that quite useful. I think there is a role for Parliament, but it is not clear exactly where it fits. If you had some thoughts on that, it would be quite good. That probably covers my point.
The Chair: Thank you very much.
Q269 Baroness Kidron: I have to declare an interest. I have a Private Member’s Bill about age assurance coming up for Second Reading on the 19th. I will also refer to the code, which I was involved with. I want to make that clear.
When I first introduced the idea of the code, I was told that absolutely everything that happened this summer would be impossible; it would absolutely make the internet fall over, it would mean that all the companies left their headquarters out of Britain, et cetera. Happily, three years on, I think they have made very significant changes and design changes to services, some of which are marvellous.
I say that, because I want to raise the question of rules of the road for age assurance. My concern is that unless there are formal rules, hopefully set up by Ofcom, that really make age assurance privacy-preserving by its very nature and really deal not only with the provider but with how the service uses it, it will undermine the parts of the Bill that are about children.
By way of explanation, I know for example that currently there is technology where you can just give a token that says a child is nine, 13 or 15. In the business arrangements, companies using that system are saying, “Could you please give me their gender, their email, ISDN?” The real issue is not technology; it is governance. I think we are all anxious about the fact that, as you said earlier, we are getting to certain conversations in 2022 and 2023.
My real concern is that, if you do not set out rules of the road right now for age assurance, we will have an absolute smorgasbord of everyone making it up. I would like your reaction to that, and whether you would welcome setting out rules, as I know many businesses and much of the commercial sector in this area would.
Dame Melanie Dawes: This is right at the top of the list of our priorities for video-sharing platforms for the year ahead. It could be age verification and proper age-gating for adult sites and making sure that creators, as they are called on adult sites, are not under 18. That is very clear, and it is something we are working on with those sites right now. It could be about making sure that for all younger users the experience they have is appropriate, and doing things to tackle the epidemic of self-harm and eating disorder content and so on. That is absolutely front of mind for us at the moment. Some of it is about understanding the age of your users, and some of it is not.
What the ICO set out in its latest opinion on that—its description of the market, the different techniques and how they might be appropriate in different circumstances—was very similar, we felt, to how we view things through the video-sharing platforms lens. Indeed, we saw it in advance, and are very aligned on that. The market is still evolving. I do not think there is a difference in approach that I can discern at the moment between us and the ICO, but we have quite a bit of work to do to get into the specifics over the coming months.
Baroness Kidron: I agree with you in general, and I have also read the ICO opinion, but there is nothing that makes it a requirement for TikTok, Facebook, Instagram, Omegle or indeed anyone to follow it. Do you not think that a code of conduct that sets out what the expectation is would, to my first point, drive innovation towards a required and desired outcome that is privacy-preserving and, I might say on the record, privacy-preserving for the child and for adults who may not wish to have their data shared?
Richard Wronka: We certainly agree that privacy-preserving approaches to age assurance and age verification are the objective. This might link back to the conversation we were having earlier about the safety duties and specificity in the safety duties on what is required of companies. Broadly speaking, these are areas that we could recommend in codes of practice, as things stand, in the draft Bill; but as regards requiring companies to engage with it, it is not clear, as the safety duties are currently constructed, that it is something we would be able to require. We certainly see privacy-preserving and effective age verification measures as the kind of thing where we would look for extra specificity in the safety duties. That would appear to be one route to achieving the outcome that you have proposed.
Baroness Kidron: Thank you.
Q270 Lord Clement-Jones: I want to go back briefly to the co-regulation area. Obviously, there are some statutory regulators that you will be very conscious of and will have relations with through the DRCF and so on. There is the question of whether there should be more mention in the Bill about their duty to co-operate and so on. There are others—we have already heard about the ASA—such as the IWF that you could designate under the Deregulation and Contracting Out Act, or you may choose not to. What would the relationship be in that respect? They have a particular role. Will you make it formal, or will it be informal?
Dame Melanie Dawes: It is something that, for us, is a really big part of our strategy—broader partnerships, I would call them. Some of them will be partnerships about research, for example. Some of them could be super-complaints bodies, and in fact the Bill gives the Secretary of State, I believe, the power of final sign-off over who is designated to take super-complaints.
That is a very good potential example of the sort of thing that wider bodies like that may want to do as part of the overall framework. There is nothing that we are seeking in the Bill to clarify that particularly. There is plenty of flexibility for those partnerships. For us, that will be more important with this regime than for any other that we can think of, because of its scale and the expertise we will need to draw on.
Lord Clement-Jones: In a sense, that will be a decision further down the track, when you are actually in operation, you have drawn up the codes of practice, determined the risk assessment process and so on, or will you do that up front?
Dame Melanie Dawes: There is a formal sense in which the process cannot start until the legislation has reached a certain point, but we are starting the conversations now. In fact, a number of bodies such as the NSPCC have already raised with us the fact that they think they could play a role, for example.
Lord Clement-Jones: The flexibility is there to designate them or co-designate them under the Deregulation and Contracting Out Act, as you mentioned.
Richard Wronka: Yes, as we understand it. We have had fairly extensive discussions with the IWF already on its potential role in the new regime. We do not have the full set of answers at this stage. I would add only that it is a practical question from our side rather than a philosophical question. Where the expertise exists, we will do whatever we can to leverage it. We are fairly comfortable at this stage that the Bill facilitates that, on the whole.
Lord Clement-Jones: The conversations have started.
Richard Wronka: Yes.
Dame Melanie Dawes: Yes.
The Chair: Thank you.
Q271 Dean Russell: I have a few questions about future-proofing and timescales. On the future-proofing side of things—I cannot recall whether I said this in a private session or a public session—one of the things that came to mind is that, at the moment, you can screengrab someone writing something on Twitter, but you cannot screengrab something being said on Alexa.
Where in the Bill is the burden of responsibility in the future, and at the start, with regard to proving that something was said that was harmful? If you look at the Facebook metaverse, who knows if you can screengrab a virtual reality conversation that is going on? You definitely cannot screengrab an Alexa announcement. How is that being looked at? Do you have any thoughts on how it would play out?
Dame Melanie Dawes: How does the Bill relate to the metaverse? It is a bit early for me to answer that question. However, it is not entirely clear that it does not if it is about user-generated content. It may well be in scope of this regime.
Dean Russell: I hope it is.
Dame Melanie Dawes: I am not sure whether it is something that Meta is factoring into its plans yet.
Dean Russell: I hope it is.
Dame Melanie Dawes: To that extent, the overall framework is quite broad. On the question of screengrabs, perhaps Richard wants to come in.
Dean Russell: Screengrabs are an example. At the moment, it seems to me that the burden of proof and the responsibility is on the user or on a group to say, “This has happened”, and not necessarily on Facebook or Twitter to say, “We’ve found this, and this has happened”, if that makes sense. I am interested in your take on that.
Richard Wronka: The underlying issue is really important; there is a responsibility on the companies in scope to be thinking about how they can provide evidence of wrongdoing, whether it meets the threshold for criminality or not, and to support their users when their users are making a complaint about something that has gone wrong. I do not think in the first instance that we would want to be in the business of setting out very prescriptive steps for services to do that. We would expect them to think about it through their risk assessment and then come up with the right approach that works for their service. There is likely to be a variety of different approaches. We would want platforms to be thinking about it and tailoring it to their particular service. You make a good point about Alexa and the limitations on what a user might do to record abuse or other kinds of harmful interactions that they might experience.
Dame Melanie Dawes: We need to empower the user to have a better ability to tell the platform when something has gone wrong and to be confident that their complaint will be listened to within an appropriate timescale. That will be part of what enables this Bill to take flight and make a difference, but it cannot be an alternative to the platforms themselves knowing what is going on. This is why I am so keen on them being required, rather than just voluntarily able, to co-operate with external researchers. We will need all the scrutiny that we can have on the platforms over the years ahead to hold them to account. Ofcom will play its part, but we cannot be everywhere seeing everything. The more we have a sense of collective endeavour, the better.
Dean Russell: With the announcement of 10,000 new jobs across the EU for people building the metaverse, which seems to be the next evolution of social networking, where you can physically see other people in that virtual reality world, my slight concern is that if there are children in a virtual space and somebody sends harmful content into that virtual space, or encourages them to watch harmful content, how can that be proven? At the moment, it is difficult enough, but at least on a Facebook post or on Twitter or on TikTok, there is an element that you can see. I am very conscious that with this Bill we do not want it only to be relevant for three years. We want it to be relevant for many years.
Has there been much thought about that side of things—the burden of proof, and how proof can be created in those sorts of environments? I appreciate it is quite early days. Do we need either to be clearer on that in the Bill or to create more flexibility in the Bill to allow for those different types of online environments?
Dame Melanie Dawes: It is important that it is future-proofed. I am not sure whether we or DCMS have particularly thought about the latest announcements and whether or not there is anything that needs to be done to make the Bill more future-proof in that respect. From my perspective, I feel that the services we have today, which we are going to be regulating, are still hugely important and are going to be around, remaining hugely important, for some time, so there is plenty to be getting on with. Maybe that is something we can take away to talk to DCMS about, and whether or not there is anything that can be done to make sure that we are as future-proof as possible, recognising that you can always come back after two years, through the review that Ministers have provided for in the Bill, which is another way to deal with that.
Dean Russell: I would like to come to that in a moment, if I may. I believe Mark Zuckerberg said last week that there was what I think he called the killer use case example of somebody being in a meeting with Facebook glasses on and being able to read stuff that is going on while they are in a physical space, because that is the brilliant way that we have seen the world likes to work now—lots of different inputs. That is all well and good if it is your calendar and your agenda, but imagine if you are a child in a situation with those glasses—children have iPhones now, so there is no reason why they would not have other types of devices—and there is user-generated content coming through bullying them while they are in a classroom or in an environment. My plea to you is to start to think about that now and whether there is anything that you can share with the committee in your thinking about it. I am not trying to put you on the spot now, but please think about that.
I can genuinely see it. Who would have thought 20 years ago, or even 10 years ago, that people would be walking around with phones almost like comfort blankets, adults and children alike, because they cannot separate themselves? That may well be the case with other types of devices and therefore other types of inputs. Any extra thoughts or feedback on that would be greatly appreciated. I do not know whether you want to comment on that before I ask my next small point.
Dame Melanie Dawes: We are very happy to give it a bit of thought for you.
Dean Russell: Please.
Richard Wronka: We are already undertaking research on the implications of AR, VR and immersive technology. That is already under way. There is an opportunity there as well. If the current generation of the internet was designed without regulation in place, there is, hopefully, an opportunity for safety to be designed in at a very early stage for these processes. Clearly, time will tell whether that is actually the case, but there is a prize if we get it right.
Dean Russell: Absolutely. If you have any research or initial findings that you can share with the committee, it would be appreciated. You mentioned the two-year timescales and so on. Say this Bill came in on a Wednesday, would people see a difference on Thursday? I do not want to get all Craig David on it. Over the space of a week, if it comes in on the first day, how long before people go, “Oh gosh, we can really see the difference this Bill is making”?
My concern is that people are expecting, at midnight of the night this comes in, that abuse will disappear from the internet, children will be safe, and perhaps even that age verification will be in place. What is the timescale in your mind for when we would see the benefit?
Dame Melanie Dawes: Let me give you two answers. One is an optimistic one and one is an expectation-managing one, if I can put it that way.
The expectation-managing one is that it will take a while to bring in all the architecture of the Bill and see that provided for. It will take a while to get the codes produced and consulted on. That takes a bit of time if it is to stick. It will take a little while after the Bill is enacted. We will do it as fast as we can, and we want to try to do as much as we can during 2022, partly to support parliamentarians through the passage of the Bill so that you can get a bit more of a sense of how some of our codes might start to look and so on.
The more optimistic answer is that I cannot really think of a piece of public policy-making, a piece of law, where the regulator has been supported to be ready as far in advance as we are on this. We are nowhere near Second Reading, yet we already have the resources and the plans and we have produced quite a lot of published work. Because of the video-sharing platforms regime, we are already in conversations on age verification and so on. What we are aiming for is that it is not about midnight on day one; it is actually about long ahead of that that we are able to start to see some change, particularly working with the ICO on the Children’s Code. At the same time, it is true, as I said in the first half of my answer, that it will take a bit of time to get the architecture in place. As for enforcing against any company, that cannot happen overnight until you have built that and had the conversations.
Dean Russell: Thank you very much.
Q272 The Chair: Once the regime is up and running, would you consider that a major new entrant into the market should have worked with Ofcom prelaunch on identifying its response to the legislation and the codes of practice? It should have codes of practice in place already when it enters the market. Possibly more likely, would you see a very fast-growing service that reaches up to the scale of a category one service, and that should be developing the codes of practice to meet the requirements of being a category one company as set out in the legislation?
Dame Melanie Dawes: The Bill provides for those timescales, does it not, Richard?
Richard Wronka: Yes. The risk assessment duties apply across the whole regime. In that scenario of organic growth, we would certainly expect there to be a starting point. Clearly, where there is very fast growth, first, there is a role for us to make sure that we are spotting that as soon as possible and starting the conversations as early as possible. Some of it might be through our partnership work. Thinking about data-driven approaches such as looking at app store downloads and those kinds of things to give us an early signal will be a really important part of it.
On the transition point between a service that has been fairly small, and where the risk of harm might have been fairly contained, through to a larger service with a large user base over the course of 12 months or so, we would have to try to act as quickly as possible to make sure that the company was thinking about how its risk assessment might need to change to reflect its growth, and that as far as humanly possible those changes were put in place before harm had occurred rather than after the event.
The Chair: We see at the moment on Facebook people creating groups that have 1 million people in them within a matter of days, sometimes hours. With initiatives like Donald Trump’s social media platform, it is not impossible to envisage a new agent that, while smaller than the category one companies, could still be operating at significant scale quite quickly.
Dame Melanie Dawes: Yes.
The Chair: There is a final short question from John Nicolson, and then we are done.
Q273 John Nicolson: Thank you very much. I was interested in the evidence given by the Ofcom Advisory Committee for Scotland. It says that there needs to be much greater understanding of how online providers will be expected to respond when there are significant differences in legislation between England and Scotland—for example, the definition of hate crime. Were you concerned about that?
Dame Melanie Dawes: One of the things we will definitely be requiring of platforms that are required to take account of legal but harmful material is that they are properly sensitive to the national and even the local or regional, or in the UK context, across the union, and that they understand local sensitivities and how language is used, and therefore what can cause offence and create harm. That will be one of the things that we will be requiring of them.
John Nicolson: In this particular context, you could be breaking the law to a more egregious degree in Scotland than you would be in England because of the whole concept of exaggeration, which is included in a new hate Bill that talks about offences being exaggerated if they involve the persecution of particular protected categories. Do you think that both Ofcom and, presumably, the online providers have given sufficient consideration to this thus far, because, as I said, the Ofcom Advisory Committee for Scotland is concerned about it?
Dame Melanie Dawes: Yes, it is something we need to do some work on. This is one of the reasons why I support the legal but harmful provisions of the Bill. They raise questions of freedom of speech by their very nature, but it is much better for those concerns, which are there already because the platforms are making decisions like that about where they draw the line, to be within the framework, so that we can have some kind of democratic oversight of them, rather than allowing the platforms to do it on their own.
John Nicolson: Thank you.
The Chair: That concludes our questions. Thank you very much for your evidence this afternoon.
Dame Melanie Dawes: Thank you very much.