final logo red (RGB)

 

Select Committee on Democracy and Digital Technologies

Corrected oral evidence: Democracy and Digital Technologies

Monday 16 March 2020

2.05 pm

 

Watch the meeting

Members present: Lord Puttnam (The Chair); Lord Black of Brentwood; Lord German; Lord Harris of Haringey; Lord Holmes of Richmond; Baroness Kidron; Lord Lipsey; Baroness McGregor-Smith; Baroness Morris of Yardley; Lord Scriven.

Evidence Session No. 22              Heard in Public              Questions 280 – 288

 

Witnesses

I: Kevin Bakhurst, Group Director, Content and Media Policy, Ofcom; Tony Close, Director of Content Standards, Licensing and Enforcement, Ofcom.

 

 


19

 

Examination of witnesses

Kevin Bakhurst and Tony Close.

Q280       The Chair: Good afternoon. In all the circumstances, thank you very much for being here; otherwise, it would have been a bit dismal. I will give the police caution. As you know, this session is open to the public. A webcast of it goes out live and is subsequently accessible via the parliamentary website. A verbatim transcript will be taken of your evidence and put on the parliamentary website. You will then have the opportunity to make minor corrections for the purposes of clarification or accuracy. Would you introduce yourselves for the record, and then we will go straight to the first question?

Kevin Bakhurst: I am group director of content and media policy at Ofcom.

Tony Close: I am Ofcoms director of content standards, licensing and enforcement.

The Chair: The first question is from Lord Lipsey.

Q281       Lord Lipsey: You look likely to get the online harms business. Will that mark a very big change in what we see on our screens, or very little change, or somewhere in between? If it changes things quite a lot, do you think the industry will say, Hooray, we have a level playing field at last, or, Boo, hiss, you are restricting freedom?

Kevin Bakhurst: As you have seen, the Government say they are minded to give the responsibilities to Ofcom, so everything we say today is in that context. Will there be a significant change? We hope so, over time particularly. I think this will be an iterative process in what we see online. Will the companies and organisations involved welcome it? They may welcome some of it. They certainly will not welcome all of it. That would be my assessment. Some of the media organisations that are currently regulated by us would like to see a more level playing field, with some of the tech companies also being subject to some regulation.

Tony Close: If we were given the role, we would want to make a difference, so hopefully there will be some change. It will be a different regulatory regime to ones that perhaps we are used to. Given the size of the sector, the amount of content and conduct issues online, and the nature of the businesses in the online environment, it is likely that any regulatory regime will have to be new and innovative. It will be a bit different from the broadcasting regulatory regime that we are used to.

Our own research says that the public expect something to change and there is an appetite for greater protections in the online environment. Whether that be content or conduct harms, people want to be protected better. If the system is going to do anything, it should deliver against those things. I would like us to see improved systems, processes and behaviours by the platforms. It should deliver greater transparency, and over time it should continue to deliver more and more protection for consumers.

Q282       Baroness McGregor-Smith: Could you explain to us the skills and areas of expertise that you will need to develop as you begin to regulate our online platforms successfully? I am interested in the real detail of what those skills will be.

Kevin Bakhurst: I can kick off and again Tony can give a little more detail. Clearly, we feel that we already have some relevant skills in the companies, and the size of companies, that we regulate. As Tony explained, this will be a different challenge for us. We have been looking in some detail for a period at how we would need to change as an organisation.

We have been building up our recruitment over the last couple of years in, for example, certain areas of technology and AI. We know that as an organisation we need to build up our knowledge in digital and technology areas, from what we currently regulate. Increasingly, however, as we look likely to take on regulation of video-sharing platforms, and now that the Government have said they are minded to appoint us the regulator with regard to online harms we have accelerated that process.

We have identified that a lot of the skills we need more of are in understanding technology, algorithms and AI. Those are areas that we have built up. We can only do it at a certain speed, because until we have the funding and confirmation that we are in that zone if we get confirmation there is a limit to how quickly we can recruit. We recently went out to recruit for our video-sharing platforms team, as soon as we were confirmed for that, and we had a lot of applications from very relevant candidates for that. That was positive, and we are quite encouraged in that field.

Tony Close: I would add a couple of things. This is less of a skill area, but it is also important that we build our knowledge around getting a better, more sophisticated understanding of the operational incentives of the platforms themselves; getting underneath the skin of their organisational culture and getting a better understanding of their business models. It is by doing that that we will have the best chance of coming up with effective regulatory interventions.

Baroness McGregor-Smith: How many people are we talking about? Give us an idea of scale? I look at all these huge tech companies with their significant levels of resource, and I look at Ofcom and I think, “Okay, well have limited resources, but how many people will we have looking at this for us?

Kevin Bakhurst: We have done a lot of modelling of how many people we feel we would need. Part of that depends on what scope and approach of the regime the Government and Parliament decide on. Within the narrow scope that has been outlined in the White Paper, we might be talking about over 100. If the scope and some of the duties are wider, we could be talking about up to 300. That is our modelling at the moment.

Baroness McGregor-Smith: On the point you made about algorithms and trying to understand what they are doing, does that mean you would have people in Ofcom and then use expertise outside of Ofcom to advise you on top of that?

Tony Close: Yes.

Baroness McGregor-Smith: So you would have your expert advisers sitting alongside you, too.

Tony Close: It is really important, when thinking about the skills and expertise needed, to understand that it is not just about recruitment. It is partly about building up skills internally, but it is also about being really creative about who you engage with and what you can use externally, whether academia or sector knowledge, to help you carry out your role.

However, when thinking about the number of people you might need, it is important to recognise that you should not be duplicating the efforts of the platforms. The regulatory regime will be about focusing on oversight of the systems rather than matching them person to person in their huge content moderation regimes, where they have tens of thousands of people each.

Baroness McGregor-Smith: What can we learn from around the world about what others are doing to help with the way you look at skills?

Kevin Bakhurst: We have had extensive consultation/discussions for the last few years with some of our fellow European regulators. As you know, the German regulator, NetzDG, was one of the earliest to introduce regulation on hate speech in Germany. It has been very generous with discussions about what has gone well and what has gone less well. That has also been the case with the CSA in France, which is now well down the road on introducing legislation itself.

However, we have had discussions not just with those two organisations. We were until recently a member of ERGA, the European regulators, and we have had extensive discussions with a whole range of fellow members of ERGA about the approaches going on in their countries, particularly in Ireland. We have been working closely with the Irish regulator, the BAI, on video-sharing platforms, and for the initial phase of video-sharing platforms they will clearly shoulder quite a lot of the responsibility because some of the big companies are based in Dublin.

Lord Harris of Haringey: You talked about understanding the business models of the platforms. With some regulators there is almost a revolving door between the people they are regulating and the regulators themselves. Is that what you envisage happening, or are you intending to grow your own skills or find people who are genuinely independent? How are you going to manage that?

Tony Close: I do not think we are imagining a revolving door so much. Our recent experience has indicated that when we have advertised for jobs, since we have been publicly identified as the potential online harms regulator and been appointed the video-sharing platform regulator, people from platforms are now applying for roles in Ofcom. They are attracted to the idea of looking at this through the other end of the lens. There is a benefit in taking people from the sector, and people from the regulator going to the sector at different points in their career. I do not know if it is a revolving door, but I think it is more complex than just recruiting or training people. It is all those things in the right mix.

Lord Harris of Haringey: There has been a real problem in the cybersecurity space in that there are simply not enough people to go round, and public sector organisations, even quite well funded ones, cannot engage in the same sort of bidding war that some of the specialist private sector organisations have. Have you discussed this with government?

Tony Close: We have experienced it ourselves as we have attempted to build up a cybersecurity team at Ofcom. We have found that we cannot attract people just with a salary package, because we cannot necessarily compete with the private sector, but and I hope this does not sound too corny we can attract people who have values that align with Ofcoms, and who are keen to align with that noble purpose of protecting people. We have been fairly successful in attracting people.

Kevin Bakhurst: In many ways, we need to be careful. We want enough experience from some of these companies, but we want enough other experience as well. We found this out when we took over-regulating the BBC. We needed people who understood how the BBC worked, and we needed people with a whole range of skills to sit alongside that.

We have also found in some of the areas we regulate, including in broadcasting, that it is sometimes a virtuous circle. It makes Ofcom quite an attractive place to work, even if you do not have the noble aspirations Tony talked about, because you might think that it will add to your CV. People come for a couple of years and we get great value out of it. People go on and it works the other way round as well; people are interested in coming to us as our work expands, and there are other interesting parts of it.

Baroness Kidron: I was really interested to hear, when you talked about the business model, that you get under the skin. Do you see any challenge in Ofcom itself as a content regulator? It is a 2D problem. There is the content that you can look at, and the behavioural side, which requires you to think differently about where the harm might lie, who is responsible, and whether more than one player is involved. Do you have any plans within Ofcom to upgrade your own culture and understanding of this issue?

Kevin Bakhurst: The answer is yes. For Ofcom, this is honestly a challenge and an opportunity. If you are regulating the sector, you need to be able to move more quickly, you need to be more agile, and you probably need to work in different ways and recruit different types of people. This includes upskilling in some areas and gaining a greater understanding of the culture of some of the tech companies. We are lucky as an organisation that we have a wide range of skills. We can discuss with some of our colleagues in the competition group, who have worked in some of the big telecoms companies, the mindset, motivations and so on of very big multinational, commercial companies. We have some of the skill set. We are very mindful that it is an area that we would need to grow, of course.

Tony Close: We recognise that this is not a traditional content regulatory scheme and that to do a great and effective job you need to look beyond traditional content regulatory mechanisms.

Before we leave skills, could I offer up something from the team back at Riverside House? We are not starting the skills from scratch here. Over the last year or so, we have been developing new teams and a new data and innovation hub, which, if we are given the task, will help us to get ready, to regulate platforms effectively, to understand how to store and process data better, and to get under the skin of algorithms, AI and machine learning. I am sure I can get the team to provide information about its work once we are back at the office, if you would like that.

Lord Scriven: You have been quite generic about skills. May I press you a little more? What are the skills you seek other than skills in general AI or algorithms? What is the content of the skills required? Which ones are critical and which ones are giving you the most problems in doing the perceived work that you need to be able to do to regulate effectively?

Tony Close: I will try to give as detailed an answer as possible. Some of these things are about getting a better and broader understanding of the areas. To date, because our previous focus was on telecoms and broadcasting, naturally we have attracted people with skills from those sectors. We are not particularly strong on AI and data. Underpinning AI and machine learning is all about data. If there is one area where we can continue to grow, it is in better data analysis and a better understanding of data more broadly from the academic sector and understanding how data is applied to different business models.

Lord Scriven: Which is the critical bit, which is perhaps not causing you sleepless nights but is worrying you, that needs to be in place and which you could struggle with?

Kevin Bakhurst: I would say data analysis. We have built up a good level of knowledge across some of the areas, but data analysis is an area that we need to set up and recruit in. Frankly, if we were given the job, we would need to go in to some of the big tech companiesthey have already said that they would be open to us doing that—to understand their algorithms and the AI and machine learning they use. We would need people who understand more about how those work and, frankly, ensure that we are asking the right questions.

Tony Close: This is one area where we could usefully rely on third parties as well. I do not think it is just about packing Ofcom full of data analysts.

Baroness Morris of Yardley: That is good and clearly necessary, and will be helpful, but there is something in my mind that still makes me think there is a bit of a gap. The public sector is not good at managing to adjust to rapid change. We know that is a problem. You have had all this time to prepare. You will not get that once it gets going, because the people you are working with, I suspect, will want to stay two steps ahead of you all the time.

I therefore have a question, but I am not sure what the answer could be. For a body like Ofcom, which has to abide by rules, regulations and proper ways of doing things, what change would you need in the organisation to respond to the ongoing issue of other peoples poor behaviour in making rapid changes to leave you behind if that makes sense?

Kevin Bakhurst: It makes sense. There undoubtedly needs to be some cultural change. We are not going to match-resource it, but we need to be well enough resourced. We need to have enough lawyers, economists and digital experts. We are very mindful of this. We have a new chief executive, Melanie Dawes, who has just joined us, and part of the discussion we have been having with her already in the first couple of weeks is about the organisational behaviours that we as an organisation need to change simply in order to be able to move more quickly and more reactively.

To be fair and I have only been here three years the last three years have already seen quite a significant change in Ofcom. We have been changing behaviours, changing the way we do things, and trying to be more agile. We work in a more agile way, which at the moment is quite useful. We have not gone far enough, and we need to accelerate that.

The other discussion is about where, if we were to recruit between 100 and 300 people, we would recruit them from and where they would work. Would some of them be in London and some of them potentially elsewhere in the UK, which is our preferred option at the moment? Ofcom is an organisation of 900 people. If you are recruiting a couple of hundred extra people, this is a great tool to drive change. When you take on these new responsibilities, knowing that you have to do things differently is also a great tool.

Tony Close: There are a couple of aspects I would add to that. Ofcom does not like getting things wrong. However, we may need to be willing to try things, certainly in the first instance, perhaps get them wrong, and then improve them and get them right. Linked to that, Ofcom does not like to take that much risk. However, we may need to balance our risk appetite when dealing some of these new challenges.

The Chair: I would say as a compliment that my own experience of Ofcoms findings has been universally excellent in the content area, but we are meeting in interesting times. In the last few days, literally hours almost, there has been a plethora of announcements from the social media companies about the way they will deal with issues responsibly particularly disinformation, for example.

Are you prepared to set the bar higher and accept the fact that a lot of these things to improve behaviours could have been done a lot earlier than this particular crisis? Will that be your standard, or would you be prepared to accept that perhaps these are exceptional times, and therefore you will let them lapse back to some of their former behaviours?

I will give an example. I did not know this. Did you know that Facebook it was certainly true a year ago allows truly anyone to declare anyone else dead? Any Facebook user can pretend to be a relative or a friend and declare any other Facebook user deceased. Would you regard that as falling within acceptable realms of behaviour, or is it one of the things you would like to come down on hard and stop?

Tony Close: As a Facebook user, I would not consider that to be acceptable under the compact that I have as a user with Facebook.

The Chair: But you are the regulator.

Tony Close: It is worth pausing for a moment and recognising some of the good things that the platforms have done recently. The way they have responded to misinformation about the coronavirus has been effective and is to be applauded. It probably provides a decent model for how you would want them to behave more consistently, perhaps all the time, and it is to be welcomed. I was not aware of the specific policy, or lack of policy, that you are talking about. It does not sound like a great thing to me. I answer that as Tony Close.

The Chair: I am just making the point, and hopefully it is helpful, that you have been offered the opportunity by this crisis to set a new set of standards. Would you maintain those standards, or would you be susceptible to the argument that, Well, that was then and this is now?

Kevin Bakhurst: First, it is worth pointing out that at the moment we are not clear from the draft White Paper whether disinformation and misinformation will come within the duties of the new regulator. Having said that, we know from the contacts we have had with the big tech companies over the last few years, and from reading the community guidelines as they develop, they have improved.

We had a very good meeting with Facebook recently. It knows that it has further to go, but there is no doubt that this is a constant period of improvement. Is it good enough so far? Clearly not. There are still things slipping through. It is one of the jobs of a regulator to ensure every time there is a crisis or an event that the tech companies learn from that and do better next time. It is in the very nature of what they do that they will never get to 100%, but they are not doing well enough across a number of areas, and I think they are conscious of that. They are doing better than they were, and they certainly understand that they need to do better still, for sure.

Q283       Lord German: If you are going to be the regulator for online harms, clearly other regulatorsthe Information Commissioner, the Advertising Standards Authority, the Electoral Commissionhave an interest in some aspects of the same area. How do you propose, as the regulator in this specific area, to engage with other regulators? How different will that be from your current arrangement, where other regulators obviously have some interest?

Kevin Bakhurst: I can kick off, and Tony can probably add a lot more detail because he is already involved in some of these relationships more closely than me.

We have a number of existing and very good relationships with the other regulators, including the CMA, where we have concurrent duties in some areas. We have built a really strong, ongoing, working relationship with the ICO, which has resulted in the commissioning of a certain amount of research in this area. We have had very fruitful discussions about the new guidelines which the ICO has brought in with the age-appropriate design code, and about where there might be synergies and overlap, and potential issues with where its duties end and ours begin.

We have good relationships and regular working meetings between the chief executives of our organisations. We have been in quite extensive discussions about how, if we were given this job, we would need to cement that relationship further.

Tony Close: I would add a couple of things, mainly about how the relationships work and how they should change to be effective. At the moment, we have a range of different relationships. We have tried to increase the range of issues that we co-operate with other people on. We recently undertook specific research with the ICO into online harms, which proved to be very successful.

To make this work as well as it can, the relationships have to be a bit more future-looking. Collectively, we have to come up with a way of horizon scanning, so instead of ensuring that there are no gaps between us now, our focus is on ensuring that we can identify gaps before they emerge, or identify gaps that may be caused by the matrix of regulatory relationships. That has to be our focus.

Lord German: Currently, you have no letter of direction that tells you that you have to have a formal relationship with other regulators. Do you think you will need to define that relationship as being not just a meeting of the chief executives but a relationship with defined boundaries, which you will all have to work onand you will have to work together within those boundaries? It will certainly have to be more formal, and a bit more structured than the current arrangements, which tend to be by agreement between you, because the number of other regulators will be greater than you have been used to in the past.

Tony Close: The short answer is yes. That must be the right answer. It depends on the arrangements that we have. I say that only because we currently have a very structured relationship with the ASA and a less structured relationship with the ICO due to the different regulatory relationships we have. A more structured relationship with all the parties with an interest in this area is likely to be beneficial.

Lord German: You mentioned the Electoral Commission. Have I missed anybody out? You have not really had anything much to do with it, although you have had some electoral work to do in relation to broadcasting. Will it be many more than the three? Are there others that you need to be engaged with?

Tony Close: There are likely to be other organisations that we would want to engage with, like the BBFC; it which played an interesting role in things like age verification online, and we could learn lessons from the process it has been through. We have a relationship with the Electoral Commission, you are absolutely right, and with the CMA, the ICO, the ASA and others. It is important that we have a relationship with them all.

Lord Black of Brentwood: Just as a follow-up, managing all those relationships will depend a lot on the nature of the legislation when it eventually comes here. Are you talking to the DCMS about the structure of legislation? Do you see that as a process that you will be closely involved in?

Tony Close: Yes. Before the statement in February announcing that the Government were minded to appoint Ofcom as the online harms regulator, we were one of a number of organisations working with DCMS and the Home Office on what a future regulatory framework might look like, helping them to understand and bringing our expertise and experience to bear.

Since being named in the February statement as the minded to appoint regulator, our relationship with DCMS and the Home Office has become more structured. We are working with both departments on understanding better what great legislation would look like in relation to the different component parts of the scheme how you might define scope, how you might define harms, that kind of thing and what enforcement powers might look like.

Lord Black of Brentwood: The timing issue is important to this Committee. How far advanced is that work?

Tony Close: The Government have set out their timetable, and I think they intend to stick to it, coronavirus permitting. They will release a statement in the late spring late May and producing heads of legislation over the summer for scrutiny.

Kevin Bakhurst: The only thing I would add to that is there is definitely a sense of urgency about this. We have quite a lot of people working on our input and our advice and on the relationship at the moment. It is quite a sharp timetable. The indications are that the Government are definitely trying to push on with it.

The Chair: A couple of years ago, Kevin very kindly hosted a seminar at Ofcom. You may remember that one of the slides was an illustration of how a good multinational would look for weak spots in between the regulators. It was put to us earlier that there might be a SWAT team, made up of the skills of the team of the regulator, that could move in to where those susceptibilities were, if they emerged. Would you be in favour of that, or do you have the same concern that I put to you two years ago?

Kevin Bakhurst: The slide has definitely informed our work on this area. It would definitely be of concern to us. As Tony alluded to, there definitely has to be a degree of horizon scanning within the regulator, or by a separate body, to look at whether there are gaps between us that we are missing. Whether that is through co-operation or whether it is separate body is a matter for Parliament and The Government. We are certainly trying to do that already in working with the other regulators. Equally, there is an important role in looking for areas of overlap, so that people do not feel they are being regulated twice in the same area. That is almost as important.

Q284       Baroness Morris of Yardley: Could we talk a little about digital literacy, which is one of your responsibilities? What is the state of media literacy in the UK now, following you taking on the responsibility some years ago?

Tony Close: It varies by group. Over the last 15 years, our research indicates a welcome increased level of comfort online and an increased level of awareness of the risks of potential harm online. However, the last couple of years has indicated that there are some significant gaps in peoples digital literacy. What causes me particular alarm is the fact that our research indicates that there is often a gap between stated levels of literacy and competence and actual levels, particularly in two areas that come out of our research. People’s ability to identify paid-for content or advertising is surprisingly low. Their ability or willingness to check or scrutinise the authenticity or truthfulness of statements is surprisingly low. Bearing in mind that when we do this work we are often reliant on peoples own stated willingness, it is probably lower than the research indicates. That is an area of concern for us.

Baroness Morris of Yardley: When you first took on this responsibility, it was a very different world, and I suspect it was about making people skilful enough to access government online and things like that. It is completely different now.

Kevin Bakhurst: Yes, it was very different.

Baroness Morris of Yardley: When you ask, What is your level of literacy?” they might say, “Yes, I can send an email”, and, “Yes, I can get into the Government Gateway”. Have a lot of people not even got to the point where they know what they do not know?

Kevin Bakhurst: You are quite right

Baroness Morris of Yardley: The nature of the task has changed.

Kevin Bakhurst: Exactly. When we were given the responsibility a number of years ago, it was a very different world. We have been trying to evolve the way we approach media literacy and digital literacy in recent years. One of the first discussions I had with Lord Puttnam when I arrived here was about needing to do more at Ofcom. That was a message that we took on board, and we have stepped up our efforts there.

We have been doing a lot more about childrens media literacy. It is about getting much more into the granularity, as Tony touched on, of exactly what people understand about the type of platforms and search engines they use and the impact that the internet has on peoples lives. We have produced a huge amount of research about it. We are always trying to identify the new gaps, what knowledge would be better and what is the most useful research we can produce which the Government, organisations and schools can use to try to fill this space.

Baroness Morris of Yardley: One thing that people have said in previous evidence sessions is that we have a great media literacy programme. I have no idea whether that is true, because we have no system for evaluating that. When we develop it in that broad sense, whether for adults or children, we do not even have the structure in place whereby we can learn from each other and ensure that we are not just replicating poor practice.

I notice that you have the Making Sense of Media programme, and you talk in your evidence about working with other organisations, as you have just done now. To what extent do you see yourself as co-ordinating the activities of others and putting in some quality control, or leading it, to try to find out which is the best way, or whether it is both, and how do you bring those together?

Tony Close: There is also a question here about how many people are doing the same thing in this space. There is a legitimate and vital role for government in setting out a media literacy strategy and public policy. There is a legitimate and vital role for a regulator like Ofcom, alongside other regulators, to do everything it can to bring various sources of both information and activity together, and there are lots of people out there doing good things.

If we can add anything, it is about being able to provide a consistent degree of world-class evidence to help people, if they undertake initiatives, to understand the basis on which they should understand them, and providing people with a space to come together and discuss these things. That is what we are trying do with our panel for Making Sense of Media and our broader network for Making Sense of Media, as well as ensuring, hopefully, that there is not too much duplication of effort, that people are putting their efforts in the right place and everyone is not running off and doing exactly the same thing.

Baroness Morris of Yardley: Are people happy to work with you on those terms? Do you find it easy to get them together?

Tony Close: Yes, including people from the industry we would be regulating. One thing that has changed in our own work over the years is that we have moved from a system where we just research how people are behaving, and their concerns, to a system where we also test that with them to ensure that we are getting under the skin and know as well as we can whether they are telling us the truth, or whether we know what they do not know to quote your earlier comment back to you. Part of that is having a better developed array of research techniques whereby, in addition to asking what people what they are up to, we can also passively track their activity so that we can see what they are up to.

Baroness Morris of Yardley: You are a regulator, but your responsibilities in this area are to promote. I am not being pernickety about words, but that is a big difference. You could argue that you do not have a regulatory function for those things, because you are doing the promotion. I suspect, because of the organisation you are, that behaviour goes all the way through. I am trying to test whether people might look to you as a guardian of quality in this area or whether, because your responsibility is to promote, it just does not come through as strongly. Does that make sense? It is the difference between regulating and promoting.

Tony Close: It is a point well made. I am not sure what the answer is, to be honest. We are trying to do as good a job as we can now in relation to media literacy. We think it is vital, not just because Ofcom has a duty but because it is big component part of us carrying out an online harms regulatory job.

Baroness Morris of Yardley: It is crucial.

The Chair: In fairness to you, the original legislation was flawed, and it was not entirely clear who was supposed to do what, when and how. A number of us can take responsibility for that.

Baroness Kidron: There is a very persistent idea that because children have great facility in using digital tools, they are equally literate about the purposes and outcomes of that use. I believe that Ofcom evidence disproves that to a great degree. Would you say something about that, for the record?

Tony Close: That is right. Our own research, particularly into childrens use and attitudes, indicates a high level of functional savviness. Kids know what they are doing when it comes to navigating or using apps and other online sources. However, they often lack the critical understanding and critical savviness to make the best decisions, or even to understand the impact of their behaviour and their decision-making process.

Q285       Baroness Kidron: Thank you very much. It seems that the Online Harms Bill will include a certain number of codes of conduct. Could you outline what your process will be in developing those codes and where you think there is evidence of best practice in codes being an effective tool for a regulator in this space?

Tony Close: You used a crucial term there; they are a tool. The codes are not the be all and end all. They are not the outcome you are looking for as a regulator. They are a vehicle through which you seek positive and great outcomes for consumers. Our experience of developing many different types of codes over the years, whether they be the really prescriptive type of codes such as the Broadcasting Code, or the less prescriptive codes of guidance that we have for telcos and communications providers, suggests that there are some fundamentals about getting it right.

You have to have a great evidence base. Often that means researching different aspects and different harms or processes, things that might end up in the code. You have to consult widely. You have to consult with the industry that will end up being subject to the code, but you also have to consult more broadly with anyone who is likely to be able to help you understand the matter as well as you can.

You need to create a code that is evolutionary. It needs to be able to change and adapt over time, which often leads you to some broad principles at the outset that you can then flesh out over time. You need to find a way of balancing the often contradictory incentives or factors. There is a great example of that in the Broadcasting Code; we have to balance, at the heart of the code, requirements to secure freedom of expression with requirements to protect people from very bad things.

Baroness Kidron: Kevin, do you want to add anything?

Kevin Bakhurst: No.

Baroness Kidron: What part does enforcement play in relation to codes as a tool? I know you are a regulator, and it will come from the Bill, but do you as the regulator need proper hard enforcement possibilities to make codes matter?

Tony Close: There is a fairly straightforward answer.

Kevin Bakhurst: The answer is yes. There is no point being a regulator or having a code of practice unless you can enforce it properly. You need a range of tools to be able to do that, including strong information-gathering powers to get the information you want, even when people do not want to give it to you. You need a range of tools to enforce the code of practice, which starts at an encouraging level but ultimately needs to be backed up by some sort of statutory enforcement, which matters to big tech companies.

Baroness Kidron: Might I press you on that last sentence? It is increasingly considered that the big fines that we have seen are being included simply as a price of doing business and are absorbed by the big companies. So, it is behavioural change, design change or personal responsibility that needs to be available at the extreme end.

Kevin Bakhurst: Personal responsibility is probably at the slightly extreme end, as are ISP-blocking powers at the extreme end, but you may also need those for some platforms, and not necessarily the big ones.

This is an area we spend a lot of time on. Ultimately, the range of powers is a matter for the Government and for Parliament. The principle requires a range of powers to encourage better behaviour. Some of the tech companies are increasingly concerned about their public image and their image to users. It can be commercially damaging to them if they get criticised too much, particularly by an independent voice.

Ultimately, you need to be able to put sizable financial fines on some of these tech companies. The cumulative effect can be quite meaningful. We know already from our experience with enormous global telecoms companies that you can encourage better behaviour. You need a full tool kit, and in the end, you need a substantial hammer, really.

Baroness Kidron: Is the other end of that that you need to make very clear statements about proportionality and about how that waterfall goes, so that if you have a very extreme hammer it is not used to crack open a nut?

Kevin Bakhurst: That would be our general approach in regulation. Particularly with broadcasting standards, there are some lessons we can draw from that and read across. When a broadcaster makes a mistake, the first thing we do by and large is publish a decision saying that it made a mistake, it should not do it again, and what the problem was.

There are different grades whereby you can start looking at reading out a correction on air, imposing a financial penalty or taking the licence away. In broadcasting, taking a licence away is a really high-level punishment when you think about it in the context of freedom of speech. We try not to use it very often. We move through the early stages, and by and large we see a change of behaviour, but we are not afraid to move to the final stage.

Baroness Kidron: Finally, last week we saw in Washington the introduction of a Bill called EARN IT. There is a very interesting concept in it, which is that if you cannot be seen to earn your lack of liability, you will become liable, so it spoke to that circumstance. Is that the sort of thing we may have to look at?

Tony Close: I do not know enough about the EARN IT provision. I know that at the heart of the European debate about intermediary liability is the trade-off between responsibilities and protections. There is a strong case for maintaining broad intermediary liability protections, assuming that within the regulatory framework you can secure enough oversight and responsibility from the platforms. I like to think of it as a compact between the platforms. They maintain intermediary liability protections by assuring the various member states and the public they serve that they will oversee the content and conduct and the behaviour in a responsible way.

Baroness Kidron: From your point of view, the job of Parliament and The Government right now is to make that compact real, visible and understandable to all parties.

Tony Close: That would be sensible, yes.

The Chair: That is a very encouraging answer but let me question one thing. There is no question at the moment, if you look at politics in the United States and Europe, that there are different approaches to regulation. Is it your sense from your initial conversations that the view you have just expressed is shared by DCMS and that we will have a more European type of regulation than, let us say, a softer-touch American version?

Kevin Bakhurst: I am not sure that I would characterise it quite in terms of European or American.

The Chair: It is a question of believing or not believing in regulation. My impression is that the European regulators believe absolutely in what they do. I have never gained quite the same impression with the United States.

Kevin Bakhurst: In that respect, we are certainly very encouraged by the approach, and we are working with DCMS on a process of regulation that we hope will work. It is undoubtedly very different from what you would see in America at the moment, although there are some shoots in places like California. We would be encouraged by that.

I have said before, if Parliament and the Government, and the regulator, can get this right, this could be the most comprehensive and effective system of regulation that I am aware of. It is an opportunity for us to try to do that, and in my belief it would deliver for audiences and consumers quite significantly.

The Chair: Thank you very much.

Lord Black of Brentwood: Following on from that point about the comprehensive nature of it, do you see a difference and, if so, what is the impact of that difference between protecting individuals and protecting society as a whole from online harms?

Tony Close: I think there is a difference, but it is not a bright line; it is a spectrum. I mentioned the misinformation about coronavirus earlier. That is a great example of something that, if mishandled, could have a detrimental impact on individuals and on society as a whole. When we think about that distinction between protecting individuals and protecting society, we have to be aware that there are blurred lines sometimes.

Kevin Bakhurst: I agree with what Tony says. Our approach to regulation is not focused on individual users per se. The idea is not that Ofcom is going to look at individual complaints; it is going to look at systems that deliver. I agree that it will deliver for individuals and for society as a whole. Whether that is in the area of protection against terrorist material or other illegal material child abuse material it will undoubtedly help to protect individuals and society. I do not think they are exclusive. I think it will deliver for both.

Q286       Lord Black of Brentwood: Of course, political disinformation is another example of where individuals can be damaged, but society as a whole can be damaged as well. Assuming that your remit is as set out in the White Paper I appreciate that there needs to be consultation, and there is a long way to go before we actually see that in legislation will regulating online ad libraries and imprints be part of the role of the new regulator?

Kevin Bakhurst: We have discussed this quite a lot in the last few days, partly in preparing to come here, because we know this is an area this Committee is interested in.

Lord Black of Brentwood: Select Committees have their uses.

Kevin Bakhurst: Yes, certainly, they focus minds. I am afraid we have to come back to the fact that the scope of this paper will be for the Government and Parliament to decide on. There is no doubt, in our view, that if Parliament or the Government were to decide to regulate this area, there would be value in all the regulatory tools you have just outlined, because they are powerful tools in trying to ensure that political advertising is as accurate and honest as possible. Some of the platforms already do this, but it is very variable across different platforms. Bringing some greater transparency and regularity to this would be a powerful tool, in our view.

Lord Black of Brentwood: As they are powerful tools, presumably you see the importance of having protection of freedom of expression in the legislation.

Tony Close: Absolutely.

Kevin Bakhurst: We have said very publicly, and we have said this all the way along, that we have a statutory responsibility under the Human Rights Act to protect freedom of expression. That reads right across our work. It is at the heart of what we try to do in broadcasting, and it will be at the heart of any approach that we take to this. We very much welcome the fact that it was made so clear in the recent draft White Paper that freedom of expression was a part of the Governments plans.

The Chair: Thank you very much indeed.

Lord Harris of Haringey: I just want to follow this up. You talked to us about these being very individual harms, and you also talked to us about freedom of expression. In the COVID-19 crisis, there are platforms that seem to be promoting the message that if you eat a bulb of raw garlic you are absolutely protected against COVID-19. Apart from being unpleasant, that may well be dangerous to people. Yet if that is done in the name of freedom of expression, it is okay, is it?

Tony Close: No. Rights are all conditional, and carefully balancing those rights and responsibilities is the key to managing your platform if you are Facebook or managing the regulatory framework if you are Ofcom. In a situation like that, you would expect a platform, in the first instance, or the regulator, to judge the degree of freedom of expression that you need to attribute to a certain type of content, and then balance that against the potential harms and make a sensible balanced judgment about whether the public interest is served by removing that content, or by taking measures to minimise it, or whether the public interest is served by allowing it up. In this case, I would say that the public interest is served by removing it, because the harms outweigh any right to freedom of expression.

Lord Harris of Haringey: That is the sort of answer I like to hear. If the content is designed to undermine peoples faith in democracy or democratic systems of government, and is consistently doing that, is that harm covered, or is that too much freedom of expression?

Kevin Bakhurst: This comes back to the scope of the Bill and whether disinformation or misinformation is within the scope or outside the scope of any future regulators responsibility.

Lord Harris of Haringey: It may not be disinformation. It may simply be a whole series of materials that are designed to undermine peoples faith in the principles of democracy and government.

Kevin Bakhurst: This is the most difficult area for any regulator of any platform. As you have just described it, what is material that serves to undermine the Government? If you are going through a crisis and people question how you are approaching that crisis, is that designed to undermine the Government or society, or is it fair debate? The threshold on removing material, or stopping it, means there has to be absolutely clear harm in it; otherwise, it is classed as robust debate. Robust debate is an essential part of our society, and it should be an essential part of the online community as well.

Tony Close: Because of the inherent complexity of issues like this I know the ad libraries are intended to deal with a very particular thing systems that encourage transparency and greater accountability are a great place to start when it comes to this kind of stuff.

Lord Harris of Haringey: Will it be just you the regulator who determines that? You have said that these are very difficult questions, and so on. Would you be the one making the judgment, or would you wish to open that out?

Tony Close: In the first instance, it is for the Government to judge how this fits into any potential regulatory or legislative scheme. Parliament has a crucial role to play when we are dealing with things like political misinformation or disinformation. The work of this Committee will surely end up feeding into this kind of consideration. Ultimately, if any of this were in the scope of regulation, you would expect platforms in the first instance to take significant responsibility for behaving in a manner that was consistent with the public policy objective of increasing authentic political speech and minimising inauthentic political speech.

Kevin Bakhurst: The only thing I would add is that Ofcom makes these judgments a lot of the time in relation to broadcast services. We regulate hundreds of broadcast services, many of them with viewpoints that are very different from core viewpoints. From our point of view, the plurality of viewpoints and freedom of expression are really important. We make these judgments an awful lot in relation to hate speech and illegal content online.

I am not trying to skirt the question, but, as you know, these are really difficult judgments, and they require a lot of thought. I would not say that we get them right all the time. We try our utmost to get these judgments right. We have some experience of making these kinds of judgments, but we have a very vibrant broadcast environment where many viewpoints are expressed, and many of them are not acceptable to lots of people.

Tony Close: There is also a role here for ensuring the promotion of or access to trusted news sources. Part of the answer is not about whether the regulator or someone else takes stuff down, removes it or restricts access to it, but about ensuring that the public genuinely have access to a wide range of authentic truthful news sources.

The Chair: That leads us straight back to digital literacy and understanding truthfulness.

Q287       Lord Holmes of Richmond: Platforms shape the user experience through algorithmic recommendation, content moderation and design of the user interface. In seeking to understand how to get behind the user experience and the discussion online, which of these should Ofcom start looking at?

Tony Close: The pithy answer is all of them. I genuinely think it is difficult to disentangle these things, because they are an end-to-end user experience. Almost all platforms use them all in delivering content to the user and in shaping the users experience or journey through that platform.

We have already started undertaking some work. We carried out some research last year looking at the role played by AI in content moderation, and how it shaped not just the content that people receive but how it shaped their next choice and the content that was delivered to them. That has been enormously helpful. If we are to do a great job in this area, we have to look at all the component parts of that content delivery chain.

Q288       Lord Scriven: Finally, a very easy question: the underarm question. If the Government could do one thing to ensure that you are a successful regulator in this new remit, what would it be?

Kevin Bakhurst: May I answer with more than one thing?

Lord Scriven: Let us start with the top priority.

Kevin Bakhurst: Design a framework that is workable and that safeguards the key parts of successful regulation.

Lord Scriven: Your other bits?

Kevin Bakhurst: That leads into what the successful parts of regulation are. They are the independence of the regulator, the powers to do the job properly, and the proper resourcing to do it in a challenging field.

Tony Close: That sounds good to me.

The Chair: All I would say is that it has been a very good session. Our job is to ensure that we hold the Ministers feet to the fire and try to provide you with the resources to create the regulator of your dreams, and our dreams. Thank you very much.