final logo red (RGB)

 

Communications and Digital Committee

Corrected oral evidence: Digital regulation

Tuesday 16 November 2021

4 pm

 

Watch the meeting

Members present: Lord Gilbert of Panteg (The Chair); Baroness Bull; Baroness Buscombe; Baroness Featherstone; Lord Foster of Bath; Lord Griffiths of Burry Port; Lord Lipsey; Baroness Rebuck; Lord Stevenson of Balmacara; Baroness Stowell of Beeston; Lord Vaizey of Didcot; The Lord Bishop of Worcester.

Evidence Session No. 3              Heard in Public              Questions 17 - 24

 

Witnesses

I: Benedict Evans, Independent Analyst; Professor Andrew Murray, Associate Dean of LSE Law School and Professor of Law, London School of Economics and Political Science.

 

USE OF THE TRANSCRIPT

This is a corrected transcript of evidence taken in public and webcast on www.parliamentlive.tv.

 


18

 

Examination of Witnesses

Benedict Evans and Professor Andrew Murray.

Q17              The Chair: Our next witnesses are Benedict Evans and Professor Andrew Murray. Benedict has spent 20 years analysing mobile digital media and technology, has worked in investment banking industry consulting and venture capital, and is now an independent analyst, trying to work out what is going on and what will happen next. What is happening next and horizon scanning are a big part of our agenda, so we look forward to your wisdom.

Professor Andrew Murray is well known to this committee. He is a former adviser to this committee, a professor of law with particular reference to new media and technology, director of the LSE Law, Technology and Society group and a fellow of the RSA. Thank you both very much indeed for joining us and giving evidence. It is really appreciated. I think we have an hour-plus of your time, so we will get stuck into the questions.

Q18              Baroness Stowell of Beeston: Hi. Thanks very much indeed for joining us. I am starting with a very simple but open question about what you see as the biggest challenges for digital regulation over the next 10 years. How can regulation keep pace with developments in technology?

It is perhaps worth me adding to that. We are very interested in hearing what you think those challenges are, but one thing we see as a big challenge, from the work we have done as a committee, is how we give regulators enough flexibility to respond to the pace of technological developments while Parliament still subjects those same regulators to adequate oversight and accountability. There is that sort of tension there, as we see it.

Benedict Evans: I have three answers, as a former consultant. The first challenge is that, when the car industry or the construction industry talk about 10 years, that is generally the next product cycle. When people in technology say 10 years, that is the edge of science fiction. We have some idea of what we might be doing then, but not really in any meaningful sense.

Something that has been on my mind recently is that Europe had auctions to sell 3G spectrum in 2000, which raised about €110 billion. Mobile internet did not really happen for a decade after that, and when it did it was not in any kind of form that anyone would have predicted. It was not the telcos. It was not AOL. It was not the ISPs. It was not the media companies. It was a has-been PC company from California and a search engine.

So 10 years is a very challenging timeline. Quite often, particularly in competition regulation in the last few years, the pace of industry change has been inside the cycle of regulatory and legislative reaction. It tends to be that the tech industry has these 15-year cycles. IBM was the centre of tech for 15 years and then, from 1980 to 1995, the PC was the centre of tech, which meant Microsoft and Intel. Then it was the web for 15 years and then the smartphone for 15 years. We are now at the end of one of those cycles and wondering what happens next. That question of speed is a challenge.

That feeds into a second answer to your question. Saying that we should regulate technology is very much like saying that we should regulate finance or cars, in that actually we do not. We regulate speed limits, road construction, light rail, and teenage boys getting drunk and driving too quickly. We may feel that General Motors is bullying its dealer network, but that has nothing to do with a congestion zone in London. That is to say that there are 15, 20 or 30 different questions within that. Some of those are tax policy, some are urban planning, some are criminal law, some are competition policy and some have conflicts within them. We are at the stage now where we are in 1975 and saying, “Oh my God, look what cars did. We must regulate these things”, but not quite digesting what kinds of problems they are and in particular what the trade-offs within them are.

Another thing that is on my mind a lot at the moment is the discussion about regulating, for example, Instagram: “We must regulate Instagram”. Okay, you go to the privacy regulator and he says, “You must make it very hard for people to move their data around, to share data and to move data to other places”. Go to the competition regulator and they will say, “You must make it as easy as possible to move data between different companies and different places, because that is how competition is enabled”.

Those kinds of trade-offs exist in every other kind of policy. Do you want to have more light rail or more cars? Do you want housing to be a wealth-building asset class, or do you want cheap houses? Pick one. We understand what those trade-offs are. When it comes to technology, we are often at the stage where my old boss in Silicon Valley, Marc Andreessen, used to say that people would just say, “Nerd harder”. “We don’t have maths to do that”. “Well, invent some”.

You have to understand when technology companies are saying, “We don’t want to do that”, when they are saying, “There are enormous trade-offs within that and we don’t seem to understand what those would be”, and when they are actually saying, “No, we can’t do that”. It is a little like going to General Motors and saying, “We want you to make the car safer”. What do you mean by that? Do you mean that there can be no crashes? They could do that if the car did not go more than five miles an hour and had a mattress on the front. Do you mean that you want to change what the crashes look like? Where would we set an appropriate level for making that decision?

Not only is this happening very quickly, but it will address things that do not exist now and will become huge in five years’ time. It will be in different countries, very often with quite different ideas of how this stuff should work. America has a first amendment that means it cannot do most of the stuff that people in Britain and Europe take for granted should happen. America has very different attitudes to just the mechanics of competition policy, what a competition agency would even look like and how it would work.

We have these tensions where an American company is being told to do this in Europe and that in Britain. The 35-year-old product manager sits in Menlo Park and says, “I can do any of those, but which?” When I was a stockbroker, I used to have a colleague who, when he was talking to a difficult client, would say, “I can buy or I can sell. Which do you want me to do? I’m a broker. I can do both”. That is sometimes the sentiment you hear from Silicon Valley: “I could do any of those, but you don’t seem to know which you want”. 

Baroness Stowell of Beeston: Whose job do you think it is to decide?

Benedict Evans: This comes to my point about regulating cars. Is a payday lending app on a smartphone a digital product or a financial services product? It seems fairly easy to say that that is a financial services product. If you are worried, for example, about AI bias inside image recognition systems or systems that decide whether you should get a mortgage or whether you are a bad credit risk, is that a technology question, a society question or a financial services question?

As a very obvious policy question, should an algorithm that says that people who have this social background seem to be worse credit risks be acted upon? What would the policy questions behind that be? Those are questions we have wrestled with for 50 years, but they get expressed in new ways in software and by new kinds of companies.

Professor Andrew Murray: I am looking at things slightly differently, from the regulator’s perspective, although I am not a regulator obviously. I see two key challenges for the regulation of digital space. The first is what you might call the convergence challenge. This is, again, cyclical, as Benedict said. Media go through cycles of convergence and we are seeing a lot of digital convergence at the moment. The key platforms are becoming gateways to a number of different parts of our lives—advertising, financial services, healthcare, news and information, and various other things.

I think we are going to see greater convergence in the way we access and use digital technologies in the next 10 years, a lot of it driven and powered by AI and so-called smart systems, which are quite expensive to produce and therefore likely to be driven by a few key companies. I think we are going to see this convergence happening.

The last time we had a convergence cycle, we ended up with Ofcom, of course, so we have seen media markets converge and regulators converge before to meet the convergent media market. I am not at all suggesting that we converge all our regulators into one super digital regulator or anything like that, but this is a challenge for regulators. The evidence that some of the regulators gave you last week acknowledged this—that they were all, essentially, stepping on each other’s toes a bit and were in the same fields but doing slightly different things. That is a mark of this convergent market.

The other challenge that they face is a legitimacy challenge. As the convergent market gets thicker, as more and more of our lives are mediated by these digital technologies, the regulators are being asked essentially to take on more and more responsibility. These key regulators, such as Ofcom, are almost operating like mini legal systems these days. They are across a number of areas, but they do not have the same accountability as courts of law do. They have a different type of accountability, because they are statutory bodies.

This legitimacy question may grow, especially if they are doing things such as imposing civil penalties of up to 10% of companies’ global turnover. There is a question of who is checking what they are doing. Who is watching the watchers? There is a legitimacy problem. None of this is a criticism of our regulators, who do a fantastic job. I am looking ahead to the challenges they face.

Going to your other point about the challenge of flexibility, the speed of change and the relative inflexibility of a regulatory framework against the speed of change in technology, the best answer is what you would call flexible within a framework. That is the use of principles-based regulation and the employment of co-regulation. In the written evidence that we submitted, we pointed out that a relative success story is data protection law. I say “relative”, because it is also now coming under challenge by being asked to do too much, essentially.

Historically, data protection has been quite an effective and functional area of regulation in this field. It is because there is a clear set of data protection principles which the Information Commissioner’s Office employs, and then the Information Commissioner’s Office has a degree of flexibility within that framework to interpret and apply those principles. This is the best hope that regulators have—to have a degree of flexibility but also a degree of responsibility and accountability. We may come back later to the question of how to keep accountability mechanisms robust.

Baroness Stowell of Beeston: I suppose there is just one further point from me. Regulators have a tendency to apply the rules or make sure that the bodies they are regulating stick within the rules. They are less eager to apply judgment, because that makes the accountability on them that much more intense, which they almost have a tendency to want to avoid, particularly in the context of them being accountable to the public at large. How equipped do you think a body such as Ofcom is to be more principles-based?

Professor Andrew Murray: The scope that Ofcom covers makes it much more difficult than it does for ICO, for example. The issue, in part, is that they are statutory regulators, so they are regulators with specific statutory functions. As a result, they are constrained by what the statutes allow them to do. As you say, their natural response is not to act ultra vires outside their statutory framework.

You could create a slightly more principles-based structure for Ofcom, but the danger is that you would end up actually trying to replicate the common law system, wherein you are trying to give it the flexibility that the courts have but without the prior structure and experience of the courts, in terms of precedent and that kind of thing. Therefore, I am not sure that Ofcom is actually well suited to be flexible in this way.

Perhaps the best solution for Ofcom is to have a more flexible response from Parliament, so that when it needs further capacity or power, Parliament is there to respond to it. I am very wary of giving regulators freedom without accountability for several reasons. It is never a good idea in regulatory theory to have power and authority without accountability. Separately, statutory regulators are designed to respond to a particular statutory issue. The flexibility to pivot to something else without the necessary framework would be quite worrying.

Benedict Evans: To pick up on Andrew’s point, it has been fascinating to look at what is happening in regulation in the US at the moment. The US regulation reminds me of Voltaire’s line about Admiral Byng—that they shoot an admiral from time to time to encourage the others. What tends to happen is that you have a set of fixed regulations and regulatory agenciesthe EPA, the SEC, the FCC and so onbut everything else is left to a combination of the DOJ and the FTC. They pick and choose, they prosecute, and they have to find a crime.

You get into the situation where you have to prove that this business that did not exist and no one could have conceived of 10 years ago has broken the Sherman Antitrust Act. If it has not, or if a judge decides that it has not, that is just okay then. If it has, you fine them some enormous amount of money and then you are in a situation of shooting the admiral, because everyone else goes, “Oh, shit, better not do that”.

Meanwhile, 10 years later, another company that did not even exist then is doing some stuff that did not exist then, so it does not draw a lesson from it at all, which is why the Microsoft case, the IBM case and so on did not really change anything. By the time Google had come along, 15 years later, most people there had never heard of Microsoft, certainly had not heard of Netscape and had not tracked that case.

I suppose what I am getting at is that each of these models can be taken too far. Yes, of course you can say to Ofcom, “Just do whatever you think is best and you can do whatever you like”, which probably would not be very good. The counterargument is that you have a sort of framework of legislation that gives people a list of boxes to check and loopholes to find. As long as you can find the right loophole or check the right box, you can do whatever you want, which is sort of the situation that you get in the USA.

Baroness Stowell of Beeston: It is the sort of situation you get with regulators generally, really.

Q19              Lord Vaizey of Didcot: This is a fascinating session. I wanted to pick up on what Andrew was saying about the ICO principles, Ofcom and statute. That goes to the heart of what we are discussing in this committee with this inquiry. If you put these regulators together in a more formal forum, can they infect each other in a good way, in the sense that the ICO can say to Ofcom, “Look at us. We’ve got principlesbased regulation and we’re much more flexible”?

That relates to the question I meant to ask, which is about horizon scanning. Do you think that, working together, these regulators can be better at, for want of a better phrase, predicting the future? That is actually a terrible phrase, because it is obviously impossible, but can they try to anticipate and, using their different approaches, future-proof themselves against what is coming down the line? That is my question. I might ask a follow-up.

Benedict Evans: There are two slightly different things within that. One of them is how you react when you look at a problem and ask, “Which kind of problem is that? Is that a finance problem? Is that a labour law problem? Is that a consumer protection problem? Is it a competition problem?” Sometimes it may be two of those, so you have two different regulators, coming from different objectives, with maybe conflicting objectives. That is, frankly, going to get worse rather than better.

If one thinks about the two great obsessions of technology at the moment, for the next 10 years, you could probably say that they are metaverse and web 3. We could maybe talk about what each of those terms means. Each of those, again, means that many more things overlap. In particular, web 3, which encapsulates a lot of things we used to call cryptocurrencies, is in some ways financial services, but it is also social networks, competition, shareholder structures of organisations and many other things. There is a question of the culture and the organisational structure of these entities, but also the conflict.

Lord Vaizey of Didcot: That is exactly the point I wanted to make. Ofcom has been regulating things up until now. It has been regulating these things called telephones and these things called television channels. Web 3 will change that in the way the ICO already has to deal with this, because everyone uses data. Web 3 is making it harder for Ofcom to regulate things, because everybody will be using this new technology.

Benedict Evans: There is this phrase “software eats the world”, and in the end everything becomes a software company. I talked about cars earlier. It is a bit like saying that we have an electricity regulator, so the electricity regulator looks after what you are charged for your power and regulates what TV shows can be broadcast, because that is electricity. Obviously that is not a great outcome. In the end, most of this stuff will not be digital any more. It will just be payday lending but with an app. It will just be a social network. It will just be a retailer. It will just be some kind of media organisation.

The interesting challenge is always when things do not fit into any of those definitions. That is always a challenge in regulation. The regulation can bake in the existing market structures. You saw this a lot with the American regulators’ attitude to cryptocurrencies. They said, “That is a security”. Looked at from some angles it kind of is, and looked at from other angles it kind of is not. If you just throw the securities regulator at it, they will say that it is a security.

Professor Andrew Murray: There are a couple of very valuable points to pick up here. There is a viewpoint that, in internet or digital regulation and governance, GDPR is the law of everything. As soon as you start working with data and a person is identifiable, GDPR is the law of everything. We could almost hand everything over to the ICO and say, “This is a data question”, but of course that would be completely the wrong way to look at it.

In terms of horizon scanning, there are two issues. First, horizon scanning is a little bit of a fool’s errand because, although you have to do it, you are going to get it wrong. If you had brought all the regulators in here in 2018 and asked them, “What will be the big challenge of 2021?”, digital healthcare and remote access to GPs was not the one they would have had at the top of their list. Yet, because of the impact of the pandemic, digital access and digital healthcare has become a real concern for people. You are always going to be knocked by external events wherever you horizon scan. That is not to say that it is not worth doing.

Secondly, if you give someone a hammer, every issue becomes a nail. If you ask Ofcom what the issue is, it will say that it is a communications problem. If you ask the FCA, it will say that it is a financial services problem. It you ask CMA, it will say that it is a markets problem. If you ask the ICO, it will say that it is a data problem. Things like the Digital Regulation Cooperation Forum help them to see each other’s perspectives, so they can see that it is not always that particular nail that they are trying to hit with the hammer. They can start to see into each other’s areas of responsibility.

I do not think that asking regulators to predict the regulatory challenge in 10 years’ time will produce the best solution. There are much better ways to horizon scan. It seems to me the DRCF is actively not doing what we want. If you let me explain, this is because it is asking regulators what the future challenges are.

I saw from the evidence last week that the FCA has put £120 million into systems and processes to help it do this. For £6.5 million, the ESRC has funded the Digital Futures at Work Research Centre at the University of Sussex. This is funded for four years and is producing a lot of research on what the future of work will look like in the next 10 to 15 years. At LSE, we currently have a bid into Leverhulme for the centre for decision-making in digital systems. This is a 10-year project, which would cost £10 million. There is a lot of horizon scanning being done in universities and academic establishments that nobody is asking us about.

The regulators are spending a lot of money trying to replicate what is already being done by highly skilled people out there, the kind of people who, frankly, will not work for the regulators. If you are an academic, you are not the kind of person who likes working nine to five in an office. I freely admit that. It seems to me that we need to bring these things together more efficiently.

Lord Vaizey of Didcot: That is a really good point. That will form a central part of our report. On the back of that, because it is obvious what your answer is, I want to ask Benedict whether there are lessons from his days at Andreessen Horowitz. There is an argument that says, “Follow the money if you want to know what’s going to happen”.

Benedict Evans: I refer to my earlier point: it seems almost absurd to ask someone in technology what will be happening in 10 years’ time. That comes back to our conversation about flexibility. You have to create some means to respond to a completely different situation.

To go to a very micro point, if one talks to competition people, the great puzzle is that it is very easy to talk about Google buying DoubleClick or Amazon buying Zappos. The puzzle is Facebook buying Instagram. Here is a pre-revenue company with 13 people that, today, lots of people would say was an anti-competitive acquisition and should not have been allowed. Go look up Jon Stewart on the Instagram acquisition. He watches a clip and says, “A billion dollars of money?” At the time, there was an almost universal sense that this was an incredibly stupid thing. “How could they possibly have spent a billion dollars on this?”

If you were deep inside Silicon Valley, you looked at it and thought, “Yes, that makes sense”. If you are the CMA, how do you look at that and say, “A pre-revenue start-up with 13 people is a fundamental competitive threat to Facebook”? There is a whole argument about whether you go ex post or ex ante. You get the more populist voices saying, “You should just ban all acquisitions”, which is profoundly silly.

It gets you to a sense of ongoing conduct regulation and the flexibility to go in and intervene, over and over again, and say, “We’re going to change the way this business works” or “We’re going to require you to do X and Y over time”, as opposed to trying to get everything right at the moment they do the deal and then, five years later, turning round and saying, “We want you to sell that”, when it may be far too late. I began my career as a telecoms analyst and we regulated local loop unbundling and interconnection rates. We did not think that we could break up telcos. We did not break up BT. Instead, we regulated the local loop.

The challenge in that, which is also a challenge for Ofcom, is that most of those questions are relatively static. Broadband does not change that much from year to year. These markets are much less static and change a lot more, which requires that much more flexibility. You have to work out some way of setting a framework of principles that lets them react to something that nobody had thought of four years ago, and that is narrow enough but broad enough, which is obviously an easy thing to do.

Q20              The Chair: Andrew, can I go back to something you mentioned in relation to Ofcom? You said that part of the solution to the problem it has as a statutory regulator might be that it could go to Parliament from time to time, either to ask for specific powers to address an identified emerging issue, or possibly to get societal steer in those sorts of areas that you described where there is a huge amount of judgment. How would Parliament do that? Is Parliament equipped to do that, because all Parliament does is pass laws?

Professor Andrew Murray: You really are at the root of the problem there. The pace of change outstrips the pace of legislative statutory development these days. Parliament does not actually have time to do all the law-making processes it needs to do already without saying, “Every six months you might need a new Ofcom Bill or ICO Bill to give them the authority”. This is the fundamental challenge at the heart of this.

There is a conflict between needing to have flexible regulation that effectively says, “We entrust you, the regulator, to go ahead and do what needs to be done” and, at the same time, having the required accountability and authority. Ofcom is a statutory body. It draws its authority from the Communications Act and subsequent Acts of Parliament, so the structure of Ofcom is such that that is where it looks for its route. If you look elsewhere and find self-regulatory bodies, such as IPSO, IMPRESS or the ASA, which draw the authority much more from their members and their membership, so they can act more quickly.

You want the flexibility of a self-regulatory body, but with the accountability and authority of a statutory body. I have been doing research into regulation and governance in this field for over 20 years and it does not exist. This is the problem. We are looking for a unicorn. The area we are seeking to regulate moves at a different pace from the regulatory framework.

The other problem with this is that regulation historically has mostly been built around sectors. There are relationships between the FCA and the financial sector. There are relationships between Ofcom and the broadcast and telco sector. The problem with digital—we will talk more about the co-operation forum, no doubt—is that it cuts across. It is not sectoral. It is a cultural change, so it affects everything we do. Trying to find that balance is exceedingly problematic. I do not know how to balance these two issues.

Benedict Evans: Picking up on Andrew’s point, one can divide the economy. Everybody is subject to general legislation. Everyone is subject to accounting law, criminal law and so on. Some industries have industry-specific regulation. There are regulated industries. Finance and telecoms are regulated industries. In some senses, what we are talking about is how kind of everything will now be regulated. What would that mean? What would it mean to say that you regulate digital? Where do those thresholds sit?

Say Barcelona has a big argument about Airbnb’s impact on house prices. Is that a digital question or is that a question about Barcelona city planning? Are Uber drivers employees or contractors? Is that a digital question or a labour law question? Part of this question about having a regulator of everything is where you split: “Is this a digital question? No, it’s a labour law question”. Is there some kind of problematic overlap in the middle where neither side quite understands?

The Chair: We may come back to the joining up of Parliament. It seems to me that we are talking about the need for regulators to work more effectively together. There may be a strong argument that Parliament also has to change, move with the digital times and become faster moving and more flexible, while not engaging and interfering. We may come back to that, after we have talked a bit about the regulatory co-operation forum.

Q21              Lord Foster of Bath: This has been an absolutely fascinating session. Thank you both very much for your contribution to it and for written evidence.

My question is meant to be about how well co-ordinated digital regulation is and how effective the DRCF is in improving co-operation between regulators. I will go a bit further, if I may. I am a relatively new boy on this committee and I am bewildered now by the complexity. We have already been told today that, on the one hand, there is a vacuum in regulating space for all things digital, and yet we have just been told that, in trying to do something about it, we are looking for a unicorn. If we look at the issue of the value of the DRCF, we are told that it is a wonderful forum for co-ordinating. It is a useful administrative thing. On the other hand, it tends to deal only with instrumental things.

If we then look at the possibility of expanding the DRCF as a way forward, we are told that the problem with that is that the big four have crowded out all the others. As a result, it does not have the resources and, importantly, does not have accountability, yet we are told on the other side that if we expand it and do it too quickly, it will prevent it being able to do anything.

We are asked, “How do we deal with the conflicts that will arise?” We have had a very good example today. Competition people would want the sharing of data, privacy people would want no sharing of data. How can that possibly be handled? In answering the question of how well co-ordinated it is now and how effective the DRCF has been, can you tell me what the unicorn might look like?

Professor Andrew Murray: I feel I have just been set the most difficult exam question of my life. I will break it down into bite-size chunks so I can at least answer part of it.

Lord Foster of Bath: You are a bear of small brain.

Professor Andrew Murray: Yes. The first thing is where we are now with the DRCF. I may be a cynic, but I do not think the DRCF would be here if it was not for the report of this committee three years ago[1] that recommended the digital authority.

Lord Vaizey of Didcot: You can come again.

Professor Andrew Murray: The DRCF is a defensive response from the key regulators. They saw that if they did not do something, something would be done. I welcome the DRCF. It is a tiny step in the right direction, in that we now have the big four—the fourth was only invited in slightly later—sharing something.

There is a kind of framework that works, slightly. However, it is the worst of the possible frameworks we could have had. As far as I am concerned—this is also in the written evidence from us—it is accreting power in the big four. If you are not at the table, you are not part of the discussion. The big four, in their workplan, are talking about things relating to children and children’s rights, but there is no seat at the table for the Children’s Commissioner, who has a statutory duty to represent the interests of children in England and Wales.

The DRCF is better than nothing, in that we now have a minimal amount of integration, under sufferance I suspect. I cannot prove that, but I suspect it. We really need something much more effective that will work.

What is the unicorn? I absolutely do not know what the unicorn is, but I know maybe the step to setting a unicorn trap, which is to expand the work of co-operation to ensure that all relevant regulators have an invite and a seat. It is also to bring about some form of accountability and representation for Parliament in the role of the DRCF, or whatever takes over from the DRCF, in the way the digital authority had.

The problem is that the DRCF is its own body. This is already proving controversial. The DRCF appointed its chief executive officer a few weeks ago. The person it has appointed is highly capable. There is no issue about that, but the first thing that a large number of people noted about her is that she came from Google. The instant response was, “This is just another circle of big tech, big regulation”.

A properly Parliament-overseen process under the public appointments system would perhaps have been a better process for that and would have said, “Here is somebody truly independent coming in through the public appointments system”. The DRCF is employing people through the members. It does not have its own employment structure or independence. It could be dissolved at any time by the partners, so it has no longevity. It has no roots.

In terms of catching our unicorn, we need to have a body that has a seat at the table available for any relevant regulator. In the written evidence, we made one observation, which was that it would have been very helpful in 2020 if a DRCF body had had the ability to invite in people from Public Health England, because suddenly they were very important in the digital space, and from NHS England and NHSX, on the sharing of NHS data. We need to have a more robust system, with a system of oversight and with all the parties being invited, not a small, self-selecting group.

I am slightly alarmed that, in the evidence last week, the witnesses suggested that what is happening in the Netherlands is a good thing. As far as I can see, that is a systemic poisoning of the system. The system we have here, where the big four get together, has now been exported to the Netherlands, where the big four have got together. The digital space is not just a digital space for the big four, so I feel that it is quite a protectionist move by the big four, because they saw the likelihood of external intervention if they did not do it.

Benedict Evans: You are suggesting that there is a group of large platform companies that are getting together to control the entire regulatory space.

Professor Andrew Murray: Yes, I suppose I am, but they are from the regulatory side, rather than from the tech side.

Lord Foster of Bath: Benedict, do you have anything further to add?

Benedict Evans: Not particularly, no. I would not know anything like as much about the internal mechanics of UK regulators as Andrew, so I will defer to his opinion.

Lord Foster of Bath: Do you have a vision for how we might trap the unicorn, what it might look like and what we need? It was you who said there is a vacuum.

Benedict Evans: I would go back to the comments that I made earlier. There is a multidimensional matrix of who is responsible for that, what the trade-offs are and how you work out who should be deciding what to do about that, and how they should co-ordinate with somebody else.

Then you have a whole other conversation, which is how whoever is deciding that would make that decision. How would they think about what the world will look like in 10 years versus two years? How would they think about the trade-offs involved in that decision? The first of those questions is an institutional question. The second is a process question, regardless of who should have decided on Facebook’s acquisition of Instagram. The other question is how on earth you would have worked out that that was going to be a problem and how you back-test your theory against that. Back-testing is interesting, just because we have not mentioned that. Whatever your proposed theory is, now apply it to 10 case studies from the past and ask, “Would it have worked then?”

The challenge in venture capital is that whenever you make one of these rules and back-test it, you miss half of the great companies. Basically, when you invest, you want a sole founder who is married to their co-founder, who has a PhD from a university and dropped out of their undergraduate degree. That is the way you get all your bases covered.

Q22              The Chair: I should declare an interest that is relevant as the discussion has moved on to whether the DRCF should include other regulators. I am an electoral commissioner, so that conceivably come into scope in the future, but it is not what we are talking about here.

Andrew, on the basis of what you said, would you urge this committee to double down on its proposal for a digital authority, the attributes that we said it should have and its relationship with Parliament through a Joint Committee of both Houses, to which both the digital authority and the underlying regulators would report?

Professor Andrew Murray: Yes. In fact, I would even possibly encourage the committee to go slightly further. I would double down on the original proposal for the digital authority, whatever its name is. The name does not matter; it is the process that is important. The process has to be one that allows open membership and not an invite from the pre-existing members. It needs to have some form of reporting function and oversight from Parliament. That lends it legitimacy.

I thought the idea in the original proposals of having a channel into the policy-making side of government, in terms of what flexibility develops or is needed, was a very strong one, notwithstanding the problem of Parliament already being very busy with its business. Interestingly enough, since that report came out I might now go even slightly further. The environment is changing, and I do not mean the technological environment here. I mean the legal regulatory environment. It is changing so quickly that I cannot keep up with it and it is supposed to be my day job.

There are reports coming out almost daily in this area. Ofcom was a big beast. It is now going to be a massive beast with the addition of 300 more staff and the responsibility for online safety. It will dominate. It is now much bigger than the ICO or any of the other digital parts of CMA or FSA. It will be very large and dominant in this area.

There is also a lot happening on AI. From the future of work report is now the proposed AI Act; I cannot remember what it is called. There are going to be new AI policies and, I suspect, new AI regulatory challenges. There is the expansion of the work of the ICO into the Children’s Code and those kinds of things.

The area has become busier and more important, and the regulators are taking on more responsibility. I would possibly go even further now. Before, the idea was for the digital authority to convene the regulators. Now, maybe the digital authority needs to be given a role to oversee and co-ordinate the regulators, and certain specific functions around protection of civil liberties, human rights, market protections and those kinds of things. 

The Chair: On the relationship with policy-making, the online safety Bill, which will be before Parliament and is undergoing scrutiny at the moment, is really an Ofcom Bill. It is about tackling it through what Ofcom would do. It seems to me that there are a number of solutions to the power of the platforms that are not covered by this Bill, because it is an Ofcom Bill. Competition policy could clearly play a really important role in addressing the issues that we are trying to deal with in the online safety Bill, but it is not in there. At the same time, does government need to become more joined up? We join up the regulators, we join up Parliament, but surely policy-making at governmental level must be looking across the piece as well.

Professor Andrew Murray: For me, yes. At the moment, in government and policy more widely, there is the kind of “let a thousand flowers bloom” approach to finding our regulation pathway for digital post-Brexit. As I said, I cannot keep up. There is a lot happening. I do not want to talk specifically about the online safety Bill, because I know that is being scrutinised elsewhere in this building. In fact, the Chair is part of that.

It reflects what you would call a regulatory response to online safety and online harms. It is suggesting that you can use a regulator to control what people say, how people say it and how that is received. Speaking personally, to my view, if something is harmful to the extent that you want to regulate it, that is a question normally for the criminal law or perhaps the civil law, through defamation and things like that. It is rightly for the courts, rather than regulators, normally, to do it that way.

To use a regulatory structure to regulate speech and harm in this way is quite unusual. I would have normally expected Parliament to say, “The following is harmful speech and is made unlawful by an Act of Parliament”, as we have done previously on a variety of speech, and then to put this into the hands of law enforcement. Of course, I also understand that law enforcement is completely overwhelmed. There is a question of resource here, too. Equally, I cannot see how 300 Ofcom people plus the actions of platforms are going to produce the outcome that is wanted either. As I said, I do not want to talk about the detail of the online safety Bill.

The Chair: Benedict, is there room to join up policy-making as well as regulation?

Benedict Evans: The online safety Bill is fascinating. From an institutional point of view, you could say that you are trying to carve out and define a problem. You are not trying to boil the ocean. You are trying to carve out one quite specific problem, define a list of quite specific objectives and then define one regulator that is supposed to solve that.

There are two strands of objection to it. On the one hand, you start with your defined seven harms and you end up with everybody’s hobby horses, as Lord Vaizey suggested. You end up with 50 things in there. You get into these ridiculous arguments that you see with the DSA in the EU. Do you exempt journalists? You have to take down misinformation, except if it comes from Der Spiegel, and then it is okay. You have to take down misinformation unless somebody who got elected says it. This creates terrible confusion inside a social networking company: “What are we supposed to do? What rules are we supposed to follow?”

The other challenge, exactly to Andrew’s point, is that we spent 200 years working out how free speech works. We have an awful lot of mostly implicit ideas about what you can say in a pub, what you can say in the street, what you can say on television and what a newspaper editor can publish. There are not very many laws about what a newspaper editor can publish, but there are an awful lot of rules about that, most of which are implicit.

Then you get to these questions: “What if a venue refuses to rent you a room? What if there is only one company that owns all the venues?” A bookshop can refuse to stock your book, but what if there are only two bookshops? You get to a lot of things that we spent 200 years puzzling about. Then we say, “Now this 35-year-old product manager in Menlo Park is supposed to work it out”. There is this slightly bizarre contrast where people say that they are just a bunch of men-children who do not know anything about the world and have never left Silicon Valley, and that they should solve all these profound philosophical problems we have spent 500 years arguing about.

It is my joke earlier about the broker: “What do you want me to do? I can buy or I can sell”. “I could make it impossible for anybody to type the Nword into Twitter”. They could do that. Do we want them to do that? What does that mean? Who should decide that? I disagree with Andrew: I think that gets you to a sense of flexibility. Is this new product a public forum, a private forum, or not? To what extent is it public or private? Is that more like a restaurant, a street or a TV channel? What is a Facebook group with 10 users or 5,000 users? You want somebody to be able to look at that and take a reasonable view. It is quite difficult to do that in legislation.

Q23              Baroness Buscombe: In a sense, you are saying that digital impacts on everything in our daily lives. If it does not now, it will soon. The pandemic has been ample proof of how things that we perhaps did not think would be important are hugely important in the digital space.

Our previous witnesses this afternoon talked about the importance of having lawyers and regulators work with people who really understand technology. Is that an issue with the DRCF and the way it is set up? You have a bunch of regulators working together. I have to say—I speak as a concerned lawyer—that a group of regulators working together probably equals power creep as well. Should we not be concerned that it could lead to a situation where we just have more rules that will not be sufficiently flexible?

We also heard this afternoon that there is a consultation out there, which is suggesting that possibly the ICO should be given the power to decide whether something is fair. That used to be called the law of equity. That is quite concerning when you have a regulator that is then being given this enormous power, in a sense, to go beyond the traditional role of regulator, if I can put it that way.

This session is hugely helpful in causing us all to really question how this can work. The unicorn becomes even more distant, in a sense, in terms of the possibilities. Maybe it also means that we should not lose sight of the common law and the law of equity. At the end of the day, what you are talking about we have been thinking about for 200 years. Those are spaces within which our cultures have developed in a sort of nuanced way, and regulation is not often nuanced.

Benedict Evans: There may be three answers to that. Coming back to my speech point, there was a famous incident maybe five years ago where Facebook took down a post of the famous photograph from Vietnam of the young girl who had been napalmed. Somebody sitting in a business process outsourcing group run by Accenture or Cognizant in India had 150 pictures to look at and saw a naked girl. That gets you to the problem of false positives, false negatives and rigid rules. You can guarantee that whoever it was, probably in a room in Hyderabad, had never seen that picture before and had no clue what it was. It is a naked girl. It is a very easy decision.

That is the challenge where you have these sorts of statements, particularly from the EU: “You have to take down all terrorist content within one hour”. Okay, you might have five people who actually speak Arabic looking at that and spending a day trying to work out whether that is terrorist or not. Of course, what will happen is that they will just take everything and anything down. Everything will come down, particularly if you are going to fine them 10% of revenue where they have not caught it within an hour.

Maybe the general point is that you have to have people, not necessarily technologists but people who have actually spent a lot of time thinking about what those questions might be, how content moderation might work or what the issues might be with payday lending. Back to my analogy with cars, it took 75 years to make seatbelts compulsory. If I was to say to you, “General Motors should make a car that cannot crash”, everyone would kind of understand what the problems with that might be. If you say to Facebook or Apple, “You have to make encryption that is completely secure and that law enforcement can access”, far too many people in this building do not understand that that is rather like asking General Motors to make a car that cannot crash.

We did not grow up with this. We do not have that sort of innate understanding of what the issues might be. That gets you to the question of how you understand what the trade-offs might be and what is possible, as well as the question of the institutional structure. Not to speak for Andrew, but quite a lot of things coming from Andrew’s observations seem to suggest that you need some sort of thirdparty path of appeal. It seems to me that he is suggesting that you should have a court of appeal that applies to all those regulators. I do not know whether you agree with that, but that seems like a conclusion.

Professor Andrew Murray: In part, yes. Putting my lawyer hat on for a second, it strikes me that regulators almost sit outside the Article 6 process, the normal process of review and due process of laws. If we think of the common law and the law of equity, there are over 1,000 years of history of getting the process right. If a judge gets a decision wrong, there is an appeals process and we work all the way up to the Supreme Court. We trust in that process, and that is how we comply and accept the rule of law.

The problem is that the regulatory state is a much newer invention. It does not have 1,000 years of history that the common law and the law of equity have. The regulatory state is a little less than 40 years old. We are still getting to grips with how to make regulators responsive to the decisions they are making. It is actually extremely difficult to review or overturn a decision of a regulator. Most of them are protected by quite strict statutory systems that have very limited lines of appeal or review.

One thing that is missing is this kind of right to be heard in open court. I do not know where this fits into the structure of accountability. If you had some kind of tribunal of appeal for all the regulators that had a pathway into the court system, it would alleviate some of the concerns I have about regulatory accountability, especially given that the key regulators now, such as ICO and Ofcom, will be able to levy civil penalties that far exceed the ordinary fines that courts would apply day to day. Courts have unlimited fining power in theory, but in practice the first judge who fined £5.5 billion would be on page 1 of every newspaper in the UK. The truth is that the ICO and Ofcom probably have more effective fining powers than your average High Court judge has.

Q24              Lord Griffiths of Burry Port: It is the world wide web and it is an international order that we live in. We must always remember that anything we talk about here is set in a global context. I would simply like you to comment on the extent to which international co-operation is necessary or achievable to ensure effective regulation. What are the challenges that you identify to achieve this or to get to this? It seems to me that it is like a chain that is as weak as its weakest link.

Benedict Evans: First, I am sure there is a regulatory term for this, but there is a tendency to go to the strongest common denominator. If an economy that is sufficiently large that you have to operate there has set a strong rule, you sort of have to apply that everywhere. Even if notionally you do not, you will probably have to build the internal processes and thinking about the way your product operates in order to apply it everywhere. That gets you to a situation where the EU is, say, 10% of Facebook’s revenue. That does not mean that it makes the product only for the EU and ignores it everywhere else. It sort of gets applied everywhere else, so you have that factor.

Secondly, you have these very different attitudes to what the law should even be. I hear suggestions from the US that regulators there are happy for the UK and the EU to do stuff that the US constitution does not let them do. They cannot pass an online harms Bill. They just could not do it, at least not in any kind of recognisable form, which is why you get this endless circling around Section 230 as a displacement activity. We can. We and the EU can just pass that law.

Thirdly, everybody is entirely happy to have regulatory cooperation as long as everybody follows what we want to do. You can certainly see that in the EU. The EU clearly thought that everybody else would just do GDPR 2, so everyone would just have equivalency and there would be no problem. It was rather put out to discover that other people had different ideas about how data regulation should work.

How you align different incentives or cultural ideas of how it should work and what the objective should be is a very difficult challenge. I understand that there is a UN treaty on regulating car design. You try to line up all your emission and safety standards so every country does not have its own seatbelt laws, for example, but cars do not change very quickly, at least not any more. I do not know how you would do an equivalent of that, given how quickly all the stuff we are talking about is changing.

Professor Andrew Murray: There is a term to describe what you are saying. It is usually called the Brussels effect. Due to the centre of gravity that is the 400 million-plus consumers in the European Union who are generally quite wealthy, most of the technology companies will align with EU and Brussels-based regulation. The Brussels effect is real and GDPR is an example of it. As you also point out, it does not always have the intention that the legislators design.

It is clear that certain markets provide leadership because they provide access to commercially valuable markets. The EU is clearly one. The United States is clearly one. The UK is clearly one, so we can provide global leadership in this area.

It is also true to say that the internet is global but also slightly local. In the UK, we tend to forget how important language is, because we are used to the English language internet. In many other parts of the world, the internet is split into what they would call the wider English language internet and the local internet. They will have a locally regulated internet in a language that is used domestically, but then will also be aware of the wider English-language internet. We do not experience that in the same way, because we are English-language.

In terms of producing co-operation, the UK, the EU, the US and a few other English-language leading states are in position to give leadership where we have agreement. The problem is that we sometimes fall into cultural or legal disagreements. As Benedict says, one key issue for the United States is that Section 230 of the CDA, as ruled by the Supreme Court, says that it is not for government to intervene in speech acts on the digital sphere. As a result, the federal Government have to be very careful about any attempts to intervene in anything that would interfere with speech acts. However, they can look to the EU, the UK or elsewhere to give leadership in this area.

As Benedict says, the major platforms do not like to build 20 versions of their product for 20 different places. They are going to build a version of the product that complies with the highest level of regulation and then roll it out across all the other places. In this sense, leadership can be given by setting the standards that we want the companies to have and then exporting them to other parts of the world. This is where regulatory co-operation is important. This is where, essentially, US regulators can backchannel to UK or EU regulators and say, “We would love to do this, but we can’t. If you can give leadership on this, we get the same result without us having to intervene”.

The Chair: Benedict Evans and Professor Murray, thank you very much indeed. That was a very illuminating session. Thank you for the written evidence that you sent us as well, which has been very useful to the committee. Have a good afternoon.


[1]              Amended by witness: I may have said “three years ago” in the moment in oral evidence. The actual number of course is “two years”.