Joint Committee on the Draft Online Safety Bill
Corrected oral evidence: Consideration of government’s draft Online Safety Bill
Monday 18 October 2021
3.30 pm
Watch the meeting: https://parliamentlive.tv/event/index/348d8d04-3876-4220-a40f-906f210686fe
Members present: Damian Collins MP (The Chair); Debbie Abrahams MP; Lord Clement-Jones; Baroness Kidron; Darren Jones MP; Lord Knight of Weymouth; John Nicolson MP; Dean Russell MP; Lord Stevenson of Balmacara; Suzanne Webb MP.
Evidence Session No. 6 Heard in Public Questions 118 - 127
Witnesses
I: Mark Steward, Executive Director of Enforcement and Market Oversight, Financial Conduct Authority; Guy Parker, Chief Executive, Advertising Standards Authority; Michael Grenfell, Executive Director for Enforcement, Competition and Markets Authority; T/Commander Clinton Blackburn, National Economic Crime Coordinator, City of London Police.
USE OF THE TRANSCRIPT
17
Mark Steward, Guy Parker, Michael Grenfell and T/Commander Clinton Blackburn.
Q118 The Chair: We would like to start the second panel with some of the agencies that work on looking at the advertising sector and financial scams. Could I ask you the first question, Guy Parker? I do not know if you heard what Martin Lewis and Rocio said in the first panel, but I am sure you are familiar with the arguments they make. From your point of view, why do we find it so difficult to take effective enforcement action against scam ads that run, particularly on social media platforms?
Guy Parker: I think the scale of online fraud makes it extremely challenging for regulators; clearly more needs to be done to tackle online fraud across a broad array of fronts. Perhaps it would help if I put in context what we do and what our part is in tackling online paid scam ads.
We are the front-line regulator of advertising by legitimate advertisers. We think we deliver significant benefits to the public and to responsible businesses. There are one or two exceptions to our coverage, including financial ads by FCA‑authorised businesses, where the FCA is the regulator when it comes to issues such as misleading us.
The ASA system works in significant part because of the buy‑in of the advertising industry, which is obviously not the case when it comes to the criminals behind scams. Although we play a role in regulating online paid scam ads, we do not see ourselves as the front-line regulator of scams. Obviously, we are not the right body to go after is the criminals behind them.
In the limited role that we play, we operate a scam ad alert system in partnership with the online platforms and networks. That helps to disrupt some obvious paid scam ads online. We encourage the public to alert us to ads that might be scams, through a quick report form on our website. We also use a machine-learning model to help predict whether online ads that we have scraped from the web are scam ads, and if they are, we issue scam ad alerts. We find that around 90% of the time the platforms respond to our alerts by taking down the scam ads and, where they can, by suspending the scammer’s accounts. Our scam ad alert system is only one piece of the puzzle. Tackling scams is a global as well as a domestic challenge, for us and other regulators.
The Chair: Can I get one thing clear from the consumer’s point of view? How does it work? I have seen this before if you have a scam alert. Someone searches for a product, let us say on Google, and they discover it. Then an alert may come up on the search page next to the organisation or company they were looking for that would alert them to the fact that a complaint had been made that the ASA had found that this was not a trustworthy product. Is that correct?
Guy Parker: No, that is not the scam ad alert system. That is one of our sanctions against companies that refuse to comply. Our scam ad alert system is an alert sent to the platforms and the ad networks telling them to take down scam ads or not to distribute them if they are asked to. It is behind the scenes within the digital advertising ecosystem; it is not consumer-facing.
The Chair: What triggers that? You receive a complaint and determine that the advert in question is in breach of the existing consumer protection legislation and, therefore, should not be able to run as an ad. It is the sort of advert where if it was running on television or in print you would warn the media owner not to accept any more versions of that ad.
Guy Parker: Exactly. A lot of the scam ad alerts are sent out as a result of reports from members of the public who have alerted us to scam ads that they have seen and screen‑grabbed. They alert us via a form we have on our website. We look at them. If we think it is an obvious paid scam ad, we send out a scam ad alert.
The Chair: How many scam ad alerts do you think you send out in a typical month?
Guy Parker: At our six‑month review, which was some time ago, we had sent out over 120 unique scam ad alerts based on about 1,200 to 1,300 quick reports that we had received from members of the public. Those figures will have increased. I am happy to write to the committee afterwards to update you, because I think they will have increased.
The Chair: You are saying that the companies remove about 90% of those scam ads.
Guy Parker: Yes.
The Chair: Have they confirmed that? They have demonstrated that to you.
Guy Parker: Yes.
The Chair: If they take an ad down when there has been a scam alert, will they proactively remove any copies of that advert presented in the same context?
Guy Parker: That is one of the things we ask them to do. I do not have data on how successful they are at that. We are in the process of looking to expand the scam ad alert system to cover other types of scams to try to increase its reach, but I go back to the point I made, or was beginning to make earlier. This is only one piece of the puzzle of tackling online scam ads. Scammers are very sophisticated technologically. They are often run by organised criminals, quite often based in jurisdictions with whom the UK does not have very good relations. They are a challenge to tackle.
That said, there are also domestic criminals involved in scam advertising, and opportunistic criminals, and our law enforcement bodies ought to be able to tackle those. I know there is a big focus on that at the moment, but it will continue to take collaboration between us, statutory regulators, Ofcom, if it is given new powers under an extended Online Safety Bill, other players in the consumer protection space, and the industry and platforms and parts of the digital advertising ecosystem to make substantial inroads into this scourge.
The Chair: You are right to use the word “scourge”. As to what you said about the scam ad alerts, while being welcome, those numbers would suggest that it is a tiny, tiny proportion of the number of scam ads out there that are reported back to you. The Advertising Standards Authority really works on the basis of media owners, brands and advertisers willingly complying with the code. As you say, you have a group of advertisers, many of whom are criminals and many of whom are hard to trace, who have no intention of complying with the code. They will not self‑enforce, and there is no built‑in sanction against them. They are advertising in a way that you could not do in any other media because the media owners would stop you doing it, but here the media owners are not stopping them doing it. Is that not the problem?
Guy Parker: There have always been scams. There were scams before the internet, and scams that used advertising to lure people in, but you are right. The role of media owners is crucial, because they are a gateway, and platforms and other parts of the digital advertising ecosystem are the gateways for a lot of online scams, so more has to be done on that front to try to stop this at source, as well as going after the criminals behind the scams.
The Chair: The committee has heard quite compelling evidence about the problem of scam ads. You could say that scam ads are already a form of illegal content. They are fraudulent, and they are being run by criminal organisations, as you have said. The Online Safety Bill is in part to deal with illegal content online, to make a system whereby the companies have to remove illegal content. We have also heard evidence that on social media it is very difficult to distinguish between organic posting and advertising, because, as discussed in the previous panel, you just have to put a penny of investment behind your organic post and it becomes an ad, which, under the current legislation, would then be out of scope of the regulator.
In your view, why should the regulator—in this case Ofcom—be asked to distinguish forms of illegal and harmful content that, if posted organically, would require removal, but, with the minimal investment behind them to boost them to a bigger audience and therefore become an ad, should be outside the scope of the legislation?
Guy Parker: That is probably a question for Ofcom to answer, but you are asking it of me. I think there are good arguments for extending the scope of the Online Safety Bill to cover financial scams. You will be hearing those from the statutory regulators who are sitting alongside me. We are a self‑co‑regulatory body; we do not derive our authority from legislation. We do not lobby for changes to legislation such as this, but clearly there are good arguments.
I have a couple of comments on the idea of extending the scope of the Bill. If that is done, it will be important to do it carefully and to make sure that the definition of scam ads is got right; otherwise, there is the risk of opening the door to all paid ads that might be potentially misleading. That is being looked at, as you know, by the online advertising programme, and that may be a good idea too. It is an area where we have a very direct interest, because, as I said earlier, we see ourselves as the front-line regulator for advertising by legitimate businesses in all media, including online, with the one or two exceptions I mentioned, including financial promotions by authorised businesses. One of the things we should be aware of is the unintended consequences of not carefully extending the scope, if that is what the Government decide to do as part of—
The Chair: I am sorry to interrupt, but as to unintended consequences, we are talking about advertisers that are not FCA registered, in which case the complaint would go to the FCA and would be advertising in breach of UK consumer protection law. If they tried to do it on television, you would probably find against them and tell the broadcasters, or advise them, that they should not be running the ads. Indeed, Ofcom could even consider whether it was a breach of their broadcasting licence to run those ads.
I do not think the issue is not being clear about what we are talking about. There is an existing system that determines that ads that are fraudulent and ads that are scams should not run in other media; and it sounds like you can do it online as well, and you do. The issue is that the social media companies do not put in the effort or resources that are needed to be effective at removing these ads, which I appreciate appear on a much bigger scale than they do in other forms of media.
It seems that we have a really clear programme; we know what we are talking about, and you know what you are talking about, but it is not being applied online. In that case, if it was determined that any illegal content, be it an ad or organic posting, should be removed from social media and should be within the scope of the regulator’s remit, is that something the ASA would object to?
Guy Parker: No. We have 17 years’ experience co‑regulating TV and radio ads with Ofcom. If that is the way this goes, we know that the ASA system can complement and help out Ofcom, and Ofcom will need help if it is taking on more duties than it has in the current scope of the Online Safety Bill.
The Chair: Thank you.
Lord Knight of Weymouth: I want to be clear on this. We just heard Martin Lewis basically saying, “Don’t trust online ads”. Is it not in the interests of the advertising industry for this to be included in the Bill so that people can re‑establish trust in online advertising?
Guy Parker: Trust in all advertising is an important thing to the advertising industry, and it is obviously important to us, because we regulate UK advertising. We know from recent research that increasing concerns about scams are influencing the public’s trust in online ads, so, yes, tackling scam ads is an important part of maintaining trust in online ads.
Online advertising is a very broad area. It is not just the paid ads that we see; it is also the influencer ads that people see. It is companies’ own advertising claims on their own websites and social media channels. It is ads by big brands that we may be more likely to trust, because they are big companies and we are familiar with them. It is advertising by much smaller businesses, including more fly‑by‑night businesses that might be more inclined to push the envelope. It is a whole host. Online advertising is a very simple term to define a very broad array of different types of communication to the public.
We have to get better at tackling scams, because they have clearly increased hugely in the last 18 months. Which? came out with its report this morning about the cost of the harm of scams, which I think it put at £9.3 billion a year. This is a problem that is getting worse, not better, so we have to look to do everything that we can to try to get better at tackling it.
Q119 The Chair: Thank you. Michael Grenfell, you said in your evidence to the committee that you were concerned not only with the question of the Bill as it currently stands not addressing this issue directly but that it may even undermine existing consumer protection legislation. Could you perhaps expand on that for the committee?
Michael Grenfell: Let me talk you through that. When we talk about that, we are talking not only about the very blatant advertising scams that you have been talking about, that Guy was talking about and what the very powerful testimony from Martin Lewis and Which? was about, but about having content on platforms that abuses consumers’ rights—for example, online reviews that are not wholly honest, or the suppression of negative online reviews, or, something Lord Stevenson will know about, the selling of tickets on platforms and not giving the right information, or celebrity endorsement where you do not disclose that it is paid for. It is not just the big, blatant, appalling things that we heard about before, but, on a slightly lower level, things that are really harmful for consumers.
We think that, as a matter of current law, the platforms have a liability to get rid of those. The term is professional diligence, and that is a concept that comes through the unfair trading regulations of 2008. What worries us is this, and I am sorry if I get a bit boring. Can we look at the text of the Bill for a second? Under Clause 9(3), if the content is just legal content rather than priority legal content, the obligation on the platform under this proposed new Bill, if the platform is alerted by someone that there is bad content, is just to take down that content. In our view, the current law is that they have a much bigger duty; they have a duty actually to monitor the content and on their own initiative to take it down. We worry that people will see that slightly narrower duty and think that it supersedes the existing law, supplants it and therefore weakens it. That is a bit convoluted, but was that reasonably clear?
The Chair: On the basis of existing law, which sets a requirement for proactive removal or not accepting, yes, exactly. I understand.
Michael Grenfell: That, as it were, is the minimalist point. There is a slightly larger point, which is this. What I have just said is our understanding of the existing law; it is based on the doctrine of professional diligence. We think it is right. Many of the platforms that we have dealt with have agreed with and accepted that. Others dispute it. We think it is important, without these things being pushed to litigation, that it should be really clear on the face of legislation. What we would like to see, either in this legislation or in a revision to the unfair trading regulations, or elsewhere, is that it be made clear beyond doubt that online platforms have a duty to monitor, and where necessary remove, content that would normally be illegal under consumer protection law.
Q120 The Chair: Thank you. Lord Clement‑Jones talked in the previous session about the screening of adverts. Mark Steward, would it not be simpler to say that unless you are FCA registered you cannot run ads in the UK targeting UK citizens with financial products?
Mark Steward: In effect, that is the position under the Financial Services and Markets Act, but that does not stop advertisers doing it, nor does it stop social media firms permitting that to happen. During the last 18 months, the period of the epidemic, we have seen the number of these ads increase markedly. We detect them, we warn about them and we speak to social media about them, and sometimes we are successful in getting the sites taken down.
But it is a bit like whack‑a‑mole, because it is apparent that what lies behind this phenomenon is well organised and industrialised and has created a means of getting the attention of online consumers at volume in a very cheap and efficient way. The pursuit of the perpetrators, which often leads us into places overseas, as Guy has mentioned already, is not the answer. The answer is increased prevention to stop it systemically happening.
We are very strong supporters of the Bill and strong supporters of an extension to paid‑for advertising, because the problem is most manifest in the paid‑for space, so it does not make sense for the Bill not to deal with the very heart of the problem, which is the paid‑for advertising space. It does not make sense for people who are not paying for advertising to be regulated by the Bill and for scammers to be able to pay for advertising and escape its clutches.
We are also very strong supporters of an approach that would obligate social media firms to create systems and controls, because we know from our financial services experience how valuable regulated systems and controls can be in preventing harm. We think that is an area where prevention is much more preferable than cure. The cure, unfortunately, is very difficult, expensive and frustrating, because it leads to perpetrators who are out of our reach overseas.
The Chair: It would seem similar to what Mr Grenfell said. Your principal call is for the enforcement of existing legislation in the online advertising space.
Mark Steward: Not exactly. There is certainly a role for downstream offence provisions, such as Section 21 of the Financial Services and Markets Act. There is certainly a strong role for offences under the Fraud Act. They have a very important role to play, but they are not the answer, because they require the breach to have already occurred. The answer must be prevention rather than simply the pursuit of wrongdoers. We can do the pursuit of wrongdoers, and we can win or fail in doing that, but it is much more efficient if systems and controls can be created to prevent it happening in the first place, and systems and controls at the gateway of social media seem to us to be the right answer.
Q121 The Chair: I know other Members wish to come in, but could I briefly ask Clinton Blackburn this? A lot of the enforcement work falls into the hands of the City of London Police. What is your assessment not just of the scale of the problem but of the growth of the need to investigate these crimes?
T/Commander Clinton Blackburn: Thank you. Again, I am a strong advocate of including paid‑for advertising, and taking it further by naming fraud as a priority. Certainly, since April 2019, and with Covid happening, more people have been going online and we have seen a real drive into the online space. We have seen a 16% increase in cases where the fraud is absolutely attributed to online advertising on communications platforms.
To give you some figures, online shopping at auction, for example, has increased in that time by 43%, and romance fraud by 15%, and again it increases year on year, with the majority being online; and in investment fraud we have seen a 16% increase. We are seeing a real drive to this area of criminality. As banks have tightened up methods of payments, the whole advertising piece online, on social media sites, is where people are propagating fraud. Between this year and the last reporting year, we have seen a 4% increase just in that short period alone.
Michael Grenfell: May I briefly make a point that follows from what we have heard? If one can go after the actual wrongdoers, that is great, but there are problems. One is the problem that some of them are outside the jurisdiction, but the other is what Mark called the whack‑a‑mole problem. We did that with fake online reviews; we went after individual perpetrators, but then others popped up. The real thing has to be that the platforms themselves take responsibility. They are capable of it; they have systems. We took an enforcement action. Believe it or not, you can trade in fake reviews on these platforms. Facebook agreed, and undertook to us that it would put in place an algorithm that detected that and stopped it. The platforms have the capability to devise methods for controlling it, and, as Mark said, the most efficient and effective way is to give them the responsibility.
Q122 Lord Clement-Jones: I want to follow up the CMA evidence. You gave us option 1 and option 2 in the evidence. You said earlier that you thought that it would be only with priority illegal content that there would be the duty to minimise. In a sense, I assume that that is what is covered by your option 1 whereby we could say that consumer protection—illegal actions under consumer law—should be treated as priority illegal content, and, lo and behold, that would align the Bill perfectly with existing consumer legislation. Is that what you are saying?
Michael Grenfell: As a minimal thing, we would like it said that the legislation is without prejudice to existing consumer protection law rights so that the problem of someone saying that it supersedes existing consumer protection law would go away. That is the very least we need.
We would like it to be on the face of legislation. One way we could do it is, as you say, to designate these breaches of consumer protection law as priority illegal content. Another way would be to amend another piece of legislation, such as the Consumer Protection from Unfair Trading Regulations 2008. Either would be good. A difficulty of using this piece of legislation and designating it as priority illegal content is that, at present, Ofcom is to be the enforcement authority. Although one would completely trust it to do that, it will have lots and lots of things to do, some of them bigger than the kinds of things the CMA looks at, and it might not prioritise them. Either you could make other agencies such as us and trading standards have concurrent powers in respect of priority illegal contents that were breaches of consumer protection law, or you could put it into another piece of legislation.
Lord Clement-Jones: That is extremely interesting and salient, because something that comes up time and again under this Bill is the crossover between regulators, potentially, in the digital area; of course, we have the Digital Regulation Cooperation Forum and so on, but I will come back to that. Does the FCA have similar nervousness about the impact of the Bill on duties and illegal aspects of the financial services field, so to speak?
Mark Steward: No, we do not see potential for conflict. In fact, if anything, we see how it can work in a very complementary way to the obligations that we enforce under FiSMA, particularly—I suppose especially—if the Bill is amended to include paid‑for advertising, and some of the offence provisions that we deal with in FiSMA are designated as priority illegal content for the purposes of the Bill.
Lord Clement-Jones: How are we going to deal with crossover between the regulators? Do we need a duty of co‑operation? Do we need to include other regulators within the scope of the Bill in terms of concurrent jurisdiction, if you like? Michael, do you want to start by answering that?
Michael Grenfell: There is never any harm in a duty of co‑operation. As you said, there is now the Digital Regulation Cooperation Forum. There is the UK Regulators Network. There are lots of mechanisms for co‑operation. I have to say that I do not think duplication is really the problem. The problem is things falling between the cracks. A few moments ago, I made the slightly narrower point that if you load platform liability for the smaller kind of consumer protection breach—not the big scam adverts that we were talking about earlier but the fake reviews, the misleading information and the celebrity endorsements that are hidden but are really paid for; those kinds of harmful things—put it into this Bill and load it all on to Ofcom, with the best will in the world, Ofcom, which will have loads and loads to do, will not necessarily regard that as the No. 1 priority. You might want to give parallel concurrent powers to other regulators too, to consumer protection agencies including us, to enforce those bits.
Lord Clement-Jones: Is that your perspective as well, Mark?
Mark Steward: Certainly in relation to the areas that we have jurisdiction over—financial promotions—we will do what we are required to do, because we want that to be a safe place for UK consumers to operate and invest in.
I also agree with Michael that, as economic regulators, we are used to working together. We work together, and we work very well too with the City of London Police and other law enforcement agencies in the country. In the particular context of the Bill, there are provisions that relate to co‑operation between the regulators. We would want to make sure, and check, that the Bill was really clear in allowing information and intelligence to be shared between all the regulators on a mutual basis, because those gateways can sometimes prevent us doing what we all want to do, but we cannot supply the information that would require it to happen. We cannot receive the information we need that would allow us to act. It is really important that gateways consistent with the purposes of the Bill are completely open for the regulators.
We would also be happy to be consulted by Ofcom on the creation of codes that apply to the work we do in our jurisdiction. I think that would be enormously helpful in binding the regulators together and avoiding the trap that Michael mentioned—that things fall between the gaps.
Michael Grenfell: I second that. The gateway to information exchange is absolutely critical.
Lord Clement-Jones: Can I bring in Guy Parker? What would be the impact of the Bill on what the ASA does in a non‑statutory way, so to speak?
Guy Parker: It would depend what it said if it was extended, but we would want to carry on operating our scam ad alert system, and we think that we could under the Bill, even if it was extended. Michael is exactly right that, even as currently drafted, the Bill gives a lot of duties to Ofcom and it is going to need help.
We want to carry on being the front-line regulator of advertising by legitimate businesses, so to the extent that the Bill as is, or as amended with an extended scope, covers that type of advertising, we want to carry on helping. We already do a lot of that. We have worked with the CMA on fake endorsements, influencer ads that are not properly disclosed and secondary tickets, and obviously we co‑regulate TV and radio ads and VOD, and soon video‑sharing platform ads, with Ofcom. We know we can work very well with the regulators. We increasingly work closely with the FCA on financial‑related issues that sit just on the edge or just outside its perimeter.
Frankly, whatever happens with the Bill, we want to carry on doing that; we are going to carry on doing it. We are doing it way more now than we were five years ago, and, whatever happens with this Bill, we are going to be doing it way more in five years’ time, because that is just the way of the world, particularly with the challenges of the information revolution that we are going through at the moment.
Q123 Lord Stevenson of Balmacara: Thank you very much. First of all, could I thank you, Michael, for the name check in relation to ticket scams and for reminding us that we need to think very widely and imaginatively about this area? I think you make a very good point about that.
My primary question is for Mark. In the Financial Services Bill recently, we spent a lot of time talking about and debating an amendment to the powers of the FCA in relation to a duty of care. We also have the possibility with this Bill of having a duty of care. Do you see us running on parallel tracks on that? Can you perhaps set out for us where you have got to on where the FCA is in trying to make that work in practice, and do you think it will have an impact on the arguments that we have been having today?
Mark Steward: The way I put it is that they are slightly different things. What will be most practical and useful in this context is the power that the regulator will have, perhaps in combination with other regulators such as the FCA, to create standing obligations that require gateway checks and other systems and controls over online advertising, specifying what the standards are, so that it is very clear for all the different platforms and social media companies exactly what they need to do, in a way that is enforceable.
The challenge of something that simply requires firms in this space to exercise a duty of care is what that really means in particular circumstances. The value of the Bill as it is framed at present, especially if it is extended in the way we think it should be, is that it will provide significant content for the platforms themselves, so that they know what they need to do and what standards they need to meet, and that those obligations will be enforceable. That is a much more useful and practical way of addressing the problems we are trying to fix than something that might be open-ended and not sufficiently clear.
Lord Stevenson of Balmacara: It is good to hear that. I recognise what you are saying and I see where you are coming from, although maybe that is because, in a sense, you have been thinking about this for so long that you have been able to work out in advance where you would go when you finally have the powers. Does that not lead you back to the same problem that Michael Grenfell raises, which is that unless you are able to make the social media companies themselves pick up the pass, the ball will be dropped at that point, because—excuse the metaphor—they are going to say, “We are merely conduits”?
Mark Steward: That is an issue that the Bill needs to overcome. If the Bill is enacted and imposes powers on regulators to create codes that can be enforced, they will have to find a way to comply. The consequences of non‑compliance, though, are really interesting questions.
Lord Stevenson of Balmacara: You say “interesting” in that curious way that people do. You mean in fact that we have to face up fairly and squarely to the question that the social media companies are responsible for the content they contain if that content is illegal under the Act or under a code that is sufficiently robust to require sanctions to be taken.
Mark Steward: I would put it slightly differently. At the moment, print media has obligations that it complies with that are not being complied with by social media companies and platforms. That does not seem like the right outcome.
Lord Stevenson of Balmacara: Michael, would you like to come back on that?
Michael Grenfell: I completely agree. That is exactly right.
Lord Stevenson of Balmacara: Thank you, Chair.
The Chair: Thank you.
Q124 Baroness Kidron: Thank you all. I want to pursue what you just said, Mark. I am sorry to focus on you. Earlier, you spoke about systems and controls, and just now about a code of conduct under whatever regulatory regime, whoever has the powers, whoever it lands with. Can you extrapolate a couple of things that you would like to see in that code of conduct or in what those systems and processes would deliver, without going into the internecine details of the codes, but some of the principles that you are looking to get through that process?
Mark Steward: I did not mean to say anything inconsistent; I am sorry. I was trying to articulate the way the Bill talks about the development of codes, which appears to be a way in which the regulators could identify areas where social media firms and platforms need to design their own systems and controls that will avoid harm occurring. Those systems and controls might include something like an obligation to ensure that anyone advertising a financial promotion is authorised by the Financial Conduct Authority, or it might involve an obligation to do some checks at the gateway where you have information that may be inconsistent or contradictory.
We have loads of examples of cases that we followed through. One recently involved a company registered with Companies House. It was in strike‑off action, so it was about to be struck off by Companies House. That company was advertising a financial promotion on a social media platform; it was searchable. The person who placed the ad appeared, from the work we did, to be situated in Dubai, but all the contacts for that person referenced back to Panamanian telephone numbers that were registered with a virtual private network, so you could not go behind it. On its face, at the gateway, when you were processing that ad through your system, you would say that the information did not add up.
Baroness Kidron: Bearing in mind that some of these companies derive their revenue primarily from advertising, would you say that is little more than product safety?
Mark Steward: I think it is trying to design a way in which you are filtering the volume of material that appears online and on people’s searches in a way that promotes safety for users. That is how I would characterise it. That obligation has to sit somewhere in the system, because at the moment it is unfiltered. At the moment, the only filter for the industrialised production line of these ads is whether they are paying the fee to the social media company. That does not appear to be the right filter, given the amount of harm this can cause and is causing.
Baroness Kidron: Thank you. Commander Blackburn, I want to ask you something about resources at your end. I agree with Michael about the safety‑by‑design element and dealing with it up stream, but ultimately you will always end up with more than you can manage. Do you have sufficient resources, and have we even considered the enforcement resources in the Bill?
T/Commander Clinton Blackburn: That is a really good question. The struggles in policing with investigating and dealing with economic crime are well documented. As mentioned earlier, it is estimated that it is 40% of all crime. We are looking at a growth rate of potentially 25% by 2025. At the moment, resourcing levels are around 1% across policing matched to economic crime. Obviously, there is a big spending review bid going in to try to meet that demand and cut off that growth. It will be dependent on the spending review bid.
Whether we would have extra resource to deal with what we are dealing with here, which is what the Bill is suggesting, would depend on the process that was to be established as a result. In some of the work that we are already doing, certainly as the national lead force, we are taking down some of the websites, hundreds of thousands of them at a time, but that is a small dent in a massive ocean. Resourcing is a very good question. Policing is underresourced in economic crime investigations per se, and obviously as part of the spending review bid we want to try to turn that curve.
Q125 Darren Jones: Mr Grenfell, I think I understood you correctly earlier when you said that there is already existing provision in consumer law for some of these scam ads to be monitored and removed. If that is the case, why is it not happening?
Michael Grenfell: What I said was that there is a concept of professional diligence in existing consumer law. What that means is to be defined. Our strong view is that it means that platforms are liable for online content. What does liable mean? The exercise of professional diligence means to the extent that they can reasonably monitor it, look for it and root it out.
Without being a great technical expert, we know that the platforms are inventive, creative and good at devising algorithms and other mechanisms. They showed that they were able to do it in the example I gave earlier about Facebook and the online trading fake reviews, where they put in an algorithm. Why they do not do it at present is not really for me to answer; it is for them. I guess, as with all these things, there is a bit of expense and effort in all that, so there needs to be pressure, including legislative pressure, to ensure it.
Darren Jones: I think you said that provisions were already in the unfair trading regulations. The CMA is the ultimate enforcer of that. You said that it is for the technology companies to do what they are not already doing. In bringing forward this new legislation, what lessons are to be learned if there is already existing provision that is not actually resulting in a change?
Michael Grenfell: The CMA takes enforcement action applying the duty of professional diligence, and we have done it across a range of things, including false sales of tickets, disguised celebrity endorsements that are really advertising and so on. It is our view on what constitutes professional diligence. Most people would accept that we are correct, but there are some who would push back and say that that is an overexpansive view. We want two things. We want the face of legislation to put it beyond doubt, which is the positive aspect. The negative aspect is that we are worried that the way Clause 9(3) is drafted might even diminish what there already is, and it needs to be said that it is without prejudice.
Darren Jones: Understood. Who is pushing back on the definition of professional diligence?
Michael Grenfell: It would be inappropriate for me to discuss individual cases that are not in the public domain. Most accept, but there are some who dispute.
Darren Jones: There has also been a bit of narrative around building a regulatory system that might entrench the monopoly status of the big technology companies, and that that would result in it being anticompetitive for challenger brands. Does the CMA share that view, or are you not concerned about that?
Michael Grenfell: I am interested to understand that point fully. The CMA in its competition role is concerned about the market power of some of the big platforms. We have done a market study. We are taking various competition enforcement pieces of action involving Facebook, Google and Apple. We have recommended to the Government, and I think the Government are proposing, to legislate for a digital markets unit that will have a pro-competition regulatory role to ensure that platforms that have market power do not exploit that market power, but I am not—
Darren Jones: I will explain. The concern is that if you build up a system that requires lots of additional staff or expensive lawyers and compliance people, or whatever it might be, and you are a start-up tech company trying to challenge some of the big companies and the regulation may also have to apply to you, it might be anticompetitive, because you cannot get off the ground. Has the CMA taken a view on that or not?
Michael Grenfell: It is a good and interesting point. Of course, you can have de minimis thresholds. You can make it apply only to those that have market power. For example, the—
Darren Jones: I do not mean to interrupt. I am just asking whether the CMA has taken a view on that in respect of this Bill. Has it thought about that?
Michael Grenfell: No, we have not taken a view, but I will offer my view now, if I may. In the context of the digital markets unit, there are more onerous obligations on those that are to have strategic market status, which is broadly analogous to market power, and I can see a very credible case that the onerous obligations apply to those with a strong position in the market. The thinking behind that is that more people are harmed by entities that have a larger share of the market.
Q126 Darren Jones: Thank you. My last question to each of our regulators is about the powers for the Secretary of State to designate priority content through statutory instrument, and to tell Ofcom that it therefore needs to do something. The argument from the Government is, I think, that in respect of health and safety work regulations, Ministers can do that through the Health and Safety Executive, but, of course, as you all know, in respect of the independence of the established regulators—Ofcom being one of them—it is very unusual to give the Secretary of State powers to tell the independent regulator what he or she thinks they should be doing in respect of particular aspects of their remit. I will take each of you in turn for your view on that. Mr Parker, first, please.
Guy Parker: It is a good question. I am not sure I know the answer to it. We are independent of government, with one exception, when it comes to Secretary of State powers. Under the Communications Act, the Secretary of State can require changes to be made to the broadcast code that we co-regulate with Ofcom. I think it has happened once in 17 years, and actually it was quite a helpful intervention, because it delivered a relatively technical change to the code quickly. I can see the sense in the Bill giving the Secretary of State powers to include newly emerging harms within the scope of the Bill, because we know that legislation is slow-moving and harms will emerge that we have not thought of yet. But it is also important that Ofcom has sufficient autonomy to regulate effectively without having to second-guess what a Secretary of State might want it to do. I am no expert on how to strike that balance.
Darren Jones: But the ASA’s view is sitting on the fence.
Guy Parker: With that one exception, we do not really have a lot of direct experience of it.
Darren Jones: Okay, that is fine. Mr Grenfell?
Michael Grenfell: I hope this will not be regarded as sitting on the fence, but I have a very similar view to Guy’s. For us, and for all public authorities in the regulatory field, independence from persistent government and political interference is very important. It is very important for investors in the country and investors in industry to know that there is predictability.
That said, it does not seem obvious to me that the Secretary of State merely designating which types of harm are to be included or not included in certain powers is that kind of interference. The kind of interference one ought to guard against is the Secretary of State saying that they do not want us to go against a particular company, or wanting us to hold back, or saying that we have our particular priorities wrong, and I think that would be true of Ofcom, or the regulator, under this legislation. The mere fact that the Secretary of State designates a category that is subject to a higher power does not seem to me inconsistent with regulatory independence.
Darren Jones: That is not sitting on the fence. Mr Steward?
Mark Steward: One of the concerns we have with the Bill, even if it is amended in the way we have suggested, is that it will not be a panacea for all the problems that arise. We know from what we have seen over the last 18 months to two years in particular how quickly the entrepreneurs who are scamming UK investors can operate. The categories of harm that we might see in the future, particularly in the world that the FCA is concerned about, cannot be limited or circumscribed.
There needs to be a mechanism that allows the Bill to operate in a travelling way as the market becomes more sophisticated, and perhaps more complicated, as new harms arise. It does not really matter to us what the mechanism is for the Bill to have a greater say in this space, as long as there is a way in which Ofcom and the regulators can act flexibly with an evolving market. As a regulator with a fixed statutory perimeter that we find enormously difficult, there needs to be some flexibility in some way, shape or form for the regulator.
Darren Jones: That is useful. Thank you.
Q127 Lord Black of Brentwood: My interests were all declared at the start of this inquiry, but specifically, because Guy is here, I declare that I am a director of the Advertising Standards Board of Finance, which raises the levy for the work of the ASA.
Picking up a point you made about legislation, legislation is slow-moving. We know that from the history of this particular Bill, which was first mooted, I think, in 2017, and it will still be some while until it is on the statute book. There is also a danger, of course, of there being large numbers of consultations on it. If we talk to DCMS, it will say, “Park a lot of this stuff for the online advertising review that we are undertaking”.
Is there an extent to which tricks are being missed with this legislation—things we should be putting in that are not there? Dr Grenfell mentioned the digital markets unit. Could the Bill be used to give that the statutory powers it needs to start work rather than waiting for another two or three years for legislation to come through?
Guy Parker: Is that to me or to Michael?
Lord Black of Brentwood: Either of you.
Michael Grenfell: Shall I have a go? The Government have said that the digital markets unit is to reside within the CMA, so we obviously have an interest in that. We think it important that that be done sooner rather than later, because there are a lot of abuses out there. I am not talking so much about the scam-type abuses we have been discussing today, but about economic abuses, the abuses of market power, and the longer they go on the harder it is for challengers to enter markets and the worse off we will all be as consumers and citizens.
Whether this piece of legislation or a separate piece of legislation is the right vehicle is not really for me to say, but I could see coherence in the view that this should be devoted to the particular harms it needs to address, and the competition/market power issues could be in a separate, discrete piece of legislation. We would obviously like to see that legislation sooner rather than later, recognising, as we do, that the Government have to juggle lots of demands on legislative time.
Guy Parker: I do not know how to answer your question. It is beyond my area of expertise. That is partly because we are a self-co-regulatory body, so we do not rely on legislation to give us our powers, our perimeters and so on.
This is focused on advertising by legitimate businesses: we are working at the moment on a framework with platforms and networks that, if we are successful—I hope we will be; we are making good progress—will deliver principles that we will put into the advertising code that will hold platforms and online networks to account for their role in promoting and enforcing the advertising code that we police. Those principles will require them to produce reports on how they are doing at promoting and enforcing the systems and processes that result, in compliance with the advertising code, and we, the ASA, will routinely, probably on an annual basis, assess their performance against those principles. We call it online platform and network standards—OPNS. We are building that now. If we are successful in building it, we will begin implementing it hopefully next year. It might take a little bit longer than that, but, with a fair wind, it could be next year.
It will be a significant change in our regulation of online advertising. The vast majority of the rules in our advertising code apply primarily to advertisers, and media have a secondary responsibility. These would be principles that applied primarily to the platforms and networks. We are going to build this, and I hope we will implement it soon. Whatever happens with the Bill, or any legislation that comes out of the online advertising programme, we hope it will have a long operational life and we can slot it in. If Ofcom ends up taking on more duties over all paid advertising online, for example, we hope this will be useful and will slot into any more statutory regulatory ecosystem.
Lord Black of Brentwood: How much co-operation are you getting from the platforms in building that piece of work?
Guy Parker: Good co-operation so far, but we have to get it over the line. The advantage if we are successful is that it will bind in all the big platforms and networks. One of the benefits of our system is that, if we get sufficient buy-in from the industry and we reach a critical mass of buy-in, we apply it comprehensively to the whole market. That is what we are looking to do with this as well.
Lord Black of Brentwood: Thank you.
The Chair: Thank you very much. That concludes our questions for this evidence session. We are grateful to all of you for your time and your very clear evidence this afternoon. Thank you.