Joint Committee on the Draft Online Safety Bill
Corrected oral evidence: Consideration of government’s draft Online Safety Bill
Monday 18 October 2021
2.35 pm
Watch the meeting: https://parliamentlive.tv/event/index/348d8d04-3876-4220-a40f-906f210686fe
Members present: Damian Collins MP (The Chair); Debbie Abrahams MP; Lord Clement-Jones; Baroness Kidron; Darren Jones MP; Lord Knight of Weymouth; John Nicolson MP; Dean Russell MP; Lord Stevenson of Balmacara; Suzanne Webb MP.
Evidence Session No. 5 Heard in Public Questions 110 - 117
Witnesses
I: Martin Lewis, Founder and Chair of MoneySavingExpert.com and the Mental Health and Money Policy Institute; Rocio Concha, Director of Policy and Advocacy and Chief Economist, Which?
18
Martin Lewis and Rocio Concha.
A two-minute silence was observed.
Q110 The Chair: We held those moments of silence to commemorate the life of our former colleague, David Amess, who was murdered on Friday in his constituency while going about his duties as a parliamentary constituency Member. His death is a tragedy for his family and friends, and for all of us here in Westminster and in the Houses of Parliament who knew him, liked him and worked with him and will all greatly miss his company, advice and counsel as we enjoyed it in the past. David Amess’s death has once again brought to the centre of our debate the question of the threats to life that Members of Parliament often face, often directed at them through social media.
We, of course, are mindful of the fact that Members of Parliament are not alone in receiving abuse and death threats. Sadly, as we have heard through this inquiry, too often people are treated and abused in that way online and on social media. We have heard evidence in this inquiry about how social media have normalised aggressive and hateful racism, misogyny, homophobia and other attacks on people. That is why it is so important that the work of this committee and of Parliament and the Government is focused on making the internet safer and delivering an Online Safety Bill that can do that. That is why we have taken the decision to continue our work and to meet today to hold our evidence sessions. Thank you.
Welcome to our witnesses for the first panel. Martin Lewis is joining us remotely, and Rocio Concha is joining us here in the room today. The purpose of this panel and the next panel is to discuss something that has been an important recurring issue in our work so far: fraudulent and scam ads directed at members of the public through social media, and whether they should be within the scope of the Bill.
Rocio, the key question seems to be that if a piece of content is in breach of existing UK consumer protection legislation that in other media would require some action being taken against it, why is that legislation not enforced effectively online? Why has it become a gap, and a gap not in niche media but in the media that people are most exposed to, often throughout the day, and the media they go to most frequently? Why are we unable to effectively enforce the consumer protection legislation that exists?
Rocio Concha: That is exactly the question. A big part of the answer is the role that the online platforms have to play in making sure that criminal activity is not happening on their sites. That is why this Bill is so important. The Bill is a major step to make sure that the online platforms take responsibility for the criminal activities that are happening on their sites. That should include fraud.
Hopefully, we will be able to discuss today that it is not only about user-generated fraud. It is also about paid-for advertising, which is where a lot of the fraud is happening.
The Chair: Martin, we would welcome your views on that as well. I know that it is an issue that you have taken up directly with the social media companies on many occasions, particularly relating to scam ads using your image. We would welcome your view as to why you think that legislation is not effectively enforced now, and what sort of response you have had from social media companies when you have discussed it with them.
Martin Lewis: I will take you back a few years, if I may. Probably the worst compliment of my professional career is that, alongside Richard Branson, my face is used in more scam ads than any other.
I was first notified of it by an elderly gentleman, who got in touch with the website, in a very unpleasant email, calling me all the names under the sun because, as I came to understand, he had lost £19,000. He had trusted me and put money into an investment that I had recommended. We tried to tell him that it was nothing to do with me and that it was a scam. We could not even get to the point of persuading him to let us help him—this was me through MoneySavingExpert—and see what was going on because I, of course, had stolen his money and all he wanted to know was, “When are you giving me my money back? You promised me that this would be safe. It wasn’t safe”.
That was the start. From that point, three or four years ago, it exploded. We know from past inquiries of your own committee in the Commons that, , over 1,000 of those adverts with me in went on to Facebook in one year on Facebook alone.
Of course, I tried the police, but I had not been defrauded so I could not report it to the police. It was the same with Action Fraud. We should remember that Action Fraud is an intelligence-gathering agency and not an enforcement agency. This is a crime that you can get away with in this country virtually with impunity. Very few people are prosecuted for this type of fraud. It is an easy way to make money. That is not a recommendation to anyone, but that is the truth of what is going on out there.
I tried the Advertising Standards Authority, which said, “Well, the scammers won’t reply to us, so we can’t govern them”. I believe it has improved its system since then, but I was told, “Sorry, we can only deal with people who will deal with us”. That does not really help when you are looking at what is potentially organised crime, based both inside and outside the UK.
We went through every single agency we could. I got to the point where I could not do anything. I was chatting to my cousin, Mark Lewis, who is a renowned media lawyer, and we cooked up a plan. I want you to listen to the perversity of the plan.
The only law that we could come up with that had anything on this was suing Facebook for defamation, because get-rich-quick-schemes impinging my reputation. We went ahead with defamation, but the fact is that that was the only enforcement action I could take as an individual whose face was being used daily by scammers. What the companies say, and still say to an extent, is, “Please report when you see any scam ads with you in”.
There are thousands of them. I do not have a full-time job reporting scam ads. Because of the way the dark advertising system works, where they hide them and only the people who see the ads see those specific ads, I do not know of the vast majority of scam ads that are published with my face on them, so I have no way of reporting them. It is not my job. I am not the one being paid to publicise these adverts.
I took a defamation lawsuit, as I know you are aware. Having been advised that if I won in court I might get £50,000, I settled in the end; as it was a campaigning lawsuit, I settled for £3 million to set up Citizens Advice Scam Action and for Facebook to launch a scam ads reporting button, which is unique to the UK. It is still in place, which has helped but it is far from a solution, although it has certainly improved the situation.
I am sitting here and saying this vociferously to you, because it should not take someone like me to have to sue for defamation to stop thousands or hundreds of thousands of people, many of whom are vulnerable, seeing their lives, their wealth and their mental health destroyed by scams. It should not be via the route of defamation. There needs to be a duty of care on social media and other online advertisers to ensure that these scam adverts do not run and that they take responsibility when they do. That is why we are calling for scam adverts to be put in the Online Safety Bill.
The Chair: In the case of the adverts that featured your image, Mr Lewis—
Martin Lewis: Feature, if you will forgive me. This has not stopped. They still happen all the time.
The Chair: As we have heard in this inquiry, the social media companies are pretty good at removing Premier League footage within seconds of it being posted if it is in breach of copyright. Clearly, therefore, they have the technology to identify images that they know are being shared illegally and to remove them almost in real time. I think I am right in saying that you do not allow your image to be used in ads. Therefore, if it has your face on it, it should be taken down. It should be pretty simple.
Martin Lewis: It is pretty binary with me. Anyone who has seen my social media feed will know I have a pretend tattoo on that says, “I don’t do ads”. I do not do any adverts. If an advert has me in, it is a scam advert.
There are other people in the public arena who do adverts, so it may be a little bit more difficult to distinguish, but with me it is pretty plain. It should be very simple. When I have had meetings with the companies, they often tell me the technologically difficult reasons why it cannot be done, to which my answer is, “Who said you need to do this via technology?”
I have a website, and nothing goes on my website that has not been put on by a human being. It is editorial. We have some tools, but we monitor them. If you are making billions of pounds from advertising and you do not have good enough technology to stop scam adverts that destroy people’s lives, and potentially their health with the diet pills, then you’re going to have to pay human beings to pre-moderate them. We should be very careful not to allow them to set the narrative that this must be a technological solution.
I do not give two hoots whether it is a technological solution or a manual solution. I would just like to see a solution. I had a very interesting tweet a week ago from a tech consultant called Paul Moore. I will ask my team to forward it to you, if that would be okay. He wrote how he had seen an advert on Facebook. He tracked down the fact that, under the underlying code, scammers make no attempt to obfuscate text. It is trivial to find “Martin Lewis”, “Quit jobs” and “bitcoin” in the advertising text. The underlying URL was one of those quick-burner URLs going to a burner ISP, which is an instant indication that it is probably a scam. In his view, it would have taken him less than five minutes to write a code that would have stopped it. Yet even after the scam ads button, that is still happening.
One of the reasons for me to be here, and one of the reasons for me to push with passion and with verve, having done this for years and having risked my own reputation by suing one of the biggest companies in the world, is to ask legislators, “Don’t let them off the hook. We need to make big tech responsible”.
I would like to make my own definition, not a legal one. It is fine to argue that you are a platform when you are publishing social media posts. It is a bit like saying that you are a pub, and people are talking in the pub. But you are responsible for what you decide to serve in the pub. When you are serving adverts and you are being paid money to publish those adverts, you are a publisher, and you should take responsibility as a publisher. It is about time we cleaned up the law on that.
Q111 The Chair: I think that is right. If you are making money out of it, you are responsible for it. Rocio Concha, what is the view of Which? on the failure of the system here, in particular the failure of the existing bodies? We are going to hear later on from the Advertising Standards Authority, but we are often told that we do not need a statutory regime for advertising because they can do it. Clearly, it does not work here.
Rocio Concha: It definitely does not work. I completely agree with Martin. We continuously find things. I will give you an example of one investigation that we did recently. We found that on Google and Bing there were companies already on the FCA warning list. You could find them on both Google and Bing, so it is quite clear that this is not a priority for the online platforms. They are not taking it seriously or being proactive in identifying that content. With paid-for advertising, it is relatively easy because there is a transaction with the platform. There is an opportunity for the platform to ask questions about that advertising, but at the moment those questions are not being asked.
We have tested the system with an investigation. I can give you an example of the work we did. We created a fake water with a fake hydration service. We contacted Google and Facebook, and they advertised it. There were minimum checks. Obviously, it was clearly fake. In the first week, we got 500 likes. In the first month, we got 100,000 impressions. We paid Google so that, when doing a search about hydration advice, our website was top, before the NHS. That is how easy it is for criminals to put information out there. There are minimum or almost no checks from the online platform.
Why is that? That is surely something they can do, but it is not on their list of priorities. That is one of the things that we are arguing in relation to the Bill. Such illegal content should be a priority. It cannot rely on Martin, Which? or other people reporting that they have been the victims of a scam. That is not how it should work. The companies should be being proactive in looking for this content and taking it down. With paid-for advertising, as I said, there is a transaction. There is nothing stopping them asking questions and making sure that ads are not fake. The current system does not work.
I will give you some statistics, because it is useful to understand the scale of the harm to show that the current system is not working. Between April 2020 and March 2021, consumers lost £535 million just on investment scams. The majority happened online, but this is a very small part of the harm. I do not know if you have seen that today we released research that we have done where we show the impact on the well-being of the victims of scams. The value for online scams is £7.2 billion per year, which is £3,684 per year per person. The scale of the harm is massive. We have told the platforms about it. We have told them what we found, but there is no real action. It is not a priority for them. That is why it is important that there is an obligation on them to take responsibility for this.
Q112 Lord Knight of Weymouth: Rocio Concha, clearly you would not be here if you did not believe that paid-for advertising should be included in the scope of the Bill. I am interested in whether user-generated content generates scams. Is there a difference between the sorts of people who would be placing the ads and those who are generating user-generated content-based scams?
Rocio Concha: Both should be included in the scope of the Bill. Obviously, paid-for advertising is a more efficient way for criminals to reach out to a lot of people in one move. That is the main differentiating factor. If the Bill does not include paid-for advertising, all that will happen is that criminals who are at the moment user generating will move to paid-for advertising.
You can see how illogical it is. If I am a criminal and I contact you directly, I will be within the scope of the Bill, but if I actually put out a fake ad, which is very easy to do as I told you, and reach out to a lot of people, I will not be covered by the scope of the Bill. That does not really make sense.
That is not just the view of Which?. It is the view of Martin. It is the view of the FCA. It is the view of the London police. It is the view of the Governor of the Bank of England. A lot of people are saying that it does not make sense to exclude paid-for advertising from the Bill.
Lord Knight of Weymouth: The impression I get from the Government is that their position, as we understand it from DCMS, is that they were going to do some work on some draft online advertising regulation that the Advertising Standards Authority would then be responsible for, rather than it going into Ofcom as a regulator. How do you see the two regulators working together in your ideal scenario?
Rocio Concha: The first thing to say about the advertising programme is that, as you know, in the most optimistic scenario it is very far from legislation. Basically, at the moment we do not know what protections will be there. We do not know when the legislation will happen. As you know, the reality is that even in the most optimistic scenario we will be waiting for at least two years. In the meantime, all the harms that I am telling you about will continue to happen. Will we be saying to people that we are happy for these harms to continue happening?
In relation to the regulators, it is quite clear that the regulators have to work together: the FCA, the CMA and Ofcom. As you know, there is the Digital Regulation Cooperation Forum, which creates the opportunity for the regulators to exchange intelligence, make sure they have the right skills and support each other in enforcing the law. There is a clear case for collaboration between the regulators.
Lord Knight of Weymouth: Martin Lewis, do you have anything to add to any of that?
Martin Lewis: Oh yes. Look, we put the Government on notice four years ago that this country faces an epidemic of scams that is damaging people’s wealth and mental health. I chair the Money and Mental Health Policy Institute. One in three people suffers depression after scams, never mind the hideous impact on those with mental health.
The idea of waiting another two years, when that epidemic exploded through the pandemic and things got even worse, is hideous. Lives are being absolutely destroyed. Rocio has said it, but I want to make it plainer. The way this Bill is currently structured incentivises scammers to create more scam ads, because that type of scam is not covered by it and user-generated is. It is a farcical distinction that harps back to the 1980s, when you had content and advertising on television and in newspapers that were distinctly separate.
You tell me: if I do a post and then pay to promote it, is it an ad or is it user-generated? If I do a dating profile and then pay to promote it, is it an ad or is it user-generated? Where do I cross the line? If the answer is that as soon as I pay it is an advert, all I have to do is pay a penny and everyone in my user-generated post is not covered by this Bill. It is ridiculous. It is farcical. To add to what Rocio said, she missed out the ABI. She missed out most big banks. I am on her side and we are together on this, but she missed them out.
When you consider that the consumer groups, the charities, the banks and the insurers are all saying, “Put this in the bloody Bill”, you have to get close to thinking that it almost seems like a conspiracy that it is not being put in the Bill. I do not believe in conspiracies, but if anything were to persuade me, it is the blank reaction I get from the Government, who say “We need to look at it more carefully”. I sued Facebook three years ago. This has been going on for years. We have had meeting after meeting after meeting. No. Enough is enough. This Bill is going through Parliament. It is the Online Safety Bill. Scam ads are destroying people’s lives. People take their own lives on the back of being scammed, and it should go in the Bill.
Lord Knight of Weymouth: I think that was clear. Thank you.
Rocio Concha: May I add to that? The case for the harm has been made. There is plenty of evidence. We have submitted a lot of evidence. We have provided evidence on platforms, we have provided that evidence to the DCMS. We have provided all that evidence, and it is clear that this is a big area of harm. Why do we want to wait at least two years? I stress that it is at least two years. In the meantime, people will be affected by this, and their lives are being destroyed.
Lord Knight of Weymouth: Finally, when you have spoken to officials, have they told you why they prefer not to include it?
Rocio Concha: I am puzzled. I just feel that they have realised the level of the harm a bit late in the process, despite Martin, us and others having talked about it. For some reason, they have not taken it seriously. We have provided plenty of case studies. We are not talking about hypothetical harms. We have the evidence. The analysis that we released today is not hypothetical harm. We used the Crime Survey for England and Wales, where victims of scams report about their well-being. These are real people. We have shared that evidence with officials, but for some reason it seems that they do not feel it is a priority. How can it not be a priority? We are talking about billions of harm. I am puzzled.
The Chair: Fraud is against the law and there is a general presumption against illegal content in the Bill.
Q113 Dean Russell: Mr Lewis, you talked about mental health earlier, which is an area I am incredibly passionate about. I wanted to get a sense of the actual impact of this on real people. Often, as has been said just now in the evidence, it is not just a financial harm. It is a harm to people’s lives. Could you give us some examples, please, of the damage this actually does?
Martin Lewis: It is probably worth starting, if you will forgive me, by explaining the way they tend to operate before I talk about the mental health harm. We are talking about online safety, but this is not only an online issue. It is an offline issue, too.
The online scam ads are the departure point. They are not the destination. What tends to happen is that you respond to one of these adverts, and then you get somebody calling you up who says, “Hi, this is a great investment. It’s recommended by me or Richard Branson, or Deborah Meaden or whomever else, and they absolutely guarantee that you will not lose money in it. This is fantastic. All we need to start is £250”. They get the £250. Sometimes, there is a fake web portal showing people how well their investment is doing.
By the way, something else that is scammed, it is worth noting, is bitcoin. Bitcoin is used like my name is used. This is not about bitcoin. There is no underlying bitcoin investment. Bitcoin is “Get rich quick” to some people, so they think that they are getting part of the bitcoin train. It is not about bitcoin.
Then you get another call, “It’s going so well. Would you like to put some more in? You see how well it’s done. You’ve now got £1,000. How about giving us £1,000 this time?” That train continues. Then you get to the point when you start to get worried and you want your money out. How do you get your money out? “You’re going to need to give us £10,000 to get your money out”.
People become brainwashed and trapped by these forms of scams. You have to add to the mental harm the feeling of being duped. This is not all vulnerable people. Solicitors, university lecturers and accountants get trapped into it.
I will give you one case study. A lady who had bladder cancer invested money earmarked for her granddaughter’s wedding because she wanted it to go a bit further. She said, in touch with me, “If Martin is sponsoring it, it must be all right”. It was a scam and she lost tens of thousands. She lost £15,000 trying to get back the money initially lost.
I heard the story of a grandmother whose grandchild’s parents had died, and the money that the parents had put in for the grandchild she put into a scam because she trusted me. If you wonder why I get so passionate, it is because I have spent 20 years trying to do consumer protection work. I see people’s lives being destroyed. You cannot factor that in. It is not just people losing money. If you have just given away your retirement fund that you worked 30 years for, and you feel stupid—they are not stupid, but they feel stupid because these are really sophisticated and clever people; if only they used their talents for better reasons—you do not recover from that in years. You do not recover from that in life. You spend your life walking along the road kicking yourself up the backside for how you have destroyed your life.
I find it deeply frustrating to be told, “Well, we need to do some research on this, and we might get a fix in a couple of years”. We have copious research at the Money and Mental Health Policy Institute. We will send you the research on that, the numbers and the cases of difficulty.
I mentioned before—it was not flippant—that people take their own life, or consider it, because they blame themselves for the scam that they have fallen for. I do not blame them. You will forgive me, but I blame legislators and regulators who do not put enough practices in place. I blame online platforms who do not deny the oxygen of publicity to the people who are doing the adverts. Of course, I blame the criminals themselves. They are criminals and I have given up on the idea that we are going to get the police to catch those criminals. That would be the best solution, but, if nothing else, we need to try to stop them getting access to vulnerable people. That is what this Bill can do.
Dean Russell: Thank you, Mr Lewis. That is incredibly powerful testimony. I appreciate you sharing it. I can tell that it is very hard to do so as well.
Rocio, would you mind sharing your experiences of the real, actual impact of this and the harm it does?
Rocio Concha: We have a number of case studies. I can read you some of them. Andrew was a retired social worker. He saw an investment. When he did his research, it was endorsed by Piers Morgan. He lost £100,000 of his life savings. He attended a suicide prevention appointment because of the stress he was facing. He was prescribed antidepressants, betablockers and tranquillisers.
Let me tell you about another case. A man who was retiring in his 70s lost £100,000 to a bitcoin scam. He said to us, “Being scammed in this way was utterly devastating. I think about it virtually every day, and it has really affected my confidence and my ability to make decisions. It has changed the person that I am”.
I want to share another case. This is a lady, a sound engineer in her 40s. She was searching for an investment and advice on Google. She ended up losing £30,000. She said, “It has been really traumatic. At the time it felt like no one cared or wanted to discuss my case with me. It breaks you as a human being and leaves you scared of the outside world”.
As Martin was saying, it is clear that it is not only about the financial impact. It is also the emotional impact that people are facing because of these scams. They blame themselves. I completely agree with Martin that, actually, the criminals are very sophisticated. It is easy to believe why these things look so real. You cannot blame the victims.
Dean Russell: Can I assume that in those cases they never recovered their money? Would that be a correct assumption?
Rocio Concha: I would need to check whether they recovered it, but if they recovered that money, it was not from the criminals or the online platform. It was not because the banks recovered the APP scams. The rules are there. The banks may have paid some of them, but I will check. It is not because the online platforms are returning the money. It is not because they catch the criminals and then return the money. As Martin says, it is very difficult to catch these criminals.
Dean Russell: Thank you for your testimony.
Q114 Darren Jones: Rocio, in the past, Which? has been given statutory status to receive and act on consumer complaints alongside the regulators. There is an interesting debate, given how many financial scams there are, about whether consumers are going to have somewhere to go. We heard from Martin Lewis earlier that at the moment they do not really have anywhere to go. They need to go somewhere with the capacity to manage those complaints on behalf of individuals. Do you think that Which? or other organisations need to play a role in that, or do you think the existing regulators and maybe the ombudsman set-up is appropriate enough?
Rocio Concha: At the moment, there are 4.6 million fraud offences in England and Wales. It is unrealistic to say that any organisation will be able to deal with that level of fraud. There is a clear case for the online platforms to take responsibility to avoid that fraud happening in the first instance and to deal proactively with it. Once that is in place, maybe there is a role for helping victims, but I do not think the solution is to give it to Which? or to ask the regulators to do more. With that volume, it is impossible. You cannot rely on law enforcement to solve this issue.
Darren Jones: Thank you. Martin Lewis, you talked earlier about the scam ads button that you negotiated with Facebook. Do you have any reflections from engaging with the technology companies about how they can better remove scammers from their platforms in the first place, and what are your answers to some of their responses about it being technically too difficult?
Martin Lewis: My answer is, just do it. You are being paid to publish adverts, so have someone who checks the advert before they put it up. It is pretty obvious in most cases. The theory behind the scam ads button, which I call social policing, is that many people can spot a scam ad—certainly people who have had experience of it before. There is the idea that everybody can press a button, and I would love it to be on every platform. I give Facebook some plaudits. It still has a lot of problems and it is nowhere near doing this properly, but it has improved more than others have improved over the years. The idea of saying, “Well, if we get a rapid number of people pressing the report button, we’ll look at that advert as a priority and take it down”, is at least a form of help.
It would be good to see a standardised scam ads icon that people can press and it can be reported. I would prefer it to work in such a way that it gets taken down automatically. It is fascinating for me that, when the companies put the ads up, it can be done on an automated basis, but they have to consider the way they take them down. I would rather it were the other way round. I would rather the regulations made it so painful for them to publish scam ads that they never went up in the first place, and that you changed the profit equation so that it was not profitable for them to publish them.
By the way, I am willing to accept that some scam ads will get through. Some very sophisticated scammers will, on occasion, create new technology that defeats the existing technology, and then the tech firms will play catch-up. If that was the situation we were in, I would say, “Fair dos, they’re trying their best and they now have to adapt”, in the same way as hacking adapts all over the world. There is a new hack and we move on. But we ain’t close to that. We are nowhere near even the beginning of that.
I go back again to this: if you are being paid to publish something, you have to take responsibility as a publisher. There is a huge irony that this Bill is willing to make social media companies responsible for user-generated scams that they are not paid to publish, but not make them responsible for advertised scams that they make their money from. I dread to think what happens to the cash. When these companies take down a scam, what do they do? Do they give the money back? Do they put it into Citizens Advice Scam Action? That is a charity and it will run out of its funds from my Facebook case quite soon. I would love to see the big social media companies, every time they found that they have taken money from a scam, saying, “We won’t touch that money. We don’t want that money. We will give it to charity”. I have never heard that. It would be a lot of money. We might even have the resources to be able to deal with this properly if they did.
Darren Jones: Thank you. You are obviously a very accomplished campaigner on these issues. Have you had a chance to speak to current or former Secretaries of State about the inclusion of financial scams, and have they given you any hope about that?
Martin Lewis: I had a meeting with the Secretary of State at DCMS a few years ago when I had taken the lawsuit against Facebook. It was a private meeting, but I think as I am giving evidence to Parliament at this point I can probably give you the tone of what was said to me. I cannot remember the exact words.
The key was whether we could prove that these were publishers and not platforms. The Secretary of State at the time said, effectively—I paraphrase somewhat—“Legislating to do this is too difficult for us. I would really rather that you took them to court and did not settle, so that we can get a definitive answer”. At that point, I said, “Thank you very much for doing your job at my risk. It’s your job to deal with this issue, not mine. I’m dealing specifically with an issue of scam adverts, and if I get an offer that’s better than my lawyers tell me I will win in court, I will settle because it is a campaigning lawsuit”. I left pretty bloody furious, if you will forgive the language, but I had just met a Secretary of State to campaign on an issue and he told me that I should solve it in court.
This is an issue of regulating some of the biggest firms in the world, in the UK, a G7 economy and a sovereign state that should have the ability to take back control. Maybe it is time that we took control of the big tech companies and the way they advertise. I know it is flippant to say this, and I promised myself I would not say it, but I am going to anyway. If Boris Johnson was as trusted as me, so that he appeared in scam adverts as often as I do, we would not be having this conversation, because the first thing that would have been in this Bill would be regulation of scam adverts. Westminster society has been a little bit too keen to look at fake news and anti-democratic news, which absolutely should be regulated and I support that, but it has not been vociferous or protective enough of the general population on the more prosaic issue of protecting people from scams.
I call on the Prime Minister and the Government. Just put this in the Bill. You do not need this fight. You can protect people. You have an opportunity to do good. Please do good. Rocio agrees, I agree, the FCA agrees, the ABI agrees, the banks agree, the charities agree, and the consumer groups agree. The only people who do not seem to are the Government and big tech. It is time to change it.
Darren Jones: I agree, so thank you for that.
The Chair: That is why we are discussing it today.
Q115 John Nicolson: I was going to ask for specific examples, but Dean has already done that. I think some of those we heard have been incredibly moving. I had a constituent, a retired nurse, who lost £30,000—her life savings. It was exactly as Martin says. Her primary concern was that it made her seem foolish. She was embarrassed that she had fallen for the scam. As it happens, we managed to get her £30,000 back. I am not an ex-“Watchdog” presenter for nothing. It is one of the things I am most proud of. We managed to achieve that. We got her the money back, but it was from the bank. It was the bank that refunded the money. The scammers were never caught.
Martin, you have made your position fairly clear. It is obvious what you want the legislation to include. It might be useful for all the people who are sitting watching this now if you could give some consumer advice and tell people, pending changes in the legislation, how they can protect themselves from scams.
Martin Lewis: I have thought about this long and hard, because we do not have the protections in place that allow people to protect themselves. One of the things to say, and perhaps if we do not get it in the Bill I will need to say it more loudly and in more places, is: do not trust online advertising.
John Nicolson: Any online advertising?
Martin Lewis: If we cannot differentiate and if there are no rules to stop scam adverts, the easiest way to protect yourself is not to trust any adverts that are online, because there are no rules to stop them being published. If we don’t have rules, it undermines the advertising industry. It undermines commerce. It undermines the economy of the country. Ultimately, if the big tech companies will not do enough to prevent scams going on their sites, we have to say, “Don’t trust any adverts that are placed, because you do not have an easy way of distinguishing between what are scams and what are not”.
Within that, I would be very careful of any celebrity-name endorsements. Do a Google check. The first thing to say is that on Google, over the years, there has been homogenisation of the look between adverts and natural search. There used to be big differences between them but now there is just the little “Ad” word that goes at the top. Do a search and scroll down to something that does not have “Ad” by it. See if it is a legit source, whether it is a newspaper website that you trust, my site, Which?, the BBC, or Parliament—you never know—and go through that. See whether there is a record of the company that is being talked about having a commercial relationship with the individual who is on there.
I was doing a financial education class in schools on this the other day. Interestingly, we asked the kids—the teacher set it up—whose face they would trust most. They go for the most trusted person. Funnily enough, when it comes to adverts, trust the most trusted least because they are less likely to be doing adverts in the first place. Be incredibly sceptical. Do not part with your money. If you want to do an investment, search out advice yourself. It is far better to buy an investment than it is to be sold one because of the way this advertising happens.
If there is economic damage to so many industries on the back of what I am saying, I do not want that damage, but ultimately we are going to have to inject a large amount of scepticism in everybody if we do not have effective rules that stop scam adverts. Do not part with your money. Use the 159 number to call your bank. If your bank gives you a warning about something and says, “Are you sure this is right?”, make sure that you have actually checked it. There is a reason they are asking whether you are sure it is right. Banks are the ones who are liable. It is farcical that the banks are liable, yet we do not make big tech liable for this. It is ridiculous.
I do not have much advice, because there is not that much you can do. Once you have been scammed, getting it back is an absolute nightmare. Most people do not get it back and many people do not report it for the exact reason you say: they are too embarrassed to do so, because they feel stupid. They are not stupid. We are stupid and our regulations at the moment are stupid.
Q116 Lord Knight of Weymouth: Rocio, are you aware of what kind of proportion of these ads is placed through programmatic means, and how much would be placed directly with social media platforms such as Facebook?
Rocio Concha: I do not have that information. Are you talking about the system to put them online?
Lord Knight of Weymouth: Yes. A lot of online ads, as I understand it, are placed using programmatic advertising agencies, so the platforms themselves might not be aware of what advertising is coming on to the platforms until the ads have appeared, because of their nature. Martin, you are nodding. Does that imply that you might have some insight?
Martin Lewis: I do not know the statistical difficulties, but it is one of the great problems that we have with this. Obviously, because of the nature of the way I operate, lots of people get in touch with me when they see an ad with me and with many other people on. We will go to a website and it is often very difficult to find out who the programmatic agency is, which is why I talked about a standardised button and a standardised approach that I would like to see. It does not matter who it is; you do one click. It could be Google or it could be another provider that is on a website.
The irony, and we may as well say it, is that I have seen newspapers reporting my campaign against scam ads and next to them on the page is one of those scam ads, because programmatic advertising says that it is Martin Lewis so they put a Martin Lewis scam ad next to it—and they are writing with outrage. This stretches far beyond the platforms of the Googles, the Facebooks and others. It stretches to all forms of websites that take advertising and it is very difficult to spot. Some of it is programmatic.
I do not know the exact stats, but scam adverts are a disease on the web. I understand that regulating the internet, social media and online platforms is very difficult, but if you start to make people responsible for what they publish, they start to take it more seriously and put more resources into it. These people are very clever. When I have met the big tech companies, they have some genius people out there, but I simply do not believe that their laser focus is on stopping scam ads. That is what I am asking you to help point them towards.
Rocio Concha: There is clear evidence that they could be doing something that they are not doing. Martin referred to the reporting tools—for example, the Facebook reporting tool. When we did research on that, only 30% of Facebook users knew about it, so the first thing for Facebook is, “Tell your users about your reporting tool”. Not only that, we found that over a third of the victims of any scam who reported to Google, and over a quarter who reported to Facebook, said that the advert was not removed.
Even when you have that and people report it, the platforms are not removing the ads. In paid‑for advertising, there is a transaction. For example, Google has something called its verification program that it introduced very recently. Do you know what that program does? Basically, you have a verification period of 30 days. During those 30 days your ad is already online, so in those 30 days criminals could already be getting in contact with a lot of victims. Why give them 30 days? Why put up an ad before it has been verified, for example?
The Chair: Thank you.
Q117 Lord Clement-Jones: You have both very graphically illustrated the impact on the consumer and the way the current system does not adequately deal with complaints or harms caused. Martin, you used the word “undermine” when you talked about the commercial side of it. What has been the actual impact of where we are on the financial services sector? There seems to be quite a strong consensus across the board that online scams, this kind of advertising, should be included in the Bill. Is that because of the impact on the financial services sector?
Martin Lewis: I never speak for banks. Normally, I am sitting opposite the table from them, not on the same side as them. We have set up a limited system of protection for people to get their own money back and it is the banks that fund that. Quite clearly, they would prefer not to be funding that, and of course if they fund it, some of it reduces shareholder money and some of it goes on to the costs of the products that we all have to pay for.
As to the financial services impact itself, when you start to diminish trust, whether it is in me or in the financial advice that is given when you have people who are trying to churn pensions into unregulated investments and they call themselves financial advisers, why would they not call themselves that? It often amuses me that people say, “Can’t you sue them for intellectual property?” These are criminals; they do not give a monkey’s about intellectual property rights. If they want to call themselves financial advisers, whether they are or not, they will call themselves financial advisers.
We are starting to degrade trust. It is very difficult to differentiate who the honest players are, who the crooked players are, and who the crooked players who say they are honest players are. Of course many scams impersonate banks, the police, the emergency services or the NHS—all those trusted agencies, trusted both by the state and by individuals. Once you move into that difficulty, we all see our own trust levels start to diminish.
I go back to the very first person who complained to me who would not let me help him because I had scammed him. There is a vicious circle, and if this carries on much longer, the questions will come. Who do you trust? How do you know it is real? How do you check it is regulated?
Remember that when you click through to these things and you see an online advert, you often get a BBC web page. The Mirror is very commonly used for some reason. They are well written. These people could work as content‑writing journalists in some cases. You will see a story, and it is a fake BBC web page or a fake Daily Mirror web page.
Going back to the earlier question, “What do you advise people?”, I would say that you have to be pretty sophisticated. I could teach people how to spot an obvious URL—that it is the first bit, not the bit before the dot and all that type of stuff—but, ultimately, there are some people with limited mental capacity, some people with learning difficulties, some people with onset dementia, some people who are just tired because they have young children, some people who do not pay enough attention, and some people who do not read all the terms and conditions. All of them are potentially victims of the scams that are going on out there, and I do not think we can protect them by instructing them in every way. The best way to protect them is to stop running the adverts.
Lord Clement-Jones: I will come back to you on the rebuilding of trust in a second. Rocio, do you have a take on that?
Rocio Concha: I completely agree with what Martin is saying. This is undermining trust in genuine companies. Often, what these criminals do is clone and pretend that they are a particular company, a big investment company. We have cases of that. Therefore, you end up with people unable to trust anything that they find on the internet. Obviously, that is hurting genuine businesses that are being cloned by the criminals.
Lord Clement-Jones: On the point about building trust, will it be enough simply to include this kind of scam advertising in the Bill? I ask, because if you look at the way Google, for instance, has insisted recently that the FCA needs to have approved a financial services ad or an investment ad, would you not go further and say that what you really have to do is screen this kind of advertising? It is not simply a question of takedown or prevention; you have to have actually built in harder duties in all this.
Martin Lewis: Yes. If I were doing it, that is what I would say. Arguably, we do not need to be that prescriptive. We need to be results driven, not methodology driven. If the tech firms are capable of coming up with a technological solution that stops scam adverts, or reduces them by 99%, let us say—we will accept a little bit of leakage, as you do in everything—I do not really care how they do it, although I do not really want to say that.
Pre‑pandemic, we were talking about getting a round‑table meeting together with all of them. I think some co‑operation would be useful; standardised reporting models and sharing information with each other would be useful. I would be happy to see a report: “This scammer is doing this on our platform. Do you want to put it on your platform?” As long as we can get the competition laws to work a bit, which I should not think would be a problem, we could be sharing it with the enforcement agencies and the consumer bodies.
If this Bill pushed hard enough that there was a duty to stop this, it is possible that we could all work in concert together to make it happen. I would be delighted to work with the firms, to sit there and help communicate their new systems for reporting scams if you see one, because they had taken enough preventive action that it would be very rare. I would be delighted if we had a uniform reporting system and we built a website where anybody can google an advert and check whether or not it is a scam, to work out what is legitimate and what is not.
All those things are possible, but they have to be willing to put the resources in. Requiring them to pre-screen is probably the easiest thing, but if they say, “We have another way of doing it that is cheaper and more effective and will reduce all the adverts”, I would say, “Great. Do it”.
Lord Clement-Jones: Thank you. Rocio, at the same time, could you add another answer to the question? Do we need to build individual redress into all this on the liability of a platform for someone who has experienced a scam or been the victim of a scam of that sort?
Rocio Concha: Yes, absolutely. Redress should always be available. Obviously, online platforms have to have the legal responsibility for prevention and for taking down illegal content. Then you should allow for redress so that you get redress when they are not doing what they should be doing as their duty.
Lord Clement-Jones: Martin, do you have a view about that?
Martin Lewis: Yes. The one thing you can do to make big firms listen is hit them in the pocket. The quickest way—I am not actually suggesting that this happens—and the obvious, easiest way to do it would simply be to say that if someone gets scammed having seen a scam advert on a platform, the platform is legally responsible for giving them their money back. I think you would find that there would be no more scam adverts after that, because then it would be worth pre‑moderating. That is probably a bit extreme because the causal link is not that defined, and it is ultimately the scammers who do it, but we should start to think a little bit into that route.
I set up one of the country’s biggest websites. I am not anti the web. We have a social media platform that at some points has been in the top 10 in the country, with the MoneySavingExpert forums. Forums, communities, social media and the internet have provided a communications revolution that has done a huge amount of good. I use Facebook, Twitter and social media. I think they are beneficial when used right, but they have some demons among them that they like to push off on to other people and say, “It’s not my fault”.
I would like to see them take a bit more responsibility. If that means reducing their profitability by 10%, in the long run they would be better off doing so and we would all be better off if they did so. If they do not have the long‑sightedness to do it for themselves, my hope is that legislators will do it for them. I would have preferred them to do it themselves when we notified them of it in the first place, but they do not.
Rocio Concha: Given the scale of these harms, we cannot rely on convincing the platform and negotiating with individual platforms, “Please do this”. That is not acceptable, given the level of harm. They have to be required to do it. I know that the regulators are already trying to get agreements with the platforms. It is taking years and it will also create a patchy system, where some of them will do it and some will not, which is very confusing for the consumer because they do not know who to trust. Given the scale of the harm, it has to be part of the regulation; they have to be required to do it.
The Chair: Thank you very much for your clear and powerful testimony. We have to draw this panel to a close at this point, but thank you for taking the time to give evidence to us today.
Rocio Concha: You have a very important job on your hands, so if there is anything we can do by giving you additional evidence—Martin and I, and all the bodies working with us—we will be more than happy to help and give you more details on anything.
The Chair: Thank you for that. We appreciate it, thank you.