Google Supplementary Written evidence (FDF0072)

 

As you know, digital fraud is a complex issue, which requires careful consideration of all parts of the ecosystem. We welcome your leadership in this area and share the importance you place on protecting UK users. Tackling fraudulent criminal activity is a cross-industry challenge and necessitates strong and sustained cooperation from a range of actors. We take our responsibility in this space incredibly seriously and have not waited to act.

 

Ads Safety Report

Our business is heavily dependent on the proper functioning of the ad -supported ecosystem, and the continued trust of users in that ecosystem. If consumers abandon bad web experiences, the long-term viability of Google’s core business is at stake.

Our latest Ads Safety Report 2021 was published in May 2022 with an accompanying blog post detailing our policies and enforcement strategies. Our policies cover content including financial services, gambling and gaming, ads promoting dangerous behaviour and counterfeit goods. We publish this information to let users and stakeholders see what action we are taking against those who try to misuse our ads programme. Alongside providing users with features to learn more about the ads they see and reporting mechanisms, this report sets out the policy enforcement to keep them safe online.

Similar to other sectors, we have observed bad actors becoming increasingly sophisticated and deploying a diversity of tactics, which is reflected in the number of ads we blocked and restricted on our services. In 2021, we suspended four million accounts for phishing, cloaking and misleading users. We also added and updated 30 policies and have in place a multi-pronged approach to combat repeat or serious offenders. As a result of these actions, between 2020 and 2021, we tripled the number of account-level suspensions for advertisers. In addition to verifying advertisers’ identities, we identify coordinated activity between accounts using signals in our network - like IP addresses, billing information and traffic patterns.

 

Collaboration with the Financial Conduct Authority

Google recognises the responsibilities platforms have towards their users and we have not waited for legislation to act. We do not tolerate ads for fraudulent financial services and we have invested significantly to prevent them appearing on our platforms while working closely with the Financial Conduct Authority (FCA) to meet evolving threats. We are confident that these measures have reduced exposure to harmful advertising and improved trust between users, platforms, and advertisers.

In September 2021, our updated Google Ads Financial Products and Services policy came into effect, following an iterative process and many months of work with the FCA. To show financial services ads to UK users, advertisers must demonstrate that they have authorisation from the FCA. Once an advertiser has completed an identity verification process and we have confirmed their entry on the FCA register, we issue them with a certificate, allowing them to advertise financial services in the UK. We also provide a means by which FCA authorised advertisers can identify the domains of companies whose marketing they approve. Advertisers are unable to advertise financial services in the UK unless they have been issued a certificate or have been deemed exempt under a strict set of criteria.

We were pleased that the Chair of the FCA recognised that these measures are industry-leading and that TSB acknowledged that these changes had ‘all but eliminated scam ads’ on Google. These steps built on existing work with the FCA. Over the last couple of years, we have also:

     Integrated & automated the FCA Alert List, preventing ads linking to the more than 5,000 websites featured on the FCA’s Warning List, despite the vast majority not using Google Ads.

     Introduced two new forms of verification checks for financial services advertisers including Business Operations Verification and Advertiser Identity Verification - where we can ask for more information from advertisers about their business.

     Updated our financial advertising policies to restrict the rates of return a firm can advertise and ban the use of terms that make unrealistic claims to make sure that our ads are fair for everyone.

 

We also work closely with a range of other organisations. For example, we also work closely with the Advertising Standards Authority to react quickly to any complaints received about ads on our platforms and others. We have been receiving and actioning alerts from their “Scam Ads Alert System” since its inception in June 2020.

We continue to invest significantly in measures to prevent scams from taking place on our platform and have worked closely with the FCA over the last two years. In order to support the FCA on the most effective response and their future work in this area, we have provided them with a £2.3m ads credit, for use on our platform, to help amplify their message to protect consumers from scams.

We have pledged a further £1.6m in ad credits to support industry scam awareness campaigns. As part of this commitment, alongside other tech platforms in the Online Fraud Steering Group, we have offered credits to support “Take 5”, the scam awareness campaign run by UK Finance. We have also donated ad credits to the Advertising Standards Authority to drive awareness of their Scam Ads Alert Scheme.

We believe that the ad credits we have offered to the FCA and others will play a significant role in amplifying these important messages and educating consumers on how to stay safe online.

However, we know that advertising alone won’t be sufficient to tackle this problem, which is why we continue to invest in new policies and measures to keep users safe on our services.

 

Draft digital competition bill

I also wanted to follow up on a question I received from Lord Colville regarding the Government’s draft digital markets bill. We welcome the UK’s ambition to establish a fast and flexible Digital Markets Unit. Our experience working with the CMA and ICO Privacy Sandbox offers an interesting model for how this “participative approach” can work. It recognises the many trade-offs that need to be considered in digital issues - in this case competition and privacy benefits - which is a critical insight for the new regime.

We think well-designed regulation should be proportionate and targeted, and take account of the dynamic and innovative nature of digital markets. This DMU will be given extensive and largely discretionary powers. As such, it must be accompanied by the most robust procedural and legal safeguards to ensure the regime is proportionate. We would be happy to discuss this in more detail if it would be of interest to the Committee.

 

30 May 2022