Written evidence submitted by the Investment Association (OSB0162)
About the Investment Association
- The Investment Association (IA) champions UK investment management, a world-leading industry which helps millions of households save for the future while supporting businesses and economic growth in the UK and abroad. Our 270 members range from smaller, specialist UK firms to European and global investment managers with a UK base. Collectively, they manage £9.4 trillion for savers and institutions, such as pension schemes and insurance companies, in the UK and beyond.
Summary
- In its current form, the Online Safety Bill does not meet its stated objective to ‘make the UK the safest place in the world to be online’. As the draft bill stands, crime facilitated through fraudulent online advertising is excluded, and the platforms which profit from the facilitation of criminal activity face no ramifications.
- The Online Safety Bill must be broadened in scope to tackle the growing problem of scam advertising, and include paid for adverts in all forms, not just user-to-user content. Currently, the draft Bill explicitly excludes paid for advertising from the scope of ‘regulated content’ in relation to a regulated user to user service in s39(2).
- Applicability of the Bill could be achieved easily. At the end of our submission we set out detailed proposed amendments to the Bill, which would bring fraudulent online advertisements within scope.
- By including paid for advertisements across all platforms, the Online Safety Bill will help to significantly reduce the financial and emotional harm in the UK. In 2020, 9,000 people lost £135m to investment scams, and the Financial Conduct Authority estimates that 86% of fraud is committed online[1]. Over £63m was lost to victims who referenced social media in their reports to law enforcement.[2]
- Search engines and social media companies should be legally required to conduct a sufficient level of due diligence on the advertisers using their platforms to ensure they are who they claim to be. When they fail there should be robust legal sanctions in place to incentivise a high level of due diligence. A broad coalition of regulators, consumer groups, law enforcement and the financial services industry feel the natural place for such requirements to become law is through the Online Safety Bill.
- As we go on to explain in more detail, paid-for scam advertising is facilitating financial, emotional and life altering trauma for thousands of people every year. Moreover, social media platforms and search engines profit financially from hosting scam adverts. The Online Safety Bill provides a timely and appropriate home to legislate for change and should ensure all websites within scope of the Bill are legally bound to roll out robust advert verification.
- At the conclusion of our evidence we have included a summary of the legal recommendation commissioned by UK Finance which outlines how the draft Bill should be amended to bring paid for advertising into scope.
Question 1. Will the proposed legislation effectively deliver the policy aim of making the UK the safest place to be online?
- No. As the Bill currently stands, it does not deliver on the aim to make the UK the safest place to be online.
- There is a large – and growing - problem of organised criminal groups cloning the websites and branding of legitimate financial services firms, falsely claiming to offer high rates of return, and then paying to prominently advertise these sites on social media websites and search engines.
- Members of the public using the internet to find investment products are drawn in by these adverts, and are then coerced by fraudsters posing as real firms into sending their money to criminals rather than investing it legitimately.
- As the legislation stands, due to the exclusion of paid-for advertisements in s39(2), platforms are not required to verify the identity of companies or individuals placing online advertisements. As a result, the Bill will not address the increasing financial and emotional harm caused by online scams, and as such the legislation cannot deliver on its policy aim.
Question 2. Does the draft Bill make adequate provisions for people who are more likely to experience harm online or who may be more vulnerable to exploitation?
- Organised criminals target retail investors looking for investment opportunities online. Members of the public are initially taken in by sponsored adverts promoted by trusted tech platforms such as Google and Facebook. The current scope of the Bill fails to protect the most vulnerable members of society due to the onus it places on the individual to spot fraudulent advertisements on globally recognised platforms.
- Action Fraud figures show that brand cloning scams amount to an average loss of £45,242 per victim[3]. This is a life changing amount of money for any individual, however for someone at the end of their career, these sums would likely impact on their ability to retire.
- Financial harm is only part of the trauma caused. Victims also suffer significant, long-lasting, and traumatic emotional impacts. These include strains on personal relationships, feelings of shame and embarrassment, as well as a deep-rooted and long-term fear of contact from strangers and the financial system. Four in ten (42%) Money and Mental Health Research Community respondents who had fallen victim to an online scam experienced a major negative impact on their mental health.
- The Money and Mental Health Institute has also found that people who have experienced mental health problems are three times more likely than the rest of the population (23% versus 8%) to have been a victim of an online scam.[4] Requiring individuals to spot fraudulent, sponsored, adverts, is placing the most vulnerable members of society in harm’s way.
- Brand cloning investment scams are also increasingly skewed to target older people with more savings, or those hoping to increase returns on their pensions. Action Fraud statistics show that the average age of a brand cloning scam victim is 60[5], moreover AgeUK analysis of the 2017-18 Crime Survey for England and Wales found that 800,000 older people a year have an experience of fraud[6].
- Consumer group Which? published research which shows fraud reporting agency Action Fraud identifies 300 to 350 reports a week where victims show signs of severe emotional stress. This represents around two people every hour[7].
- The Daily Mail also recently set up a decoy fake investment website advertised through Google, demonstrating the ease with which such scams can be created. Over 3,500 potential victims viewed their advert in less than a week.[8]
Question 3. Will the proposed legislation help to deliver the policy aim of using digital technologies and services to support the UK’s economic growth? Will it support a more inclusive, competitive, and innovative future digital economy?
- Investment scams facilitated through fake advertisements on search engines and social media channel significant amounts of money out of the mainstream UK economy, and into organised crime.
- The FCA estimates that fraud costs the UK up to £190bn per year, with 86% of this committed online. Investment-related scams and specifically brand cloning scams are growing in prevalence, with the FCA more than doubling the number of scam warnings it issues between 2019 and 2020[9]. Without changes to the Online Safety Bill, the UK will continue to lose out vital economic activity to criminal gangs and organised crime.
- Building on this, brand cloning investment scams are increasingly skewed to target older people with more savings, or those hoping to increase returns on their pensions. Action Fraud statistics show that the average age of a brand cloning scam victim is 60[10], moreover AgeUK analysis of the 2017-18 Crime Survey for England and Wales found that 800,000 older people a year have an experience of fraud[11].
- Action Fraud figures show that brand cloning scams amount to an average loss of £45,242 per victim[12]. This is a life changing amount of money for any individual, however for someone at the end of their career, these sums affect the ability to retire and put undue stress on our pension system. Those who had hoped to fund their retirement and necessary later life social care privately are finding themselves without the means after falling victim to scams. With thousands of older people in this position, the NHS and the taxpayer is having to meet the shortfall.
Question 5. Are there any types of content omitted from the scope of the Bill that you consider significant e.g. commercial pornography or the promotion of financial scams? How should they be covered if so?
- The Online Safety Bill must be broadened in scope to tackle the growing problem of scam advertising and include paid for adverts in all forms, not just user-to-user content. Currently, the draft Bill explicitly excludes paid for advertising from the scope of ‘regulated content’ in relation to a regulated user to user service in s39(2).
- Whilst it is welcome that the draft Bill includes some online scams and introduces a ‘duty of care’ toward platform users, it specifically excludes fraudulent adverts because paid for content is not classed as ‘user generated’.
- This situation can be rectified by small updates to the Bill in s39(2), S41(4) and in ss9&21, as well as expanding on this with a proper clarification of what is meant by fraud offences. We would support the UK Finance suggestion of a new s44 and Schedule 4 to make this clear.
- Currently s39(2) excludes paid-for-advertising from the Bills’ definition of ‘regulated content’ and should be amended to ensure advertisements fall within scope of the Bill. To strengthen this further, an additional clause should be added which requires websites which host paid-for-advertising capabilities to roll out advert verification with identity checks – this could be achieved in S41(4).
- Finally, to make the most of the opportunity the Bill presents there should be a specific section on safety duties (ss9 & 21) on how to identify and prevent fraudulent adverts.
Question 6: The draft Bill applies to providers of user-to-user services and search services. Will this achieve the Government's policy aims? Should other types of services be included in the scope of the Bill?
- Whilst it is welcome that the draft Bill includes some online scams and introduces a ‘duty of care’ toward platform users, it specifically excludes fraudulent adverts because paid for content is not classed as ‘user generated’. The decision not to require search services to verify advertisements therefore provides a significant loophole, which sees the most harmful online scams untouched by legislation.
- Through this loophole, the Bill in its current form encourages the fraudsters to pay for their content to ensure it is not ‘user-generated’. This would have the perverse impact of this content reaching more people, as advertising benefits from a platforms’ algorithms and targeting. As a consequence, the draft Bill would not prevent internet users from being directed to fraudulent websites by scammers advertising on online platforms and search engines, and may unintentionally increase this. This is in direct conflict with the aims of the Bill.
- To raise an additional point, platforms will continue to profit from both criminals paying to advertise scams, and from organisations paying to advertise warnings against these very scam adverts. In 2020, the FCA spent close to £500,000 on Google Adwords, warning consumers about high-risk investments[13].
Proposed Amendments to Bill
- As put forward by UK Finance and supported by the Investment Association, the following amendments to the draft Bill should be sufficient to bring paid for advertising within its scope.
- The Government should reconsider the exclusion of paid-for advertising and cloned websites in relation to their role in facilitating fraud, and instead include the necessary mechanisms to tackle all online fraud within the Bill. To achieve this, Section 39(2)(f) of the draft Bill should be removed, which currently excludes paid-for advertisements from the scope of regulated content.
- Amend Section 41 (4) of the draft Bill, and other related references to explicitly include all fraudulent offences as illegal content within the Bill. At present the text of the draft Bill gives no mention of fraud, scams or economic crime. Explicitly defining fraud offences as illegal content will give recognition to the significant prevalence, risk and severity of harm caused by online fraud and will help put in place proper protections within the framework set out by the draft Bill to tackle all online fraud.
- Section 41 (4) of the draft Bill should be amended by inserting the following line:
“41 Meaning of “illegal content” etc
(4) “Relevant offence” means—
(a) a terrorism offence (see section 42),
(b) a CSEA offence (see section 43),
(c) a fraud offence (see [new] section 44),
(d) an offence that is specified in, or is of a description specified in, regulations made by the Secretary of State (see section 44), or
(e) an offence, not within paragraph (a), (b) or (c), of which the victim or intended victim is an individual (or individuals).”
- Building on this, inserting a new Section 44, which refers to a new Schedule 4 within the Bill to define the offences that constitute fraudulent offences, including fraud facilitated via paid-for adverts and cloned websites. This will be based on existing domestic legislation, including the Fraud Act 2006, and should not introduce any new offences. As with Section 42 and Section 43 in the draft Bill for terrorism and CSEA activity respectively, the Secretary of State will have powers to amend this new Schedule through subsequent regulations.
- A new Section 44 as follows:
“[NEW] 44 Offences relating to fraud
(1) In this Part “fraud offence” means an offence specified in [NEW] Schedule 4
(2) Secretary of State may by regulations amend [NEW] Schedule 4”
- Building on this, a new Schedule 4 as follows:
“SCHEDULE 4
[NEW] Section 44
FRAUD OFFENCES
- An offence under any of the following provisions of the Fraud Act 2006 —
- Section 2 – Fraud by false representation
- Section 3 – Fraud by failing to disclose information
- Section 4 – Fraud by abuse of position
- Section 6 – Possession or control of articles for use in fraud
- Section 7 – Making or supplying articles for use in fraud
- Section 9 – Participating in fraudulent business
- Section 11 – Obtaining services dishonestly
- An offence under section 5(2) of the Criminal Law Act 1977 (conspiracy to defraud).
- A money laundering offence under Part 7 of the Proceeds of Crime Act 2002.”
- Amending Section 41 (5) of the Bill on the meaning of illegal content by inserting the following technical line that clarifies that, when the relevant offence is a fraud offence, the content is described as fraudulent content:
“Illegal content—
(a) is “terrorism content” if the relevant offence is a terrorism offence;
(b) is “CSEA content” if the relevant offence is a CSEA offence;
(c) is “fraudulent content” if the relevant offence is a fraud offence;
(d) is “priority illegal content” if the relevant offence is an offence that is specified in, or is of a description specified in, regulations under subsection (4)(c).”
- From this, a new Section 7 (8) (b) (iii) (and respectively, Section 19 (3) (a) (iii)) of the draft Bill are required, inserting the following line to require service providers to identify, assess and understand the risks of fraudulent content on their platforms:
“Definitions
(8) An “illegal content risk assessment” of a service of a particular kind means an assessment to identify, assess and understand such of the following as appear to be appropriate, taking into account the risk profile that relates to services of that kind –
(a) the user base;
(b) the level of risk of individuals who are users of the service encountering the following by means of the service—
(i) terrorism content,
(ii) CSEA content,
(iii) fraudulent content,
(iv) priority illegal content, and
(v) other illegal content,”
- The safety duties about illegal content (Section 9 and Section 21) should require that providers take further proactive measures to minimise the presence, time and dissemination of fraudulent content. It is not enough for providers to simply undertake a risk assessment for fraudulent content and take down this content when reported.
- This would be best achieved either through a commitment from Government that fraud offences will be treated as priority illegal content through later regulations referred to under Section 41 and Section 44 as it stands in the draft Bill, or by a specific reference to fraudulent content in Section 9 and Section 21.
- Additionally, a new Section 29 (3) which recognises that OFCOM prepares codes of practice specifically relating to fraudulent content, as follows:
“29 Codes of practice about duties
(3) OFCOM must prepare a code of practice for providers of regulated services describing recommended steps for the purposes of compliance with duties set out in section 9 or 21 (safety duties about illegal content) so far as relating to fraudulent content.”
- Alongside this, Section 63 of the Bill should be amended to include fraudulent content to provide the statutory basis for OFCOM’s power to require a service provider to use accredited technology to identify and remove fraudulent content on private and public channels.
- The harmful content provisions should be expanded to include financial impacts on individuals in any assessment of harmful content, alongside psychological and physical impacts on individuals which are already included in the draft Bill’s harmful content provisions (Section 45 and Section 46).
- The Bill should adopt the FCA’s recommendation[14] of expanding the Duties of Care (Chapter 2 and Chapter 3 of Part 2) to encompass an obligation to prevent the communication of financial promotions which have not been approved for communication by an FCA-authorised firm. To fulfil this duty, online platforms and their senior managers should be required to implement measures including:
- Appropriate gateway systems and controls to prevent publication.
- Steps to ensure fraudulent and misleading financial promotions are dealt with rapidly.
- Processes that allow authorities to share intelligence on non-compliant financial promotions.
- Fraud facilitated via cloned websites should be fully included within the scope of the Bill by amending Section 41 (6) (a) of the Bill to ensure that this does not unintentionally exclude fraud facilitated by cloned websites.
28 September 2021