Written evidence submitted by the Financial Conduct Authority (ADM0030)
The Financial Conduct Authority (FCA) is the conduct regulator for approximately 56,000 financial services firms and financial markets in the UK. We aim to make markets work well for individuals, businesses and the economy as a whole. Our work and purpose is defined by the Financial Services and Markets Act 2000 and we are accountable to Parliament.
What policy, if any, you have in regard to the use of algorithms in your sector
Algorithms play a significant role in financial services. They can deliver benefits to firms and consumers such as in reduced costs or increased speed of service. They can also, however, amplify potential harm if poorly controlled as increased speed and complexity of financial markets can turn otherwise manageable errors into extreme events with potentially wide-spread implications.
Algorithms have been used in some financial sectors for some time, while other sectors are beginning to make increasing use of them. The applications a trading firm has for algorithms will be very different to those of an insurance firm. As the Committee will be aware, the use of algorithms to make decisions is not unique to the financial services sector.
We do not have a policy for or against the use of algorithms. As a regulator, we are technology neutral. Our focus is on ensuring that they are properly overseen and controlled by firms which use them and on the outcomes they deliver.
What rules of guidance you have issued about algorithms in your sector?
Our Handbook sets out a range of rules and guidance. It includes high level standards which would apply to a UK firm using algorithms. For example:
▪ Our Principles for Business must be followed by all FCA regulated firms. These include requirements that a firm must conduct its business with integrity (Principle 1), that a firm must conduct its business with due skill, care and diligence (Principle 2) and that a firm must pay due regard to the interests of its customers and treat them fairly (Principle 6).
▪ The Senior Managers and Certification Regime (SM&CR) has been in place for banks since March 2016 and is being rolled out to other sectors. It aims to raise the standards of conduct for everyone who works in financial services by making senior managers in firms more responsible and accountable for their actions. Within wholesale financial markets, those individuals within firms who are responsible for approving the deployment of a trading algorithm or having significant responsibility for monitoring a trading algorithm are covered by the SM&CR, including a requirement to certify on an annual basis that they are ‘fit and proper’.
Our Handbook also has rules on the systems and controls (SYSC) that a firm must have in place to ensure that its algorithmic trading systems are resilient, have appropriate thresholds and limits and cannot be used for any purpose which is contrary to the Market Abuse Regulations. This includes a requirement that a firm engaged in algorithmic trading must be able to provide, at the FCA’s request, a description of the nature of its algorithmic trading strategy, details of the trading parameters or limits which apply to its system and any further information about the firms’ algorithmic trading and systems used for that trading.
In addition, some specific requirements apply to certain activities that use algorithms. For example, The Markets in Financial Instruments Directive II (MiFID II) came into force on 3 January 2018, aiming to strengthen consumer protection and improve the functioning of financial markets. Part of its scope determines the capacity and arrangements needed to manage trading facilities in order to enable algorithmic trading to take place. MiFID II also contains obligations, systems and controls for algorithmic investment firms to mitigate the risks arising from algorithmic trading. Trading systems and the algorithms must be fully tested before deployment and deployed or substantially updated only as approved by senior management. The firm must carry out an annual self-assessment and issue a validation report covering its algorithmic systems and strategies, the firm’s governance and control framework, its business continuity arrangements and its stress testing. Firms must also assess their overall compliance with the other MiFID II requirements. Venues need to provide testing environments for users of algorithmic trading strategies. Finally, MIFID II introduced the requirement for those investment firms trading on their account using a high frequency trading strategy, and not otherwise within the regulatory perimeter, to be authorised for the first time.
Any arrangements for bodies in the sector to make available, to you or the public, (i) the details of any algorithms used and/or (ii) an explanation of the way any algorithm functions, to aid understanding.
The FCA captures some of this information through its supervisory work and the information we receive from firms and other sources. MiFID II introduces an obligation for firms to hold an inventory of the algorithms they use which the FCA can receive on request.
MIFID II contains a new notification requirement for any investment firm that uses algorithmic trading to notify the FCA of this fact and the venue on which that trading takes place.
We do not intend to share the details of algorithms used publically, however, the FCA intends to publish the findings of our recent supervisory work in the coming weeks.
What arrangements are in place in the sector to monitor the development and use of algorithms?
The FCA supervises firms against our rules through a range of methods. One is to carry out multi firm work, looking at a specific issue across a sample of firms. The benefit of this work is that it enables us to compare firms and highlight outliers.
We have undertaken a number of reviews on algorithmic trading covering areas such as pre-trade risk controls, development and testing procedures as well as the governance framework around this. We have recently completed two supervisory projects looking at the development and use of algorithms for decision making and execution purposes. We hope to publish the key findings of these reviews as well as highlighting the relevant MiFID II requirements in the next few weeks.
We also carry out work on individual firms, particularly so if our analysis of its business model shows that algorithm activity delivers substantial revenues either in a material business unit or the group as a whole. For example, for firms carrying out High Frequency Trading activity, we might carry out testing on a firm’s resilience and capability of dealing with peak order volumes.
We monitor markets closely in order to assess the causes of any unusual activity. On 7 October 2016, Sterling suffered a significant drop in value described as a ‘flash crash’. While the market recovered relatively quickly, we worked to identify possible causes of the flash crash. We spoke to firms and issued a Dear CEO letter reminding them of the importance of adequate controls around algorithms.
We also undertake significant surveillance (for the detection of market abuse) of the financial markets where algorithms are particularly active. This gives the FCA oversight of the conduct implications of algorithmic trading.
We can also collect intelligence on the use of algorithms through our Market Intelligence Data Unit, our assessment of individual sectors through our sector views and through initiatives like Innovate, via the Regulatory Sandbox (“the Sandbox”) and Advice Unit.
The Sandbox is an environment where firms can test new innovative ideas which could benefit consumers. To date we have seen a number of firms test robo advice propositions as part of the Sandbox. Firms have also benefited from support from our Advice Unit, which provides regulatory feedback for firms who are looking to deliver automated advice and guidance. To mitigate risks that these models deliver unsuitable advice, we have ensured that firms build in additional safeguards before they begin testing. In most cases, this has involved qualified financial advisers checking the automated advice outputs generated by the underlying algorithms.
We do not assess individual algorithms. The sheer number in use would require significant resources and, more importantly, may not deliver the best outcome. Instead our supervisory focus is on testing that firms have the right processes in place when developing and implementing an algorithm and that the governance is effective. We would also seek to test that firms have suitable controls in place in order to deal with any issues which could arise as a result of the use of algorithms.
Ultimately, it is the outcome delivered by the algorithm which is crucial. The way a firm makes a decision does not change their obligations under our rules and regulations. For example, in financial advice, algorithms are being used to provide automated advice to consumers making the provision of advice more accessible. As our approach to regulation and supervision is technology neutral, the method of provision does not change the requirement that firms provide consumers with suitable advice.
While we remain technology neutral, generally, new technology can help deliver improvements in financial services and so we encourage firms to use technology that works for them and consumers. We expect the highest conduct standards to be maintained regardless of the technology which underpins decision making. Should it become clear that a firm is not adhering to what we expect we would intervene, regardless of the method being used to take decisions.
The accountability that bodies in the sector have to you for their use of algorithms
Note that this is covered above under question 2.
What assessments have been made of the impact that the Data Protection Bill, and the EU General Data Protection Regulation (GDPR), will have in your sector in regard to the development and use of algorithms.
An internal working group is monitoring the impact of the GDPR as we move toward implementation. We maintain a record of issues highlighted to us by firms but, to date, none have declared algorithms as a concern. As some sectors are likely to be impacted by GDPR, such as general insurance and asset management firms, this is something we are monitoring closely.