Written evidence submitted by the
Information Commissioner’s Office (ADM0033)

 

 

Re: Algorithms and Decision Making report

Thank you for your letter of 9 September 2018 inviting me to set out the Information Commissioner’s Office’s views (ICO) to the Committee’s report and its recommendations.

 

Overall we welcome the Committee’s report on an area that is already extremely important in people’s lives and will only continue to increase in prominence in the future.

 

Events have progressed since I gave evidence to the Committee in January. Most notably, the General Data Protection Regulation and the Data Protection Act 2018 are now in effect. As a result the ICO has significant new powers to help ensure greater transparency in algorithmic decision making.

 

During my evidence session we discussed in some detail the new Centre for Data Ethics and Innovation which will have much to say in this area, and this new body is starting to take shape.

 

On the four specific recommendations you highlighted, I will set out in the attached table, the ICO’s position in light of both the Committee’s conclusion and the government response.

 

I hope this reply answers the points raised by the Committee. If I can be of further assistance to the Committee’s work in this area, please do not hesitate to get in touch.

 

 

Elizabeth Denham

Information Commissioner

 

10 October 2018


House of Commons Science & Technology Committee Recommendations and Government Response

 

Committee Recommendation

ICO Response

Recommendation Paragraph 67: The “right to explanation” is a key part of achieving accountability.  We note that the Government has not gone beyond the GDPR’s non-binding provisions and that individuals are not currently able to formally challenge the results of all algorithm decisions or where appropriate to seek redress for the impacts of such decisions.  The scope for such safeguards should be considered by the Centre for Data Ethics & Innovation (CDEI) and the ICO in the review of the operation of the GDPR that we advocate in Chapter 4.

 

The recommendation raises two issues in relation to the applicable legislation.

 

Firstly, it is the case that in GDPR the reference to providing an explanation after an automated decision has been made appears in a Recital and is not a specific requirement in Article 22.

 

Nevertheless, the Article does require that, where the processing is based on consent or contract, individuals have the right to obtain human intervention, express their view and contest the decision. There are equivalent provisions in the Data Protection Act 2018 (DPA) where automated decisions are required or authorised by law. Furthermore, consent, contract and law are the only conditions which legitimise such decision making.

Responding to such a complaint is likely to involve providing an explanation of how the decision was reached, as the Recital implies. We would expect controllers to provide this wherever possible. We recognise that this can raise technical and conceptual issues, due to the nature of algorithmic decisions, which is why we are working to produce a framework for explanation, as proposed in the AI Sector Deal.

 

Moreover, where a controller intends to use automated decision making, there is a specific obligation in Articles 13 and 14 to provide meaningful information about the logic involved to the data subject in advance. These transparency provisions are particularly important given the invisibility of much algorithmic decision making.

 

In relation to the absence of a right to challenge all automated decisions, it should be noted that there is a general prohibition on solely automated decision making that has legal or similarly significant effects, and it is only permitted when done on the basis of consent contract or law, as mentioned above. Those permitted exceptions all carry with them a right to contest the decision. It is true that the prohibition does not extend to decisions that are partly automated, i.e. where a human makes the final decision, but the legislation is intended to address the specific problem of wholly automated decisions.

 

We look forward to working with the Centre for Data Ethics and innovation to consider the implications of this and we will use our powers and duties in the DPA to advise Parliament as appropriate.

 

Recommendation Paragraph 91: The CDEI and the ICO should keep the operation of the GDPR under review as far as it governs algorithms, and report to Government by May 2019 on areas where the UK’s data protection legislation might need further refinement.  They should start with a more immediate review of the lessons of the Cambridge Analytica case.  We welcome the amendments made to the DP bill which give the ICO the powers it sought in relation to its Information Notices, avoiding the delays it experienced in investigating the Cambridge Analytica case.  The Government should also ensure that the ICO is adequately funded to carry out these new powers.  The Government, along with the ICO and the CDEI should continue to monitor how terms and conditions rules under the GDPR are being applied to ensure that personal data is protected and that consumers are effectively informed, acknowledging that it is predominantly algorithms that use those data.

We welcome the government’s commitment to ensuring that the ICO is sufficiently resourced to carry out its powers and duties, and also welcomes the investigative and enforcement powers given to the ICO in the DPA and the GDPR. Our recently published Regulatory Action Policy explains how we will use these in a proportionate and risk-based way.

 

In relation to our investigation into the use of data analytics for political purposes, once our investigation has concluded we will be carrying out an internal exercise examining all aspects of the way the investigation was undertaken including the effectiveness of the legislation. From this we hope to identify any lessons for future high profile investigations.

The majority of the processing under the investigation was pre-GDPR so we were not able to utilise the full range of new powers. One of the main challenges we have faced is the inability to compel individuals to interview – this may be something we will want to revisit in the future.

 

 

Automated decision-making is a rapidly evolving area in which the application of the GDPR is relatively untried. It is therefore a particular focus of interest for us. We will keep the operation of the new legislation under review, mindful that the DPA gives us a duty to advise Parliament and government, and the power to issue opinions to them, on data protection matters.

 

Recommendation Paragraph 92: “Data Protection impact assessments”, required under the GDPR, will be an essential safeguard.  The ICO and the CDEI should encourage the publication of the assessments (in summary form if needed to avoid any commercial confidentiality issues).  They should also consider whether the legislation provides sufficient powers to compel data controllers to prepare impact assessments, and to improve them if the ICO and the CDEI believe the assessments to be inadequate.

The ICO has published detailed guidance on DPIAs under the GDPR. Publishing a DPIA is not a legal requirement under GDPR, with the decision resting with the controller. However, in our view it is good practice for organisations to publish their DPIAs in order to aid transparency and accountability, to foster trust in their processing operations and to help individuals to exercise their rights. This position is also consistent with DPIA guidelines from the European Data Protection Board.

 

We accept that in some cases it may be appropriate to publish a redacted version or summary, for reasons of commercial confidentiality or security. Public authorities in particular may be required to publish DPIAs under their Freedom of Information publication schemes – PIAs (Privacy Impact Assessments) were included in ICO definition documents of what classes of information FOI publication schemes should cover. In any case, they may be requested under the FOI Act.

 

We continue to engage with controllers around DPIAs, both under our statutory role in relation to Prior Consultation, and as a critical friend to key stakeholders. As with other requirements in GDPR, we continue to review the controller obligations around DPIAs, and our powers in relation to them. We will, of course, advise Government and Parliament if we have concerns about how the legislation is working in practice.

 

 

 

Recommendation Paragraph 97: The CDEI and the Information Commissioner should review the extent of algorithms oversight by each of the main sector-specific regulators, and use the results to guide those regulators to extend their work in this area as appropriate.  The Information Commissioner should also make an assessment, on the back of that work, of whether it needs greater powers to perform its regulatory oversight role where sector regulators do not see this as a priority.

The ICO welcomes the role of the CDEI in reviewing the state of play on algorithmic decision-making, gathering evidence and making recommendations as to the ethical application of this technology. We look forward to working closely with the CDEI in this area, building on the co-operation we have with other expert bodies such as the Alan Turing Institute.

 

We are currently increasing our own resources in this area and strengthening the capabilities of our Technology department.