Committee sets the agenda for new algorithmic ethics agency
23 May 2018
The Science and Technology Committee report acknowledges the huge opportunities presented by algorithms to the public sector and wider society, but also the potential for their decisions to disproportionately affect certain groups.
- Read the report summary
- Read the report conclusions and recommendations
- Read the full report: Algorithms in Decision Making
The report is released as the GDPR becomes effective, and in the wake of the recent controversy centred around the algorithm used by Cambridge Analytica.
Centre for Data Ethics & Innovation
The report calls on the 'Centre for Data Ethics & Innovation' – being set up by the Government – to examine algorithm biases and transparency tools, determine the scope for individuals to be able to challenge the results of all significant algorithmic decisions which affect them (such as mortgages and loans) and where appropriate to seek redress for the impacts of such decisions.
Where algorithms significantly adversely affect the public or their rights, the Committee highlights that a combination of algorithmic explanation and as much transparency as possible is needed.
It also calls for the Government to provide better oversight of private sector algorithms which use public sector datasets, and look at how best to monetise these datasets to improve outcomes across Government.
Norman Lamb, Chair of the Science and Technology Committee, said:
"Algorithms present the Government with a huge opportunity to improve public services and outcomes, particularly in the NHS. They also provide commercial opportunities to the private sector in industries such as insurance, banking and advertising. But they can also make flawed decisions which may disproportionately affect some people and groups.
The Centre for Data Ethics & Innovation should review the operation of the GDPR, but more immediately learn lessons from the Cambridge Analytica case about the way algorithms are governed when used commercially.
The Government must urgently produce a model that demonstrates how public data can be responsibly used by the private sector, to benefit public services such as the NHS. Only then will we benefit from the enormous value of our health data. Deals are already being struck without the required partnership models we need."
The Committee also recommends that the Government should:
- Continue to make public sector datasets available for both 'big data' developers and algorithm developers through new 'data trusts', and make better use of its databases to improve public service delivery
- Produce, maintain and publish a list of where algorithms are being used within Central Government, or are planned to be used, to aid transparency, and identify a ministerial champion with oversight of public sector algorithm use.
- Commission a review from the Crown Commercial Service which sets out a model for private/public sector involvement in developing algorithms.
Notes to editors:
The Report responds to a request for an inquiry into algorithms suggested to the previous Committee's 'My Science Inquiry' by Dr Stephanie Mathisen. She raised the question of "the extent to which algorithms can exacerbate or reduce biases" as well as "the need for decisions made by algorithms to be challenged, understood and regulated".