final logo red (RGB)

 

Justice and Home Affairs Committee

Corrected oral evidence: New technologies and the application of the law

Tuesday 23 November 2021

11.30 am

 

Watch the meeting

Members present: Baroness Hamwee (The Chair); Lord Blunkett; Baroness Chakrabarti; Lord Dholakia; Lord Hunt of Wirral; Baroness Pidding; Baroness Primarolo; Lord Ricketts; Baroness Sanderson of Welton; Baroness Shackleton of Belgravia.

Evidence Session No. 6              Heard in Public              Questions 83 - 98

 

Witnesses

I: Professor Paul Taylor, Chief Scientific Adviser, National Police Chiefs’ Council; Alun Michael, Police and Crime Commissioner for South Wales and Joint Lead for Data and Bioethics, Association of Police and Crime Commissioners; Darryl Preston, Police and Crime Commissioner for Cambridgeshire and Peterborough and Joint Lead for Data and Bioethics, Association of Police and Crime Commissioners; David Tucker, Faculty Lead on Crime and Criminal Justice, College of Policing.

 

USE OF THE TRANSCRIPT

  1. This is a corrected transcript of evidence taken in public and webcast on www.parliamentlive.tv.

22

 

Examination of witnesses

Professor Paul Taylor, Alun Michael, Darryl Preston and David Tucker.

Q83             The Chair: Good morning, everyone. Welcome to our witnesses, Professor Paul Taylor and David Tucker, who are with us in person, and Alun Michael and Darryl Preston, who are joining us online. One of our members, Lord Dholakia, is also online. Other members may arrive during the meeting. I will ask each of the witnesses to introduce themselves and their roles. Let us start with Professor Taylor.

Professor Paul Taylor: Thank you very much for the invitation. I am a professor of psychology at Lancaster University, but on 1 May I became the first Policing Chief Scientific Adviser.

David Tucker: I am head of crime and criminal justice at the College of Policing.

Alun Michael: I am the police and crime commissioner for South Wales, and currently also the vice-chair of the Association of Police and Crime Commissioners. I am chairing a group, which is a joint activity with Darryl Preston, looking at the ethics of biometrics and data as used by the police, which complements our portfolios looking at forensics and other applications of science in the course of policing.

Darryl Preston: Good morning. I am the police and crime commissioner for Cambridgeshire and Peterborough. I am the lead for forensics and biometrics with the Association of Police and Crime Commissioners. As Alun Michael said, we work closely together, particularly on some of the ethical issues of biometrics.

Q84             The Chair: Thank you. I will start by asking the witnesses who are with us in the room, who both have a more overarching role, to give us a very quick rundown of the organisations working in the area.

Professor Paul Taylor: I can try. As you will appreciate, it is a complex picture. At the top, with ultimate operational responsibility, are the chief constables and the police and crime commissioners making decisions about how we will use new technology. The NPCC has eleven co-ordination committees that have a responsibility for particular areas of new technology. The IMORCC[1], which you will heard about, is particularly pertinent in this space, but underneath that there are lots of national leads for a variety of different new technologies. As the CSA, I work very closely with the national leads to ensure that we have best practice on each of those technologies.

I am afraid the list does not stop there. Policing has invested in national structures for particular areas of use and value for new technology policing. You will have heard of the Police Digital Service; that is one example of that. There is also the National Data Analytics Service; we felt that there is a need to accelerate, and drive forward and give national oversight on those meetings. Then we have the College of Policing, which has a role in ensuring quality and best practice, including through its authorised professional practice doctrines.

It is important to note that policing will never achieve what it needs to achieve with new technologies on its own, so we have partnerships across government, and with academia and industry. My role as the Chief Scientific Adviser, which was introduced in May, is to provide oversight and advice to all those moving parts in the hope that they are moving in the same direction in a way that benefits and supports chief constables and PCCs in their ultimate decision-making.

The Chair: Mr Tucker, are you going to say “I agree”, or are you going to add to the list?

David Tucker: I am afraid that I am going to add to the list slightly and just pick up on the partnership point. Of course, law enforcement does not stop with the police, so we have very productive relationships with the NCA and the security services, which also have a role in using technology. When they use technology, there is very often an implication for policing.

The Chair: It would be helpful for us to see on paper who all these organisations are and what their different roles are.

Professor Paul Taylor: We are very happy to provide something.

Q85             The Chair: That would be very helpful. Thank you very much. I will turn now to the two commissioners to ask what guidance and support a police force would need from government to help in procuring the tools that are available. Procurement is something that concerns us.

Alun Michael: Support from government for resources is absolutely crucial, and choices have to be made as to how those resources are deployed. The most important thing, whether it is in legislation or in guidance, is that the principles are set down at a national level, but the application of those principles has to be very much at the local level. Part of the responsibility of the police and crime commissioner, following the Police Reform and Social Responsibility Act 2011, is to set down the expectations in the police and crime plan.

The original cartoon depiction of the commissioner drawing up a blank sheet paper, writing down requirements and passing it to the chief constable saying, “There you go, deliver that” is, like most cartoons, a caricature. As far as I am concerned, and I think many of my colleagues would say the same, the police and crime plan, for which I take responsibility, is very much drawn up in conjunction with the chief constable and his team and is a careful analysis of the requirements of our local areas.

There is also the importance of having regard to the requirements of central government. The strategic policing requirement sets out important elements of priority. At the local level, the delivery of the police and crime planthe way in which the chief constable exercises his operational discretion, which is protected in law—is very important indeed.

The application of technology, the subject of today’s hearing, is very much about working out together how you balance—using whatever tools are available—protecting the public, catching bad people and ensuring the safety of the local community with the need to protect human rights and civil liberties and have respect for the central values for policing that are set down in law. We go back to the principles that were set down by Sir Robert Peel with the two elements of engagement and being embedded in the local community, and prevention as the first responsibility of the police.

We are probably doing better today and more collectively between the commissioners and chief constables, the APCC and NPCC, than in the 50 years I have been following what happens in policing. In many ways, the current arrangement, including the consideration of the part that technology and science can play, is very balanced. There is a lot of sharing and discussion. We had a two-day conference last week that was quite outstanding in being particularly challenging to all of us.

The Chair: Mr Michael, I will ask you to pause there, because we need to come back to some of the issues. Mr Preston, do you want to add anything? We are interested particularly in how you balance national or regional standards, arrangements, and local pressures and requirements.

Darryl Preston: I agree with Alun Michael. It is for government to set down the guiding principles enabling the use of technologies that are there to keep our communities safe. We have other areas; we have the Regulation of Investigatory Powers Act on covert policing, which set down principles. Therefore, it is still for the chief constable to make the local operational decisions within the parameters of those principles. Alun has described very well the role of the police and crime commissioner holding the chief constable to account for an effective and efficient police service.

On the question you posed about government, for me that is about the setting of principles.

On the regional compared to the local, we all collaborate regionally, particularly on serious crime. We have governance in place for that, and chief constables and police and crime commissioners are deeply involved in that. They represent the force areas across our regions. Locally, when it comes to the scrutiny of the decisions of the chief constable, I have an independent scrutiny panel where we look at decisions on stop and search, for example. In my policing area, we have not deployed some of the biometric technologies, but going forward I see that as a vehicle locally to assist me in my role.

The Chair: Thank you. We are all working in an area where the technology is new and lay people, like most of us, do not understand what it can do and what it might be doing.

Q86             Baroness Chakrabarti: I should probably say, for the sake of transparency, that in about 1997 I had the pleasure of working for Mr Michael when he was Minister of State and I was a lawyer in the Home Office. The current Home Secretary told this committee that “decisions about people will always be made by people”. How do you ensure that this happens in practice?

Professor Paul Taylor: I suggest that that is almost entirely the case. Our algorithms work in tandem with our officers to support their decision-making and not to replace it. If you have had evidence contrary to that, I particularly want to know about it as CSA.

One of the biggest benefits of technologies is the efficiency they provide, and one might want to think of areas where you do want to use the technology to reduce the human engagement in decision-making. It is important here to understand the difference between the different types of new technology. A new technology that is being used in an operational or investigative context is very much supporting the officer who will make a decision. There might be more back-room opportunities to use a technology to support our demand management, for example. As you will have read from the Lancashire Voice example that we provided in our evidence, the technology is being used to allow police to better understand the demand that it is having, and one might want to say that that can be given a little more free rein in how it works.

Even in those circumstances, it is our contention and our working practice that we do a number of things to ensure that the human is in the loop. First, you will have heard about the ethics committees that all forces have, and I hope you have heard that many of our forcesthe Met, Lancashire, Thames Valley and West Midlands are examples—have set up specific ethics committees to deal with new technologies. They provide scrutiny and opportunity to understand whether there are gaps where the technology may be making an implicit, as well as explicit, decision on things.

Secondly, when we deploy new technologies there is a period of what I call manual checking. We look at the answer the technology is giving to our problem and compare it to the existing way in which we solve the problem to check that no hidden issues arise that we would not have thought of. The third thing, and this is part of my office, is to give subsequent regular review, including sampling the results to ensure that the technology is delivering fairly and as expected.

Baroness Chakrabarti: I note that you said “almost always the case” and “human in the loop” rather than “human in control”, but perhaps Mr Tucker wants to follow up.

David Tucker: I will pick up on something that Professor Taylor said. There are beneficial sides to the technology. For example, a lot of decisions are made to let a person pass through an airport without human interaction. It is normally where some form of intervention is required, and that is the point at which the human is in the loop or, in most cases, is the final decision-maker.

I want to come back on one aspect of your previous question about the setting of principles. There is another layer to this that comes from the College of Policing. We try to interpret legislation and make it operationally applicable, and often the principles—for example, on live facial recognition—of how you use that come from a number of different places. Some will be from an Appeal Court decision and some from other areas of principle. We try to bring that together either in authorised professional practice or in a code of practice.

Alun Michael: Most policing involves judgments being made. When I first came into this role, a screensaver was used which I thought had a very good message. It said, “You are a leader”. That was addressed to every police officer, not just senior officers, and every member of police staff. The training and values that sit behind the decisions being made are important, so that the basis for making mature and balanced judgments is in place. Technology covers a range of things. For example, the use of a taser is a judgment that—

The Chair: I am sorry. It always seems so rude to be interrupting people, particularly on the screen. We are focused on new technology.

Alun Michael: Yes, I was going on to that. The point I am making is that it is no different from the rest of policing. On the use of technology—we were the first force to deploy automatic facial recognition, for example—the important bit is that there is never automatic identification of an individual. There is the identification of a match, which then has to be looked at by an individual, so the human element has to be there all the time. The selection of the watchlist has to be a matter of judgment as to whether it is an appropriate identification of who we are looking for.

There is more than one element of scrutiny. One is scrutiny by the police and crime commissioner and my team of what is being done by the force. Then there is the ethics committee, as has been mentioned. In our case, a joint ethics committee has been appointed by me and the chief constable to be an independent check, and we refer issues of new deployment of technology to them.

Q87             Baroness Chakrabarti: Thank you so much. Mr Preston, to follow up, would you focus on the training element that Mr Michael mentioned and deal with what kind of training is mandatory for, or at least available to, officers working with these new and experimental technologies?

Darryl Preston: On the Home Secretary’s comment that people make decisions, I see this use of technology as an intelligence tool, another tool in the toolbox to deal with serious crime and community safety.

On training, I think it would be best to hand over to our colleagues. That is a very operational matter. If we are talking about some of the ethical issues here, I recently attended a stop and search training session for new recruits, and there is significant training for them. I think others are better placed to talk about that, because that is an operational question.

Baroness Chakrabarti: I will turn to those who want to talk about operational matters in a moment. I will just point out that stop and search is not a new technology and it is heavily prescribed by law, unlike new technologies, which you can imagine may be my concern.

The Chair: Before you answer, could you add in what you know about how many forces have ethics committees that are focused on the technologies rather than just broadly?

David Tucker: I will start with the training. There is very little mandatory training in policing. The use of the term “training” can be slightly misleading, because in these circumstances we are not talking about how you operate a piece of kit. What is more important is the accountability, the transparency and that we are operating with the consent of the public. Policing needs to operate with the consent of the public.

Through all our learning for police officers, which is the journey from the day you start to the day you retire, the messages about accountability, transparency, ethics and legality are reinforced time and time again. They underpin every piece of training that is designed by the College or designed and delivered locally by forces. They will all be underpinned particularly with human rights. Almost all decisions in policing, as Mr Michael has indicated, involve making fine judgments about whose rights will be infringed and getting the balance right. That appears in all our learning product, from recruit to the senior command course. Does that answer your question, before I move on?

Baroness Chakrabarti: Not really, other than I think you are saying that there is very little mandatory training.

David Tucker: The difficulty is that some national technologies are available such as PNC[2]—which is not a new technology, obviously—PND[3] and some of the work that Mr Preston alluded to that is covered under the Regulation of Investigatory Powers Act. There are national training programmes for those, because they are national products. Most of the other technologies are locally procured, so the College, as a national training provider, cannot do that. I want to reinforce the point that the pressing of the buttons, how you use the kit, is one aspect of it. The underpinning principles of openness, transparency and answerability back to the public are key.

The Chair: One needs to be trained to interrogate the system, I would have thought, and that is very technical. I cannot imagine myself doing it.

Baroness Chakrabarti: I think the question has been answered.

Q88             The Chair: I think it has. We will come on to procurement, but we have heard tales outside this country of relationships between the provider, the seller of equipment, and the customer, the police, which of course raises questions about how all that works. Do you know whether training is part of the package when one buys the kit, normally?

David Tucker: I can only talk about the programmes that we are involved in at the College, which are national provision. We are trying to move over into the successor of PNC, the police national computer, and training is a key part of that. It will be provided by the company that eventually creates the product, but it will be overseen by and linked into the College of Policing. We will make sure that all the issues of transparency, openness and accountability are hard-wired into this, because we recognise the potential for misuse of technology.

Professor Paul Taylor: Most products come with a training package, but a larger issue underpins your question. Often we will take a fairly mature technology and try to implement it in policing, knock it into place and hope that it fits the hole. Procurement needs to go much more upstream and work with providers, academia or industry, so that when they develop their products they think at a much earlier stage about issues of fairness and how it will work in policing, and they design things that fit so that they work hand in hand. The officers and teams who work with them during that journey ought then to have a very mature understanding of the opportunities and challenges that the systems provide.

Q89             Baroness Primarolo: You will know that the evidence to the committee has made clear the concerns about bias, the black box functions—I am talking about AI and advanced algorithms—and the propensity for technology to be used predictively instead of adding to decision-making. I will turn first to Professor Taylor and Mr Tucker and then to Mr Michael and Mr Preston. You have not described yet a clear line of accountability. You have used the word a lot. Can you explain to us simply who is accountable when it goes wrong or is misused, or there is something fundamentally wrong in the design so that it is biased? Where is the accountability?

Professor Paul Taylor: The accountability rests with the chief constable who will work, as you have heard, with the PCCs. They have the ultimate responsibility for using a technology. I feel it is my role to ensure that they are able to use that technology fully informed of the risks and the challenges for what will happen. That is the ultimate structure.

I will want to rehearse with them the false-alarm rates. For a chief constable, that might be a question of how many victims I am willing not to support because of the issue of the technology capturing everybody but having a very high false-alarm rate; versus a balanced position. There are things like making them understand the difference between the front-stage and the back-stage applications and ensuring that the decision-maker is always the officer, and talking about areas where sometimes that is not the case and how they might become overreliant on technology, to give one example. There is also ensuring that we have systems and processes in place so that does not occur.

From a chief scientist’s point of view, it is about ensuring that we have the best practice, and the best practice is the best science. That is the pre-registration of analyses, so that we are not cheating at the back. It is the open publication of that science so that the community can scrutinise what we have done. It is, where possible—I have tried to take a number of steps in this direction—allowing datasets to be available so that other independent scrutiny can occur. All of that feeds into the decision-maker, who is the chief constable.

Baroness Primarolo: Mr Tucker, I cannot help but feel that having 43 different structures of accountability is not going to work clearly when somebody’s human rights or basic rights have been infringed or, worse still, they have been prosecuted because of misuse or failure of the technologies. What guidance would you give so that we can all be clear that every constabulary is working to the same principles?

David Tucker: I will sidestep the 43-force model part of the question.

Baroness Primarolo: I would rather you did not, because it is a good point about how we hold them accountable.

David Tucker: I think Mr Michael and Mr Preston might want to pick that up. The College is just about to release our guidance on live facial recognition, and we set out the principles there.

Broadly, and this would apply in a number of other areas, there are three levels of accountability. There is the procurement decision, which rests with the chief constable. Ultimately, it all rests with the chief constable and, through the PCC, with the public. There is the level of procurement where chief constables have to make a decision about the technology they will purchase and make sure that it falls within acceptable operating parameters, that it has been tested properly and so on.

There is another decision about how that technology will be used in the force. As an operational decision the chief constable might say, “I’ll use it only in these circumstances and not in other circumstances”, and there will need to be public accountability about that decision, a consultation on how they do it. There is another tier, which is how each application happens, and whether it is appropriate for the risk, and recognises and balances the rights of everybody involved. The final tier is the actual operational decisions on whether or not to intervene.

All those elements have accountabilities attached to them and ultimately are the accountabilities back to the courts. We have seen that where decisions are challenged or doubted cases go to court and affect the way policing operates. The case of Bridges on live facial recognition is a good example. The Appeal Court said that there was an absence of policy, so we are filling that gap and moving to apply these principles to this piece of technology, making clear the levels of accountability and how they will be discharged. There are things like equality impact assessments and data privacy impact assessments.

Baroness Primarolo: Mr Preston, if somebody in your policing area has had their individual rights infringed because of fundamental flaws in facial recognition, how would they get recourse? Who would be accountable for all the problems they would have faced?

Darryl Preston: If this has happened to an individual, the usual complaints procedures would be followed. They are clearly defined and set down in regulations. There is a very set, clear complaints procedure, which I also hold the chief constable to account for. I will come back to the point about the chief constable being accountable in the police force area. Accountability reflects a number of thingsthe guidance that is already out there, the guidance that comes with authorised professional practice—but there is a host of other scrutiny bodies too, such as the ICO[4] surveillance commissioner and HMICFRS.[5] In my policing area, if an individual’s human rights were breached, they would have recourse to the complaints system and ultimately a court of law.

Q90             Baroness Primarolo: Mr Michael, do you consider it appropriate that in the negotiations of contracts for AI technology there is an accountability and a pushback on to the providers where there is a fundamental flaw in the technology that is provided? If so, how would that be achieved?

Alun Michael: I go back to the answer given earlier that that needs to be built into the whole process of design of the technology and its application to policing. The lines of accountability are very clear. The operational responsibility is with the chief constable, but the chief constable is held to account by the police and crime commissioner. That has to be a mutually challenging process all the time. I would be more worried about some sort of centrally directed approach to this over the 43 forces than I am by what we have, which is a mutually challenging arrangement.

How policing has developed over the few years since the police and crime commissioner role came in is very interesting; it is a collective responsibility for understanding, being fully informed, and challenging presumptions that are made within operational policing. I think the relationship between commissioners as a group taking a collective responsibility and chief constables taking collective responsibility as a group through the NPCC is increasingly being seen as important. I welcome the refresh that is going on at the moment of the role of the College of Policing, which has been a bit of an absent player in recent years, disappointingly.

Going back to your concern about differences at the local level, as Tom Winsor, the Chief Inspector of Constabulary, said, nothing can be achieved by a merger that cannot be achieved by collaboration. Collaboration is not a soft concept. It is about proper co-operation, recognising the challenges that each of us faces in ensuring that technology is applied for the protection of the public, but applied in ways that protect civil liberties and human rights.

Baroness Primarolo: I am afraid, Chair, that I do not feel I have got an answer. All the contributions have used the words “transparency, accountability, holding people to account”, but there is no clear line that tells me how, across 43 constabularies, we make sure that we have the same principles operating within the United Kingdom. Collaboration is great, Mr Michael, I absolutely agree with you, but it is no substitute for clear lines of accountability.

Alun Michael: No, but clear lines of accountability are at the local level. It is also important that we are constantly inquisitive and challenging, and that is a team game. It is not top-down decision-making, deciding everything from the centre. If you approach the application of new technology on a rule-based approach rather than strict adherence to the law and the application of principles, things will go wrong.

Baroness Primarolo: Yes, but that is not the point I am making, Mr Michael. The point I am making is that these are being developed as technologies and deployed nationally. If the responsibility is to pick and choose in each constabulary, there is no clear line of accountability on what is being developed, what is being used and how people’s rights are defended when it goes wrong. That is what I was trying to get to.

Alun Michael: With respect, I think you are mixing two different things. There are technologies that can be applied generally in all forces, and in that case the Home Office plays a very significant role. There was reference to the police national computer, for instance. The decision-making on the application of technologies that are deployed within individual forces has to be local, but it is applied according to—

Baroness Primarolo: I am not talking about all technologies, Mr Michael. I am talking about only advanced algorithms that are new. The evidence that we have had is that different police forces are doing different things to different levels and standards. The question of accountability and transparency has become—I will put it politely—opaque. I do not think I can take it any further.

Alun Michael: That is only if you are looking for one system and a centrally controlled police force. That is not what we have in England and Wales.

Baroness Primarolo: No, I am not. I am looking for principles on accountability.

Alun Michael: The principles on accountability are in the operational decision-making. It is the responsibility of the chief constable. The holding to account is by the police and crime commissioner. Collectively, policing in England and Wales operates collaboratively. We are constantly inquisitive and challenging. There are challenges between the APCC and NPCC and it is a very healthy model.

Baroness Primarolo: Thank you.

Q91             The Chair: I will come back to Mr Tucker. You mentioned guidance that the College is about to produce on live facial recognition. I have two questions. I am intrigued as to why now and why only now. Has this been prompted by problems and in particular the case that you mentioned? Does one wait for the problems to emerge before issuing the guidance? Can you give us a preview of the guidance? Will it advise forces on how, or indeed if, to use facial recognition for stop and search?

David Tucker: The first question is why now. It is a response to the Court of Appeal decision in Bridges. One of the problems that has been alluded to by Professor Taylor and Mr Michael is that there are local applications of technology, and it is difficult for the College at the centre to be doing guidance on things that are developing quickly. We have to wait for a moment of maturity, because if we do not we run the risk of trying to give guidance on something that has not settled down and is developing.

The broader principles will always apply, which is that you have to recognise people’s human rights and have the accountability and transparency process that applies. As Mr Michael has described, the chief constable is responsible for the operational aspects, held to account by the police and crime commissioner who is then held to account by the electorate. That is how that works. It is a tricky thing to start trying to write guidance for things that are still broadly in a development phase.

The live facial recognition guidance is not the only area where we have operated. We have recently released guidance on extracting digital material from mobile phones and other portable devices. The principles are pretty strong, and very clear that your intrusion has to be the minimum that is required for the purpose that you are trying to achieve; that you should intrude into people’s privacy to the minimum degree possible, that you should look at other ways that are less intrusive. All this comes straight out of the proportionality considerations of the Human Rights Act, and we are applying those in this new technology area.

The live facial recognition authorised professional practice will set out those four levels of accountability, and scrutiny and checking, so that you procure the right thing and you have a policy on how you will use it in your particular force area. The decision then is about when you will do an operational deployment, and there is the fourth level: the individual choices about when you intervene. The structure of the guidance is broadly around those headings.

The Chair: And what about stop and search?

David Tucker: Live facial recognition allows you to identify somebody who is on a watch-list and then a decision is made about whether you intervene. That does not feel like the same thing as stop and search. That is something different, where individual officers need to make a decision about whether there are reasonable grounds. It might be that somebody is identified from an LFR deployment and you feel that you want to do a stop and search on them because the alert was linked to that, but they are not intrinsically linked in that way.

Q92             Baroness Shackleton of Belgravia: I am going to address this question to Mr Michael and Mr Preston, and particularly to Mr Michael because you also sit as a justice of the peace, so you understand why this data is being collected and how the public have to have confidence that the police force has been accountable. This is echoing what Baroness Primarolo said. We can have all this information and we have sacrificed our privacy, but unless the judge or the public feel that it is worthwhile and reliable, it is not much use. What strategies are you going to develop to ensure that the ongoing evaluation oversight of the technology used by the police force is up to date? In colloquial terms, are there going to be regular MOTs as if it were a car? How will you make sure that you are roadworthy and fit for purpose? How will that be reflected in the way you describe your accountability to the public?

Alun Michael: That is a very rich question and a very good one. I ought to say that I last sat as a JP in 1987, when I disqualified myself by being elected to Parliament. I am very proud of the 15 years that I spent as a magistrate and you are quite right that it informs my thinking.

You are also right to require, or to ask for, continual renewal and scrutiny of what is happening, rather than saying, “We have done that, so we can forget about the principles and values.” We did a lot of work scrutinising and challenging the introduction of automatic facial recognition, for instance. There was a lot of internal discussion between me and, primarily, the deputy chief constable who was applying the technology about what was acceptable, what would not be acceptable, how we would open up to the public, and how we would allow the public access to see how it works by putting the van in place and inviting people to come to see how it worked. We had very important discussions about where the threshold is of where it is legitimate to apply and use scrutiny. It was interesting that the courts were not interested in any of that—they were interested in whether it was legal—whereas I think that scrutiny, that question of whether you are applying the right values and scrutinising what the technology does, is crucial to the policing approach.

It then goes on to ongoing evaluation, to come to one of the points in your question, where, for instance, there have been suggestions that there is bias built into the technology, a bit like institutional bias. It is important for us to follow the science there and to constantly look for further investigation and further scientific application of developing knowledge. To use your phrase, regular MOTs are absolutely crucial at every stage of this process, in terms of what the technology is doing and whether it is consistent with the values that we expect to be applied—the protection of civil liberties.

Baroness Shackleton of Belgravia: In crude terms, Mr Michael, do you have in your diaries a regular check-up—not when a disaster has happened, but constant maintenance, fit-for-purpose checks?

Alun Michael: Yes.

Baroness Shackleton of Belgravia: How is that dealt with from police force to police force?

Alun Michael: There will be differences between police forces. I can give you my application. My chief executive is the monitoring officer responsible for a series of meetings in which the force is required to bring us up to date on particular aspects and answer difficult queries. We use those boards to raise questions that might have come from councillors or members of the public. We also gather questions through widespread consultation—we have a young people’s consultation, for instance—which is getting richer by the year. These things do not happen overnight. They are not easy to organise, but I am very pleased with the way that is developing. They are being more and more challenging, but more and more thoughtful and evidence-based in the way that they challenge us. Those are the questions that go into the monitoring system, the accountability system, which is applied in the force.

The other side of it, of course, is asking the force to apply whatever technology can assist us in making the best possible use of the limited number of officers that we have to keep the public safe in order to provide preventive measures at every opportunity, rather than only reacting to bad things when they happen.

Darryl Preston: I completely agree that the regular MOT needs to happen. In my force area we have a diarised forum—a public meeting where I hold the chief to account—but the reality is that there is a professional working relationship there as well. We speak most days, particularly on issues such as this, notwithstanding that deployment is an operational decision. If it clearly has an impact on our communities it is something that I need to be aware of. As I described, we have an independent scrutiny panel. PCCs can also employ HMICFRS to come in to look at some of these particularly detailed areas.

You mentioned the issue of data retention for the data that policing and law enforcement might get from some of these technologies. The retention of material by policing is another area that needs scrutiny. Ultimately, to be lawful, the police must lawfully be in possession of the images or photos that they use from live facial recognition. That is certainly a role for the PCC and I challenge my chief constable to ensure that we hold material lawfully in our force area.

The Chair: This is usually the point in the meeting—and it is today—when I must remind people that we will have to speed up.

Baroness Shackleton of Belgravia: Professor Taylor and Mr Tucker, would you like to share with us your ideal evaluation and oversight mechanisms?

Professor Paul Taylor: It is my role as chief scientist to bring a set of independent oversights to support the meetings that you have just heard described. Unfortunately, that is not a one-stop method; there is a variety of ways to do that. At the pinnacle, we have now a science advisory council, for which we have appointed a chair, so replicating other chief scientific advisers we will have a SAC.[6] That SAC has the opportunity to see some of this new technology but also to call bits of technology to scrutinise and advise on it. It brings a wealth of expertise. Beneath that is a set of over 400 experts who we can draw on. It is my role to encourage and work with those national leads that I described to ensure that those experts are regularly engaged and supported in all the things that you have just described.

Baroness Shackleton of Belgravia: Will that be made transparent?

Professor Paul Taylor: Absolutely. I will make two points about evaluation. First, I do not think it is just for the police to evaluate. I spoke very briefly about open science, but another example of where the public ought to have the opportunity to scrutinise would be open data. I know you have heard evidence before where it has been made clear that some of these are quite complex technical challenges that perhaps the public per se would not be able to scrutinise, but that does not mean that we cannot bring in expert committees and others to act as intermediaries to help with that scrutiny. That is important.

Secondly, it is important to raise to this committee that to be able to sufficiently scrutinise these new technologies and all the permutations, to be able to 100% tell you what they are and are not doing, requires a level of resource that we simply do not have. The amount of testing, testing and testing one would need is just not there.

Going back to the question about industry, that is one of the challenges, of course: if I turn around to the industry to say, “I am going to hold you responsible”, it will then want to do a level of testing that probably makes investment into that technology not viable anymore, and not an interesting product for it to engage in. There is a tension there. We do not want to stifle the market by saying that it must do all of this, but equally we need it to do it. That is a real challenge for us in lots of areas where we just cannot do the full amount of testing that one needs, which means the informed PCC and CC are only partially informed – informed only to the extent of the testing we have been able to undertake.

David Tucker: The College of Policing tries to drive evidence-based practice. We want to be sure that, when technologies are used, there is clarity about what the technology seeks to achieve. We often talk about the intelligent customer. Policing needs to be an intelligent customer: it needs to understand the problem it is trying to solve and how it is thought the technology will contribute to that. Where it is not possible to do that testing in advance, there must be clear baselines so that evaluation can happen, so that people can make an assessment about whether the technology is delivering what it is supposed to deliver.

The College has a crucial role in that and we are looking at various forms of technology. As Professor Taylor said, this is a dynamic area. People are innovating all the time. They are trying to use technology in the best way possible for the benefit of the public. There is a risk of trying to make sure everything is completely covered while not stifling innovation, but against that has to be balanced the accountability, transparency and answerability back to the public so that we police with consent. That is what we try to do through our guidance and evaluation.

Baroness Shackleton of Belgravia: People need the confidence to know that what they see is what is on the packet and that they can use the tool reliably, but human beings are deploying it. It is a matter of confidence, and that comes only with transparency, regulation and adhering to values.

David Tucker: I agree with all of that with perhaps a slight caveat around regulation, because the regulation has to come in at the right moment.

The Chair: Baroness Shackleton, I am sorry, we have only half an hour left and an awful lot more to cover.

Q93             Baroness Sanderson of Welton: To your point about how the police must have confidence, and on the flip side of the accountability point, we hear a lot about the various oversight mechanisms and to us that all sounds fairly complicated. How can a single police force have the confidence that Baroness Shackleton is talking about in the scope, proportionality and legality of what they are using? Mr Tucker, could we begin with you? You have mentioned that mandatory training is not mandatory most of the time, and where there is training available it is available on national products only. How can police forces have confidence?

David Tucker: I seek to make the point I made earlier, which is that the application of a piece of technology—the pressing of buttons or the deployment—will depend on the local circumstances and local procurement decisions. Behind it are the principles, which are clear and well understood across policing: you must intrude to the minimum that is possible, you must have a clear problem that you are trying to solve, you must understand how the technology is going to answer your problem, and you must do that within the law and within the code of ethics that is driven throughout the College of Policing training curriculum. The whole curriculum is underpinned by reference to human rights and the code of ethics to make sure that when you apply these technologies they are used appropriately. The local application of that, as Mr Michael indicated, is back through the chief constable’s operational accountability to the Police and Crime Commissioner and to the public.

Baroness Sanderson of Welton: I think we could go round in circles again and we would discuss the same things that Baroness Primarolo discussed with you. I am not confident that it is a very clear picture or that it would give confidence to the person who it is being used on, or the person using it, but I will leave it there for now.

Q94             Lord Blunkett: To set the scene in 30 seconds, I understand the processes that provide the accountability that Mr Michael and Mr Preston have referred to in terms of the PCC and those advising them, as well as their relationship with the chief constable, but I also feel we are in danger of going round in circles, as Lady Sanderson described.

I want to put a proposition to Professor Taylor and Mr Tucker. How do we understand the quality and the validation of the technology we are using if the scientific advisory committee that has just been established does not feel that it has the wherewithal or the capacity to kitemark the particular developments that are being put forward to the 43 forces in England and Wales and Police Scotland?

If we do not have that, then surely what we started with this morning, with the chair’s reference to overseas experience, which we heard about right at the beginning of our inquiry, will kick in. In America, where there is a great deal of sophistication—it happened to be in California, so pretty near to where the cutting edge technologies are being researched and presented—they found that local police services were being offered trials, tasters and free provision for a period of time, precisely to say, “Let’s test this and then down the line we might come back and say that we don’t validate that”, as has been described this morning, or, “We found that that did work after all.” The problem was, of course, how to get it out once it had been established and embedded.

My question is a very simple one to Professor Taylor and Mr Tucker: where the hell are we on this? If we do not have the capacity to know at central level what is valid, what is appropriate and what safeguards exist within that technology, how the hell do 43 chiefs of police and their PCCs understand it?

Professor Paul Taylor: One of the factors that makes that more complex is the richness in the range of maturity of technologies. When technology reaches a level of sufficient maturity, as we have just heard, we would expect to have an authorised professional practice around it, which is probably close to a kitemark.

With the emerging technologies, you are right to say that we operate a system that relies on expert opinion and evidence to ensure that we meet best practice at the time. That would be, for example, a science advisory council. The ethics lead for NPCC has just established a subgroup, the data ethics group, to support national-level understanding of best practice. As that matures for a new technology, it reaches the point where we would expect it to move into APP territory.

On validity and safeguards, I personally view that as something that the science ecosystem is jointly responsible for, working with others to ensure that we get as much data as we can. In my domain, that is working with BEIS and UKRI[7] to ensure that research councils and others are able to invest, so that we rapidly accelerate our understanding of those technologies that are likely to come to the point where chief constables feel they ought to be using them because to do so will protect lives.

David Tucker: I go back to some of the points I have made before. Across policing there is a very strong understanding of human rights and issues about intruding into people’s lives in ways that go beyond what is strictly necessary. Those are all enshrined in law. That underpins all our guidance and training to make sure that, wherever you are in the organisation, you are fully aware of the balance between what is right and what is not so that you ask, in relation to the technology, whether there are other less intrusive ways to achieve the outcome that you desire.

Chief constables will need to apply those decisions and thoughts when they make procurement decisions. When somebody comes along to offer a new technology to answer a policing problem, the chief constable, with the help of the PCC and all these structures such as the ICO and the other commissioners can assist by asking, “Is this an appropriate technology to use? Is it answering a policing problem? Is it appropriate to use this technology in these areas?” If it is not then that will not happen. We need policing to become the intelligent customer, to ask those questions and make sure that those decisions are made. I feel fairly confident that the principles that underpin all this are well understood across policing.

Lord Blunkett: Do Mr Michael and Mr Preston want to come in? I appreciate that we are running out of time.

Alun Michael: The answer is the absolute requirement of constant challenge and questioning. We and the chief constables would be very conscious of the danger of being too close to a supplier. We can be sure of that careful distancing and careful questioning only if there is constant challenge, and that is very much built into our system. I hesitate to say that what you have described in California could never happen in the UK, but I think it would be under challenge, both within a force and between forces.

Lord Blunkett: Darryl, do you have a quick comment?

Darryl Preston: Yes, very briefly, because procurement was mentioned. The issue for me is that, as we heard, policing has not always been the most intelligent customer. To try to resolve that we now have the Police Digital Service and BlueLight Commercial, which we are looking to procure nationally. That is one step forward in the area of procurement.

Lord Blunkett: I will leave it at that, except to say that there is the issue of reliability, which we are also looking at. Although lie detectors might be better than whether your eyes flick to the left if you are guilty and right if you are not, what is happening with Colin Pitchfork is very interesting: can you learn to breathe in a particular way to outwit a piece of technology? That is for another day, perhaps.

The Chair: I think Lord Ricketts’s question is going to follow on from that.

Q95             Lord Ricketts: It does follow on very well from that. We have heard a lot of very interesting things today. To continue to dig down on this issue of standards in procurement and ensuring their validity, a couple of things rest in my mind as worrying. Professor Taylor, you said that the responsibility for technological standards rests with the science ecosystem jointly. That does not feel to me like a very clear position of responsibility.

Going back to an answer from Mr Tucker to an earlier question about the College of Policing, when it decides to intervene and lay down national guidelines is when the issue has, as in this case of facial recognition, reached the courts and a weakness has been shown. In other words, by that stage people have encountered problems and people’s human rights have been infringed, and at that point the College of Policing acts. It would be more reassuring to think that there was more proactive upstream action by the College of Policing as these technologies roll out.

I want to go into a bit more detail on the issue of those technologies that are being trialled and have high false positive rates, which I think is the case with some of these technologies. Again, Mr Tucker, perhaps you could talk about how your office can go about setting standards on the validity of tools that are shown to have quite a high false positive rate in their initial evaluations.

Professor Paul Taylor: To be clear, I did not mean that the ecosystem as a whole should be setting the standards; I apologise if I misspoke. It is very much about the ecosystem providing the evidence for us to understand those technologies. On scientific standards, I work closely with the NPCC lead for the particular capability. It would be for us to decide jointly what standards we should be promoting within the NPCC for each force to adopt. I apologise for misleading you on that question.

Lord Ricketts: It may have been my misunderstanding.

Professor Paul Taylor: On the high false positive rates, that potentially depends on where the piece of technology is being used. If it is what I would call front stage—operational-investigative—we would have a very low tolerance for false positives. There might be backstage work, where we are looking at demand management, where we might be simply driving some efficiency in the system. There may be a different tolerance for that there. It is very difficult to answer what my single line in the sand is; it depends on operational and other responsibilities.

Q96             Lord Ricketts: Mr Tucker, as you are answering, can I ask you to comment on the specific case of OxRec, which is the Oxford University algorithm for violent recidivism, which I gather the College of Policing is evaluating? There have been some reports of a 40% false positive rate in this particular algorithm. Is that the sort of thing that would ring alarm bells? Could that sort of figure ever be tolerable as a false positive rate or are you looking for much lower figures than that?

David Tucker: Picking up on what Professor Taylor said, there are clearly different levels of acceptability of false positives, depending on what the technology is asking you to do. Where it could lead to infringement of people’s liberties there will be less tolerance for higher false positives, but it will vary depending on what the application is and what you are trying to do. It will all come back to the principles around human rights that I have already set out.

I am not aware of that particular example, but I can find out more if you wish. One of the things that policing wants to do is effectively identify who in society poses the greatest risk of harm, so that the resources that are available to policing are targeted to the right individuals and that those people who do not pose a risk are allowed to carry on their lives unencumbered by any form of intervention or whatever it might be.

The trouble is that risk assessment is very complex and dynamic. We know from looking at things such as domestic abuse homicide rates that there are lots of people for whom there is no history in any of the agencies, so trying to judge risk in those ways is difficult. We need to innovate and try different approaches to see where we can be more effective. That requires a bit of experimentation and proper evaluation but also, critically, the proper application of rules and the principles to make sure that people’s liberties are not infringed. It is a tricky thing to do. I think everybody would agree that it would be a good thing if we could be more effective at identifying risky people, but we cannot do that at the expense of very high levels of false positives, not least because it loads yet more false positives into the system, which sucks away resource. When you have everything looking like high risk, nothing is high risk. We must be careful about that; the balance is key.

You asked about when to intervene and whether we could be more proactive. That is a fair point. I go back to something I said earlier around maturity. Very often we are in a position where a technology like live facial recognition is emerging and going through a process of development, where lessons are being learned about false positives, how the technology works, what the impacts are for people, and whether there would be disproportionate impacts on minority ethnic communities or different genders. All of that is working its way through. The principles behind it still apply. The principles of accountability and minimal intrusion still apply and would still apply to any new experimental type of work. We need chief constables, working with their PCCs and all of the other regulators in the system, to make sure that we are not pushing beyond what is acceptable and that we are not unfairly infringing on people’s lives.

Lord Ricketts: Thank you. Unless Mr Michael and Mr Preston would like to come in with a very quick comment I will pass the floor back to the chair.

The Chair: I am looking at them and neither is coming in.

Q97             Lord Dholakia: My question relates to trust and bias. Some communities are known to be particularly affected by police bias. We see this in stop and search. After 70 years of settlements with some of these communities, we still have the same adversarial contact between the police and communities. How can they trust technological tools built on police data? What work is under way to build and maintain trust within every community?

Alun Michael: It comes down to a proper understanding of the importance of trust and bias, and, in particular, the constant challenge of where there may be an inbuilt bias in the system. That is as important in relation to technology as it is in everyday discussion. I reference the joint summit that we had last week between the APCC and the NPCC, when most of the chief constables and most of the commissioners spent two days in conference on how much policing is trying to respond to that challenge.

In south Wales, we seized the opportunity of the Black Lives Matter protest and the death of George Floyd to accelerate the work that we are doing within the force and the extent to which we are working with the communities across south Wales to try to deal with something that has been there for generations, as I am sure you would accept.

Any suggestion that there is a built-in bias in the application of technology, just as any suggestion that there is institutional bias, needs to be tackled proactively and very positively. That would be the priority.

Lord Dholakia: Can I take up that point? How can you remove bias when the police use data such as community deprivation, immigration status and police intelligence? To what extent should such data be used?

Alun Michael: I accept the challenge there. If the police are using particular bits of data to predict and it is biased and a misapplication then it is not acceptable to do so—simple as.

The Chair: The problem surely is that there is already concern. We have already heard several of our witnesses say there is always bias built in. The very difficult task for everyone involved in this is how to give reassurance, given that issue.

Alun Michael: It is difficult because we are dealing with things that are endemic across society. The police are held to a higher level of expectation, and rightly so, because that is the whole point of providing citizens with those police powers. It is absolutely a challenge for the whole of society, but it is even more of a challenge for policing. If you had been at the two days that we had last week you would have seen reflected in the discussions and challenges, and by the people who were giving evidence to us in that session, how seriously this is being taken.

The Chair: I interrupted Lord Dholakia, who may well want to pursue this.

Lord Dholakia: I am simply concerned about the nature of the consultation taking place. In many parts of the country the bodies for police community consultation that had previously been set up have almost disappeared from the scene. Are you consulting the right people, or are you simply relying on your ethics committee to give you the input necessary for using the technology? Mr Michael, you may be doing something different, but in other parts of the country there is a grave need for such bodies to be set up.

Alun Michael: I agree with that. Basically, for those of us for whom those principles are important there is accelerating good practice, but the fact that there is a collective mutual challenge between chief constables and commissioners is a particularly healthy development. That is something that we have seen accelerate over the last eighteen months.

The Chair: It would be helpful for us to know what is going on by way of work to answer the point, because I appreciate that everyone is concerned—

Alun Michael: We can certainly provide that.

The Chair: That would be very useful. Does anyone else want to come in on this as to what work is going on?

David Tucker: I would draw attention to the requirements under the public sector equality duty, which applies to all of these policy and deployment decisions. In the College guidance there is an equality impact assessment and there is a requirement for a data protection impact assessment when you use new technologies. You have hinted at it: it is very difficult because you must recognise the bias and understand where it is. The understanding and recognition of that has changed immeasurably in the time that I have been involved in policing. It will not be a perfect situation, but policy requirements around equality and data protection impact assessments are the ways in which you surface and have those discussions. A level of consultation about that is required as well. The feedback loop and the structures are there and are quite well developed. We need to make sure they are used most effectively through the accountability and democratic processes.

Professor Paul Taylor: I would simply make two summary points. Openness and public reporting of the science is critical for people to be able to understand. That involves the police and myself as Chief Scientific Adviser taking more of a role in helping and having conversations with the public about what these technologies are doing.

As you rightly point out, chair, there is no place that we will get to that will not be biased on some level. Even the most innocuous decision to look at burglary, say, reduces the resource we have to look at, say, fraud. Those are different populations of offenders. I have had examples cited to me since joining where perhaps policing has been reluctant to have that public conversation for fear of telling the story of what crime goes on in communities, but when you do so you explain to someone, “I’m really sorry, we haven’t been able to investigate crime X, this public disturbance, because we are currently dealing with some significant county line challenges.” I think the public are more of an intelligent customer than perhaps we give them credit for. We should be engaging in that way.

Q98             The Chair: In the couple of minutes that we have left I will see whether anyone else has any further questions. I would like to ask Mr Michael to help us with something arising from the written evidence that we have had from—I will do it in full; novel, I know—the Association of Police and Crime Commissioners, the National Police Chiefs’ Council and the Police Digital Service, which says, “Government should seek to clarify public appetite for new technologies and legislate so that policing has a clearer basis on which to make policies and decisions about deployment.” Can you amplify this request for legislation?

Alun Michael: My explanation would be that the values and principles need to be established in law. That essentially goes right back to the beginning of this hearing on the need to use technology, wherever possible, to protect the public, but to do so in ways that respect and protect civil liberties and human rights. It must be a values-based, rather than a rules-based, approach. There is a history of tension between values and rules. I particularly remember it when I was dealing with the Companies Act. If you take the approach of getting the values right and requiring constant scrutiny and questioning then the application will come right; trying to do it the other way around—prescribing in rules everything that must happen—is a recipe for things going wrong. I suggest that is the clarification: ensuring always that it is not a one-stick approach and that we are concerned only with protecting the public. It must be about a balance of the values according to which we apply our activities.

The Chair: Do you think more legislation is needed or should we be relying on our well-established legislation and principles as developed through case law?

Alun Michael: I agree with that. I spent 25 years as a legislator and that taught me that laws rarely prevent what they forbid, to use the old saw. It is very true. An awful lot of well-intentioned legislation actually complicates and gets in the way. The constant requirement to apply values—things that Sir Robert Peel set down when he established the first police force in London that are as relevant today as they were in his day, such as the prevention of harm and the police reflecting, being engaged with and embedded in their community—means that these principles need to be applied in the field of the use of technology, as in the whole of policing.

The Chair: It would be very useful to the committee if we could try to be specific about any of this. If there are examples that you might think of following this meeting as to how principles are being applied or should be applied, we would be grateful.

Lord Blunkett: I think we have missed the boat on the Police, Crime, Sentencing and Courts Bill, but if we put any more into that then we will all burst.

The Chair: I had a go last night, but it was a bit late to gather much enthusiasm. Is there anything else that anyone feels must be added—a heavy hint there? No?

We are very grateful for your time and your expertise. If there is more that you could add following this, including a family tree of the organisations of our witnesses here today, that would be extremely helpful.

Professor Paul Taylor: It may be more of a family bush, I think.

Alun Michael: It will probably be three dimensional.

The Chair: Let us not get too ambitious. I think that is it, and that concludes the meeting. Thank you.


[1] Information Management and Operational Requirements Coordination Committee

[2] Police National Computer

[3] Police National Database

[4] Information Commissioners Office

[5] Her Majesty's Inspectorate of Constabulary and Fire & Rescue Services

[6] Science Advisory Council

[7] UK Research and Innovation