Justice and Home Affairs Committee
Corrected oral evidence: New technologies and the application of the law
Wednesday 12 January 2022
Members present: Baroness Hamwee (The Chair); Lord Blunkett; Baroness Chakrabarti; Lord Dholakia; Baroness Hallett; Lord Hunt of Wirral; Baroness Kennedy of The Shaws; Baroness Pidding; Baroness Primarolo; Lord Ricketts; Baroness Sanderson of Welton; Baroness Shackleton of Belgravia.
Evidence Session No. 7 Heard in Public Questions 99 - 111
I: Rt Hon Kit Malthouse MP, Minister of State for Crime and Policing; Dr Christophe Prince, Director for Data and Identity, Home Office.
USE OF THE TRANSCRIPT
Rt Hon Kit Malthouse MP and Dr Christophe Prince.
Q99 The Chair: Good afternoon, everyone. Welcome to this meeting of the House of Lords Justice and Home Affairs Select Committee. This afternoon’s session is to ask the Minister, Kit Malthouse, Minister of State for Crime and Policing at the Home Office and the Ministry of Justice, about government views on the subject of our current inquiry, new technologies and the application of the law. Welcome, Minister, and Christophe Prince, the director for data and identity at the Home Office, who is on this call in order to assist with technological detail.
Perhaps I can start. I am always trying to find the right person on the tile to look at, even though they cannot see that I am. Minister, the Government have an AI strategy, but as far as we can see there is no particular vision for the use of technology in the legal field; that is, in the application of law. We are wondering what enthusiasm there is on the Government’s part. Perhaps in a couple of sentences you can fill in the context for us.
Kit Malthouse: I could do it in one word, “enormous”. We essentially believe that technology can play a huge part in the prevention and detection of crime, and there are exciting developments under way as we speak that are already assisting and where we think we will see significant increases in police productivity and greater public safety as a result. Obviously we have to balance that against the trust that British people have in the policing system, but broadly, as I say, we have enormous enthusiasm.
I hope you will have seen in some parts certainly of my portfolio some fairly assertive use of technology. Sobriety tagging, for example, which we are using for offenders, is now available in all courts in England and Wales. We are using GPS tagging now for acquisitive criminals in 19 police force areas across the country, and obviously there has been a debate at the other end, the prevention end, about the use of live facial recognition across police forces, as well as a general percolation now of the sense in policing that there are a lot of data analytics that can be usefully used for the prevention of crime in all sorts of areas, from antisocial behaviour right through to very high-harm crimes and identifying those who are likely to be murderous.
The Chair: That, of course, is why we need a strategy. I wonder what steps are required before you can publish the White Paper that was announced in the AI strategy some time between its publication, which I think was last September, and next September. Can you give us any detail about what is happening to lead to that? You have mentioned trust, and governance of AI is something that concerns us a lot. That is very much linked with public trust.
Kit Malthouse: Yes. On your first question about a strategy, “Up to a point, Lord Copper”, in that you certainly need to give some shape and form to direction, but it is very hard to have a strategy about technologies that have yet to emerge or where the full potential of them is yet to emerge. We want to make sure that things are done in an orderly way, not least from a financial point of view.
Obviously we have to make sure that the governance framework is right and that the scrutiny framework is right around the acquisition of those technologies. We do not want to stifle innovation. We do not want to stand between a chief constable and a piece of kit or software that they want to have a go at. Being too bureaucratic or, if you like, strategic—with a small “s”—about that can often stand in the way. We are trying to strike a balance. I hope you know that the police have developed a national digital strategy. They have a data board, and guidance emerges on particular technologies like live facial recognition as it is tested in a policing environment and becomes of utility and more widely available. That is broadly where we want to go.
In terms of the AI governance, the AI paper is a BEIS/DCMS endeavour, so you would have to ask them what steps they need to get to before they go to the next stage on that stuff. We are very excited about the use of artificial intelligence and machine learning in policing, not least because we are seeing some of the benefits already, for example in ANPR analysis, where we are now able to analyse in real time movements of vehicles that give cause for concern. We have seen some pretty significant results in apprehending those vehicles, particularly, for example, for the transport of drugs.
The Chair: We asked the National Police Chiefs’ Council and the Association of Police and Crime Commissioners if they could provide us with a family tree of the organisations that work in this area. The immediate response was that it is likely to be more of a bush than a tree, and we have now had some formal evidence that shows a very large number of organisations that are involved and would have a role. I imagine that this will be more Home Office than MoJ, but what is the Government’s role in rationalising, without stifling innovation, the way proposals are dealt with and ensuring that good oversight is in place and so on?
Kit Malthouse: It is a very good question, and you are right, I think, to say that it is a bush more than a tree. It is quite a crowded field. I think that is a reflection of the fact that the advent or the elliptical curve of technological developments has been so fast that Governments over the last 20 years have scrambled a bit to catch up. We have seen an evolving framework, both legally and structurally, where successive Governments have built on the structure of the past and no one has quite gone back to the beginning and said, “Hold on a minute, if we’d known then what we know now, what structure would we have put in place?”
Our job in all that is to set the frameworks legislatively and to adjust them through Parliament as required. There is obviously a fair amount of interlocking legislation covering these areas. There is also then establishing the regulators and the external scrutiny of those areas. We have just recently, for example, as I hope you know, done a bit of rationalisation in combining the biometrics and surveillance camera commissioners into one person, I hope usefully, not least because—here is an example—of the growing partnership between those two technologies. When CCTV first appeared 30 or 40 years ago, when I was a Westminster councillor, we did not have the ability to analyse and combine it with biometrics in the way we do now. A subsequent Government thought, “Uh-oh, we need something on biometrics” and created a biometrics commissioner. Now those two technologies are coming together, so we can have one, which provides a bit of simplicity.
The other lever that we have to a certain extent is the money that we put into things. We can prompt new initiatives and innovation or not, as we choose. For example, the National Data Analytics Solution project, which is based in West Midlands Police, was brought about through central government funding to explore that area of technology and see if it could show benefits.
The Chair: Thank you.
Q100 Baroness Pidding: Algorithmic technology informs decisions throughout the policing pipeline, from intelligence to investigation to prosecution and offender management, which affects subsequent stages of the justice system. Minister, how does the Home Office work with the Ministry of Justice and the judiciary in its deployment of technology?
Kit Malthouse: It is a very good question. In the end, obviously it is for the judge to decide what happens in the courtroom and what is admissible and what is not admissible. That is ring-fenced a little bit by common law and by some of the statutory rules on exclusion, but in the end there is a fair amount of judicial discretion.
All the techniques that are used, if they are offered in evidence, have to be and will be subject to expert scrutiny and evaluation in the courtroom. They also have to be validated under the relevant rules in PACE. They have to be in compliance with codes of practice; they have to come up to standard. As you might know, we have recently taken through the House, I hope shortly for commencement, putting the Forensic Science Regulator on a statutory footing so that the regulations or the guidelines that that regulator puts in place will now be admissible in court and that evidence can be held against that standard.
I guess to a certain extent you could see why a special case should possibly be made for these new technologies, but in the final analysis this stuff should be subjected to the same evidential scrutiny and standards in court as any other evidence would be. Effectively, it needs to be cross-examined and tested in the same way other stuff is. You may remember back in the early days of the fixed-point speed cameras, Gatso speed cameras, whose ability to assess speed was often challenged in court by experts who said, ‘You couldn’t possibly say he was doing 35 in a 30 zone’. There was measurement of the little white bars on the road as to whether they were correctly placed and whether the camera was correctly calibrated. All that kind of stuff happened in the early days of that technology, and I do not think these would be any different.
The Chair: We always ask about joined-up government. Can you say anything about liaison between your two departments and the judiciary?
Kit Malthouse: I am the living embodiment of that liaison between the two as a joint Minister between the two departments. As technology develops and emerges into the courts, we liaise about its use, how it operates and what opportunity it offers. The big example that we have on the desk at the moment, of course, is the whole area of digital forensics and the extraction of information from people’s mobile phones for use in evidence in court. That is particularly pertinent at the moment, as you know.
We are doing a lot of work on rape and serious sexual offences, where conviction rates are extremely low. We know that the impact of this very intrusive line of inquiry into people’s mobile phones has a significant impact on victims and their likelihood of pursuing a case. A project is being led from the MoJ, but in conjunction with the police, to look at how we can minimise that impact, make it more proportionate and therefore not deter quite so many victims from pursuing their case in court. In these specific areas where difficulties are encountered, we obviously work together to try to get on top of it.
The other example I would point you to is with the Attorney-General’s Office where, as you might know, last year the Attorney-General issued new disclosure guidelines. Some of that pertains to digital evidence and that is causing some consternation in policing, given the time that is being taken, frankly, to comply with those new disclosure guidelines. Some of that can be obviated by the two departments working together. I know that the Solicitor-General is visiting police forces at the moment to understand the impact of the disclosure guidelines.
I happened to be in a car with a couple of police officers not very long ago. I always ask front-line police officers, ‘If there is one thing I could do to improve your life, what would it be?’ Sometimes it is, ‘Can we have a new coffee machine in the squad room?’ but these two guys said to me, ‘We are a bit fed up with having to pixelate faces on body-worn camera evidence for a submission in court, the faces of people who are not pertinent to the investigation’. We went back to the AG’s office and said, ‘Where does this requirement come from?’ and they said, ‘We don’t know. That one has been invented by the police’. We are trying to investigate that further to see where it has come from and whether it is actually needed.
Some of these things can obviously be ironed out. Some of them emerge from problems that occur in court or in the pursuit of justice, and we do work together to try to solve them, yes.
Q101 Baroness Chakrabarti: Welcome, Minister. A recent report from the UN General Assembly warned of the potentially quite high risk of race discrimination in the context of certain technologies from a border management point of view. Could you help us a bit on that border control issue in relation to which technologies, particularly the more innovative and advanced AI technologies, are being used and in what context? How are they being used? Crucially, what safeguards exist or are being developed? I listened very carefully to your previous answers about innovation and so on, but of course there are rule of law and rights issues here too. What is your thinking, and what is your practice and your mitigation for race discrimination with technology in the border control context?
Kit Malthouse: Essentially, the technology where this has been of most concern has been facial recognition, where there have been claims that some of the algorithms that are used might have a racial bias in them. There are two safeguards. Certainly from a policing point of view, facial recognition is never used without a human being in the loop. A human being is there as a check and balance. The technology is there to make the human being more effective, not necessarily to replace the human being. Although in facial recognition, or in anything really, chess or otherwise, machine might beat man or woman, machine plus man or woman beats both, generally. We find that that works well.
The algorithms themselves are subject to quite a lot of testing and analysis. Certainly, with the facial recognition algorithm, where these have been tested and evaluated it does not appear from the evidence that there is a detectable bias in the algorithm from a race point of view. That has to be tested, and in a court situation that would be tested by experts, I guess, but those two things together we hope would assist in reassuring the public that that is not necessarily the case.
Christophe, could you just remind us? There is an organisation that has looked at the algorithms and evaluated them for that kind of bias and confirmed that they were not showing any detectable bias.
Dr Christophe Prince: In relation to facial recognition algorithms, it is NIST, the National Institute of Standards and Technology in the US, which has carried out testing on those facial algorithms, yes.
Kit Malthouse: It is also worth us saying that there is quite a move towards increased algorithmic transparency, where you can put your algorithms out there for peer evaluation and for general testing. I think we are one of the first countries in the world to publish a cross-government standard for doing that. That is one of the commitments in the AI strategy. We are trying to encourage police forces to do exactly that. I presume that Paul Taylor will have talked to you about the fact that he has already, I think, made some of the datasets available for independent scrutiny. You can put an algorithm out there for scrutiny. I would not know what it is. It would mean nothing to me, but no doubt there are experts out there who would be willing to challenge and test it.
Baroness Chakrabarti: That is incredibly helpful, Minister. You are trying to urge transparency as a check and balance and you want to encourage the police. My question was about the Border Force, but to some extent the bigger point is the bigger point. You definitely see transparency as the check and balance, and you are trying to encourage—that was the word you used—transparency.
Do you think that part of that encouragement might be legislation? I am just conscious that when it comes to more antiquated, mundane police powers and immigration officer powers, those are all quite circumscribed by hard law, from PACE and the immigration Acts onwards. Yet this technology, which is more exciting on the one hand but potentially intrusive on the other and which you have admitted that you and I would not be able to read with our naked eyes, it is not in primary legislation. Have you given thought to making that transparency a matter not of encouragement but of hard law?
Kit Malthouse: We certainly have to see. I do not think that the police are necessarily resistant to this. We are building on work already. The National Police Chiefs’ Council already has an algorithmic accountability structure, which it pushes, and I think it recognises that as it acquires these new technologies it needs to take the public with it. That means that sometimes if myths emerge around particular technologies it needs to scotch them, but that is true of all parts of policing.
Policing is naturally a difficult and sometimes confrontational thing to do, and if you are to maintain the trust of the public, being transparent about what you are doing and allowing the public either individually or collectively to examine what is going on is critical. Whether it is stop and search, where we are encouraging police forces to have engagement panels in particular areas about stop and search and to take people out with them on stop and search operations so they can see what is going on, or having custody visitors in custody who can show up at random to see what is happening there, along with the CCTV that can be examined, all these things assist with trust in policing. I do not necessarily think we should see technology as a special case. These are areas where it is to the benefit of policing to be transparent.
Baroness Chakrabarti: Circumscribed by hard law, as police powers are, like the Police and Criminal Evidence Act or the immigration Acts and so on?
Kit Malthouse: Yes, although much of that is principles-based. Certainly I would agree with you that we need to set democratically broad principles, but there are always areas of nuance and circumstance that you cannot prescribe in law. For example, the escalation of police use of force is not prescribed in law, although there are principles set down about which circumstances the police can use force in. I think the same is true of the legislative framework around these areas. One of those key principles is transparency. There are other ways in which we can encourage it, and that is what we are doing. I am not saying no, but we will definitely keep it under review.
The Chair: You mentioned stop and search, Minister. In the case of stops with the aid of facial recognition, is there data held by the Home Office about the accuracy and the appropriateness of stops where facial recognition techniques have been used as compared with when they have not? In other words, has the algorithm been, as it were, upheld?
Kit Malthouse: The algorithms have certainly been evaluated. I am not sure, Christophe, if we hold it centrally. Do not forget that it is early days on facial recognition. Some police forces, such as South Wales Police, are quite well advanced. I went to see their facial recognition in operation. As you might know, the Metropolitan Police have been using it judiciously in the capital from time to time. I think it is coming back and being evaluated as we go. Christophe, do we have some of the emerging data on accuracy? We probably have it. Whether we collect it systemically, I am not sure.
Dr Christophe Prince: We do not hold centrally the evaluation of the algorithms that are used by those forces or the specifics of their deployment, but both South Wales Police and the Metropolitan Police have undertaken independent evaluation of their own deployment.
Q102 Baroness Sanderson of Welton: Hello, Minister. You mentioned the transparency standard. We have been struck by the fact that no one really knows who is using what technology or to what extent, certainly not in an official way. As you say, we are leading the way with the standard, but it is a pilot, it is voluntary, and the scope is fairly limited. What are your views on this? Do you see it being extended or made mandatory? What can it achieve currently with the limitations that I have just mentioned?
Kit Malthouse: As I say, on all these things it is relatively early days for us. Speaking with my policing hat on, the police have already made some commitments on transparency. They have a thing called ALGO-CARE, which is about algorithmic accountability.
My general view is that any good Chief Constable—in the end, all these decisions configure around a Chief Constable—would realise that being open and transparent about the use and acquisition of these technologies and their prosecution, if you like, can only help to enhance public trust, whether in live facial recognition or body-worn video camera or whatever it might be. Mandating transparency across the board is more a principle thing than a legislative thing. The trouble is that once you get into legislation you have to be quite prescriptive about the circumstances, the form and the way they are used, and as we move into this rapidly advancing area of technological development that might stifle innovation, to a certain extent.
Baroness Sanderson of Welton: Again, we have been struck by the fact that you have all the different forces and it can feel a bit like the Wild West. There are quite serious powers within this technology. They are doing quite serious things that have an effect on people, and the fact that there is not really central accountability to that and it is therefore left up to a chief constable does not seem quite right to me.
Kit Malthouse: I understand that. Do not forget that although there might not be central accountability, if Lancashire Police wanted to acquire some data analytic capability that could analyse their datasets and then predict, for example, who was likely to be a murderer, they would not necessarily notify me about it, but they would notify their democratically elected Police and Crime Commissioner, who is their local point of accountability. They would have to comply with the interlocking legislation in this area, the Data Protection Act, the Human Rights Act, the Equality Act and PACE, which we have already talked about. They would have to think internally about local ethics committees, scrutiny committees or community engagement committees that they might have.
It is quite hard for the cops to do these things without there being oversight, if you like, if only from the Police and Crime Commissioner. In the end, the Police and Crime Commissioner is the elected permission from the people of Lancashire for the police to do what they are doing, so he or she—it is a he in Lancashire at the moment—would have the initial oversight of what they were doing.
Baroness Sanderson of Welton: Sorry, just to be clear, my final question is about the pilot. Is its purpose improving transparency on a regional level as opposed to centrally?
Kit Malthouse: Yes.
Baroness Sanderson of Welton: That is the point from Government’s point of view? That is what you are aiming to achieve with it?
Kit Malthouse: In the end, the structure of our policing is that although there is a collective trust, the primary trust is with the local people they serve. Those are the people who they have to convince they are doing things correctly, and those are the people who vote for the Police and Crime Commissioner. Of the 41 forces in England and Wales, they should be looking to their local people first.
The prime example of that with live facial recognition is, I guess, in south Wales and in London. In London, where the Met put LFR out there in operation, they jumped through enormous hoops to make sure that they were buying in the local community, that everybody knew what was happening, that they understood why, that what they were doing was scrutinised locally rather than nationally. To do it nationally would be quite unwieldy, given the variety of different techniques and technologies that people may be trying.
Q103 The Chair: Minister, you have mentioned capturing murderers using AI. Is this through predictive technology or what? Dealing with offenders at that level is not something that anyone has specifically mentioned to us before. Can you elaborate?
Kit Malthouse: There is an interesting area of exploration of different datasets coming together to allow us to predict behaviour. That is not a new phenomenon; the insurance industry does that all the time. Whether we are able to say that in particular circumstances—for example, with a particular pattern of offending—there is a greater likelihood of you going on to serious offending that may be violent or to murder somebody strikes me as something that we might think about.
We know, for example, that there are patterns of offending in a domestic setting that mean there is a greater likelihood of there being catastrophic violence at the end. We know that there are certain offenders we are extremely worried about, who on release from prison will go on to commit further offences, notwithstanding their release under law. Obviously we try to manage that through the Probation Service. The question is whether, if we put those datasets together, we could be more predictive about that behaviour and therefore narrow the group of people about whom we are concerned.
The Chair: And whether the person about whom you are concerned can understand how that decision has been arrived at, in this case.
Kit Malthouse: Yes, if they need to know. There are some circumstances in which obviously we do not want to alert criminals to the fact that we are surveilling them.
It works on the other side as well. We know that there are patterns of victimisation that can be analysed and assessed that mean that somebody is more likely to be a victim of something catastrophic in their future. Repeat victims of domestic violence, for example, are obviously of interest from a public protection point of view. We know that, if you look at drug dealing and the phenomenon of county lines, there is what we call cuckooing, where drug dealers take over people’s homes to deal from in small towns like mine in Andover. These gangs target particular individuals who are vulnerable to their homes being taken over.
There are a variety of pointers, which is a modern phenomenon of the police officer knowing who the bad lads were in the community, that we might be able to piece together for the better prevention of crime.
The Chair: I can see that Lord Dholakia would like to come in, I suspect on the word “community”.
Q104 Lord Dholakia: The question I wanted to raise is very much related to the community, as you mentioned, Chair. Minister, you mention that data for our ethnic community is not available at the moment. Secondly, the London metropolitan force is not representative of the ethnic community in this country; nor is it reflected in promotion, recruitment or retention of people from ethnic minorities in this country. The evidence that we have received so far clearly indicates how important it is to train people who take decisions in relation to artificial intelligence, yet you have no information. Going beyond that, the central policing college provides no training whatsoever. In the absence of all these things, you have a very serious problem, where people who have had this adversarial contact continue to discredit the system that you are trying to establish.
Kit Malthouse: In general terms, you make a very strong point about representation in the police. As I hope you know, through the uplift of 20,000 police officers, we are doing our best to try to bring more diversity to policing so that it reflects more closely the great tapestry of human beings that we are. We are showing some success. Certainly if you look at gender, policing has moved over the last 10 years from a quarter female to getting on for a third now. For many forces, more than half their new recruits are female. From a BME point of view, we still have a long way to go, but we are seeing some advances. Certainly the percentage that we are recruiting is much higher than it has been in the past. I think we now have the highest level of BME police officers ever in the history of the police force across the whole of the country.
There are advances to be made now. It does take time for those officers to percolate into other parts of policing. You are quite right that in these areas of the use of technology it will be good to have the same confidence-building representation as elsewhere. The same is true, for example, in professional standards departments. There are quite a lot of PSDs across the UK that need more of a mix of police officers of all kinds too. I do take your point on that.
As I said earlier, I think you are right. Given some of the claims made about this technology with regard to race, it is extremely important that we promote confidence in those communities in its use and that it is agnostic as to race, particularly where it is being used for identification. Do not forget that a lot of this stuff is used for detection, which is obviously focused on finding an individual and then proving that they are guilty. From a prevention point of view, we do have to promote that sense of confidence, but that is where the transparency comes in.
I think I am right in saying that in the Met’s use of live facial recognition, for example, it had local panels in the areas. Christophe, nod if that is right. From memory, I think that is right. It had local panels in the areas so that it could show people why it was using it in particular areas and that it was being done under scrutiny to try to promote that confidence. As you will know, London is a very mixed community, so it will have had a mixed panel to look at the technology.
The Chair: Just pursuing this point about prediction and prevention, when an individual is under the spotlight, as it were, that obviously has enormous ramifications for that individual, who, as you say, may not know what is going on, and you said should not know in some cases. Is this being developed?
Kit Malthouse: I hate to tell you this, but the police have quite a lot of people under suspicion of crime all the time, not necessarily by these means. There are, for example, people who they suspect of crimes who they have yet to arrest or question, but they are gathering evidence about their crimes. When the police go to the scene of a crime, for example, and they gather forensic data that points them towards an individual, they do not necessarily go straight away to get that individual. They want to establish the rest of the case, if you see what I mean. I would not be too alarmed about the notion of prediction. Police have their suspicions, shall we say, just like everybody else. The question is whether they can be better informed by the use of data analytics.
We have done some of that work in the National Data Analytics Solution project, where pools of data analysis of police evidence, interview evidence, would allow the police to piece together networks of individuals that configured around one or two individuals at the centre of modern slavery groupings. NDAS would run that data and would identify an individual who they therefore suspected was part of a modern slavery network. They would then, I guess, do further evidence gathering, not necessarily alerting that individual to the fact that they were building a case, before they arrested them. I am suggesting that it is nothing new necessarily; it is just a development in that area.
Let us put it this way. Parliament has recognised this tendency in legislation. As you know, it is possible, if you have a new partner, to get the record of offending if they have been a domestic abuse offender. We have a sex offenders register, and we put people on that register so that their past offending behaviour is there for all to see. Patterns of behaviour are already acknowledged as possible predictors of future behaviour.
The Chair: There may be questions about evidential status, but I had better move on from this, I think.
Q105 Lord Blunkett: Minister, Dr Prince, thank you very much for being with us. You have my sympathy, Kit, because you are covering the length and breadth of what used to be the old Home Office. It is helpful to us, but it is quite a challenge.
I want to put two or three things together. In responding to Baroness Chakrabarti, you quite rightly talked about people always being in the loop when using technology. We have used the term, and the Home Secretary has used the term, that no decisions about people should be made other than by people. This is easier said than done, and I would like to refer to the evidence that was adduced by Professor Stuart Russell when he was giving the ‘Reith Lectures’. He said that one of the most difficult aspects in the use of artificial intelligence and algorithms was uncertainty. My fingers and my life have been burnt, and I think you might have had some picture of this when you addressed the Justice Select Committee in the Commons on imprisonment for public protection. Predicting how people will behave and predicting how they may recover from that behaviour is incredibly difficult.
I want to explore it with you, because it will be important to pick up on the issue of awareness training and people’s understanding of the technology. Many of us have been in cars where the driver has relied very heavily on GPS, even though they are supposedly only using it as a tool, sometimes—in fact, once in my case—into a field because people simply followed the instructions. This will be the case with police officers and those working in the criminal justice system in the future as this gets more difficult. Can you explore for a minute this issue of uncertainty and how we deal with it when we think that people are making decisions but they are actually letting the machinery and the technology make the decisions for them?
Kit Malthouse: I absolutely agree with your assertion that no decision should be made about a human being other than by a human being. You are right that the management and the assessment of risk of human behaviour is art and science and that there needs to be a bit of both.
The question is: can we use technology to better inform the human being in making their decision? What advances in behavioural science and data analytics would tell us is that to a certain extent you can, although it is not entirely perfect, as you say. There will always be room for manoeuvre. For example, if you look at what we were just talking about, the predictive thing, you can only ever narrow the field. You will never with certainty point to an individual and say that that person is likely to commit X crime or will 100 per cent commit X crime. It is more about giving human beings better tools that will give them the ability either to identify or to get ahead of a criminal, or indeed to detect them in the future, but the human will always be there. We have to make sure that in the acquisition of this technology that is clear to policing.
Lord Blunkett: I just wondered whether in the delegation to the National Police Chiefs’ Council, the College of Policing and the 40-odd forces in England and Wales we are missing a bit of a trick when it comes to consistency of awareness and training about both the possibility and the challenges of artificial intelligence and algorithms, because human beings tend to think that if something has been authorised for deployment, it will have the certainties built into it.
That is obviously true of ANPR, and I take your point from your opening statement. That has been a great success, and there will be successes of this sort in the future. However, unless human beings are very aware of the dangers of taking on something that does not deal with uncertainty, it takes what it is given and comes out with what it is expected to deliver. All this will be a massive challenge as we rely more and more on the aid of technology.
Kit Malthouse: I completely agree with you. I think your first point about, shall we say, the variable acquisition of the skills and understanding across policing is very fair. We are trying to improve that. We have, as you know, appointed a chief scientific adviser now. The NPCC has its strategies; it has leads. It is trying, as the police do so well, to promote ever greater skills and awareness in this area.
We have learned some tough lessons about the overtrusting of technology. To use a non-policing example, there are dozens and dozens of completely innocent post office owners, managers and mistresses, who were sometimes sent to prison because of a belief that technology could not get it wrong. I am very live to that possibility, while being excited about the ability of technology to help us.
Lord Blunkett: My encouragement is to think a bit more about how we gain that consistency of approach.
Kit Malthouse: Yes, I do take your point.
Q106 Lord Ricketts: Minister, I want to spend a few minutes on the issue of regulation and procurement of this new technology. It follows on from the point about 42 different police forces making decentralised decisions, in this case about procuring this technology. Some of our witnesses have worried that across that spread of forces not all will have the capacity to assess and evaluate this new technology being sold to them by some pretty persuasive entrepreneurs, in many cases. What do you think about the issue of some sort of central body that could undertake assessing and kite-marking of technologies so that individual police forces could then go ahead and procure it with more confidence than if they were just face to face with the market one by one?
Kit Malthouse: We have that at the moment in certain areas. In more mature technologies, for example, I regularly get across my desk new speed cameras for certification for use, which have been for assessment and are tested and all the rest of it. I was down at DSTL at Porton Down just the other week looking at some new bits of equipment and technology that it is assessing and putting through the wringer at the moment so that forces could consider them for acquisition. Some of that goes on, but it tends to be generally for more mature technology.
At the other end, though, we have to be slightly careful, as I said earlier, not to stifle innovation or, indeed, people willing to try things. There is a problem, which you rightly push on, which is that across the 41 police forces there is variable skill and ability to assess and acquire those bits of kit. It is for Police and Crime Commissioners and chief constables to consider whether they are making the right decisions. In some circumstances, we have seen some pretty poor decision-making. You will know that Greater Manchester Police have been struggling with an iOPS system, their internal software management system, which was meant to be all singing and all dancing. I am sure they strode confidently into that procurement decision, but it has proved to be a terrible problem and a headache for them.
Every year we hold a security conference down at Farnborough, where bits of technology are brought and shown and all the rest of it. I wandered around—when we were allowed to wander around these things a couple of years ago—and was incredibly excited about some of the software that was available, particularly for video analysis, which would save weeks and weeks of police officer time. How that gets into policing in a coherent way is tricky.
We have tried to bring some shape and form to that over the last couple of years. I now chair a Strategic Change and Investment Board (SCIB), which looks at both current and future capability requirement in policing and then tries to bring some coherence to a collective investment decision across policing in particular technology areas. I hope that all that combined—the DSTL, the SCIB and allowing Chief Constables to play with the technology at the front end—means that we have a good system to bring the stuff through, albeit that it is an evolution at the moment.
Lord Ricketts: Yes. I can understand that for the more mature technologies there is more of a track record and that you can get to more centralised standard setting. In a way, it is what Baroness Sanderson talked about—the Wild West, the new technologies, the innovative technologies—where perhaps some kind of central kite-marking or assessment of whether these things work as advertised would be necessary.
In terms of procurement, I recall that when I was a Permanent Secretary a decade ago, Francis Maude led a big initiative to centralise IT procurement standard setting so that there was one single standard that applied across all government IT. I do not know how satisfactory that has been over the last decade. Is there a case for more procurement decisions to be made centrally on the big issues rather than having the decentralised decision-making?
Kit Malthouse: As I say, that is what we are trying to get to: that at least we get co-ordinated procurement through the SCIB, if that is required. Our policing system relies on the operational independence of 41 chief constables in England and Wales, and them coming together in a particular procurement is often a co-operative and persuasive exercise rather than a mandated one. There are some powers of mandation; the National Police Air Service, the choppers, came together following not actual but threats of mandation. There is something we could do there, but at the moment in this area we would have to explore it through the SCIB if we felt that things were getting disorganised.
There is acquisition going on in some areas at the moment where they tend towards the same bits of software because it is proven to work and is evaluated and all that kind of stuff, but you make a strong point.
The Chair: On standards, Minister, is that a subject for the governance part of the White Paper?
Kit Malthouse: You mean the AI White Paper?
The Chair: Sorry, yes.
Kit Malthouse: It could be, yes. The standards are critical, and we have found in policing that as technology has matured and proved itself of utility, guidelines emerge from the National Police Chiefs’ Council, the regulators start to provide guidelines and government starts to mandate standards. I just want to be slightly careful that we do not choke off this stuff at the early stage. We have to accept that there will be some technology that the police acquire and use for a bit but which then proves not to be terribly useful and they let it go. We have to allow them to fail before we jump on everything.
The Chair: There are different points in the process, are there not?
Kit Malthouse: Yes.
The Chair: You mentioned pretty poor decision-making in Manchester. One wants to be able to pick up a problem at the right point, not find out the difficulties by the experience.
Kit Malthouse: Yes. On the operation, we have proposed a code of practice for the police use of new technologies. We have just done a data reform consultation, which we hope might give us more of a principles-based consistent operation, but that is different from procurement.
The Chair: I am sure the committee will have some things to say. We heard some horror stories—not in this country, I am happy to say, but from the United States—which made us think quite hard.
Q107 Lord Hunt of Wirral: Minister, as Lord Blunkett reminded us, you are a pretty key person for this Select Committee as you are a senior Minister in the Home Office and in the MoJ. We were pleased to hear what you have been saying about transparency.
I want to ask you a question about accountability. I am becoming increasingly troubled by what are called fairness metrics. There is a move to use AI not simply to uphold the status quo but to actively make society fairer by rectifying what are perceived to be existing social, economic and other inequalities. I just paused, because I am worried and nervous about what is being factored in. We are back to accountability. Who has ultimate accountability for the legal, ethical and valid use of advanced technologies, particularly in policing?
Kit Malthouse: The primary decision-maker in normal circumstances is the Chief Constable. In the way the Chief Constable decides whether to arrest you, they are completely free within the law to make that decision, broadly, but they are accountable to any number of bodies themselves. They are primarily accountable to their Police and Crime Commissioner, who is the elected representative in their area. Obviously they have to operate within the network of laws that I talked about earlier: PACE 1984, the Equalities Act, the Data Protection Act and the Human Rights Act. There is also the common law. There is a web of controls and accountability methods, but in the end they are accountable to their local people through the Police and Crime Commissioner.
Lord Hunt of Wirral: We have heard a variety of views on the West Midlands Police Ethics Committee and its suitability to inspire perhaps even a national ethics body. Will the model inspire a national body, and how should such a national body interact with local committees?
Kit Malthouse: I do think it is a good idea for local police forces to have ethics committees, and I would hope that they would sit and think careful and profound thoughts about the ethics of their practice in all areas. However, in the end, are we not the national ethics committee? If there is police practice that is deemed to be unethical or that people think is unethical—or that you or a Member of Parliament or whoever thinks is unethical—they should surface it through us. I would be concerned about setting up a parallel ethics group. That is us. That is for the House of Commons and the House of Lords to decide. We decide the legal and the moral framework within which everything in society operates.
Lord Hunt of Wirral: You would accept personal accountability as a member of an elected Government, as a senior Minister? The buck stops with you, does it?
Kit Malthouse: The buck for what?
Lord Hunt of Wirral: For overseeing what is fed in. This is where I come back to the fairness matrix. There is now evolving a distinguishment between bias-preserving and bias-transforming fairness metrics. Will you be keeping a careful eye on this? I think it could develop in ways that perhaps are presently unforeseen.
Kit Malthouse: That is a very good question which I will have to think about. In the end, my view is that I am broadly—whether I like it or not—responsible for most things. The Government are responsible for bringing things to an end by their veto power, by passing legislation to stop things happening or by centrally funding things that do happen. We have some brakes and levers that we can pull, although, as you would naturally understand in our mature democracy, they are not direct. I cannot, for example, remove a Chief Constable, even if I wanted to. That ability falls to the Police and Crime Commissioner. I can do nothing about a Police and Crime Commissioner, because they have their own mandate and they are elected. These are the natural brakes and protections that you would expect in a democracy, but in the end, what do the people think? The people think that we are responsible, yes.
Lord Blunkett: There are ways and means, I promise you.
Q108 Baroness Shackleton of Belgravia: Good afternoon, Minister. Thank you for joining us for our Chair’s birthday party. I am going to switch to governance, which is similar to but not the same as accountability. The bottom line will turn on the confidence people have in the system being fair, honest, applicable and fit for purpose, but we have found at least 30 bodies with a role in this governance of data and technology within the policing system. Is there a sort of Rosetta Stone that explains these complex institutional arrangements, and how would you rationalise them? If there is a Rosetta Stone, can you articulate what that is, because we as a committee are slightly bemused as to what that might be?
Kit Malthouse: I do not know if it is a Rosetta Stone, but there is a chart. You are quite right: we have a model of decentralised policing, and it operates in quite a complicated structure of accountability and influence.
Baroness Shackleton of Belgravia: Sorry to interrupt you. It is sort of byzantine.
Kit Malthouse: It is. It is a bush, not a tree, that is true. It is your own preference whether you think that is right or wrong. Some people may say that within that bush there is protection and within a tree things become more assertive. Others may think that the clarity of the tree is preferable.
Obviously the police have their national digital strategy, which over the next decade is about making sure that they have the right infrastructure and the right governance and that they sort all this stuff out. As I said earlier, we are trying to play our part with some rationalisation. Putting the biometrics and surveillance camera together seemed sensible to us, given the evolution of those two technologies. Following the data reform consultation there may be more that we can do about rationalising it further. Given that all this stuff is in some shape or form data, the Information Commissioner’s Office grows as a body and over time you can see things migrating in that direction, so we will see. You are right: it is complicated at the moment, albeit that most forces are quite clear about their own situation.
There are lots of forces at play on a Chief Constable, from the regulators to the legislation that we have put in place. There are obviously their own organisations, such as the National Police Chiefs’ Council and the Police Digital Service. They have their own leads and a scientific adviser, and they will have their own engagement bodies locally. But, in the end, the Chief Constable has to be accountable before the law, and that normally focuses minds.
Baroness Shackleton of Belgravia: Is the Chief Constable responsible for the governance?
Kit Malthouse: It depends. They are responsible for drawing together some of the local governance internally in policing. The Police and Crime Commissioner is effectively responsible for the governance overall and for supervising the money and performance. We have external statutory governance through some of the commissioners that provides the web of protection, hopefully.
Baroness Shackleton of Belgravia: Are there the equivalent of non-executive directors who sit with the Chief Constable who keep him in check in this respect?
Kit Malthouse: No. We have a Police and Crime Commissioner, and the police and crime panel in each of the 41 areas effectively scrutinises the Chief Constable through the Police and Crime Commissioner. That panel is drawn on political balance from across the area but often contains independent and co-opted members. Yes, there is a web of scrutiny there externally as well as internally.
Baroness Shackleton of Belgravia: What happens if somebody is behaving in a way that requires rectification? What is the process that can pull them into line?
Kit Malthouse: For example?
Baroness Shackleton of Belgravia: If the data that had been culled had been misused, for example.
Kit Malthouse: Any complaint about the misuse of data can be made to the Information Commissioner’s Office. There is also recourse through law. We have a fairly extensive and assertive police complaints structure that people can appeal to.
Baroness Shackleton of Belgravia: I may be putting it badly. Is there a sort of homogenous application of the same rules among all these different 30 groups that we have been told exist? There has to be not one rule in Scunthorpe and another one in Basingstoke, for example.
Kit Malthouse: Sorry, rule for what?
Baroness Shackleton of Belgravia: There are 30 bodies with the role of governance of data. How do they get all together, and how are they applying the same rules?
Kit Malthouse: It depends what you mean by 30 bodies. There are obviously some central bodies, and I presume by bodies you are referring also to the regulators, are you?
Baroness Shackleton of Belgravia: Yes.
Kit Malthouse: What the regulators are there to do is exactly as you ask, which is to provide consistency in the use of those technologies, biometrics and surveillance cameras. They will have guidelines and codes and all that kind of stuff, which is to provide that level of consistency in operation. If you are saying that those 30 organisations should all sit down together and co-ordinate so they are all doing consistent stuff to that particular police force, that sort of co-ordination does not happen. Among those 30 bodies I presume you are including things like the chief scientific adviser and the local ethics panels and all those kinds of organisations, are you?
Baroness Shackleton of Belgravia: Yes.
Kit Malthouse: That would be quite a difficult logistical exercise. In the end, the overarching co-ordination is the web of legislation that we put in place in which they all have to operate. Christophe, do you want to comment about that point?
Dr Christophe Prince: Just to reinforce your point, Minister, that they are operating to legislation where that is applicable. There will be more specific technology and the development of guidance, for example from the College of Policing, that becomes applicable across those forces and which other bodies would look to apply. It very much depends on the specific question they are being asked to address. Some of those bodies will be providing their domain of expertise, be that commercial or scientific—in that case, the Forensic Science Regulator—and their own expertise, which will come together in the decisions that the chiefs will make on the basis of the guidance they develop.
Q109 Baroness Chakrabarti: In response to Lord Hunt, the Minister said that it is the Chief Constable who ultimately decides who arrests you, but that is not quite right, is it?
Kit Malthouse: No, it is not right, I am sorry.
Baroness Chakrabarti: Sorry, I am not scoring points. I think it is relevant to our consideration, because of course it is an individual constable who makes that decision.
Kit Malthouse: A constable, correct.
Baroness Chakrabarti: That is why it is not an authoritarian tree. It is an individual constable who does that, within a framework of statute for the most part these days. The Conservative Government enacted PACE, which these days is the seminal piece of legislation in relation to police powers, and of course there is the substantive criminal law about which offences are arrestable and so on. This is the nub of my concern, Minister. I think you can see what I am getting at here. I see a statutory vacuum in relation to 21st century police powers, including the use of facial recognition technology, AI and other new technologies. Some of these powers are far more intrusive than stop and search powers, arrest powers and search warrants to search my house, yet what you are saying at the moment is, ‘Let us innovate and leave it to the Chief Constable’.
Do you not think it is time for a 21st century statute so that this is no longer the Wild West? You admitted earlier that there was a problem with so-called digital strip searching and victims being deterred from going to the police station to report sex offences because of the intrusive searching of their mobile phones. A few years on and your Government are now legislating to deal with that. Do you not think it is time to legislate more holistically for some of this intrusive power, which can be very exciting but could also have huge concerns for the citizen?
Kit Malthouse: In our manifesto—no doubt you were glued to it at the last election—we did say that we wanted to create a satisfactory framework for the acquisition of police technology. I agree with your aspiration. However, at the moment it is not entirely clear to me that the various principles that are outlined in the web of legislation that we have talked about thus far do not also apply in these areas too. Given that we cannot predict where technology will go, it would be hard to design a better system of principles than we currently have unless you contemplate that we would have specific legislation every time new technology appeared. That is where we are struggling a bit, because this stuff is coming so fast that it is hard for legislators to keep up. We prefer to produce a set of principles and then, as technology appears and is put to use, to work out what is happening and govern it through guidelines and the transparency structures that we have talked about.
From time to time, as technology arrives, obviously there is a bit of discomfort about it, but then as it matures and that work is done to build public trust it becomes accepted. We talked about Gatso cameras before. When they first arrived, and I am old enough to remember them arriving, there was outrage about the fact that these cameras meant that speeding was not a fair fight anymore, that you were going to get caught 100 per cent of the time, but over time they have become an accepted part of the landscape. From your perspective, you might say that that is dangerous, but in the end it is a product of building public trust. I think that is what we have to do with all these technologies. We can legislate all we like, but in the end it is about the public trusting their use and seeing the utility that is the most important thing.
Q110 Baroness Primarolo: Minister, thank you very much for the time you have given us. I want to build on the point you have just made about trust, because trust is also based on the public having an appreciation that the Government or the police forces know what they are doing and why. You asked earlier in today’s very helpful evidence session, ‘Can the police be better informed by the use of data analytics?’—a classically important question that we are all trying to grapple with.
My question to you is how the Government can answer that question in the absence of proper evaluation on whether it works and monitoring a consistency of approach through the various police forces. For example, allowing a thousand flowers to bloom, some of which will fail and some of which might succeed, does not follow what you also identified as being important: that democratically we need to set broad principles for the use of AI. At the end of this session, could you perhaps briefly draw those points back together again? What we are looking for here is that the Government have a sense of direction based on encouraging technology but monitoring its use and whether it works.
Kit Malthouse: In the end, you are right: all this has to be evaluated and show utility. From my point of view—as you have probably gathered from today, I am a simple thinker—the utility is that crime falls, more criminals are apprehended and we achieve our objectives.
I will give you a great example. When the Prime Minister stood on the steps of Downing Street in 2019, one of the missions that he set for the Government was the ‘rolling up’ of county lines. We set to this job with a will and funded operations in London, the West Midlands and Merseyside. In London in particular they developed an operation called Operation Orochi, where innovative use of mobile phone data allowed them in fast time to attribute what are known as burner phones to particular individuals. These are drug dealers in London who are dealing drugs in Norwich or in Andover, exploiting young people and all the rest of it. They were able to use that data and location data to enormous effect, to the extent that when these guys—and we have now arrested over 7,400 of them—are presented in court they plead 90-odd per cent of the time often just on the telecoms evidence. Without any drugs or money, the evidence is there and they plead. That is a result, and I do not think I need an academic evaluation to prove that is working.
There are some areas like the National Data Analytics Solution in the West Midlands, where obviously that is evaluated and we want to see how that works. That is being delivered for us by West Midlands Police in conjunction with the private sector, and that is evaluated too. As these things come on board and are of interest, they have to be rigorously scrutinised to make sure that they are doing what they say they do.
I understand your concern about that, and the natural tendency for us to want to do that in some kind of centralised way that meant that we all had confidence that Kent Police were doing the same as Lancashire Police and Northumbria Police, but at the very early stages of technology acquisition that is hard to do. I definitely agree that as we move into the mid-stage, the maturity stage, there is more scope for doing that. As I say, that is when guidelines emerge and the regulators have a look and we are able to say, ‘As we move into further maturity we can then certify this for use’, like Taser and Gatso cameras and all those things that we have started to use.
There is an evolution model of technology through policing where a different approach is needed at a different stage. Having said that, at all stages—Baroness Chakrabarti is exactly right—the web of legislation pertains. That is the legislation that our two Houses have put together, and the principles in that legislation of proportionality, reasonableness and the purpose of what they are doing have to pertain. That is the ultimate safeguard, along with that web of scrutiny and regulation that sits alongside. The acquisition of new technology is hard because we find ourselves a bit like Donald Rumsfeld: we do not know what we do not know.
Baroness Primarolo: Yes, I appreciate that. So we do not know that we are not putting innocent people into prison either.
Kit Malthouse: Exactly. Well, yes.
Lord Blunkett: It adds to the uncertainties.
Baroness Primarolo: That breaches pretty fundamental principles which the Government should be aware of, I think you are saying.
Kit Malthouse: But do not forget that all this stuff is testable in court. As you will have seen, the live facial recognition that South Wales Police use underwent judicial review and was tested in court. That is how we test those uncertainties, as Lord Blunkett put it: by testing them in front of a judge.
Baroness Primarolo: Thank you very much, Minister. I will leave it there in case the Chair has any other quick points to make.
Q111 The Chair: I will follow up with questions that are very much related to that, although I would say, Minister, that one can be simple but very clever at the same time.
You mentioned certification. What is your reaction to there being a ban—‘moratorium’ would perhaps be a better term—on using any particular technology until it is certified as meeting particular set scientific standards? You have referred to that just now, but at the same time perhaps you could amplify it. What is your view about a register of algorithmic tools used in the public sector? We are aware of the issue of innovation. We are aware of the difficulty of commercial confidentiality and property rights, but what is your view on those two things?
Kit Malthouse: On the first, I would have to think about that, if I am honest. It might become unwieldy, in that I would be getting submissions on a daily basis: ‘We would like to try this. We would like to try that. Can you give us permission to have a go?’ There are some areas where permission is required. New technology that is to do with weaponry, for example, has to go for testing before it is acquired, and the same is the case with more mature technologies, but I would be a bit nervous about that becoming unwieldly and, in fact, a deterrent to the acquisition of new technology. I would have to think about that. I am sorry, remind me of your second question.
The Chair: It was about a register.
Kit Malthouse: As you rightly pointed out yourself, I think there would be some difficulty there with proprietary technology, albeit my general principle view is that the police should be as transparent as they can be, particularly with techniques and evidence that are used for the arrest of individuals. In the end, anything that is used in the prosecution of crime is testable in court, so if you are identified or apprehended because of a bit of technology, it is perfectly possible for the accused to defend themselves with the use of an expert witness testing that technology in court. In those circumstances, I think that would override even the commercial sensitivities.
The Chair: Yes, indeed. I am intrigued to know how the contracts work between the purchaser, the police but I suppose any purchaser, and the manufacturer and provider. If something goes wrong, whose responsibility is it? We have not managed to see any of the contracts that are in use, so confidentiality has been applied even to that level. If you are able to help us on that I would be grateful, but I will not ask you to commit yourself now.
Kit Malthouse: Some of this is governable by statute. As we said earlier, with the Forensic Science Regulator coming on to a statutory footing, the compliance with the code and the guidelines of an evidential standard will be pertinent in a case. You would think the same would be true with other bits of tech as those guidelines and codes emerge.
The Chair: Thank you. Is there anything more that you would like to add, Minister?
Kit Malthouse: No. All I would say is that I welcome your scrutiny of this area, because it is a difficult one. The pace of change is so rapid that it is hard for us to keep up, so we are trying to create a system that is flexible enough to allow early adoption and acquisition and the testing of technology, but at the same time gives confidence to the British public that their privacy and the historic protections we have all enjoyed are still being preserved.
The Chair: The public are the right place for us to conclude, so I will formally conclude the meeting. Thank you, everybody.
 The Police and Criminal Evidence Act (1984)
 Live Facial Recognition
 Defence Science and Technology Laboratory
 Input/output operations per second is a performance measurement system used for hard drives and storage area networks.