Written evidence submitted by Group 5 (Event 1) (DHS0049)
Transcript of roundtable event with members of the health care workforce held on Thursday 22nd September 2022 for the Health and Social Care Expert Panel
Participant A: We're a membership organization and the members that we work with are all providers of services for people with a learning disability, autism or both.
Helen Patterson: OK, thank you. Participant B?
Participant B: I represent an independent advocacy organization. Last year we supported half a million people with their rights and entitlements, and of those about 25% were within the NHS or a healthcare system specifically with complaints and PHSO type escalations. About 12% of our beneficiaries live with LD, 43% with mental health, about 34% with disability, and the rest [have] other marginalizing factors like poverty, deprivation, substance abuse, racial discrimination amongst others, and also homophobia as well. So a very wide range of beneficiaries across the UK.
Helen Patterson: OK. Thank you and Participant C:
Participant C: Hi there, I represent Med Confidential. We seek to ensure that every use of patients’ data is consensual, safe and transparent. We've been around since shortly after the 2012 act where GP colleagues asked me to take a look at that. Unfortunately some of our work often represents when things are going wrong. But I should say we are huge advocates of, as I say, consensual, safe and transparent use of patients data obviously for healthcare, but also research and service improvement.
Helen Patterson: OK. Thank you very much. So shall we start off with the first commitment area which is the care of patients and users. In particular we're going to talk about the NHS App to start off with. So the first question is, is your understanding that the NHS App is used by many of the patients and service users your organization represents? It would be really interesting to hear what [your] experience is. So Participant B, would you like to go first?
Participant B: Thank you. So the answer is no. If you think about why people need advocates… I would say that we have a larger proportion of people living in poverty or deprivation who don't necessarily have PC's or mobile phones. Many of them don't even have landlines, to be quite honest. They also don't have ID's. So if you think about the barriers to having an app: A you need to have a device; B you need to have an ID to be able to authenticate. So I think that's one challenge. The other challenge is that I think there's an accessibility issue around getting on the app, so whether that is easy read, whether that is modifications for people with sensory disabilities as well. I think people with sensory disabilities have had very mixed experiences. When it comes to the website [it is] much easier because of text to speech readers. But I think that I would say there is a complexity and barrier just about the information, not just access to the device and an economic divide because of that. Those are our observations and the feedback we've had from beneficiaries.
Helen Patterson: That's exceedingly useful. Thank you very much, Participant B. Should we go to Clyde and see if your experiences with your service users is similar?
Participant A: It is. Before I start I just wanted to mention that I was a representative of Voice Ability for four years. I would also say that for people who are in a locked setting or a secure unit, they wouldn't necessarily have access to that technology even if they did own it. So I think the answer is probably the same. I'm basing that not necessarily on my role for the Association for Real Change, but on my own direct experience of managing learning disability services just before or just as the pandemic hit us when this app suddenly became really important. From that point of view, most of the people living in a supported living setting or a residential service, or who are accessing a day service, wouldn't necessarily have the technology or the ability to use the technology. On a personal level: I'm autistic and I actually don't like a lot of technology. I find it confusing and cluttered and busy. I actually found this [the NHS App] is actually not that, and I found that I actually do use the app which is unusual for me personally. I actually find it quite clean and quick and easy to use. There are some log on issues with it, which is frustrating, but I understand why it has to be secure. I think the answer is, in terms of learning disability services, I would be very surprised if many of the people supported by our members were actually regularly using the app, and it's not something you can use on behalf of somebody. So a support worker can't use that app, effectively, on behalf of somebody they're working with. I can't imagine how that would work.
Helen Patterson: So for you using it, you said it's easy to use… What do you use the app for?
Participant A: Primarily for the vaccine registration and getting into places at the time when you needed to show your vaccination status. I've never used any NHS technology before. I've always been reluctant… I'm quite cautious about my data if I'm honest. I struggle a little bit with that. But if for whatever reason I did choose to jump in with both feet with this… and I still use it; I order repeat prescriptions with it and things like that. But that's speaking as somebody with autism. I guess the members that I represent would primarily be people with a learning disability who may also have autism, and I'd be very surprised if many of that group of people were using it. There will be some, but I think they would be quite small in number.
Helen Patterson: Can we go to Participant C?
Participant C: Sure, thanks. So obviously we advocate for all patients and in special regard to data and digital and tech, so I can really report what people have contacted us about, and that has been – as both Participant B and A have mentioned – I should say that quite a number of patients who said they had autism found in particular the NHS log-in part, you know, that bit in order to use the app and to verify it… what we call user journeys or pathways, they are completely broken for people who find those things difficult. [There are] more than a handful of people who said that they specifically had problems with that. We were able in some instances, with people who had already been using GP online services through their browser, they were able to use the credentials from their GP practice to authenticate to the app. But once we help someone like that, we don't have… you know, we're a tiny organization and we don't really have any follow up function. I assume that we help someone fix the problem that they had, but, as I say, we noticed that. I mean frankly, putting the vaccine passports in or behind an authentication process in the app was, we think, a very cynical move. It may have driven many people to download and install the app, but people were doing so very reluctantly and that was massively the case in terms of people who were contacting us. And you know, they may have done that for a period during the pandemic in order to have something which you actually needed to do. But I think the overall, if you like, declared numbers of people who have the app versus those who are using it are probably quite soft, unless someone is used to doing repeat prescriptions or what have you. In matter of fact, apart from some national services, obviously the app is essentially just hooking into your own GP practice’s services, and I think once people realize that it's a little bit flat. You know, the app was promoted as quite a big deal. Again, one of the experiences that we have had is, ‘well, OK fine, I could do it on my phone, I can do it on my phone’, which for some people is fine, but other people just say to us, ‘well, look, it doesn't do anything really different than I can do in a browser, and that's what I prefer to do anyway’. I couldn't say what proportion we are talking about – we're a tiny, infinitesimal sample – but these are the sorts of issues… and very importantly, the point that Participant A raised is that while the national data opt out is one of those national services that is available via the app, that opt out is broken by design for families with children. Delegation in the app is a really, really complex issue; delegation generally is. Doing it digitally and remotely is doubly difficult. It is this general problem of which, you know, dissent is just one example. But again for us it throws up a lot of frustrated patients and people who just are finding things very difficult.
Helen Patterson: That's really interesting. Yeah, that's really useful. Thank you. Participant B, do you want to just make your point on this as well?
Participant B: I just wanted to build on what Participant A and C were saying about two experiences of the app. I actually use the app really to its full extent as a personal user. I particularly find the fact that, as a dyslexic person, it does my facial recognition and so on means like less typing, less thumbing around. So my experience the app is very positive and I have a long term health condition. I'm tested regularly for blood work, and I find the fact that I don't have to call anyone – it just appears on the app what my test results are – and the fact that I can access my medical notes, to be very useful for monitoring my health condition and it just gives me peace of mind. What I would say is – building what Participant A was saying about closed cultures like secure units, like prisons, like mental health, care homes – actually the app is further isolating people. The other thing I would say, and this is just a personal experience I had a couple weeks ago, is [that] I'm scared that the app is replacing common sense human interaction. I had a personal experience two weeks ago where I went to my GP and I was on the phone with my GP about a repeat subscription and she said, ‘I need you to come into the doctor's surgery to put your arm in the cuff and have your blood pressure done.’ Very simple, and so she said, ‘I'll call you after that's done.’ So I went and walked over and she rang me, and I could actually hear her from behind the door. I was sitting right outside the door, and I said, ‘well, shall I just pop in?’ She said, ‘I'm not allowed to have you pop in because I'm only allowed two minutes for this interaction.’ I said, ‘but you're 2 inches behind the door, I can actually hear you through the door.’ And she said, ‘I'm not allowed to.’ She said, ‘the way the app works is that we're meant to spend less time per patient than the 15 minutes that used to be there.’ And that really struck me. Now for me, I found it funny. But let's say I lived with LD. I would find that really confusing to be quite honest. That felt strange. I don't know if other participants have had similar feedback from people about this replacing human contact, but that was something that I experienced that I thought was really confusing.
Helen Patterson: That was really useful, thank you. Do you mind if we just move on because we've got quite a lot of questions to get through, and there's another couple in this section and I'm just aware of time? So moving on to question two: other than the NHS App, does the service user group your organisation represents use other online healthcare services? Participant A, do you want to go first?
Participant A: Yeah. I think again it's probably for the same reasons that I would argue that most people who are supported in residential care, respite care, PMLD services, learning disability services, are probably not doing this themselves. It's possible that staff might be in terms of prescriptions and annual health checks for people; learning disabilities should have an annual health check. There may be interactions through online services to connect them with community nurse teams, et cetera. But I wouldn't think that's a widespread thing. I'd say that's probably almost the exception rather than the rule, I would suggest.
Helen Patterson: OK, Participant B?
Participant B: The experience in terms of digital interaction with other things outside the app, the first thing is a lot of my beneficiaries where they are on email, and they're using the NHS complaint system, do use email, and they tend to identify their local GP’s email very quickly to work around e-consult or the app or other things. So a lot of complaints and letters are coming through email. There's also probably some confusion, and feedback we've had about e-consult. What people say is [that] they're confused by some of the privatised services that are on e-consult. So when you log in, people are sort of saying, ‘well, I thought I was logging into the NHS, why am I following a digital bread crumb trail that's taking me to some paid service?’ The other thing that we often hear from people is that they use websites and local GP surgeries or e-consult not to actually fill out the digital form, but to download and print the form and then post it. I think [it] is very interesting that actually people aren't using the digital pathways as they're intended; they're actually using them in very different ways because they find it confusing or intimidating.
Is that just e-consult or [is it] the other ones which they find confusing as well?
I think that [in] the experiences that we have in terms of the data that I've reviewed prior to coming to this call, e-consult and the local websites of GP doctor surgeries are often cited because some of those websites have local digital forms that are not part of a national NHS type system. Catherine asked about [the] mindfulness app [that] I've never even heard anyone mentioning at all. I know that there's anti-smoking apps and things [but] I've never heard anyone give feedback either way, positive or negative at least [from] the people that I represent… I think there's no viewpoint, at least from the service users that I'm representing.
Helen Patterson: OK. And Participant C, is your experience the same?
Participant C: No, not at all. We have a very long track record with apps and specifically the apps library that no longer exists which went through several iteration, each of which were terrible and contained apps that should never have been endorsed by the NHS. Some of them were breaking the law. Certainly many of them were breaking ethical principles and many of them have had business models that were completely inappropriate. That is still the case, unfortunately. With specific ones –e-consult was mentioned – we get quite a lot of comments about the marketplace that is attached to online provision, so that's roughly speaking doing what you would do with the NHS App. But you do it, you know, at a practice and you find yourself rapidly in literally something called a marketplace, a list of services that include private, paid for services. To my mind, and to the minds of many people who contact us, that is marketing directly to patients, something that is supposedly forbidden within the NHS but which is happening daily and has done now for many years. There are other ones that we've had in recent times that have stepped over lines, who actually helped NHS digital to build the NHS App which bid for some years, then stopped and has now resumed, essentially marketing its really dodgy [and] completely useless DNA tests to patients who use that app, and this is a DNA test for things like skin care and what have you. There’s other ones which were used very effectively during COVID at the height of the pandemic, which have recently received funding and are seeking a new business model. Real questions remain around if people basically started reporting symptoms to what was previously just a very high-priced diet app, which was repurposed for that and got a huge user base. Why is it that that company should be able to utilize a user base that have been handed to it in an emergency for its own business purposes? There are other instances, and not just recently, but for example, the postal pharmacies. This is an app, something presents as an online function to people, but you've got to ask what is going on behind the scenes in the business.
Helen Patterson: Can I just go back to patient experience on this? Are they using other apps like the Mindful app which Catherine mentioned?
Participant C: No, not the Mindfulness app – I don't know which one it is that Catherine is referring to – but mental health and mindfulness type apps are things which patients do contact us about asking, ‘what the hell is this?’
Catherine Davies: Just to clarify, it wasn't a specific app. It was just a category of apps that help people relax or step counters, or any of those. There are quite a lot of those sort of other apps in the healthcare space.
Participant C: Yeah, and [with] the NHS digital NHSX data protection and security toolkit, we've said for years [that it] should include very clear direction that there is there is to be no advertising ID used, and no tracking tech used in any app that the NHS is endorsing. Otherwise we are simply just handing people over to random business models and direct marketing. The apps library singularly failed to erect any sort of process that reasonably checks these things out. There's some tiny amount of research going into the clinical efficacy and effectiveness of these things and yet some of them are making quite strong claims.
Helen Patterson: OK. Participant C, I think you've made your point there and that's really useful. So thank you. And can we just go to Participant A before we move on to the third question?
Participant A: I'll try to keep it as brief as I can. It was just really a concern on a personal level about Palantir and what's happening there. To make sure that this does tie back to the people we represent: I was on a call which had been organized by NHS Digital in relation to the digital social care records. All social care providers are expected to have records that are interoperable with NHS Records by March 2024. On one of the calls there was a young man representing a commercial software supplier. I've never seen somebody so excited. He wasn't regulating himself very well, but he behaved as though somebody had let him into the sweet shop. You could see his excitement and what I took away from that on a personal level is just a real fear. He can only be that excited if he thinks that the controls over these data are poor. For a lot of the people supported in in our member services, they will not have capacity to consent to this. Who is it going to be? You can't consent on behalf of somebody else, nor can you. So how is that going to be managed?
Helen Patterson: OK, thank you. Actually it was really useful [that] you mentioned this integration because one of the questions is: do service users your organisation represent receive integrated care?
Participant A: Well, I think the short answer is probably no. I think there's obviously a lot of hope in the integrated care system that that will change, but we have had other initiatives that intend to bring together health and social care. So we're very supportive of the idea, but obviously we need to see how it's going to work in practice. But no, I think the experience is probably, more for support workers and the managers working in services, that actually this is hard. It's hard to get primary care services to work in a coordinated way and there's a lot of time and effort needs to be put into that.
Helen Patterson: OK. Participant B, is your experience the same on that?
No. We see selected programs around the country where there's cooperation between sort of CCGs and local authorities to provide some programs, in small pockets around the country, not in every county. But there are two or three local authorities and CCG's that we work with that we have seen provide digital inclusion type efforts.
Helen Patterson: OK, thank you. And Participant C anything to add on to that before we move on?
Participant C: I'm just putting a link into the chat to the historic stuff on the apps library. But it was the failure to do the apps library properly, both in terms of efficacy and clinical effectiveness and stopping advertising to patients that led to the policy decision to simply just scatter these things around the NHS. I would say while the app library failed for very good reasons, it is actually counterproductive to take the approach they're taking at the moment for precisely the reason that we are talking about, which is that there's nowhere for patients to actually find out if something actually is NHS or not.
Helen Patterson: Yeah, I have to admit [that] when I join this panel, the first thing I did is I thought I'll just go and look at the App Library and [it was] it's gone. So I share your frustration there. Right. I think we probably need to move on. Catherine, do you want to do the commitment area two?
Catherine Davies: Yes, I'm happy to do the next one. So our next commitment area relates to the health of the population. To find new treatments or improve healthcare services, researchers sometimes need to analyse information from patients’ healthcare records. So the specific question is, in your opinion, how can the NHS help patients and service users feel comfortable about having their information used for research and innovation. So we've sort of started to touch on this a little bit already. Participant B, would you like to kick this one off?
Participant B: I think you have a very good example of this kind of consent with organ donation programs online. I've just recently donated my organs online using a fully digital app, and what I thought was good about it was that actually what consent involved was very clear at every step on the form, and I understood what that consent looked like, what it meant for my families, what it meant for my organs at the point of my death. I think that could be used as a case study and possibly a basis for what you're describing, but that’s just one view. I don't have any beneficiary data about this issue because we tend to deal with people once they're in crisis or at a complaint stage to evaluate their options. So I don't have anything to offer on this other than my own personal experience.
Catherine Davies: Your personal experience is helpful, thank you. That's a really good example. Participant C, what are your views on this? I imagine they're quite strong.
Participant C: Yeah, very strong. It's what we do, basically. I mean, it's not about consent; it's consensual. There are many things that people do, and do because they trust, you know, the NHS or a person in the service that they are interacting with. But when you talk about data and things that are done with it, many of them are invisible. People find out about them after the fact and are not informed ahead of time, and have in many cases quite limited options about what they can do. So we would start at the top and say how the NHS can help people engage in shared endeavours beyond their own healthcare is to demonstrate trustworthiness. That breaks down – if you know Onora O'Neill's very good talk about trust – into three simple things: is this person competent; are they honest; and are they reliable? If, and only if, you can demonstrate those things… and things like reliability means publishing, telling people what you have done with their data because the only way someone can actually make a judgment of whether they trust you next month is to see what was done last month. There are commitments in some of the strategies, although they been pushed off into the long grass, about literally telling people how their data has been used, not just data in general, but how their data has been used. We firmly believe – and have been advocating pretty much as long as we've been advocating for trusted research environments so that you can control the use of data and access to it – in what we call personal data usage report. With commissioning modern systems, this data is identifiable data. It might be pseudonymized but is identifiable. There are audit trails in every modern computer system, therefore it is not beyond wit of a ‘geek’ to build a system…. In fact, we've done it with existing APIs to show you could generate a PDF. statement like your bank statement or your mobile phone statement that actually says your data was involved in this study. You could get the audit report that your summary care record was accessed by [this person] on this date in this institution. We've had some really quite tricky cases where your summary care record was not accessed on a certain date by a certain institution, or any date by that certain institution, which shows or can reveal a breakdown of process that can be critical to someone’s care. All of these things are not just nice words, they actually mean engineering, reengineering or commissioning systems in very particular ways against a bunch of really very clear principles. That, I believe, is the way, and that's why I do what I do frankly, and have been for the last decade. I believe we can do this, and once we do, I believe many, many more people will engage in these things. But if there are attempts to essentially suck up everyone's data without properly telling them and all this other stuff, then we will continue to punch holes, to degrade the asset, to punch holes in the data from all those people who opt out. We're already 5,000,000 in. I don't think it should have ever got that high, but it was deliberate choices by the department, by NHS England, that resulted in that many people opting out.
Catherine Davies: So just to summarize, you're saying [to] be really clear with people about what they're agreeing to, and then give people personalized information on how their data has or hasn't been used; and getting it right with the integrity of the system as a whole so that we have as much data as we need to do the things we need to do.
Participant C: Yeah. But the bottom line is you have to give people choices. These are their medical records. They have actual human rights and data protection rights, and we have the notion of medical confidentiality. Therefore, the system has to be predicated upon the individual being able to make a choice as freely as possible but being informed, not told, ‘oh, you're a bad person if you op out because it's hurting research,’ which is sort of the implied message of much of the communications that goes on. But actually being informed and saying, ‘well, look, we're looking to help people with a condition that maybe you don't have, but someone in your family has, and if we can look at your data, that's actually going to help us understand this whole condition more or better. And so there are some very clear and sensible things that could be said, but absent the evidence and the openness and the honesty, people are becoming less and less inclined to trust. Once you lose trust, you've lost the whole thing.
Helen Patterson: So how would you turn that round to get people to trust, to share data, just out of interest? What are your views on how you turn that around?
Participant C: There are three words: consensual, safe, and transparent. Move all access into trusted research environments, which is the safe bit of what we've said; be transparent and publish everything that has been done, and preferably what's been done to individuals data; and provide people the means to exercise choice. Unless you get those three right, you won't be anywhere. Unless overall as institutions and sectors like the research sector and what have you, rather than arguing from a point of narrow self-interest we actually start to talk about the NHS in the same sort of way as we did right at its inception as something that we are doing together.
Catherine Davies: Yeah. And then putting in this important point about accessibility as well. So to make sure people can actually understand what is being said. Can we bring you in, Participant A? What would you like to add something?
Participant A: Yeah. Thanks, Catherine. So until Participant C said it, I was itching to say the word transparent and obviously he's now covered that. But I think there's something for me about authenticity as well because you can be transparent but inauthentic. And I think the bit that's where the authenticity is really important for me is [whether] this is a shared endeavour. Is this about syndicating risk? Is this about some of the forming principles of the NHS or is the objective something else. If it is something else, and it's potentially commercial, then you need to be really, really honest about that because you can't make the choices that Participant C is talking about if you haven't got the full story. So it's about transparency, but it's about authenticity as well. I think we've got an environment now post Brexit where, you know, we've talked a lot about animal welfare standards and Food Standards we've, talked about human rights, and they're being presented as though we have an opportunity to improve on those areas. People that are able to think this through themselves know that it's the opposite that's true. Our human rights are under threat and are our animal welfare is under threat. I think the scepticism for me on a personal level – and I think this there's no reason to think it would be different for someone with a learning disability – is, ‘are you telling me the truth about this? Is this really why you want the data, or are there commercial operators that want to get their hands on it?’ If you're not being truthful about that, then for those of us able to spot that you'll never get the engagement. So the first thing I did was opt out as soon as I saw this because I don't trust. So yeah, I hope that's helpful. There's no reason to think somebody with a learning disability in a position to have this conversation would think any differently than anybody else, actually.
Catherine Davies: If there are a number of reasons, because sometimes things are complex, there's not just one thing... Do you think it's best just to state all of the reasons so that people understand all the purposes? And I don't mean like a massive long list of things like a massive disclaimer that's just literally everything. But I mean, a thoughtful list.
Participant A: It could be anything, couldn't it? I think if you're presenting it as something that's connected, you know, the person is contributing to a shared endeavour… When that isn't the purpose then the trust is eroded. So I think it's not about how much information, but it's about being genuine about it. People sense what's genuine. I think most of us are quite good at that.
Participant C: And it's dealing with some of the exceptions, to be honest. You know, a lot of the fears are around relatively minor… things that could only ever deliver minor incremental improvements on anything at all, if that. They could be just jettisoned entirely and then you would be left with the really quite tricky issues around, for example, pharmaceutical marketing, which has to be understood in the context of pharmacovigilance, which is an absolutely essential function, and how the NHS itself purchases medicines for tens of millions of people. So some of these things could be dealt with separately from consent, but as a sort of broader public education piece where we actually start to say, ‘look, the NHS is lovely and we all depended upon it, but you need to understand that it's a very complex ecosystem and that the idea that there will be no private companies involved, or no one making any profits at all or what have you, are frankly nonsensical.’ But rather than running away from that, I honestly believe you could actually properly explain it.
Catherine Davies: Yeah, OK. That's very helpful. Shall we move on to the next thematic area?
Participant B: Is it OK if I just add one quick thing? So I've put some comments in about literacy and language and obviously accessibility, and there have been some experiences recently in the last year that I really want to highlight at this point. So if you think about the Afghan arrivals and the Ukrainian arrivals, it's not like these were planned moves. Actually, if you think [that] these are two of the largest programs of relocation in the UK where people were airlifted out and then sent to various communities, host families, bridging hotels, etcetera, what we found was that people were brought here very quickly, but then it took up to six to nine months to have local language translation. So on one hand, the government was saying, ‘oh, you can register for the NHS and be seen.’ But actually it took six or seven months for people to actually receive local leaflets, or in the meantime, the translators weren't on the ground or anything to deal with the NHS. So we found ourselves in this really unfortunate place sitting in between and helping to advocate for people in this position or even the host families. I do think there's also literacy, you know, we don't have 100% literacy. And when I mean literacy, I don't just mean the binary [of whether] people can read or not. I also mean what level of education. Sometimes the content on the digital websites in the NHS is not plain English enough, or written at a college or university level. I think that that's something we need to look at, to say what is the reading level of most people as opposed to writing something that looks very scholarly on the websites.
Participant C: And by the way, this was done back in 2014 with care data, it wasn't done before. I don't know what's happened. There used to be a translation function. There used to be a reading age function on quite a lot of NHS…. Literally just bung it into it and come out, print it and everything and that's gone. I don't know where it's gone, but it's probably been forgotten or defunded or something. But it was a vital function.
Helen Patterson: That's really helpful, thank you. Now we've got quarter of an hour to cover and four more questions. So we probably have to speed up a wee bit. So the next question is on the cost of care…
Catherine Davies: Sorry. Just on that one, I think we might really have covered that one already unless. when I looked at that one, I thought it was similar to the question we'd asked already. So I thought we might clarify whether patients are service users that your organizations represent. Have they been, have they received digital services where they are based? So has a doctor or nurse come to them to do something like monitor their blood pressure or their blood oxygen levels where they are rather than them having to travel to a surgery or a hospital? I thought we could maybe tweak the question like that, if that's OK?
Helen Patterson: That's right. Yeah. You carry on. You're the expert.
Catherine Davies: Participant A, do you have any experience of services being delivered in peoples’ homes or care settings?
Participant A: I do, particularly in vaccinations of course. I'm sure that's the obvious one. We saw a mixed picture there. So the JCVI prioritization model, we think, positioned people with a learning disability in the wrong place. There was another issue with the dependence on the register, so there's a possible misunderstanding that there exists a complete register of people with a learning disability in the UK. There does not. The JCVI prioritization criteria talked about people with a severe learning disability [but] they didn't explain what they mean by that. It is not a universally understood and accepted term. So in terms of how that played out, what we saw was actually quite positive in many ways. GP teams that were able to would use up spare vaccines and give staff vaccinations that they probably shouldn't have, because the prioritization model was upside down. They might have been there to vaccinate people with a learning disability, but they had some spare vaccine and staff were there, so they gave it to them. What we saw there was local creative sorts of people doing the best thing and doing the right thing, probably despite the guidance, not because of it, if that makes sense, and probably in some cases putting themselves in a potentially difficult position because they were doing something they shouldn't really have been doing. But actually [there were] quite a few positive stories of that happening. So there was a lot there around primary care teams, [being] really shoulder to the wheel and getting on and doing things.
Catherine Davies: Yep, great example, thank you so much. Participant C, do you have any examples?
Participant C: I completely concur with what Participant A said. [In the] early pandemic, once we generated the first shielding list there was this horrible mismatch between the food deserts, if you like. So people were told to stay at home and it was the GPs that basically picked that one up, and the supermarkets themselves actually in some instances, which was genuinely terrifying. But the specific example I’ve got – and this was from doctors who contacted me, not patients – was oximetry at home. It was a research study and the doctors recognized it as such. While it was obviously in the context of a pandemic, it was a research study and it didn't follow protocols. They didn't even have a piece of paper that they could give to the patient to explain what the study was. It was just assumed that because people at home were desperate and ill and what have you, and needed some help that they would accept whatever was being done to them and that their data would go anywhere without telling them. And it's things like that which, as I say, might seem a bit petty, but it's stuff like that when doctors get in touch with me about that, I think that's a real concern.
Catherine Davies: Yeah, thank you very much. Participant C?
Participant C: My feedback and experiences of the beneficiaries we serve are very similar to Participant A, so there's no need to repeat what Participant A has said.
Catherine Davies: Thank you so much. Helen, shall I hand back to you now for the last question?
Helen Patterson: Yeah. There are a couple on the workforce here, so the first one is: are technological or mobile digital services an important part of the care received by service users you represent?
Catherine Davies: So again, I think we've sort of we've really touched on this one already, haven't we?
Helen Patterson: In some ways, yeah. But is it a big proportion or just a small proportion or really not very much at all, hardly touching them?
Participant A: I would describe it as growing. I think in terms of learning disability services, the role of technology is expanding. There are lots of companies looking for lots of different ways to support people, some of them a little bit sketchy, if I’m honest, in terms of trying to replace human contact with care and support provided by a machine. But some [are] potentially quite positive, so sensors that tell you when somebody's having an epileptic seizure or has moved and got out of bed, or whatever it may be. So I'd say in learning disability services, the role of technology is growing, is how I would describe it.
Helen Patterson: Are there any technologies you really want which would be really helpful?
Participant A: There are some impressive organisations that are really collecting data about everything. Staff have tablets that when they're working with people, everything that happens is recorded. Now that might seem a little bit over the top and a bit ‘Big Brother’, but what they are able to show when a Commissioner comes in and says, ‘we've had fewer incidents of concern, fewer challenging behaviour incidents, so we think that means we can reduce the package of care…’ what the provider is able to say is, ‘hang on a second, the reason why that's happening is because we did this, this, this, this and this, and if we hadn't then that those numbers wouldn't be the way they are,’ because they can capture that data. They can show the even quite minor interventions like giving the person a little bit of space, which isn't something that would normally be recorded in the service. So there are providers out there – I think this is probably the most extreme example – that are collecting data is using a single system to do it. They do everything on one system, they are really using data in a positive way. I think there are questions about it – some people would have questions about it – but what that says is that probably some providers are doing that and others are at different places. In terms of the sector you’ve probably got quite a broad range. There will be some very traditional services barely using tech at all.
Helen Patterson: Ok. Over to Participant B.
Participant B: I'll talk about two examples [and feedback] that we are aware of. One is amongst a group of beneficiaries with dementia. There's been some very, very successful uses of tracking bracelets to keep an eye on lost relatives who may be in more independent living. Part of me thinks it's great; part of me thinks it's a bit creepy and dubious. I'm not sure where I sit with it. There are a lot of rights issues for me there [which] I'm not sure I've settled on, or [that] my staff have settled on, but it is interesting in terms of safeguarding peoples’ whereabouts if they go walkabout. The second thing we've observed is very, very good pilots and larger rollouts of pain management apps. So this is where people are in long term health or palliative care situations and there are pain management apps to help make efficient use of let's say carers or nurses who might be administering medication. So people log on and click on… you know, when you go to the airport and there's pictures of how much pain are you in, a smiley face, pain, and people very quickly click on an app and say, ‘I'm in no pain today,’ or, ‘I'm in a lot of pain,’ and it helps to prioritize then nurses or other medical professionals who are mobile. Those are the two best examples that we've heard about that are being trialled around the country.
Helen Patterson: That sounds really interesting. Participant C, do you have anything to add anything to that.
Participant C: I think we're aware of similar sort of trials and what have you. I mean the basic problem which we smashed into with COVID is that there is literally no consistency of data gathering, certainly in social care and in large parts of the NHS. I know there's work ongoing with that. We support it. We actually are involved in some aspects, for example what is a social care data dictionary and schema, because you can't just use SNOMED, you're not capturing the same thing. There's no SNOMED code for when you last actually opened the window of your of your resident, which would have been probably one of the most useful things you could have known during COVID. But I think we should not conflate health and social care as data and digital issues without understanding that there are some profound underlying structural differences in the data that we are talking about. And [in reference to] Participant A’s point – those that just collect it all – well, OK, that's fine. But that's actually breaching the law. There is a data minimization principle, and all data that's collected has to be collected for a known purpose, and that isn't just as broad as we want it. So I think this is an important area, and we should be focusing on what it is that we need to collect or measure, because data is only what we choose to collect, and once we've got that sorted and agreed, then it's going to make a lot of these other issues of, ‘is it creepy? Is it not creepy?’ all that sort of stuff, actually quite a lot easier to tackle.
Helen Patterson: Does anyone want to make any kind of closing remarks before we get moved into the main group?
Participant A: Just a final comment then on Participant C’s point about collecting lots and lots of data. The other issue with that is the more data you collect, the more difficult it is to make sure it's accurate and complete and sufficient for the purpose. So I do recognise some of the issues around that. I think when I came away from the organisation when I met them, I came away thinking this is amazing, and also I've got some questions about it.
Participant C: This is the other point, which is that the collection of data can become an exercise in itself and that doesn't help people, because it's only if data is actually passed through to the appropriate person, [a] human being, who is going to take some action, that things work. We've seen that play out terribly with Baby P and all these things in the past, just having a system just collecting data is not the point. It only works if human beings are empowered to take the time and act in a way that will actually help someone.
Catherine Davies: Just to comment on that… where people do want to use technology, and they understand how it will be used and where they think that it can improve their quality of life and their health. Then imagine if we could then join that up with a system that worked well in terms of building trust and confidence in the system, wouldn't that be a great place to kind of end up?
Participant C: I know somebody who has got to the point of actually suing their trust because they simply cannot report really structured data that they can provide on their condition. I mean, this is after over a decade. So rather than maybe talking about that, I think if we started to do some small scale things that added up to a general ability for patients who are comfortable with technology to be able to report structured data, that again would be a very positive move and save whole bunch of repeated effort all over the system.
Participant B: If you want to be a people-centred organization you need to design all of your access to it, whether it's digital or face to face. I think there needs to be a lot more active engagement of – I hate the word – ‘hard to reach’ communities, but actually it's the digitally excluded communities and there's no point asking a group of technology enthusiasts to design your technology, because they're the early adopters and already there. I almost think you have to survey a very wide group, the naysayers, the people who will never use the technology, but also design accepting that you won't be able to reach 100% of the population, but to understand and make informed choices about who the technology will serve, how it will serve, and what alternatives that are for people who can't access the digital technology and are excluded. So I almost think you have to design the change, assuming you will exclude someone, and I know that doesn't feel great, but it's pragmatic and realistic. So to minimize that exclusion, it's best to probably engage more widely.
Participant C: To pick up on Participant B’s point… prior to campaigning in the early 2000s, I built a system for looked after children. The premise there was, because these were in many case kids [that] have been ignored or left behind or failed by various systems, including sometimes their own families, we had to choose to deal with the most difficult problems first. It's not just being inclusive, it is looking for those driving problems that people in those communities, or groupings, or what have you, are going to experience that will make what you design more robust for everyone. Sometimes it's all seen as sort of, ‘we've got to be nice and we’ve got to be lovely,’ and all this other stuff, and it's not. This is a hard engineering principle. You will find your most difficult problems in those people who are actually struggling, and if you just design for the masses then you will always continue to exclude people. So I really would strongly encourage again that inclusion is not an add-on, it's not a minor part. I think it should be put at the heart of the designing, coproduction, architecting, and engineering of these things, because only if we actually build systems that can deal with – again, a horrible term, but I will use it – edge cases. Will we have actually built a system that is likely to be able to include – and I agree with Participant B’s point, [it] will never be perfect – but likely to include many more people who are just repeatedly, repeatedly not designed for. So that's a bit of a bugbear of my own. But yeah, I've genuinely tried to tackle this problem over several years with a very
Catherine Davies: So how can organizations developing innovative services have those conversations, Participant C? Because I think sometimes they want to [but] they're just not sure how to. So presumably they can engage with organisations like the ones that you represent. What are some of the other things that people can do to actually deeply understand the challenges that some of those people face, and therefore how to coproduce solutions with them.
Participant C: You have got to work with organisations that represent the people, because just trying to go and find people yourself is nuts and really quite bad. So go to an organization that does represent those people, and genuinely represents those people, and work with those. What I did was work with those people to find the way to talk to the kids, to the carers, to the local authority… Just go to the representative organization that are generally representative, allow them to help you to find the people. So I work with the Who Cares Trust for kids in care, I’ve worked with housing trusts, for people who have financial inclusion problems. Those people [who] are actually delivering support of services to the communities that you're trying to reach are the people through whom you would need to go, and then they can also sanity check you. frankly, like that enthusiastic chat that Participant A saw in the call, or what have you. You know, you've got to approach this respectfully and properly and demonstrate competence as you go.
Catherine Davies: Yeah. Can we bring Participant A in, then?
Participant A: I'll try to be brief because I'm not sure how long we've got… but just to feed in that point about authenticity again. We've seen, particularly through the pandemic and from the department, I have to say, quite a lot of behaviour that I think is actually a little bit sketchy in some ways. So including people in meetings, and ostensibly treating the fact that I've attended a meeting as [though I have] endorsed a particular policy or particular decision; I haven't. So genuine involvement of people, genuine coproduction, takes time and it takes you in a direction you didn't think you'd necessarily would go. If you're not open to that, you just want to tick a box, then it seems to be quite easy nowadays to find the language that suggests that you've involved people, that suggests that you've consulted with people, when in actual fact you've gone through some kind of process that produces an audit trail, but it doesn't mean that those people were properly involved and included, and actually had a chance to say, ‘no, don't do that; do this,’ then you did what they said. When you create that space for that to happen it takes more time, and it goes in directions you didn't expect it to. That's what coproduction does. I wouldn't necessarily say my own organization is where it needs to be in relation to that, but that's how I understand genuine coproduction to work.
Catherine Davies: Yeah, that's really helpful, Participant B?
Participant B: Well, the irony about this is every time I do see attempts at digital coproduction, it's always advertised digitally. So if you're trying to reach digital, you know, it's very easy. We ask ourselves when we think about digital spaces, we say, ‘well, where are the digital hangouts in each community?’ We all go, ‘oh, there's a Facebook group and this community, and there's this online forum here.’ and so easy to ask the question about the digitally included. But to do the work on the digitally excluded, my experience with coproduction is [that] you need to go into the community, you need to go to libraries where people are not online; you need to go to places of worship; you need to go to care homes; you need to go into sort of mental health facilities; you need to go into prisons. That takes time and money. I think you don't just need to go in and have a chat, you need to involve them not just in their views, but in user testing to say, ‘if we were to bring this in, would you use it?’ every step of the way. I just think the question about how to reach digitally excluded communities is expensive, time-consuming and not easy at all. So I think you almost need to do community mapping locally and ask yourselves, where do people who are digitally excluded, how do they communicate, and almost take yourself back to 1984, a world without the Internet, with pay phones and, you know, paper leaflets like, how did we all communicate? How did it information get around, you know? You almost have to take yourself back to that mindset.
Helen Patterson: This is something the Office for National Statistics had to do when it was doing the digitally first census and really did a lot of community mapping to get out there into the communities.
Participant C: And it's people too. I remember the days I filled the room with pieces of technology and brought, you know, kids from [the age of around] eight or nine up into the early to mid-teens and we went around we played with them, and then I asked them, ‘would you touch this?’ They wouldn't touch anything to do with fingerprinting; nothing at all. This was in 2000 or so. And yet when I showed a scanner, a girl who carried her life around in a plastic bag, which she never let more than 18 inches from her body was like, ‘would that mean that I could have it safe somewhere in digital?’. But yeah, you've got to go back to basics and expose what it is that you can do, even the very tool that you're going to be using, frankly. With the carers, I just basically got everyone playing ‘The Sims’, but the adults had to play the game and the kids told them how to do it, because the adults were all sort of, ‘I don't do computer games.’ It's like you've got to really be quite creative in breaking through, not just identifying where the communities are, but once you get the people in the room, breaking through the preconceptions and the way in which they are going to just talk about it because they're being told their views are being listened to.
Participant B: I also think not stereotyping. I think we often make broad assumptions about whole communities. I hear the phrase ‘older people don't like technology.’ What a load of rubbish that is. Don't stereotype for a whole demographic. The experience of three people in a coproduction group is not the truth for every person in that community. So something like technology is not a universal truth that maps to EDI nicely. I think there has to be some basic questions of, ‘what are your barriers to using this,’ and you'll find out things that are nothing to do with technology but maybe have to do with deprivation or other factors that you haven't considered.
Participant C: Why assume that kids use mobile phones? There was one mobile phone and five kids were using it [playing ‘The Sims’]. This is a few years back. Five kids were using one mobile phone, or one mobile phones worth of batteries. They just had their own SIM cards and that totally changed how you would go about designing a thing.
Catherine Davies: I think we're about to be transported back.
Helen Patterson: Thank you.