Written Evidence Submitted by Professor Rosalind Edwards, Professor Val Gillies, Dr Sarah Gorin, Dr Hélène Vannier Ducasse

(GAI0035)

 

Executive Summary:

In response to this call for evidence on the Governance of Artificial Intelligence by the Parliament Science and Technology Committee, we provide evidence and policy recommendations in relation to the following:

Based on our ESRC-funded project Parental Social Licence for Data Linkage for Service Intervention, this response brings specific evidence regarding:

Recommendations

To guarantee the fair, socially legitimate, and trusted use of data linkage and analytics of personal administrative data for public service delivery, we call for:

            See Implication 2.

            See Implication 3.

            See Implication 4.

About the research project:

The Parental social licence for data linkage for service intervention research project is providing a comprehensive understanding of the social legitimacy of and trust in operational data linkage and analytics as a basis for intervention in family lives.

The research fills a vital gap in knowledge about the dynamics of social legitimacy and trust among parents of dependent children, in a context where policy developments and data processing practices to inform services interventions may be moving ahead of public knowledge and consent.

The project is funded by the Economic and Social Research Council under grant number ES/T001623/1.

Response authors:

Rosalind Edwards, Professor of Sociology at the University of Southampton, has researched and published extensively on family issues, services, and policies.

Val Gillies, Professor of Social Policy and Criminology at the University of Westminster, researches in the area of family, social class, marginalised children and young people, and historical comparative analysis.

Sarah Gorin, Assistant Professor at the University of Warwick, has extensive research experience in the field of children’s social care.

Hélène Vannier Ducasse, Senior Research Fellow at the University of Southampton, researches the political representation of disadvantaged people and the unequal impact of social policies. 

Citation:

Edwards, R., Gillies, V., Gorin, S. and Vannier Ducasse, H. (2022). A Response to the Parliament’s call for evidence on the Governance of Artificial Intelligence (AI).

DOI: 10.5258/SOTON/PP0027

Relevant Sustainable Development Goals:

This evidence addresses the following SDGs:

It calls for the most vulnerable ownership and control over new technology (target 1.4) to build appropriate and equal access to services and social protection systems (target 1.3), thus adopting pro-poor development and poverty eradication strategies (target 1.b).

It calls for the social and political inclusion of all (target 10.2) and for combating discriminatory laws, policies and practices, towards reducing inequalities of outcome (target 10.3) and progressively achieving greater equality (target 10.4).

It calls for ensuring public access to information and protecting fundamental freedoms (target 16.10), and for responsive, inclusive, and participatory decision-making (target 16.7), towards enforcing non-discriminatory laws and policies (target 16.b) and building accountable and transparent institutions (target 16.6). Thus, it strengthens the rule of law and equal access to justice for all (target 16.3).

Research findings:

Finding 1.      The legitimacy of automatically collecting, linking and analysing families’ administrative data is poor among parents across the UK[1], with only 11% approving it. Yet, 67% would find it acceptable in principle for specific ends.

Such acceptability is however withheld when it comes to trusting specific public services to do so, with only half of parents overall trusting local councils’ education, children or early-years services, the police or criminal justice.

Finding 2.      The opinion of more vulnerable parents is particularly important in view of their lesser social legitimacy and trust. This is particularly the case of lone parents, younger parents, larger families, and Black parents. These groups are also most likely to require public services and to be identified for early service interventions.

Table

Description automatically generated

Table 1 - Trust in organisations to join together administrative records to identify families to target public services across the UK

Finding 3.      While parents are the main stakeholders, they cannot fully assess the legitimacy of actual administrative data processes. They consider not to know how their records are being used (82%). Only half know that digital records from different sources can be linked together (53%). Yet, most think that Government should publicise how they use families’ data (81%).

However, there is no accessible register, nor straightforward processes to find out about administrative data processing. Responses to our Freedom of Information to 220 UK Local Authorities, provided incomplete, irrelevant, and ambiguous about their use of operational data linking and predictive analytics.[2]

Finding 4.      Parents across the UK consider they must be asked permission for their family’s administrative records to be processed, at 60% overall and 66% among parents from marginalised groups.

Yet, they denounce their lack of consultation in the process, and the low meaningfulness of the consent they provided, when at all requested. Some parents[3] testify:

“I think that this whole sharing of information and data is fundamentally wrong without consent because it’s my data. It belongs to me, this information. It is not information that should be sold or shared with any other agency without my consent.” (Mother)

“I would like to have been asked but I would’ve agreed, because I think I was more scared about not agreeing... you’re in that goldfish bowl.  So yeah, I wouldn’t have said no.” (Mother)

Finding 5.      For service-user parents, identifying families that might need help isn’t the main challenge. Thus, the social legitimacy of using personal data to do so, highly depends on guaranteeing the resources for and availability of services:

“They're only offering short-term interventions or haven't got any space. So what's going to come with that information… You can have all the information you want in the world, but if it's not backed up by a really holistic service that's fully funded, then you could be opening wounds or creating issues for people.” (Mother)

“It does surprise me… People are asking already for help, but they don’t meet thresholds, they don’t meet criteria. So how are they going to make it better sharing information?” (Mother)

“That astonishes me, I'd be absolutely astonished and flabbergasted if that ever happened because the families I know are not supported in any way… I don't think there's capacity to be reaching out to families and doing preventative work with them That's not what I suspect what gathering the data's really about… I just don't buy that.” (Mother)

Finding 6.      Parents fear a lack of reasonable human agency in interpreting data and denounce the weight of records in the mediation of services, preventing them to be seen as they are. As such, they worry about the future consequences of data records, linkage and analysis.

“They maybe sort of rush in with sharing too much information too soon or more than is necessary… Too much information can be bad because there's only so much information someone can read before… your brain just automatically decides what type of a person they are… People believe that if it's down on paper, on a document, on a computer, it must be true.” (Mother)

“People make up their minds based on what they've read from other places [and] based on my past rather than speaking to me about what my present is like… People [don’t] have the chance to tell their story as it stands there and then. [They are] being labelled as something they used to be… and sometimes that information could be incorrect… It’ll stand as true forever and I just don't think that's fair.” (Mother)

Finding 7.      In such a context, parents worry about the damaging potential of data linkage and analysis to identify families that might need support. Vulnerable groups are particularly concerned that it will lead to discrimination (57% of Black parents, 52% of lone parents, 60% of younger parents), and to putting families off accessing services (62% of Black parents, 52% of lone parents, 54% of parents in larger families).

Service-users parents see the potential for errors, labelling and family avoidance resulting from further data processing, as a double-edged sword if not a total ‘no go’:

“Individuals might be victims of circumstances Is that going to be reflected? Then you’re taking raw data and creating a meaning without having all the information to hand, which potentially, could be quite dangerous...” (Father)

I can imagine if people found out: ‘oh there’s a system deciding whether you’re going to get some involvement’, it would make things ten times worse because you've not just got to worry about people, you’ve also now got to worry about a system judging you.” (Mother)

“I think all situations are different and it’s quite dangerous to use someone situation and predict someone else’s because you could be right, but you could be massively wrong… I wouldn’t want [my data] being shared to try and predict for other people...  I think the risks of it far outweigh the benefits.” (Father)

“Well, you're being labelled and categorized… it might not be accurate [but it] travels with you through all your dealings... And I don’t know what that signifies [or] implies for the future but I think you'd have to be very confident your information was accurate if you're going to use it like that.” (Mother)

Finding 8.      Parents’ lack of adhesion to data-processing is directly affected by previous experience of highly inaccurate data-handling. There was a consensus that administrative data may not be accurate, particularly among more vulnerable parents (with up to 79% for Black parents)

Interviewees reported many incorrect recording experiences, such as children’s surnames, addresses, ethnicity, but also regarding health information, personality portrayals, or important life events. They denounced digital data’s potential for decontextualising information, with potentially damaging consequences:

In one report, she put that my [child] was outside with no shoes on and a coat undone…  but failed to put that he was on a trampoline… just bouncing up and down... So, of course people reading were seeing that this X-year-old was outside with no shoes and no coat on in the middle of October… So, that was one incident where they’d taken bits out, and that’s the problem with the computer.” (Mother)

Finding 9.      The lack of data minimisation – which purports data must be adequate, relevant, and limited to what is necessary – was highly denounced, and focused on the use of inappropriate language,  high level of personal judgement and unprofessional comments, but also the amount of old, irrelevant, intrusive data dug-up, notably regarding their past.

Interviewees also reported confidentiality issues, with personal information disclosed to violent ex-partner or passed on to an adoptive family without being told.

Great distrust in the handling of administrative data followed, with parents worrying that enlarged sharing and data processing would lead to further issues of confidentiality, decontextualization and irrelevance of the data.

Finding 10.  Public authorities failed handling 6 Subject Access Requests (SARs) undertook by five interviewees - to the police, Children’s social care, and education services. One parent faced silence; the others, despite relying on advocates or the court, got incomplete access and experienced delayed timescales (9 months to ‘ongoing’ after 2 years). The ICO called out many public authorities for failing to comply with SARs, following similar complaints of dismissal, withheld data and belated responses.[4]

For service-users, access to their information is important to gain a fuller picture of decisions made, but also check, correct their data, and hold agencies accountable for mishandlings. By accessing the school data, a family has for example, been able to respond in court to social services allegations.

Service-users’ rights to have their data actualised or rectified neither seem to be respected, leading to greater anxiety in the process, and a lack of trust in professionals:

“I had to [deal] with social services, there was actually like false information placed on there, I told them about, to the point now where at every meeting I have to get in an advocate…” (Mother)

“I know that can't be amended once it's had something entered into it … I feel that my name's been cleared [from previous allegations] but in terms of data, that stuff is on our record… it's going to be there forever probably.” (Mother)

 


Implications:

Implication 1.     For administrative data-linkage and analysis to become socially legitimate, it is paramount for policymakers to account for the conditions of their acceptability and address the reasons behind parents’ lack of trust in public institutions. Especially, of more vulnerable groups and service-users. Pursuing the processing of administrative data, would otherwise exacerbate the gap and social polarisation on acceptability and trust, with great consequences for a cohesive and equal society.

       See Finding 1 and Finding 2.

Implication 2.     The linkage and analysis of data to identify families that might need support must be considered with extra attention to keep at sight not only good intentions, but families' real experiences.

Parents are already experiencing the greater authority of the written records, which they feel are too easily considered to be the truth, and miss being confronted with their lived experiences and needs.[5] This highly invalidates the premise that ‘human control’ really makes the difference not to automatise decision-making, such as defined in the UK GDPR.

The potential labelling such use of data involves, even more as data is felt to be highly inaccurate and irrelevant, making parents demand to guarantee resources and services paramount for avoiding such systems to systematise discrimination rather than fight social inequalities. Despite their good intention, parents fear such data processing may approximate what the EU AI Act define as ‘social scoring’.

In a context of ever-increasing demand on public services and reduced resources exacerbated by austerity, the COVID-19 pandemic, a cost-of-living crisis and rising child poverty – the use of data linkage and algorithms cannot be used as substitutes for the provision of quality social policies and services.[6]

       See Finding 5, Finding 6 and Finding 7.

Implication 3.     Under GDPR, consent is not the only lawful basis for processing personal data, and public authorities are likely to use alternatives. In line with the EU AI Act, it would however be essential to reinforce access to information and to guarantee legal and practical conditions for parents to provide meaningful consent to use of their personal data.

       See Finding 3 and Finding 4.

Implication 4.     The UK-GDPR must be consolidated to reinforce public authorities’ compliance with its principles, which they may too easily disregard or be exonerated for. To guarantee parents’ right to access or rectify their data, to clarify purpose, storage limitations, guarantee data minimisation or confidentiality are essential to achieve social legitimacy and trust in processing administrative data. Until public authorities provide data subjects with the dignity and humanity of a service that promotes trust, there is a significant question for policymakers to answer about whether it is responsible to continue to collect, link and analyse administrative data.

       See Finding 3, Finding 8 and Finding 9.

 

(November 2022)

 


[1] The statistics provided in this evidence are issued from a representative probability-based survey, with a sample of 843 parents of dependent children across the UK. It inquired the condition of social legitimacy and bases for trust in processing families’ administrative data.

[2] See http://generic.wordpress.soton.ac.uk/parentdata/wp-content/uploads/sites/394/2021/10/FoI-Responses-working-paper.pdf

[3] The quotes included in this evidence are extracted from 9 focus-groups held with parents of dependent children, and 23 interviews held with parents with some form of family support service across the UK.

[4] Data sharing: MoD and Home Office ignored people's data requests - ICO - BBC News

[5] Such finding is also supported in: https://www.adalovelaceinstitute.org/report/knotted-pipeline-health-data-inequalities/

[6] Edwards, R., Gillies, V., & Gorin, S. (2022). Problem-solving for problem-solving: Data analytics to identify families for service intervention. Critical Social Policy, 42(2), 265–284.