5rights Foundation – written evidence (DAD0082)

 

About 5Rights Foundation

The digital world was imagined as one in which all users would be equal, yet 1/3rd of internet users are children.  Nearly one billion children are therefore growing up in an environment that systematically fails to recognise their age and the protections, privileges, legal frameworks and rights that together constitute the concept of childhood.

 

Working closely with children, we operate in the engine room of the digital world: supporting enforceable regulation and international agreements, developing technical standards and protocols, and helping businesses re-imagine the design of their digital services.

 

We fight for systemic change to ensure that the digital world caters for children and young people by design and by default, so they can navigate it creatively, knowledgeably and fearlessly.

 

5Rights’ submission focuses on the place of children in this important discussion.

 

Overview

We want to congratulate the Committee for recognising the need to consider the effects of technology on democracy, public discourse, and the public square. We share the Committee’s optimism about the potential benefits that the digital environment can bring both to the individual and to society more broadly. But we also share the Committee’s concern that digital technology can frustrate the democratic process, too, and undermine the participation and information rights of individuals and communities, on which the democratic process relies – including those of children.

A fifth of all internet users in the UK, and third of users around the world, are children. Despite this, children are consistently overlooked in both the design of digital technologies and consideration of its impact.  Where children are included in the conversation, technologists and policy-makers focus narrowly on child protection, to the exclusion of children’s participation rights, their childhood development needs and their welfare more broadly.

 

This is seen most clearly where discussions of the digital environment intersect with discussion of democracy. In all the attention currently given to electoral interference, disinformation, polarisation, fraud, data protection, end-to-end encryption, algorithmic bias, and online targeting, children have remained largely absent from the conversation. 

 

However, these issues actually affect children disproportionately, given both their developmental vulnerabilities and the fact that children’s status as ‘early adopters’ of emerging technologies make them ‘the canaries in the coal mine for threats to all.’[1] Children consume far more of their news from social media than adults do.[2] They are more vulnerable to the risks associated with profiling and behavioural advertising.[3] They are more likely to be affected by internet-related health threats, such as the growing anti-vax movement and gaming addiction.[4] They stand to be disproportionately affected by the move to encryption of online services, as we discuss in our response to Question 6. And, of course, they grow up with the consequences of democratic decisions in which they are given no part.

 

Article 3 of the UN Convention on the Rights of the Child (UNCRC) states that: ‘In all actions concerning children, whether undertaken by public or private social welfare institutions, courts of law, administrative authorities or legislative bodies, the best interests of the child shall be a primary consideration. Article 12 requires that ‘States Parties shall assure to the child who is capable of forming his or her own views the right to express those views freely in all matters affecting the child’.

 

We recommend, therefore, that in each of its recommendations the Committee properly accounts for the rights, needs and specific views of children and young people.

 

Answers to Questions 3, 5 & 6

 

QUESTION 3: What role should every stage of education play in helping to create a healthy, active, digitally literate democracy?

5Rights’ 2019 report Towards an Internet Safety Strategy identifies education as one of the seven pillars of a framework for promoting children’s welfare in the digital environment.[5] Education is not a replacement for a digital world that recognises childhood by design and default; but it is vital that children have the necessary skills and understanding of the digital systems that impact on their life experiences and life outcomes.

 

5Rights has worked for several years directly with children and young people - from deliberative juries to co-creation workshops - and over time has developed a series of ‘Data Literacy Workshops’. Crucially, the workshops, which focus on Data, AI Systems, Persuasion, and Consent, treat tech companies as neutral players.  For the Committee’s interest, please see some of the statements made by the participants of the workshop through the following link: https://5rightsfoundation.com/images/5Rights Data Literacy Quotes.pdf. 5Rights has also co-created a newspaper – The Digital Times – with the young people involved in the workshops, which we will share with the Committee shortly.

 

We would like to draw the Committee’s attention to the fact that children have strong views on privacy, transparency, fair-terms and harmful content.  And that both our work and the wider academic consensus forcefully support education that departs from the traditional, narrow focus on safety and teaches children about the motivations, mechanisms and technologies underlying their digital experience.

 

In 2018, 5Rights in partnership with BT surveyed the education programmes of 73 providers of digital literacy. Only one mentioned the commercial drivers that dictate the design of services that children use online. We also note that there is both parental and political concern around the world about children’s data protection in the age of edtech.  The irony of being taught digital literacy whilst having your data gathered and commercially exploited will not be lost on the Committee.

 

As we explain in Towards an Internet Safety Strategy, ‘There is considerable evidence that children want a very different approach to education; focused on the purposes of technology, and offering social and critical skills.’ [6] A more holistic approach to digital and data literacy helps equip children to identify, challenge, and overcome the obstacles to responsible online participation, allows them to advocate for their own digital experience and contribute positively to the democratic process and debate as a result.

 

Education and awareness must also extend beyond children:

 

We recommend that to be a healthy, active, digitally literate democracy, children must understand the purposes of the technology they use, have a critical understanding of the content it delivers, have the skills and competencies to participate creatively, and a reasonable, age-appropriate understanding of potential outcomes, including – but not narrowly focused on – harms.

 

We recommend that those providing digital literacy in schools, must be mandated to follow a broad (agreed) curriculum, and be required to deliver those services in a manner that protects children’s personal, educational and other data.

 

QUESTION 5: What effect does targeted advertising have on the political process, and what effects could it have in the future? Should there be additional regulation of political advertising?

The question of targeting is of profound importance to democracy, and the Committee should note that children have repeatedly been found to be more susceptible and vulnerable to online targeting of all kinds.[7] Regulation should reflect this, but it should also ensure that the same ethical controls are applied to digital advertising as to advertising in other settings. We support others who have made the detailed case that political advertising online should be subject to the same level of accountability as political advertising offline – including being clear about the financial and editorial source.

 

On the question of children specifically, 5Rights has previously made the following points:

 

In summary, emerging technologies provide more opportunity for children’s data to be collected for the purposes of targeting, more opportunity for children to be subject to targeting, and more opportunity for children to be impacted by targeting. This impact could, if ethical rules are established and upheld, be positive – but so long as targeting is so significantly motivated by commercial considerations, and remains unregulated, then negative impacts are likely to be amplified. Societal norms and expectations should be clearly stated, mandated and enforced in the digital environment – particularly where children are concerned.

 

*5Rights is currently producing a paper on the impact of targeted advertising on children and would be happy to share it with the Committee once it is published.

 

*5Rights is currently working on behalf of the Council on the Rights of the Child, on the creation of a General Comment on children’s rights in relation to the digital environment. We would be happy to share a draft with the Committee when it goes out for consultation in the New Year.

 

We recommend that children be recognised, in law, as a group in need of greater protection from the impact of targeting online. Specifically, children should not be subject to behavioural advertising and associated profiling.

 

We recommend mandated transparency and accountability reporting within both the commercial and political advertising sector.

 

QUESTION 6 To what extent does increasing use of encrypted messaging and private groups present a challenge to the democratic process

The growing trend of internet companies implementing end-to-end encryption across their services presents a challenge not just to the democratic process, but to wider efforts to promote the welfare of individuals, communities, and society. 5Rights is a pro-privacy organisation and recognises the societal value of end-to-end encryption to both individuals and specific user groups. However, there are several unanswered questions and consequences in the current adoption of end-to-end encryption.

 

Specifically, we would like the Committee to ask the relevant stakeholders:

 

  1. What information are providers of encrypted services able to access - including by automated means - while messages are on devices, i.e. before they are sent, and once they are opened on receipt?

 

  1. In light of ostensible concerns about the potential for child protection tools to be re-appropriated for other purposes, are there any known instances in which photoDNA has been compromised or exploited?

 

  1. What proportion and number of known child sexual abuse images currently being detected will no longer be detected once end-to-end encryption is implemented?

 

  1. What plans does Facebook Messenger have to replace existing tools for detecting child sexual abuse material?

 

  1. Why have companies such as Facebook not committed to implementing photoDNA at the client/device-side, as opposed to the server-side, as a means of continuing to scan for CSAM after the move to end-to-end encryption? (The co-inventor of PhotoDNA, Professor Hany Farid, has repeatedly stated that this is entirely possible).

 

  1. Is mixing up the specific issue of tackling CSAM with wider issues about security and crime is a cynical attempt to make more controversial invasions of privacy seem acceptable?

 

We note the recent position taken by HMG on this issue and whilst we are partly in agreement, we are deeply concerned that the issue of CSAM has been rolled into a broader political fight in which children will be the losers.  5Rights concern is that the push for/ fight against end-to-end encryption is a commercial vs state battle for information – and neither party is taking full account of the threat this represents to children.

 

In summary, whilst perhaps not in the Committee’s immediate focus, it is possible that in the name of the democratic process, end-to-end encryption will devastate the global effort to disrupt the distribution of child sexual abuse material (CSAM) online. 5Rights has produced a briefing on this issue with Professor Hany Farid, one of the inventors of photoDNA, which can be found here: https://5rightsfoundation.com/uploads/5Rights E2E encryption & CSAM briefing.pdf.

 

We recommend that the Committee questions the Government, relevant internet companies, and experts in the field, along the lines we have outlined. 

 

We recommend a commitment from all internet companies that the encryption of their services, offered as default or otherwise, will be implemented in a way that allows photoDNA and other technologies for disrupting the distribution of CSAM to operate. This commitment should be mandated by national governments and international institutions, and disentangled from wider security and criminal concerns.

 

For more evidence or information, please contact Jay Harman on 020 7502 3818 or jay@5Rightsfoundation.com

7


 


[1] Rethinking the rights of children for the digital age, Prof. Sonia Livingstone, LSE, March 2019

[2] Children and Parents: Media Use and Attitudes Report, Ofcom, 29 November 2017

[3] For example, some children of all ages have been shown to consider as trustworthy information that appears in advertisements, and to be unable to identify certain types of paid-for content - P. 7, 33, 38, Children’s Media Lives, Ofcom, 29 November 2017

[4] WHO recognises Gaming Disorder as a mental health condition, Mental Health Today, June 2018

[5] Towards an Internet Safety Strategy, 5Rights, January 2019

[6] Towards an Internet Safety Strategy, 5Rights, January 2019

[7] E.g. see Study on the impact of marketing through social media, online games and mobile applications on children’s behaviour, European Commission, March 2016

[8] https://www.theguardian.com/society/2018/sep/16/councils-use-377000-peoples-data-in-efforts-to-predict-child-abuse

[9] https://www.rcpch.ac.uk/resources/personal-child-health-record-pchr

[10] Children and Parents: Media Use and Attitudes Report, Ofcom, November 2017

[11] Recognition of advertising: online marketing to children under 12, Advertising Standards Authority, 2017

[12] Oral and written evidence, UK Advertising in a Digital Age, House of Lords Select Committee on Communications, April 2018