CIE0307

Written evidence submitted by Defenddigitalme

 

Written submission for the Education Select Committee from defenddigitalme relating to the impact of Covid-19 on digital standards and rights in the education environment

As an estimated 90% of the worlds student population are affected by school closures[i] in the COVID-19 pandemic, technology is playing a vital role worldwide. Some tools enable the delivery of essential information, connecting school communities outside the classroom. Others provide national platforms[ii] for sharing educational materials, or offer alternative means and modes of Assistive Technology and augmented communications, supporting the rights of those with disabilities.[iii] However the rushed adoption of technology around the world, to deliver emergency remote instruction, risks undermining learnersand childrens rights at an unprecedented speed and scale. We urge organisations and education authorities to procure and recommend only those technologies which openly demonstrate that they uphold childrens rights, and call on States Parties to offer a secure online space for all children to access knowledge without commercial interference.

It is unknown whether the UK government has undertaken the necessary contingency planning to ensure the resilience of systems, or has made any attempt to assess the cost or effects of their use on disadvantaged groups with a view to preparedness for future lockdowns and return to extended periods of remote learning.

In the UK we can use seven COVID-19 case studies to demonstrate in summary both government and companies’ failures to meet high standards in addressing their lawful obligations towards children’s rights:

  1. Lawful basis for personal data processing have been abandoned: The Welsh Government decision that schools will no longer rely on consent and instead will provide these additional services as part of their public taskto adopt edTech tools; a change the government has been considering making [] for some time, but has brought forward the decision.But Google Inc. cannot rely on the public task.
  2. Highly invasive data processing of sensitive (special category) data without clear basis in law: University Technical College Leeds installation of a high spec thermal camera using facial recognition technology to enable hidden thermal imaging and temperature measurements of students and staff.[iv]
  3. Predictive analytics using machine learning Local Authorities under pressure to use, yet the ML tools "system errors, unreliable performance and lurking biases may have life and death consequences.[v]
  4. Pupils' Internet surveillance without clear basis in law: surveillance that extends beyond the school gates and into the home and personal time, in both safeguarding through schools and directly by commercial companies (safeguarding-in-schools software), has expanded from pre-COVID-19 at scale, since many more children spend much more time out of the classroom and are using school devices at home, are logged on to the school network using personal devices, or may have school software clients installed on personal devices. Most families are not aware of this data processing and its implications.
  5. Due diligence failure in procurement: The Department for Education decision to award exclusive funding to schools to purchase either Google G-Suite for Education or Microsoft Office 365 Education.[vi] Further commercial apps have benefited from expansion embedding into common use at speed and scale.
  6. A need to level-up legislation to protect the privacy rights of the child across the UK: Biometric data are processed routinely in schools by commercial products, and the Protection of Freedoms Act 2012 applies only to schools in England. In other countries use of biometrics is not permitted for children and fingerprint in canteen use was found unlawful under the GDPR. Furthermore, we made recommendations[vii] for a precautionary approach in COVID-19 to restrict use of fingerprint platens in school canteens and librariesthis has not been taken up by the Department for Education.
  7. Accessibility standards: divides that existed before COVID-19 are exacerbated compared with previous homework, because of the amount of time a child is expected to use a digital tool; not only in terms of infrastructure (hardware, software and broadband access) but in design e.g. in the case of dyslexia. 
     

Questions of risk and childrens rights in the digital environment
 

Efficacy: The hype of edTechachievement in the classroom so far, far outweighs the evidence of delivery. Neil Selwyn, Professor in the Faculty of Education, Monash University, Australia, writing in the Impact magazine of the (UK based) Chartered College in January 2019 summed up:

the impacts of technology use on teaching and learning remain uncertain. Andreas Schleicher the OECDs director of education caused some upset in 2015 when suggesting that ICT has negligible impact on classrooms. Yet he was simply voicing what many teachers have long known: good technology use in education is very tricky to pin down.[viii]

 

Yet there are few standards in education; in pedagogy, efficacy or safety, that edTech are required to reach before adoption in UK schools, creating risks for individuals, communities, for cost and State infrastructure.

 

Procurement: The golden hour of freeware and reduced cost by edTech companies in COVID-19 will soon end—many expire in September, and schools will need to decide, with poor evidence, what is worth keeping and whether they can afford to run the software and hardware necessary, and what costs there are to end use.

The public purse should not be exploited for poor technology, be subject to price gouging[ix] or poor practice that results in waste of public funds, such as the over £1 million[x] spent in West Cheshire College experiment with RFID chips in 2013. And security and ownership of critical infrastructure systems are crucial to get right. Procurement is the pathway for global giants in private equity including those that buy out UK companies[xi], and multi national owners to access the UK school system and children’s lives. With around one third of the global funding of edTech coming from China and half from the U.S. as Neil Selwyn wrote in Techlash last month, in terms of digital education, there needs to be sustained scrutiny of the emergency actions and logics that are being put into place.As Bianca Wylie wrote in the Boston Review recently, ‘technology procurement is thus one of the largest democratic vulnerabilities that exists today."[xii]

 

Exploitation: Emerging technology company CEOs are speaking at events daily in this crisis, for example on AI product training using children’s data to create their products (likely unlawful without consent) using the COVID-19 crisis as an opportunity to onboard “hundreds of thousands” more children who do not understand their data are collected every few seconds or via every mouse-click for a company’s own use.

 

You have to train an AI. And you use teachers and parents and schools to do that. It takes years to train an AI. [] ongoing every second of every single day youre taking teacher, parent, and student interactions, taking their feedback, youre AB testing.AI doesnt learn about ethnicity by itself.[xiii]


Children are disadvantaged by the power imbalance between them and school authorities[xiv] under normal circumstances. But this imbalance is only made worse in the current situation, as some State Parties choose to impose apps, platforms and surveillance, allowing commercial companies into childrens home life without consent.[xv] Our government appears to give no weight to the concerns that other state and supervisory authorities[xvi] have over these tech giants, including banning Microsoft[xvii] and filing lawsuits against Google. Others take a view that tracking student data without parental consent is not only illegal, it is dangerous.[xviii]

 

Rights: Sustainable Development Goal 4[xix] adopted by all United Nations Member States in 2015, provides a shared blueprint to ensure inclusive and equitable education opportunities for all, without discrimination. Article 24 of the UN Convention on the Rights of Persons with Disabilities, and the UN Convention on the Rights of the Child, ratified by over 197 State Parties worldwide, already offer a robust framework for protecting childrens rights that should be applied by all parties in the rapid adoption of online learning technologies during the COVID-19 crisis. Childrens rights include non-discrimination (Article 2 UNCRC), that the childs best interests shall be a primary consideration in all things (Article 3 UNCRC), freedom to full development of their personhood and character (Article 6 UNCRC), rights to privacy and reputation, and protection from arbitrary or unlawful interference with family, home or correspondence (Article 16 UNCRC and Article 8 of the European Convention on Human Rights (ECHR), Article 7 CFREU), data protection (Article 8 CFREU), freedom of expression (Article 13 UNCRC, Article 10 ECHR, Article 11 CFREU) and freedom of thought (Article 14 UNCRC, Article 9 ECHR, Article 10 CFREU). And the education of the child shall be directed, according to Article 29[xx], to the development of the childs personality, talents, mental and physical abilities to their fullest potential with respect for human rights, fundamental freedoms and principles. Children are also entitled to protection from economic exploitation under  UNCRC Art. 32.[xxi]

Advancing childrens critical rights under COVID-19

Few of these human and child rights are protected in practice, when it comes to applied technology standards in education in England and Wales. In Scotland some safeguards are in place through a Local Authority due diligence model, and for example Dundee has a list of approved and declined apps and platforms for failures of efficacy or legal compliance[xxii] — some of which however, continue to be used in schools in England.

 

Data Protection law and the GDPR are “not an obstacle to distance education during the Coronavirus pandemic, it gives the possibility for schools to reasonably implement appropriate distance education methods and techniques, while at the same time respecting the basic data protection rules said Jan Nowak, the President of Poland’s Personal Data Protection Office (UODO).[xxiii]  

 

In the US, the Federal Education Rights and Privacy Act (FERPA) better protects children’s personal data and privacy in education. The FTC issued a statement on both remote learning and childrens privacy, and COPPA Guidance for Ed Tech Companies and Schools during the Coronavirus.[xxiv]

 

The Slovenian DPA (IP)[xxv] issued a non-binding opinion regarding the processing of personal data of teachers and pupils when new technologies are used in order to offer or participate in a lesson. Data controllers (i.e. schools) should seek an adequate legal basis and pay attention in particular to their information obligation, the security of personal data, possible data transfers abroad and the principle of data minimisation.
 

Policy makers in the UK 

    should consider the impacts of the current use of e-learning, and conduct and publish impact assessments on childrens rights, equality, and data protection as part of due diligence in its adoption.

    must adopt only products that adhere to the obligations to respect, protect and fulfil the rights of the child in the digital environment[xxvi] and UN General Comment No.16 (2013)[xxvii] regarding the business sector impact on childrens rights for both domestic product promotion and those the UK promotes for export[xxviii].

    make and publish transparently any decisions about new national level product or service adoptions including due diligence and funding commitments, and policy makers should commit to review practices, and their impacts with civil society including the most affected and marginalised communities and the Supervisory Authorities (including the ICO), once the emergency situation has ended.

 

However, as the complexity of computer design and algorithms grows, it is increasingly difficult for a school, without the necessary expertise, to understand how many of these tools work. Researchers at the Oxford University Department of Computer Science revealed in 2018 the extent of hidden trackers, in an assessment of nearly one million apps. Education apps were some of the worst offenders. If by borrowing the building blocks of code to create some of their apps’ functionality, developers might not even understand the full extent of their app development in the ecosystem,[xxix] how can teachers be expected to understand how they work and explain it to families? Or really check their efficacy beyond what the company marketing tells them? All these elements create a need for expert due diligence and support in educational technology procurement. The cost effectiveness of this may only be practical in a shared model at regional level but without it, we cannot hope to realise a rights’ respecting model of edTech use across education in England.


Recommendations for development of standards in the digital education environment

The investigative burden in schools at the moment is too great to be able to understand some products, do adequate risk assessment, retrieve the information required to provide to the data subjects, and be able to meet and uphold usersrights. School staff commonly accept using a product without understanding its full functionality and fail to inform children of their rights, the lifelong implications of using the product and what routes of redress if thing go wrong. We need a strong legislative framework to empower staff and enable companies to know what is permitted and what is not, processing childrens data from education and to enable a trustworthy environment fit for the future, so families can send their children safely to school.

 

Emerging technologies are increasingly invasive including elements of social and emotional data, behavioural science and nudge techniques, neuroscience, personalisation, facial recognition and gait analysis, affective tech[xxx], and these should not be trialled on children in schools. Any research studies should require ethical oversight and opt-in consent in research conditions. Some of these challenges are being addressed by upcoming guidance to be issued by the Committee on Convention 108 on data protection in education.[xxxi]

Researchers at LSE in 2019 have documented how children care about their privacy online, that they want to be able to decide what information is shared and with whom,[xxxii]  but also that,teachers are unclear what happens to childrensdata and there is common misunderstanding how much data leaves a school:


              The only time it does [to the government] is when we do the Year 11 data [...] Because obviously
              theyll do the tracking of different groups. (teacher, London)

 

              and when it comes to using educational platforms,I would've thought the fact that it's a

school-based software, this is all been properly regulated.(teacher, London)


Legislation, statutory Codes of Practice, and enforcement action are needed to protect the full range of human rights of the child and young people in the digital environment in education. For a variety of motivations, there is a rapid growth of commercial actors and emerging technologies in the $8bn global edTech market[xxxiii], propagated not only by angel investors and tech accelerators in US and UK English language markets, but across the world and these business models are often opaque or exploitative. At the same time, under the pressures of austerity and marketisation, the infrastructure to deliver UK state education is exposed to risk via commercial freeware. There is pressure to use benchmarking data analytics, pool pupil data into data lakes, and link student data in Higher Education[xxxiv] with other government departments data (HMRC, DWP) or use it together with health, policing and commercial data broker data for predictive risk scoring. Implications for the security and stability of the state sector education infrastructure, the costs to privacy, and effects of normalisation, may last a lifetime for this datafied generation and the State.

 

"Children do not lose their human rights by virtue of passing through the school gates. Thus, for example, education must be provided in a way that respects the inherent dignity of the child and enables the child to express his or her views freely in accordance with article 12, para (1), and to participate in school life.(paragraph 8 of its general comment No. 1, on the aims of education, the UN Convention Committee on the Rights of the Child stated in 2001)[xxxv]

 

Today, those rights compete unfairly with commercial interests. Data protection law alone, is weak protection for children in compulsory education. In addition, there are serious risks and implications for the cost and control of the state education system and its future infrastructure, for schools, and for families due to routine dependence on key technologies beyond local control at national, local authority and school levels.
 

Proposed elements for legislation

 

Further enquiry: Every State that has ratified the UN Convention on the Rights of the Child (UNCRC) is required to report to the Committee on the Rights of the Child ('the Committee') on how it is fulfilling its obligations under the Convention. The UK review is next due to take place in 2021-22. Ahead of and in preparation for that, we propose the Education Select Committee have an enquiry on Digital Rights and Standards in Education to review the current landscape of standards of edtEch and e-learning platforms:[xxxvi]

 

a) Assess the current landscape of digital infrastructure in schools

   Risk to schools, state infrastructure and individuals of digital dependency on Big Tech giants (including freeware) such as Google (now under legislative scrutiny in Denmark and New Mexico) what are the current costs in the UK and where is what delivered for free, where are we dependent? What are those companies future business intentions and their plans for future costs?

   Risks in not delivering adequate digital standards in schools (broadband access, funding, skills)

   The costs to parents of rapidly expanding parent-buy/loan schemes of iPads, Chrome Books, and required parent purchase schemes.

   The costs to the school system and environment, of obsolete hardware (cupboards full of iPads, no right to repair, and no funds for upgrades.)

 

b) Review the rights of the child in the digital environment in schools 

   Assess the scale and volume of commercialisation of delivery of education through apps and platforms and other third-party companies

   Assess current oversight procedures on third-party digital procurement infrastructures and policies

   Are the UNCRC rights of the child in the digital environment that as Council of Europe members, we signed up to in 2016, and GDPR/UK Data Protection law two years on now met in education? Or what resources and support are still needed by schools and institutions in order to meet achieve these necessary standards at national, regional (LA/MATS) and local levels
 

c) Recommend post-legislative scrutiny of regulations passed since the growth of digital tools (2000)

   review the types of actors, and volume, velocity and value of DfE pupil data distribution today at national level[xxxvii], very different from when the primary legislation was passed (Education Act 1996)

   review the linkage across datasets from age 2-19, including Local Authorities linking pupil data with data broker data, and other commercial actorsre-use in predictive data analytics, and the effects and any unintended consequences of Longitudinal Educational Outcomes data (created through the linkage of pupil data with HMRC and DWP records)

   assess the fitness of national datasets for policy making, as part of the UK national data strategy.

 

d) Assess the need for new education law for the digital age the Education Rights and Privacy Act 

   a vision to govern access to educational information and records by commercial companies, public bodies and other third parties, including researchers, potential employers, and on foreign transfers and takeovers. Clarity, consistency and confidence will be improved across the education sector with a firm framework and oversight. Even with regional differences due to devolution of educational powers, some aspects of consistency would level up standards and rights across the UK for all

   review proposals, for safe national pupil data (Department for Education data distribution) practice

   ensure inclusion of the rights of every child, with disabilities, and special educational needs.

   address standards and childrens rights in education in the digital environment in twelve areas:
 

1. Transparency (extent of commercial products use for children’s rights, freeware effects of practice in data processing, dependency on commercial product delivery and their future business model)

2. Empowerment of families under the rule of law (equality of access, rights and routes for redress)

3. Safe data by default (National Pupil Data processing for 23 million records)

4. Accountability in public sector systems (education data linkage including predictive risk scoring)

5. Avoiding algorithmic discrimination  and designing for fairness

6. The role of education data in the national data strategy

7. Accessibility, Infrastructure and Internet access

8. Horizon scanning including emerging technologies

9. Online harms

10. Privacy of communications and profiling

11. Security

12. Teacher training (there is no standard content in ITT on data and digital practice or rights).             



About defenddigitalme

defenddigitalme is a call to action to protect childrens rights to privacy. We are teachers and parents who campaign for safe, fair and transparent data processing in education, in England, and beyond. We advocate for childrens data and digital rights, in response to concerns about increasingly invasive uses of childrens personal information. Funded by the Joseph Rowntree Reform Trust.              (June 2020)                                           

                            Page 7 of 7


[i] UNESCO COVID-19 Educational Disruption https://en.unesco.org/themes/education-emergencies/coronavirus-school-closures

[ii] UNESCO list of National learning platforms and tools (accessed March 28, 2020) https://web.archive.org/web/20200325181822/https://en.unesco.org/themes/education-emergencies/coronavirus-school-closures/nationalresponses

[iii] UN Convention on the Rights of Persons with Disabilities (UNCRPD) Article 24 https://www.un.org/development/desa/disabilities/convention-on-the-rights-of-persons-with-disabilities/article-24-education.html

[iv] UTC Leeds reopens its doors to students (June 2020) Permanent copy https://web.archive.org/web/20200706095742/https://www.utcleeds.co.uk/news/2020/06/23/utc-leeds-reopens-its-doors-to-students/

[v] Predictive analytics https://www.londoncouncils.gov.uk/our-key-themes/our-projects/london-ventures/current-projects/childrens-safeguarding but responsible design standards are not mandated > https://whatworks-csc.org.uk/research-report/ethics-review-of-machine-learning-in-childrens-social-care/

[vi] DfE funding for Google / Microsoft https://www.gov.uk/government/news/schools-to-benefit-from-education-partnership-with-tech-giants

[vii] defenddigitalme COVID-19 recommendations for a precautionary approach in use of fingerprint platens https://defenddigitalme.com/wp-content/uploads/2020/03/Published-Briefing-on-Fingerprint-readers-in-schools-and-Coronavirus-COVID-19-March-9-2020.pdf

[viii] Neil Selwyn, Monash University Australia, writing in the Impact magazine of the Chartered College, (January  2019) https://impact.chartered.college/article/editorial-education-technology/

[ix] Massive Data Breaches, Billions in Wasted Funds: Who Is Holding Edtech Vendors Accountable? (2017) https://www.edsurge.com/news/2017-05-24-massive-data-breaches-billions-in-wasted-funds-who-is-holding-edtech-vendors-accountable

[x] West Cheshire College https://www.theguardian.com/technology/2013/nov/19/college-rfid-chip-tracking-pupils-invasion-privacy

[xi] Chinas NetDragon acquire Edmodo for $137.5 M (2018) https://www.edsurge.com/news/2018-04-09-china-s-netdragon-to-acquire-edmodo-for-137-5-million — Edmodo had a breach of 77m records in 2017. Deep dive into the Edmodo data breach https://medium.com/4iqdelvedeep/deep-dive-into-the-edmodo-data-breach-f1207c415ffb

[xii] Bianca Wylie bostonreview.net/politics/bianca-wylie-no-google-yes-democracy-toronto

[xiii] Century Tech CEO at CogX June 8, 2020 https://www.youtube.com/watch?v=1OrtiBdSPDE [05:20]

[xiv] Facial recognition Sweden GDPR fine (2019) https://edpb.europa.eu/news/national-news/2019/facial-recognition-school-renders-swedens-first-gdpr-fine_en

[xv] Hwb Additional Services for every learner, 20 March 2020 https://hwb.gov.wales/news/article/76979aea-3819-42e9-9c10-121e907ef922

[xvi] Norwegian newspaper Aftonposten: The Data Inspectorate is investigating whether it is legal to use Google in school (February 2020)  https://www.aftenposten.no/norge/i/pLvba6/datatilsynet-undersoeker-om-det-er-lovlig-aa-bruke-google-i-skolen

[xvii] German state bans Office 365 in schools, citing privacy concerns  July 2019 https://www.theverge.com/2019/7/15/20694797/hesse-german-state-gdpr-office-365-schools-illegal-data-protection

[xviii] Google sued by New Mexico attorney general for collecting student data through Chromebooks (Feb 2020) The Verge https://www.theverge.com/2020/2/20/21145698/google-student-privacy-lawsuit-education-schools-chromebooks-new-mexico-balderas

[xix] The SDGs work by countries and the UN https://sustainabledevelopment.un.org/post2015/transformingourworld

[xx] UNICEF aims of education https://www.unicef.org.uk/rights-respecting-schools/the-right-to-education/

[xxi] Van der Hof, S et al (forthcoming 2020) The International Journal of Childrens Rights.

[xxii] See list in footnote 39 of submission to the JCHR Select Committee Right to Privacy (Article 8) and the Digital Revolution inquiry

http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/human-rights-committee/the-right-to-privacy-article-8-and-the-digital-revolution/written/103752.html#_ftn39

[xxiii] Security of personal data during remote learning: Polish Data Protection Authority guide for schools https://uodo.gov.pl/en/553/1118

[xxiv] Federal Trade Commission COPPA Guidance for Ed Tech Companies and Schools during the Coronavirus (April 2020)

https://www.ftc.gov/news-events/blogs/business-blog/2020/04/coppa-guidance-ed-tech-companies-schools-during-coronavirus

[xxv] The Slovenian Data Protection Authority (IP) issued an opinion on the processing of personal data of minors in education in response to the pandemic https://gdprhub.eu/index.php?title=IP_-_07121-1/2020/638

[xxvi] The Council of Europe Guidelines on Children in the Digital Environment Recommendation CM/Rec(2018)7 https://rm.coe.int/guidelines-to-respect-protect-and-fulfil-the-rights-of-the-child-in-th/16808d881a

[xxvii] Committee on the Rights of the Child General comment No. 16 (2013) on State obligations regarding the impact of the business sector on childrens rights https://www.unicef.org/csr/css/CRC_General_Comment_ENGLISH_26112013.pdf

[xxviii] DfE EdTech leadership group meeting transparency (February 2020) https://www.whatdotheyknow.com/request/edtech_dfe_edtech_leadership_gro#incoming-1537342

[xxix] Understanding Value and Design Choices Made by Android Family App Developers, (2020) (Ekambaranathan, Zhao, Van Kleek) published in Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems DOI: 10.1145/3334480

[xxx] Dr Selena Nemorin, University College London, Affective capture in digital school spaces. Emotion, Space and Society, 24. pp. 11-18. ISSN 1755-458

[xxxi] Committee of Convention 108 draft, shared in July 2020 https://www.coe.int/en/web/data-protection/data-protection-views-from-strasbourg-in-visio-1-3-july

[xxxii] Children’s data and privacy online: Growing up in a digital age, http://www.lse.ac.uk/my-privacy-uk/Assets/Documents/Childrens-data-and-privacy-online-report-for-web.pdf Stoilova, M., Livingstone, S. and Nandagiri, R. (2019)

[xxxiii] UNICEF, Discussion Paper Series: Children’s Rights and Business in a Digital World (p5) Privacy, Protection of Personal Information, and Reputational Rights https://www.unicef.org/csr/files/UNICEF_CRB_Digital_World_Series_PRIVACY.pdf

[xxxiv] Big Data in Education, the digital future of learning, policy and practice (Sage) Williamson, B., (2017) University of Edinburgh, Centre for Research in Digital Education and the Edinburgh Futures Institute

[xxxv] The Committee on the Rights of the Child (2001): education must be provided in a way that respects the dignity of the child https://www.ohchr.org/EN/Issues/Education/Training/Compilation/Pages/a)GeneralCommentNo1TheAimsofEducation(article29)(2001).aspx

[xxxvi] Resolution on e-learning platforms adopted by the 40th International Conference of Data Protection and Privacy Commissioners (ICDPPC) (2018) https://edps.europa.eu/sites/edp/files/publication/icdppc-40th_dewg-resolution_adopted_en_0.pdf

[xxxvii] DfE external data shares https://www.gov.uk/government/publications/dfe-external-data-shares

 

 

July 2020