Written evidence submitted by Dr Nejra van Zalk

 

 

Response to the Digital, Culture, Media and Sport Sub-Committee on Online Harms and Disinformation, June 2020

 

Dr Nejra van Zalk, Head of Design Psychology Lab and Lead of Human Behaviour and Experience network, Dyson School of Design Engineering, Imperial College London

 

  1. Young children are at increasing risk of coming across harmful online content, including deliberate disinformation. Though the Internet was not designed for children and young people (CYP) in mind, it is being increasingly used by children of decreasing ages (with children as young as 5 having access to online content according to a recent report). Worryingly, exploiting vulnerabilities in the human psyche has been a common feature of the design process for many digital innovations (including social media platforms), with addictive features (such as harmful content or spreading factually inaccurate content) being there by design rather than accident so as to increase usage. As CYP are spending increasing amounts of time online due to the Covid-19 pandemic, this is particularly concerning.

 

  1. Recently, the Information Commissioner's Office (ICO) released their Age-Appropriate Design Code (currently subject to parliamentary approval), aimed as a design guide for companies whose content is likely to be accessed by CYP. Together with Ali Shah, Head of Technology at the ICO, I have road-tested this code in my Design Psychology module at the Dyson School of Design Engineering. My interest is to understand how to mental health-proof digital technologies and innovations before releasing them for wider use so as to avoid negative effects on mental health companies to ensure that CYP accessing their content do not come to harm.

 

  1. During this road-testing, students created simple browser add-ons that allowed for filtering out inappropriate material when accessed by CYP, or proposed intuitive digital interventions focused on teaching digital privacy to parents and their children that could be built into phone apps. Importantly, these interventions were done in a developmentally sensitive way; rather than focusing on removing CYP’s autonomy, the interventions aimed at increasing awareness and educating CYP about the importance of staying safe online. I believe the proposed Age Appropriate Design Code is more important than ever during the current pandemic, as it will force companies to focus on age-appropriate application – meaning that it is putting the responsibility on them rather than on CYP, many of whom are still developing their autonomy and decision-making skills.

 

  1. This exercise demonstrated that the oft-repeated maxim by large tech companies that these types of regulations would inhibit growth and creativity does not hold true. Rather, it seems that companies are more interested in clinging to the current attention economy business model that relies on the pull on users’ attention and creates a more efficient advertising space in turn. COVID-19 has underlined the threat of online disinformation, particularly for vulnerable users such as CYP. We need fundamental reform to tackle disinformation and other harms online.

 

  1. I have argued for social platforms (or even web browsers) as paid services as a way to implement safety features and good design while promoting healthy social development. Holding companies accountable to the Age Appropriate Design Code will also be an important way forward, as will increased focus on industry to work closely with behavioural and social scientists to conduct rigorous scientific testing and treat technological applications as planned behavioural interventions.