Written evidence submitted by Dr Cigdem Sengul
Digital, Culture, Media and Sport Committee
Commons Select Committee
Connected tech: smart or sinister?
Dr Cigdem Sengul, Brunel University London[1]
This evidence submission covers the following questions with a particular focus on smart and connected technology at home:
Q1: What has been or will be the most important impacts of increasingly prevalent smart and connected technology in our lives, including in the home, workplace, and in our towns and cities, and are they necessarily better than current systems?
Q2: Are there any groups in society who may particularly benefit from or be vulnerable to the increasing prevalence of smart technology, such as young or elderly people, people with disabilities and people likely to be digitally excluded?
Q3: How can we incentivise or encourage design that is safe, secure, environmentally- and user-friendly and human rights compliant?
Q4: What are the key short- and long-term risks and threats, and how can we ensure the devices, systems and networks of individuals, businesses and organisations are digitally-literate and cyber secure?
Executive Summary
1.1 The adoption of smart and connected technology or the Internet of Things (IoT) may seem widespread: on several continents, more than half of households already have at least one IoT device[2]. The technology promises to make homes “more liveable—and a lot more automated”[3]—by simply adding Internet-connected appliances and devices typically controlled by smartphones. Technological developments enabled these Internet-connected common devices, ranging from speakers to doorbells to toys, and radically enhanced the ability to collect, analyse, and disseminate information in and from our homes.
1.2 One area that brings about well-founded tech optimism is smart healthcare. Connected medical devices range from monitoring machines in hospitals and clinical settings to small wearable devices like heart rate monitors. The quantified self is growing in popularity, and there is a general appreciation of improvements to health awareness and the convenience that smart technologies provide to their users[4] and may achieve positive behavioural effects on consumers, e.g. encouraging a routine and establishing long-term patterns among users[5]. Especially relevant in the Covid-19 era, this technology was considered for assessing the loneliness and social isolation of older adults and helping alleviate these problems without direct interaction with other people[6],[7].
1.3 Despite this positive outlook, in a DCMS commissioned study published in 2020, 28% of respondents said they were not planning to buy a consumer-connected device in the next twelve months due to security concerns[8]. The survey results show that “the meaning and value proposition of the smart home have not yet achieved closure for consumers” and “anxiety about the likelihood of a security incident” influence the adoption of smart home technology[9].
1.4 We observed a similar trend in the THRIDI project[10], through several design workshops with interdisciplinary researchers run 2021-2022, that the utility of smart appliances is contentious, and these devices are mainly considered as marketing gimmicks produced without much consideration for interoperability, data portability, data protection and privacy[11].
1.5 This negative sentiment is backed up by several issues reported on the web[12],[13] and in the academic literature, e.g. a video doorbell that sends video recordings to its service provider based on movement sensors without any notification or consent from recorded parties[14] or a smart speaker that activates without the wake word and cannot be stopped[15].
1.6 In summary, smart technology still needs to prove its utility to consumers, as it is rightfully seen to bring substantial privacy and security risks, which need to be addressed first.
2.1 Ambient assisted living technology has excellent potential to help elderly and fragile people to live independently[16]. However, according to Ofcom’s 2022 Adults’ Media Use and Attitudes Report, the groups that are more likely not to have internet access at home continued to be those aged 75+ (26%). This percentage increased to 30% among those aged 65+ in DE households, indicating a greater risk of digital exclusion. The age effect has also been noted by a recent academic study8 - people aged 65 and over are less willing to use smart home devices in case of unauthorised data collection than younger people.
2.2 According to Ofcom’s 2022 Adults’ Media Use and Attitudes Reports, half (49%) of those who did not use the internet at home had asked someone else to do something for them online in the past year, indicating a risk for their privacy when accessing certain services.
2.3 There is also a cause for concern due to a wide range of devices with internet connectivity intended for use by children or in caring for them, such as toys, learning development devices, and baby or child monitors. These devices expose children and their families to safety and security risks that are not fully understood, including as severe as bullying and psychological abuse[17]. A well-known example is the interactive toy doll named My Friend Cayla, which has been banned in Germany because it recorded conversations and stored them unprotected on the internet[18].
2.4 It is of concern when a power imbalance is amplified due to the use of smart technology, where this may be the case of an older adult or younger child under guardianship, in the context of intimate partner violence16, or citizens against the law enforcement as laws change in the case of abortion rights in the USA[19].
2.5 For general-purpose devices, the needs of vulnerable groups are typically not properly considered in the design and development of smart technologies, e.g. smart speakers that misactivate more when they are exposed to unclear dialogue, and therefore, putting non-native English users or users with a heavy accent, or users with lower voice volume at an additional risk of privacy exposure[20].
2.6 In summary, there are important concerns that highlight the risks of excluding certain groups from establishing trustworthy relationships with connected smart technologies. Any step taken to improve the experience of these groups would also help improve the accessibility and usability of smart and connected technology for everyone[21].
3.1. The 1973 US Department of Health, Education and Welfare report on “Records, Computers, and Rights of Citizens”[22] still effectively captures the current situation on the right of citizens in the presence of technological advancements: “The net effect of computerisation is that it is becoming much easier for record-keeping systems to affect people than for people to affect record-keeping systems... Although there is nothing inherently unfair in trading some measure of privacy for a benefit, both parties to the exchange should participate in setting the terms.”
3.2. User training and user-friendly interfaces are crucial to ensure both parties can participate in setting the terms. However, most consumers have limited knowledge and understanding of data collection, where vast amounts of data can end up stored in the provider clouds. Consumers do not know what will be done with that data; they do not grasp the full implications of consenting to its release, and with the new regulations, they have the power to retract or redraw their consent[23]. In Ofcom’s 2022 Adults’ Media Use and Attitudes Report, almost eight in ten internet users (79%) said they were confident in using the internet. However, a smaller proportion (59%) were confident in managing access to their personal data[24]. Consumers need a better understanding of what to look for and better transparency and control to have more trusted interactions with their devices.
3.3. This is especially important in the case of children. The Information Commissioner’s Office (ICO) published design guidelines for protecting children’s privacy[25] and the “Age-appropriate design: a code of practice for online services”[26]. However, it is necessary to be future-looking in keeping the Code up-to-date, for example, as AI systems are becoming increasingly pervasive within children’s devices, apps, and services, which possibly merits a specific “Code for Age-Appropriate AI”[27].
3.4. More work is also needed to operationalise the current Code to ensure its realisation as intended[28]. For instance, age-appropriate privacy policies are touched upon in the Code, but recommendations and resources appear to be primarily targeted at parents, explaining to them the child’s right to privacy. Only when children become preteens (i.e. consent age according to GDPR) are audio, video or written materials expected to be provided for the child to explain how the service works (at earlier ages, children are notified that their parents have been informed and may be tracking them).
3.5. Privacy policies should be targeted directly at the users of the product. Nevertheless, recent research finds that only one of the six privacy policies reviewed has a reading age close to the target age of the product (Detective Dot aimed at children aged 7+)[29]. Consequently, in a recent study with young learners[30], we saw that young people, like adults[31], do not read the Terms and Conditions[32]. In addition, tools like age-verification pop-ups do not seem to serve their purpose, and children do their own assessment (or rely on parents) to decide the age-appropriateness of the service. Therefore, investing in directly educating children on the vulnerabilities, risks, and their privacy rights as early as possible would be more beneficial. The documents produced by groups like 5Rights Foundation are very useful[33] from this perspective.
3.6. For adult users, standardising privacy interfaces will help with the learning curve of controlling many devices. Privacy interfaces should be easy to access, intuitive and flexible enough to match the level of ability and privacy expectations of different users without requiring them to go through complex dashboards with long lists of configurable parameters. In the long term, it would be beneficial to use the technology to ensure data flows can be “physically seen”, i.e. through appropriate visualisation or other senses. Such transparency would help to identify the “gossiping” devices[34].
3.7. It is also questionable whether a device needs to connect to a cloud service or make an Internet connection to support its functions. It would be expected that losing network functionality should not lead to carrying out the main functions of the appliance. Funding research and incentivising the development of trusted and accessible ways of using personal data at the user edge would help alleviate unnecessary data exposures[35].
3.8. Supporting innovation into privacy-by-(co)design with consumers, industry and academia would consequently enable more privacy-conscious products in the market, giving consumers better choices. We already see global companies like Apple take a positive privacy stance, but it is equally essential to ensure that rules apply to rule setters[36].
Q4. What are the key short- and long-term risks and threats, and how can we ensure the devices, systems and networks of individuals, businesses and organisations are digitally-literate and cyber secure?
4.1 Criminals have already used Internet of Things (IoT) search engines to find the default username and password of any device on your home network[37]. Therefore, any connected home device can be a point of weakness, especially deployed with universal default passwords. The UK 2018 Code of Practice[38] sets out the appropriate security principles that manufacturers and other relevant industry stakeholders should uphold. It was very promising to see the Code of practice contributed to the 2020 European Standard on connected product security, EN 303 645[39].
4.2 It is also encouraging to see the Product Security and Telecommunications Infrastructure Bill (PSTIB) makes the top three guidelines (“no default passwords”, “implement a vulnerability disclosure policy”, and “keep software updated”) as requirements for manufacturers, importers and distributors of consumer tech devices. While it may be argued whether the Bill goes far enough[40], it is a step in the right direction[41].
4.3 Establishing good security and privacy defaults, in general, is a good measure, as end-users may not follow all recommendations finding them cumbersome. For example, according to the “Consumer Attitudes Towards IoT Security report by Ipsos MORI” in December 2020, one in five consumers (20%) say that when purchasing a smart device, they have checked to see if the device has a default password that is not unique to it”[42]. Consumers also typically underestimate the risks associated with using a particular technology.
4.4 Nevertheless, end-users still need the agency to operate their devices beyond secure defaults. Therefore, consumers should be provided help to set up privacy at the point of purchase. It is also worthwhile to ensure privacy and security policies, warnings or updates are simplified, e.g. providing step-by-step data protection manuals for smart and connected technology (similar to a product operation and safety manual), which explain how to configure and keep devices secure.
4.5 It must be noted that the PSTIB focuses on transparency and requires device manufacturers set out how long they will provide security updates for products. However, this does not help with devices already in circulation or when devices live beyond their security update period. According to the “Evidencing the Cost of the UK Government’s Proposed Regulatory Interventions for Consumer IoT” report in 2020, “only 17% of consumers dispose of IoT products by throwing away; the remainder remains within the product lifecycle by being retained, passed onto someone else, given to charity, or resold”[43]. Hence, these non-security compliant products in circulation are expected to make their owners and others vulnerable on the internet. Therefore, it would help if manufacturers introduced guidelines for operating such devices at minimum risk or do product recalls in the worst case.
4.6 Protecting people from devices going obsolete[44] is critical when these devices are not only embedded in our homes but in our bodies. Therefore, it may be appropriate to require specific categories of consumer-connected products to undergo an assurance process.
4.7 In addition, user training needs to go beyond manuals. While cybersecurity training may be available at schools and workplaces, outside of these environments the opportunity to develop competence with emerging technologies is limited. One of the reasons for the tech-averseness of the elderly population is unresolved concerns regarding the trustworthiness of new technology[45]. Community drop-in centres offering training courses tailored to these audiences may help avoid digital exclusion.
4.8 All the approaches mentioned above could gradually help end-users gain agency over their smart and connected devices. However, even if end-users can exercise control on their personal devices, user agency is hard to achieve when different users with different relationships share or are affected by devices (e.g., housemates/family members[46], tenants vs landlords, and neighbours[47]). Bystanders also need privacy, which calls for cooperative mechanisms to better support and balance the needs of primary end-users and bystanders[48].
4.9 With connected devices at home, consumers also enter into a dynamic relationship with the device vendor, and the negotiability of that relationship is of utmost importance. Examples of unforeseen risks are everywhere: Peloton treadmill users did possibly not expect to have their devices recalled and then installed with a four-digit passcode, which requires a subscription to unlock (a decision reversed later by the company)[49]. Nor did the smart thermostat customers in Texas who had signed up for an energy saver program, which gave the power companies the ability to control their smart thermostats. The issue was only realised when homes got unexpectedly warm in a heat wave[50]. Even if the product changes are managed robustly, privacy preferences and data sharing context change over time, creating new challenges for managing agency (e.g., with the transition to adulthood, it is unclear how children transition from sharing personal information with authority figures such as parents)[51].
4.10 In summary, policies and codes of practice must be able to respond to change and are future-proof.
[1] Dr Cigdem Sengul is a Senior Lecturer in Computer Science at Brunel University and specialises in the Internet of Things and usable security and privacy.
[2] Kumar, D., Shen, K., Case, B., Garg, D., Kuznetsov, D., Gupta, R., and Durumeric, Z. "All Things Considered: An Analysis of IoT Devices on Home Networks". 2019 USENIX Security. https://www.usenix.org/conference/usenixsecurity19/presentation/kumar-deepak
[3] PCMag UK. 2022. The Best Smart Home Devices for 2022. https://uk.pcmag.com/smart-home/137197/the-best-smart-home-devices-for-2021
[4] The Economist. 2022. The quantified self https://www.economist.com/technology-quarterly/2022-05-07
[5] Swan, M. Sensor Mania! The Internet of Things, Wearable Computing, Objective Metrics, and the Quantified Self 2.0. J. Sens. Actuator Netw. 2012, 1, 217-253. https://doi.org/10.3390/jsan1030217
[6] Latikka, R., Rubio-Hernández, R., Lohan, E., Rantala, J., Nieto Fernández, F., Laitinen, A., Oksanen, A. Older Adults’ Loneliness, Social Isolation, and Physical Information and Communication Technology in the Era of Ambient Assisted Living: A Systematic Literature Review J Med Internet Res 2021;23(12):e28022 DOI: https://doi.org/10.2196/28022
[7] Sofia von Humboldt, Neyda Ma. Mendoza-Ruvalcaba, Elva Dolores Arias-Merino, Andrea Costa, Emilia Cabras, Gail Low & Isabel Leal. Smart technology and the meaning in life of older adults during the Covid-19 public health emergency period: a cross-cultural qualitative study, International Review of Psychiatry, 2020, 32:7-8, 713-722, DOI: http://doi.org/10.1080/09540261.2020.1810643
[8] GOV.UK. 2022. https://www.gov.uk/government/publications/regulating-consumer-smart-product-cyber-security-government-response/government-response-to-the-call-for-views-on-consumer-connected-product-cyber-security-legislation
[9] Cannizzaro, S., Procter, R., Ma, S., & Maple, C. Trust in the smart home: Findings from a nationally representative survey in the UK. PloS one, 2020, 15(5), e0231615. https://doi.org/10.1371/journal.pone.0231615
[10] THRIDI project https://www.brunel.ac.uk/research/projects/trust-in-home-rethinking-interface-design-in-the-internet-of-things
[11] Chen, J. and Urquhart, L. ‘They're all about pushing the products and shiny things rather than fundamental security': Mapping socio-technical challenges in securing the smart home. Information & Communications Technology Law, 2021, 31(1), pp.99-122 https://doi.org/10.1080/13600834.2021.1957193
[12] Mozilla Foundation – Privacy Not Included https://foundation.mozilla.org/en/privacynotincluded/
[13] Smart Spies: Alexa and Google Home expose users to vishing and eavesdropping https://www.srlabs.de/bites/smart-spies
[14] Jingjing Ren, Daniel J. Dubois, David Choffnes, Anna Maria Mandalari, Roman Kolcun, and Hamed Haddadi. Information Exposure From Consumer IoT Devices: A Multidimensional, Network-Informed Measurement Approach. In Proceedings of the ACM Internet Measurement Conference, 2019 (IMC '19)., 267–279. https://doi.org/10.1145/3355369.3355577
[15] Guardian. 2022. 'Alexa, are you invading my privacy?' – the dark side of our voice assistants. https://www.theguardian.com/technology/2019/oct/09/alexa-are-you-invading-my-privacy-the-dark-side-of-our-voice-assistants
[16] https://www.ageuk.org.uk/products/mobility-and-independence-at-home/personal-alarms/
[17] Datta Burton, S.., Tanczer, L.M., Vasudevan, S., Hailes, S., Carr, M.
The UK Code of Practice for Consumer IoT Security: ‘where we are and what next’.
The PETRAS National Centre of Excellence for IoT Systems Cybersecurity, 2021. DOI:
https://doi.org/10.14324/000.rp.10117734
[18] Samuel Greengard. Deep insecurities: the internet of things shifts technology risk. Commun. ACM 62, 5. May 2019, 20–22. https://doi.org/10.1145/3317675
[19] https://www.wired.com/story/fertility-data-weaponized/
[20] Dubois, Daniel & Kolcun, Roman & Mandalari, Anna & Paracha, Muhammad & Choffnes, David & Haddadi, Hamed. When Speakers Are All Ears: Characterizing Misactivations of IoT Smart Speakers. Proceedings on Privacy Enhancing Technologies. 2020. 255-276. DOI: https://doi.org/10.2478/popets-2020-0072.
[21] https://www.w3.org/WAI/fundamentals/accessibility-usability-inclusion/
[22] U.S. Department of Health, Education and Welfare, Records, Computers and the Rights of Citizens (July 1973) https://www.justice.gov/opcl/docs/rec-com-rights.pdf
[23] Ooijen, I. v. and Vrabec, H. U. Does the GDPR Enhance Consumers' Control over Personal Data? An Analysis from a Behavioural Perspective. J Consum Policy, 2019. p. 91–107 https://link.springer.com/article/10.1007/s10603-018-9399-7
[24] Ofcom Adults’ Media Use and Attitudes report, 2022 https://www.ofcom.org.uk/__data/assets/pdf_file/0020/234362/adults-media-use-and-attitudes-report-2022.pdf
[25] ICO Children’s Code, Design Guidance https://ico.org.uk/for-organisations/childrens-code-hub/childrens-code-design-guidance/protect-children-s-privacy-by-default/
[26] ICO Age-appropriate design: a code of practice for online services https://ico.org.uk/for-organisations/guide-to-data-protection/ico-codes-of-practice/age-appropriate-design-a-code-of-practice-for-online-services/
[27] Ge Wang, Jun Zhao, Max Van Kleek, and Nigel Shadbolt. Informing Age-Appropriate AI: Examining Principles and Practices of AI for Children. In CHI Conference on Human Factors in Computing Systems, 2022 (CHI '22). Association for Computing Machinery, New York, NY, USA, Article 536, 1–29. https://doi.org/10.1145/3491102.3502057
[28] A reflection on the UNCRC Best Interests of the Child principle: in the context of The Age Appropriate Design Code https://pure.hud.ac.uk/en/publications/a-reflection-on-the-uncrc-best-interests-of-the-child-principle-i
[29] Connected Toys: What Device Documentation Explains about Privacy and Security, https://petras-iot.org/update/connected-toys-what-device-documentation-explains-about-privacy-and-security /
[30] Revealing Young Learners’ Mental Models of Online Sludge https://spritehub.org/2021/08/23/revealing-young-learners-mental-models-of-online-sludge/
[31] Jonathan A. Obar & Anne Oeldorf-Hirsch. The biggest lie on the Internet: Ignoring the privacy policies and terms of service policies of social networking services, Information, Communication & Society, 2020 23:1, 128-147, DOI: https://doi.org/10.1080/1369118X.2018.1486870
[32] R. Farthing, K. Michael, R. Abbas and G. Smith-Nunes. Age Appropriate Digital Services for Young People: Major Reforms. IEEE Consumer Electronics Magazine, vol. 10, no. 4, pp. 40-48, 1 July 2021, DOI: https://doi.org/10.1109/MCE.2021.3053772.
[33] https://5rightsfoundation.com/uploads/demystifying-the-age-appropriate-design-code.pdf
[34] Said Jawad Saidi, Anna Maria Mandalari, Roman Kolcun, Hamed Haddadi, Daniel J. Dubois, David Choffnes, Georgios Smaragdakis, and Anja Feldmann. A Haystack Full of Needles: Scalable Detection of IoT Devices in the Wild. In Proceedings of the ACM Internet Measurement Conference, 2020 (IMC '20). 87–100. https://doi.org/10.1145/3419394.3423650
[35] B. Varghese, et al. Revisiting the Arguments for Edge Computing Research. IEEE Internet Computing, vol. 25, no. 05, pp. 36-42, 2021. https://doi.ieeecomputersociety.org/10.1109/MIC.2021.3093924
[36] POLITICO. 2022. Apple’s privacy rules targeted by German competition watchdog. https://www.politico.eu/article/apples-privacy-rules-targeted-by-german-competition-watchdog/
[37] Guardian. https://www.theguardian.com/technology/2016/jan/25/search-engine-lets-users-find-live-video-of-sleeping-babies
[38]https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/971440/Code_of_Practice_for_Consumer_IoT_Security_October_2018_V2.pdf
[39] https://www.etsi.org/deliver/etsi_en/303600_303699/303645/02.01.01_60/en_303645v020101p.pdf
[40] https://www.itpro.co.uk/network-internet/internet-of-things-iot/361985/iot-product-security-telecommunications-infrastructure-gaps
[41] It's Time to Regulate IoT to Improve Cyber-Security - Schneier on Security. https://www.schneier.com/news/archives/2017/11/schneier_its_time_to.html
[42]https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/978685/Consumer_Attitudes_Towards_IoT_Security_-_Research_Report.pdf
[43] Evidencing the Cost of the UK Government's Proposed Regulatory Interventions for Consumer IOT, Technical Report, RSM UK , Great Britain. Department for Digital, Culture, Media and Sport, 2020. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/900330/Evidencing_the_cost_of_the_UK_government_s_proposed_regulatory_interventions_for_consumer_internet_of_things__IoT__products.pdf
[44] BBC News. 2022. Bionic eyes: Obsolete tech leaves patients in the dark. https://www.bbc.co.uk/news/technology-60416058
[45] Bran Knowles and Vicki L. Hanson.The wisdom of older technology (non)users. Commun. ACM 61, 3. March 2018, 72–77. https://doi.org/10.1145/3179995
[46] Geeng, C. and Roesner, F. Who's In Control? Interactions In Multi-User Smart Homes. CHI Conference on Human Factors in Computing Systems, 2019 (CHI '19). Paper 268, 1–13. https://doi.org/10.1145/3290605.3300498
[47] Sparrow, M., British Judge Rules That Amazon Ring Cameras And Other CCTV Could Be An Invasion Of Privacy. 2022. https://www.forbes.com/sites/marksparrow/2021/10/14/british-judge-rules-that-amazon-ring-cameras-and-other-cctv-could-be-an-invasion-of-privacy/
[48] Yao, Y., Basdeo, J.R., Mcdonough, O. R., and Wang, Y. Privacy Perceptions and Designs of Bystanders in Smart Homes. Proc. ACM Hum.-Comput. Interact., 3, CSCW, Article 59. November 2019 https://doi.org/10.1145/3359161
[49] PCMag UK. 2022. Peloton Restores Free 'Just Run' Feature for Its Treadmill. https://uk.pcmag.com/old-fitness/134077/peloton-axes-free-just-run-feature-from-treadmill-bricking-it-for-non-subscribers
[50] Vox. 2022. How your power company can remotely control your smart thermostat. https://www.vox.com/recode/22543678/smart-thermostat-air-conditioner-texas-heatwave
[51] https://petras-iot.org/update/connected-toys-what-device-documentation-explains-about-privacy-and-security/