Written evidence submitted by REPHRAIN - the National Research Centre on Privacy, Harm Reduction and Adversarial Influence Online

 

Call for Evidence: Connected tech: smart or sinister?

 

 

INTRODUCTION

Thank you for an opportunity to provide our response to this evidence call. We are writing on behalf of REPHRAIN, the National Research Centre on Privacy, Harm Reduction and Adversarial Influence Online. REPHRAIN is the UK’s world-leading interdisciplinary community focused on the protection of citizens online. As a UKRI-funded National Research Centre, we boast a critical mass of over 100 internationally leading experts at 13 UK institutions working across 37 different and diverse research projects and 23 founding industry, non-profit, government, law, regulation, and international research centre partners. As an interdisciplinary and engaged research group, we work collaboratively on addressing the three following missions:

We are addressing this call because REPHRAIN researchers are international experts on smart and connected technologies across domains like agriculture, energy, consumer electronics, smart home, financial technologies, and many others. For years, we have been working to identify, evaluate and address risks to smart systems in the areas like privacy, security, usability, responsibility, among the others.

This is a submission from the REPHRAIN centre. Specifically, the following researchers contributed to the formulation of this response (in alphabetical order): Dr Chuadhry Mujeeb Ahmed Prof Madeline Carr, Dr Partha das Chowdhury, Prof Lynn Coventry, Dr François Dupressoir, Dr David Ellis, Dr Nadin Kokciyan, Prof Shane D Johnson, Dr Jose Tomas Llanos, Dr Mark McGill, Dr Ola Michalec, Dr Marvin Ramokapane, Prof Awais Rashid, Yvonne Rigby, Dr Bianca Slocombe, Dr Kami Vaniea, Dr Baraa Zieni.

  1. What has been or will be the most important impacts of increasingly prevalent smart and connected technology in our lives, including in the home, in the workplace and in our towns and cities, and are they necessarily better than current systems?

We believe that smart and connected technologies have brought about positive and negative impacts on the society, both requiring attention in the ongoing policy and regulatory processes.

Starting from the positives, we can outline the following:

However, smart and connected technologies are reconfiguring our society in the following ways:

With the demonstrated societal impact of smart technologies, it can be seen as an improvement over existing manual and non-smart systems. But this force for good would only be endearing if privacy, security as well as responsible innovation underlines its vision.

 

  1. Are there any groups in society who may particularly benefit from or be vulnerable to the increasing prevalence of smart technology, such as young or elderly people, people with disabilities and people likely to be digitally excluded?

We would like to highlight the following vulnerable groups:

Additionally, we would like to stress that identifying vulnerable users might become increasingly challenging as smart technologies proliferate. Understanding barriers or a lack of access becomes challenging when millions of people use a specific technology. Overall, there needs to be an evaluation of what real opportunities people have when accessing smart systems. This evaluation framework ought to move beyond usability considerations and include accessibility, human capabilities and second order effects.  This is because people affected by smart technologies are both active users, indirect users, and bystanders. While usability evaluations are often concerned with direct users the bystanders are not heard. In smart technology evaluations, we need to consider human diversity in terms of age, gender, ability, political and economic circumstances (Chowdhury et al. 2022).
 

  1. How can we incentivise or encourage design that is safe, secure, environmentally- and user-friendly and human rights compliant?

We outline the following recommendations:

 

  1. What are the key short- and long-term risks and threats, and how can we ensure the devices, systems and networks of individuals, businesses and organisations are digitally-literate and cyber secure?

We outline the following risks and threats:

 

We can ensure digital literacy and security in society through:

 

  1. How will current geopolitical concerns influence domestic consumers, e.g. regarding standards of imported goods or in how we can deal with cyber threats?

We call for increased international collaboration despite geopolitical concerns. In particular, we remark that:

 

  1. Do existing frameworks, like data protection legislation and the Public Security and Telecommunications Infrastructure Bill, adequately address concerns with smart technology, and if not, how could they be changed?

We posit that the existing frameworks lag behind what is occurring in practice now, and what advancements are impending in IoT (e.g., in smart energy context) and smart wearables (e.g., head worn Augmented Reality (AR) devices). Below we outline the areas that require regulatory improvement as well as recommend regulatory mechanisms:

 

 

 

REFERENCES

Abdi, N., Ramokapane, K. M., & Such, J. M. (2019). More than smart speakers: security and privacy perceptions of smart home personal assistants. In Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019) (pp. 451-466).

 

Abdi, N., Zhan, X., Ramokapane, K. M., & Such, J. (2021). Privacy norms for smart home personal assistants. In Proceedings of the 2021 CHI conference on human factors in computing systems (pp. 1-14).

 

Ahmed, C. M., MR, G. R., & Mathur, A. P. (2020, October). Challenges in machine learning based approaches for real-time anomaly detection in industrial control systems. In Proceedings of the 6th ACM on cyber-physical system security workshop (pp. 23-29).

 

Ahmed, C. M., Mathur, A., & Ochoa, M. (2017). NoiSense: Detecting data integrity attacks on sensor measurements using hardware-based fingerprints. arXiv preprint arXiv:1712.01598.

 

Almeida, J.B., Barbosa, M., Barthe, G., Dupressoir, F., Grégoire, B., Laporte, V., and Pereira, V. (2017) A Fast and Verified Software Stack for Secure Function Evaluation. CCS 2017: 1989-2006

Anderson, R. (2018) Making security sustainable. Communications of the ACM 61.3: 24-26.

Blythe, J. M., & Johnson, S. D. (2018).  Rapid evidence assessment on labelling schemes and implications for consumer IoT security.   Department for Digital, Culture, Media and Sport, https://www.gov.uk/government/publications/rapid-evidence-assessment-on-labelling-schemes-for-iot-security.

Blythe, J. M., Sombatruang, N., & Johnson, S. D. (2019). What security features and crime prevention advice is communicated in consumer IoT device manuals and support pages?. Journal of Cybersecurity, 5(1), https://academic.oup.com/cybersecurity/article/5/1/tyz005/5519411?searchresult=1.

Chowdhury, P. D., Hallett, J., Patnaik, N., Tahaei, M., & Rashid, A. (2021, October). Developers Are Neither Enemies Nor Users: They Are Collaborators. In 2021 IEEE Secure Development Conference (SecDev) (pp. 47-55). IEEE.

Chowdhury, P. D., Dominguez, A., Ramokapane, M. K., & Rashid, A. (2022). The Political Economy of Privacy Enhancing Technologies. arXiv preprint arXiv:2202.08548.

 

Christianson, B. (2013) Living in an impossible world. Philosophy and Technology, 26(4):411{429}

Colnago, J., Feng, Y., Palanivel, T., Pearman, S., Ung, M., Acquisti, A., Cranor, L.F. and Sadeh, N. (2020) Informing the Design of a Personalized Privacy Assistant for the Internet of Things. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 1–13.

Davidson, B., Ellis, D., Stachl, C., Taylor, P., & Joinson, A. (2022). Measurement practices exacerbate the generalizability crisis: Novel digital measures can help. Behavioural and Brain Sciences, 45, E10. doi:10.1017/S0140525X21000534

Ellis, D. A. (2020). Smartphones within psychological science. Cambridge University Press.

Giddens, A. (1990) A. Giddens. The consequences of modernity. Oxford, Polity Press, 1:1–19, 1990. 

Github (2022) RISC-V Crypto ISE https://github.com/scarv/xcrypto

Golomb, S.W. (1971) Mathematical models: Uses and limitations. IEEE Transactions on Reliability 20.3. 130-131

Haagh, H., Karbyshev, A. Oechsner, S., Spitters, B., Strub, P.Y. (2018) Computer-Aided Proofs for Multiparty Computation with Active Security. CSF 2018: 119-131

Kokciyan, N. and Yolum, P. (2022) Taking Situation-Based Privacy Decisions: Privacy Assistants Working with Humans. In Proceedings of the 31st International Joint Conference on Artificial Intelligence and the 25th European Conference on Artificial Intelligence (IJCAI-ECAI) [to appear].

Jakobi, T., Patil, S., Randall, D., Stevens, G., and Wulf, V. (2019) It Is About What They Could Do with the Data: A User Perspective on Privacy in Smart Metering. ACM Trans. Comput.-Hum. Interact. 26, 1, Article 2, 44 pages.

Kwasny, D., & Hemmerling, D. (2021). Gender and age estimation methods based on speech using deep neural networks. Sensors, 21(14), 4785.

Marres, N., & Stark, D. (2020). Put to the test: For a new sociology of testing. The British Journal of Sociology, 71(3), 423-443.

 

McGill; Mark, "The IEEE Global Initiative on Ethics of Extended Reality (XR) Report--Extended Reality (XR) and the Erosion of Anonymity and Privacy," in Extended Reality (XR) and the Erosion of Anonymity and Privacy - White Paper , vol., no., pp.1-24, 18 Nov. 2021.

 

McQueenie, R., Ellis, D. A., Fleming, M., Wilson, P., & Williamson, A. E. (2021). Educational associations with missed GP appointments for patients under 35 years old: administrative data linkage study. BMC Medicine, 19(1), 1-7.

Meng, N., Keküllüoğlu, D., Vaniea, K. (2021) Owning and Sharing: Privacy Perceptions of Smart Speaker Users. Proceedings of the ACM Conference on Computer Supported Cooperative Work and Social Computing.  https://groups.inf.ed.ac.uk/tulips/papers/meng2021cscw.pdf

Michalec, O., Hayes, E.; Longhurst, J. and Tudgey, D. (2019) Exploring the potential and communication of metering in the energy and water sectors. Utilities Policy. Available here

Michalec, O., O’Donovan, C. & Sobhani, M. (2021) What is robotics made of? The interdisciplinary politics of robotics research. Humanit Soc Sci Commun 8, 65.  https://doi.org/10.1057/s41599-021-00737-6

Michalec, O., Milyaeva, S. and Rashid, A (2022) When the future meets the past: can safety and cyber security coexist in modern critical infrastructures? Big Data and Society (In press)

Michalec, O. and Chitchyan, R. (2022) Smart Lens project https://www.bristol.ac.uk/bristol-digital-futures-institute/research/seed-corn-funding/energy-systems/get-a-smart-meter/#d.en.577338

Michalec, O. (2022) How to Talk about Cybersecurity of Emerging Technologies A Report to Board Level Executives in the Energy Sector. Policy briefing  https://petras-iot.org/wp-content/uploads/2022/03/How-to-talk-about-cybersecurity-of-emerging-technologies.pdf

Ministry for Europe and Foreign Affairs (France) (2018) Cybersecurity: Paris Call of 12 November 2018 for Trust and Security in Cyberspace https://www.diplomatie.gouv.fr/en/french-foreign-policy/digital-diplomacy/france-and-cyber-security/article/cybersecurity-paris-call-of-12-november-2018-for-trust-and-security-in

Morrisett, G., Shi, E., Sojakova, K., Fan, X., Gancher, J. (2021) IPDL: A Simple Framework for Formally Verifying Distributed Cryptographic Protocols. IACR Cryptol. ePrint Arch. 2021: 147

NIST LWC (2021). Lightweight Cryptography, url: https://csrc.nist.gov/Projects/lightweight-cryptography
last accessed: 20th June 2022.

The Oxford Internet Institute (OII) (2022) An open letter to Mark Zuckerberg https://www.oii.ox.ac.uk/an-open-letter-to-mark-zuckerberg/#contributors

Quayyum, F., Cruzes, D. S., & Jaccheri, L. (2021). Cybersecurity awareness for children: A systematic literature review. International Journal of Child-Computer Interaction, 30, 100343.

Parkin, S., Patel, T., Lopez-Neira, I. and Tanczer, L. (2019) Usability analysis of shared device ecosystem security: informing support for survivors of IoT-facilitated tech-abuse. In Proceedings of the New Security Paradigms Workshop (NSPW '19). Association for Computing Machinery, New York, NY, USA, 1–15. https://doi.org/10.1145/3368860.3368861

Patnaik, N., Dwyer, A. C., Hallett, J., & Rashid, A. (2021). Don't forget your classics: Systematizing 45 years of Ancestry for Security API Usability Recommendations. arXiv preprint arXiv:2105.02031.

Privacy International (2021) Best Before date policy brief: Device sustainability through long-term software support https://privacyinternational.org/advocacy/4636/best-date-policy-brief-device-sustainability-through-long-term-software-support

Ramokapane, K. M., van der Linden, D., & Zamansky, A. (2019a). Does my dog really need a gadget? What can we learn from pet owners' motivations for using pet wearables?. In Proceedings of the Sixth International Conference on Animal-Computer Interaction (pp. 1-6).

 

Ramokapane, K. M., Mazeli, A. C., & Rashid, A. (2019b). Skip, Skip, Skip, Accept!!!: A Study on the Usability of Smartphone Manufacturer Provided Default Features and User Privacy. Proc. Priv. Enhancing Technol., 2019(2), 209-227.

 

Ramokapane, K. M., Bird, C., Rashid, A., & Chitchyan, R. (2022). Privacy Design Strategies for Home Energy Management Systems (HEMS). In CHI Conference on Human Factors in Computing Systems (pp. 1-15).

 

Rashid, A. (2021). Developer-Centred Security. In Encyclopedia of Cryptography, Security and Privacy. Springer.

 

RISC-V (2022) RISC-V Announces First New Specifications of 2022, Adding to 16 Ratified in 2021 | RISC-V International. Community news https://riscv.org/

REPHRAIN (2022) SOXAI – Social Explainability for trustworthy AI: What types of explanations can help users develop appropriate trust? https://www.rephrain.ac.uk/soxai/

Sharemind (2015) Using Sharemind to Estimate Satellite Collision Probability https://sharemind.cyber.ee/satellite-collision-security/

Shaw, H., Ellis, D. A., & Ziegler, F. V. (2018). The Technology Integration Model (TIM). Predicting the continued use of technology. Computers in Human Behaviour, 83, 204-214.

Shaw, H., Taylor, P. J., Ellis, D. A., & Conchie, S. M. (2022). Behavioural consistency in the digital age. Psychological science, 33(3), 364-370.

Sovacool, B. K., Kivimaa, P., Hielscher, S., & Jenkins, K. (2017). Vulnerability and resistance in the United Kingdom's smart meter transition. Energy Policy, 109, 767-781.

 

Strohmayer, A., Slupska, J., Bellini, R., Coventry, L., Hairston, T., & Dodge, A. (2021). Trust and Abusability Toolkit: Centering Safety in Human-Data Interactions. Northumbria University

Sugawara, T., Cyr, B., Rampazzi, S., Genkin, D., & Fu, K. (2020). Light Commands:{Laser-Based} Audio Injection Attacks on {Voice-Controllable} Systems. In 29th USENIX Security Symposium (USENIX Security 20) (pp. 2631-2648).

The University of Cambridge (2022) CHERI RISC-V https://www.cl.cam.ac.uk/research/security/ctsrd/cheri/cheri-risc-v.html

Tahaei, M., Vaniea, K., and Saphra, N. (2020) Understanding Privacy-Related Questions on Stack Overflow. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.

Tahaei, M., Frik, A., and Vaniea, K. (2021) Privacy Champions in Software Teams: Understanding Their Motivations, Strategies, and Challenges. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.

Tahaei, M., Li, T., & Vaniea, K. (2022). Understanding Privacy-Related Advice on Stack Overflow. Proc. Priv. Enhancing Technol., 2022(2), 114-131.

Tu, Y., Rampazzi, S., Hao, B., Rodriguez, A., Fu, K., & Hei, X. (2019). Trick or heat? Manipulating critical temperature-based control systems using rectification attacks. In Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security (pp. 2301-2315).

Weyns, D., Bures, T., Calinescu, R., Craggs, B., Fitzgerald, J., Garlan, D., ... & Schmerl, B. (2021, September). Six Software Engineering Principles for Smarter Cyber-Physical Systems. In 2021 IEEE International Conference on Autonomic Computing and Self-Organizing Systems Companion (ACSOS-C) (pp. 198-203). IEEE.

Zhang, G., Yan, C., Ji, X., Zhang, T., Zhang, T., & Xu, W. (2017). Dolphinattack: Inaudible voice commands. In Proceedings of the 2017 ACM SIGSAC conference on computer and communications security (pp. 103-117).

Zieni, B. and Heckel, R. (2021) TEM: A Transparency Engineering Methodology Enabling Users’ Trust Judgement. 2021 IEEE 29th International Requirements Engineering Conference (RE), pp. 94-105, doi: 10.1109/RE51729.2021.00016.

Zieni, B., Spagnuelo, D., & Heckel, R. (2021) Transparency by default: GDPR Patterns for Agile Development. In International Conference on Electronic Government and the Information Systems Perspective (pp. 89-102). Springer, Cham.