Written evidence submitted by Professor Claudia Aradau (TFP0027)

 

I am Professor of International Politics at King’s College London and my current research focuses on how digital technologies reconfigure security practices. I also research datafication technologies and the role of algorithms in governance processes globally. In this contribution, I offer a set of comments on the relation between technologies and power, as well as the role of private companies, particularly in response to Questions 1, 2, 3, 4 and 5.

 

  1. The relation between digital technologies and power is a complex one, and it is important to avoid framing it as ‘technologies shifting power’. Technologies are shaped by and express power relations; they can also transform power relations. To understand what technologies do, they need to be analysed in the particular context of their design, production, circulation and use. We need to ask who produces technologies, for which purposes, how they are acquired and implemented. To give a concrete example from my research on humanitarian action and digital technologies, my colleagues and I have investigated the development of apps by humanitarian actors to communicate with and support displaced populations. The production of apps for refugees marked a new development in the practices of digital humanitarianism, which had started by harnessing the power of digital technologies and social media to respond to emergencies and disasters worldwide. Although apps are developed by different companies and humanitarian actors, we have shown how Facebook and Google are positioned at the centre of the humanitarian ecosystem we studied. The companies of the centre of this ecosystem provided part of the infrastructure of digital humanitarianism. While developing these new technologies, humanitarian actors are also particularly vulnerable themselves, as many lack in-house digital expertise, fund technology development through ‘tech for good’ initiatives, and lack of capacities to hold these technology producers accountable.[i] Thus, we have found many depreciated and obsolete apps and software, as humanitarian actors often didn’t have the capacity and resources to maintain these digital technologies. This research highlights that the FCDO should first answer questions about what problems it wants to tackle, rather than about using technology in general.

 

An understanding of global problems must come first in order to avoid the illusions and pitfalls of techno-solutionism. Secondly, it is important that both governmental and non-governmental actors involved in technology development and regulation have independent and interdisciplinary knowledge – this could require in-house teams of technologists and social scientists – or an independent centre that could provide such expertise. Thirdly, I think accountability is more important than influence and it should be closely considered in how problems are formulated, who gets to a say in the description of a problem and which answers are given.

 

  1. Given that the big tech companies have, maintain and extend digital platforms, other state and non-state actors are increasingly reliant on these companies and their own infrastructures. It is important to acknowledge that private companies are not external to state institutions, but that they already provide the infrastructures that many governments and state institutions use. Therefore, public actors and private companies are already entangled in many ways, and therefore relations with private companies cannot be approached from a position of exteriority. What are the requirements for technological developments within the government and how do contracts awarded to private companies integrate oversight of technological developments and accountability for the design and implementation of these systems? Governments often implement technological systems with little oversight, citizen involvement, expert consultation or requirements for public accountability. Rather than fostering the illusion of influencing private actors who are disperse around the world, it is important to evaluate and revise the UK government’s own practices of using data and technology. What is the role of publics and public accountability within these technological developments? Integrating practices of accountability and public representation within discussions about technological developments would go a long way towards changing existing practices. See, for instance, the initiative by the Ada Lovelace Institute on a citizens’ biometric council to develop recommendations for technology governance.[ii]

 

  1. Norms for the use of social media already exist in different forms, having been produced by states as well as by social media companies themselves. The Network Enforcement Act, commonly known as NetzDG, which was passed by Germany in 2018 is a useful example of how states have tried to regulate social media companies. The social media companies with over 2 million German users have 24 hours to remove content from the web that has been reported to them and is obviously in contradiction to the German law (‘manifestly unlawful content’). If it is not as clear a violation (‘unlawful content’), the platforms have seven days to remove content.[iii] The legislation has been widely hailed as an advance for the regulation of the digital sphere and of social media companies. However, NetzDG is limited in each reach, while social media companies already govern globally through the production of ‘community guidelines’ for communication on their platforms. In so doing, the companies have acquired quasi-sovereign powers of decision, as their own community guidelines take precedence over national legislation, as in the case of NetzDG I briefly discuss here. Having received a user complaint about content, a Facebook content moderator first checks whether the content violates its own community standards. If that’s the case, the content is taken down immediately. If not, if the content also received a complaint under NetzDG provisions, the German NetzDG rules are applied.

 

There are three elements that need to be considered in relation to the development of social norms for the use of social media. Firstly, which norms are already used and what is the relation between national and global norms? Currently, global norms are implemented by the social media companies through community guidelines, while national norms are set in place by states. Company norms often have de facto priority over national ones. Secondly, implementing these norms, whether NetzDG or the one of social media companies requires work and workers. Any discussion of social norms and their development should also take into account the labour force that would be needed, where these workers would be based and what conditions of employment they would have. Thirdly, the governance and regulation of communication is also a democratic question about how harmful speech is understood.

 

  1. Both questions 4 and 5 seem to eschew the role of international organisations. Despite limitations, international organisations like the United Nations enjoy more accountability and global representation than partial and ad-hoc coalitions of states. The legitimacy of global frameworks will depend on close engagements with a range of publics and actors across the world so that they reflect the views of those who are affected by technology. It will also need to account for what can be called technological cosmopolitanism from below – how people use technology to challenge inequality and build new connections and relations. It cannot be limited to states, but it will need to recognise that people contribute to technology globally.

 

 

1 June 2021

 

3


 


[i] Aradau, Claudia, Tobias Blanke, and Giles Greenway. ‘Acts of digital parasitism: Hacking, humanitarian apps and platformisation’. New Media & Society (2019): 1-18.

[ii] Ada Lovelace Institute, The Citizens’ Biometrics Council, 30 March 2021, https://www.adalovelaceinstitute.org/report/citizens-biometrics-council/.

[iii] Heldt, Amélie Pia. ‘Reading between the lines and the numbers: an analysis of the first NetzDG reports’. In  Internet Policy Review 8, no. 2 (2019). doi:DOI: 10.14763/2019.2.1398, https://policyreview.info/articles/analysis/reading-between-lines-and-numbers-analysis-first-netzdg-reports.