Written evidence submitted by Index on Censorship
Online Safety & Online Harms
Index on Censorship was launched on the pages of the Times by Stephen Spender in October 1971. Our mission was, and remains, to be defenders of media, artistic and academic freedom throughout the world. Shining a spotlight on those repressive regimes which seek to silence their citizens and publishing the works of those amazing and inspirational people who stand up against this tyranny.
In our work supporting global free expression, we are very aware of the impact of the UK Parliament as exporters of legislation. Setting the tone of the debate in other nation states. This is most alarming when inadvertent restrictions on free speech and free expression emerge in a British context as repressive regimes then use this to justify more draconian measures. Index believes that this is a genuine risk of this legislation.
Is it necessary to have an explicit definition and process for determining harm to children and adults in the Online Safety Bill, and what should it be?
- Yes, it will be important to explicitly define the term “harmful” in the new bill. Without a clear definition the bill will effectively outsource decision making on what is and isn’t permitted to tech companies.
- We work tirelessly to protect freedom of expression and combat censorship on a global scale and are therefore well positioned to recognise the threat that this bill poses. The threat of large fines will create a commercial incentive to over-censor. With no way of tracking what has been deleted and the scale. This over censorship would disproportionately impact marginalised communities, as a result of discriminatory algorithmic moderation and pressure from hate groups. This is already a feature of online moderation in July 2021 LinkedIn removed a coming out post from a 16-year-old following complaints.
Does the draft Bill focus enough on the ways tech companies could be encouraged to consider safety and/or the risk of harm in platform design and the systems and processes that they put in place?
- No, the bill does not focus enough on the risks in systems and processes that tech companies put in place. The bill will necessitate a massive increase in the use of AI moderation programmes which are prone to over censoring. The bill must ensure that any algorithms used by platforms to moderate content are entirely transparent and accountable.
- A ‘HateCheck’ study tested the efficiency of hate speech detection models using 29 key functionalities. They found that the models tested with HateCheck had fundamental weaknesses including wrongly blocking rebuttals to hate speech and bias when it comes to certain groups. (Hate Check 2021).
- As the MIT Technology Review pointed out, artificial intelligence systems do not understand meanings based on context. (MIT Technological Review 2016).
- In a shocking example, videos of anti-Government demonstrations in Lebanon were deleted because chanting included anti-Hezbollah messages – the algorithm had picked up Hezbollah but not the anti-extremist context.
- A 2019 study by Washington University found that tweets from African-American users were twice as likely to be labelled as offensive than tweets from other users (Washington University 2019).
What are the key omissions to the draft Bill, such as a general safety duty or powers to deal with urgent security threats, and (how) could they be practically included without compromising rights such as freedom of expression?
- The bill does not mandate a system of storing deleted content. Without this, it will be impossible to evaluate whether platforms are deleting content appropriately or whether they are engaging in mass censorship.
- Significantly, this omission makes it harder to prosecute criminals and protect victims of online abuse because the bill forces platforms to delete valuable evidence before it is handed over to the police with no obligation to ensure it is properly archived. The bill must be amended to mandate the establishment of a Digital Evidence Locker to ensure that content deletion is appropriately audited and crucial criminal evidence is not permanently lost.
- As a charity that works to defend freedom of expression worldwide, we are well aware of the vital role that online content can play in highlighting and documenting crimes globally. The impact of this bill on evidence gathering will be felt far beyond the UK’s borders. Human Rights Watch released a report highlighting how social media content, particularly photographs and videos, posted by perpetrators, victims, and witnesses to abuses, as well as others has become increasingly central to some prosecutions of war crimes and other international crimes, including at the International Criminal Court (ICC) and in national proceedings in Europe. They stress the need for platforms to preserve this evidence for the future in the event it is removed.
- A clear example of this is the Syrian war crimes Archive – 23% of the evidence stored has been deleted from its original platform and would no longer be able to be accessed by the ICC if the Syrian Archive had not already captured it.
What are the lessons that the Government should learn when directly comparing the draft Bill to existing and proposed legislation around the world?
- The UK wields a significant amount of soft power globally. Often, legislation that is first drafted and adopted here is closely imitated by many countries across the world, as British democracy and law is held in such high regard.
- Passing this legislation in its current form would grant authoritarian regimes legitimacy in utilising similar legislation to crack down on free speech and journalism in other countries.
- As a leading charity working to protect freedom of expression globally, we are already witnessing the impact of this bill elsewhere, stringent social media regulations passed secretly in Pakistan last year appear to have lifted the term “online harm” directly from the U.K. government’s 2019 white paper on the topic. The government must be aware of the risk this legislation poses to freedom of expression globally.