Graphical user interface, text

Description automatically generated

 

Written evidence submitted by the Board of Deputies of British Jews (OSB0043)

 

Background

In recent years, a significant percentage of antisemitic hate has been transmitted via the online sphere, particularly through social media. Figures recorded by the Community Security Trust suggest that approximately two fifths took place online[1]. Furthermore, the CST states frankly that its records are just the tip of the iceberg, such as in the following written earlier this year:

These totals understate the scale of online antisemitism: CST only records antisemitic incidents if they have been reported by either the victim or a witness; if the content shows evidence of antisemitic language, motivation or targeting; and if the offender is based in the United Kingdom or has directly targeted a UK-based victim. Targeted campaigns directed at individual victims often involve dozens of accounts sending hundreds or even thousands of tweets, images or posts within a concentrated timespan, or hundreds of tweets from a single offender, but each campaign of this type will only be recorded by CST as a single incident.[2]

The Board of Deputies of British Jews is determined to ensure that social media companies take proper action to protect Jewish users from antisemitic abuse on their platforms. We believe the following steps are necessary to ensure that such protection is in place.

1)      Social Media companies must adopt the International Holocaust Remembrance Alliance (hereafter IHRA) definition of antisemitism as part of their community guidelines, or be compelled to do so.

a)      A major problem we find in our interactions with social media companies is their dispute of what even constitutes antisemitism. With no baseline agreement on what the problem looks like, even if social media companies had flawless enforcement, it is unlikely that they would be able to tackle antisemitism in their platforms. As such, we need an agreed minimum standard for what constitutes antisemitism. Fortunately, there is an internationally agreed definition of antisemitism, known as the IHRA definition, which could and should be used. 

b)      We have discussed this with some prominent social media companies, and their responses are always the same. We ask them to adopt IHRA; they say that many of IHRA’s points are already part of their community standards. When we point out the significant gaps in these standards – gaps which would be covered by IHRA they essentially shrug their shoulders and say that we will have to agree to disagree. In one unusually frank meeting, a representative of one of the main social media companies stated that they would not adopt the definition unless required to do so by Government regulation.

c)      This led us to conclude that the only way in which social media companies would adopt the IHRA definition was if they were compelled to do so. At our urging, Oliver Dowden MP, Secretary of State for Digital, Culture, Media and Sport, has written[3] to all the major social media companies in the UK, “strongly encouraging” them to adopt IHRA “and consider its practical application in the development of your company’s policies and procedures.” We hope that social media companies will take notice of this; however, if they do not, we urge Parliament to ensure that the guidance which will accompany the Online Safety Bill will specifically include the requirement for the IHRA definition to be adopted in full as part of social media company’s community guidelines.

d)      As part of this, we have followed up with the Secretary of State and asked him to write to Ofcom, as the intended regular for the online space, to similarly strongly encourage them to adopt the IHRA definition and use it when assessing whether social media companies are effectively combating antisemitism. This would be a means of making social media companies have to have due regard for the definition. There is a precedent here – earlier this year Gavin Williamson MP, Secretary of State for Education, wrote to the Office for Students (OfS), the Higher Education regulator, asking them to do the following:

 

“to undertake a scoping exercise to identify providers which are reluctant to adopt the [IHRA] definition and consider introducing mandatory reporting of antisemitic incident numbers by providers. This would ensure a robust evidence base, which the OfS could then use to effectively regulate in this area. If antisemitic incidents do occur at a provider, the OfS should consider if it is relevant in a particular case whether the provider has adopted the definition when considering what sanctions, including monetary penalties, would be appropriate to apply.”[4]

 

Additionally, two successive Secretaries of State for the Ministry of Housing, Communities and Local Government have written to local council leaders asking them to adopt the IHRA definition of antisemitism, in line with the Government’s own adoption of the definition[5][6]

We believe that similar action is necessary regarding the Online sphere. Furthermore, in 2017 Ofcom adopted the IHRA definition in the context of undertaking its statutory broadcasting functions[7]. In our discussions with Ofcom, representatives have said that they would be very willing to use it in the context of undertaking their new duties with regards to social media, but it would require the Government to ask Ofcom to do so.

 

e)      We also note the Carnegie Trust’s recommendations for a specific code of practice for Hate Crime and wider legal harms. They have noted that while Codes of Practice regarding Terrorism and Child Sexual Exploitation and Abuse have rightly been given prominence, the Hate Crime code of practice has not, and does not include wider harms, as proposed. We note their draft code of practice on the subject and believe it deserves proper consideration[8].

 

2)      Social media companies should be required to appoint a minimum number of staff in the UK teams to moderate content that is harmful in the UK.

a)      We believe an in-country team monitoring suspected breaches of community guidelines will be more likely to have political, cultural and linguistic context for cases than a team based elsewhere. Additionally, we think it would likely improve accountability by the social media companies in question.

b)      To clarify – we think that social media companies operating in the UK should have teams of people based in the UK to moderate complaints raised in the UK, by users in the UK.

c)      Regarding the need for political, cultural and linguistic context, we will provide an example from the wider sphere of hatred. In one example we have heard, an MP was subjected to homophobic abuse on social media and complained to the platform in question, only to be told that the comments he had reported “did not violate our community standards”. It transpired that the reason for this was because an offensive homophobic slang word had been used which is not widely recognised outside the UK. One must presume that the person halfway around the world looking at the incident saw nothing wrong with the comment and acted accordingly.

d)      Social media companies tend to zealously guard their data – including the exact number of complaints they receive and the results of such complaints. Data, when occasionally shared, is always done in a way which will present that social media company in a positive light. Therefore, the number of staff required to work in such a team should be calculated not by complaints received, but number of UK users.

e)      Regarding in-country teams, organisations should be required to hire one member of staff per hundred-thousand monthly users. So, for example, a social media company with 30 million users should have a community guidelines team of 300, while a company with 700,000 users should have a team of seven. (Suggested guidance on smaller social media companies can be found in 3c)

f)       All members of the community guidelines team should be thoroughly trained in their social media company’s community guidelines, which we believe should incorporate the entirety of the IHRA working definition of antisemitism.

g)      Each week, a number of decisions should be chosen at random and checked by senior staff to ensure that the decisions being taken are indeed in line with the companies’ policies. If not, the individual who made that decision should receive additional training.

h)      Such teams should endeavour to respond to complaints within 10 working days of the complaint being made. In more egregious cases (eg. a high-profile figure on social media making antisemitic comments), that response time should be 1-2 days.  

 

3)      Ofcom’s Role

As the new regulator for the social media space, Ofcom should be entitled to request the following:

a)      Reports from social media companies regarding complaints made and decisions taken by the monitoring team as a result, including the average time taken to reach decisions.

b)      The ability to inspect such monitoring teams at a moment’s notice, to determine whether the team is comprised of the correct number of people and whether their training is acceptable.

c)      With regard to platforms with less than 100,000 UK users, or platforms whose number of users is unknown but which have a reputation for the sharing or promotion of hate speech (eg. 4Chan, 8Chan, BitChute et al), Ofcom should determine the size of the team necessary to monitor such speech.

d)      In the event of a company refusing to comply with requirements (either in terms of operating a serious community guidelines policy which combats hatred, or declining to create such a monitoring or staff it adequately) Ofcom should have the power to issue significant fines to such companies.

 

4)      There must be an adoption of a system of heavy fines for social media companies who fail to comply with the newly agreed standards.

a)      Sadly, we fear that in some cases the social media companies will only act, not out of principle, but when their bottom line is at stake. Meaningful fines are therefore crucial to driving meaningful change. 

b)      We are glad to say that the Government has included the potential for extremely heavy fines as part of the Online Safety Bill.

 

Conclusion

Antisemitism has been given a new lease of life thanks to the internet. Likeminded individuals can work together to disseminate Jew-hate and other falsehoods, both attacking Jewish users and attempting to indoctrinate antisemitism within wider society. It is the responsibility of us all to ensure that this is not allowed to happen.

The United Kingdom recognises and values free speech; however, it acknowledges that there is a difference between free speech and hate speech, and the former cannot come at the expense of the latter. With this in mind, we believe there needs to be a concentrated effort by lawmakers in this country to ensure that the IHRA definition is adopted by social media companies in the UK as a minimum standard.

While the online sphere, including social media, offers many blessings in terms of the connectivity it offers, there can be no doubt about dangers of harmful content posted online, both in and of itself and because their real world impacts. We hope that the Online Safety Bill and the measures around it will be an opportunity taken, and not an opportunity missed, to clamp down on hatreds of all kinds online. Some of the measures we propose have wide applicability to different forms of abuse, but even for those that refer specifically to antisemitism, we believe that if social media companies are made to institute strong measures to combat antisemitism, these can be used as a blueprint to help other minority communities similarly fight back against the hatred directed at them online, making social media a safer space for all.

 

17 September 2021

 

 

5

 


[1] https://cst.org.uk/news/blog/2020/02/06/antisemitic-incidents-report-2019

[2] https://cst.org.uk/news/blog/2021/02/11/cst-antisemitic-incidents-report-2020-published-today

[3] https://jewishnews.timesofisrael.com/oliver-dowden-urges-facebook-and-twitter-to-adopt-ihra-antisemitism-definition/

[4] https://www.officeforstudents.org.uk/media/48277145-4cf3-497f-b9b7-b13fdf16f46b/ofs-strategic-guidance-20210208.pdf

[5] http://modgov.southnorthants.gov.uk/documents/s19217/Letter%20to%20Local%20Authority%20Leaders.pdf

[6] https://democracy.cheltenham.gov.uk/documents/s31920/2020_02_17_COU_IHRA_appendix.pdf

[7] https://www.ofcom.org.uk/__data/assets/pdf_file/0025/123388/ihra-foi.pdf

[8] https://d1ssu070pg2v9i.cloudfront.net/pex/pex_carnegie2021/2021/06/05092047/Draft-Code-of-Practice-in-respect-of-Hate-Crime-and-wider-legal-harms-covering-paper-June-2021.pdf