Written evidence submitted by 5 Sports: The Football Association, England and Wales Cricket Board, Rugby Football Union, Rugby Football League and Lawn Tennis Association, The FA (OSB0111)



Across all of our sports - from players to pundits, referees to team managers, administrators to coaches - many of those taking part have been subjected to racist, sexist, homophobic and other discriminatory abuse on social media.


In many instances, the content is legal but incredibly harmful, and causes immense distress to those who are forced to face it day in, day out.


It is proving very difficult to ensure that social media companies prevent or take down offensive content before it is seen, that online abusers are prevented from deleting and re-registering accounts, or that authorities have sufficient information and evidence to take prosecutions forward. Individuals can abuse others online anonymously, which means that online hate and discriminatory abuse have no real-world consequences.


Such abuse would not be acceptable in any other walk of life, and we are calling for the Online Safety Bill to be expedited in order to ensure that social media, one of the great communication tools of our time, is safe for everyone to use. We propose that the Online Safety Bill can be strengthened in the following ways:


  1. The Equality Act 2010 affords certain groups statutory protection from discrimination in certain activities. The Online Safety Bill should provide the same protection to these groups online.


  1. Ofcom should be given powers on ‘legal but harmful’ content.


  1. A sliding scale of verification should be required for accounts, and limiting the reach of partially verified accounts should be part of the Codes of Practice (s.29).


  1. Comments on news publishers’ platforms should be included with respect to discrimination and hate speech.


  1. Discrimination and hate speech should be the subject of specific Codes of Practice (s.29).


  1. Discrimination and hate speech should be categorised as “priority illegal content” in the Bill in order to put an increased obligation on service providers to take positive actions to minimise the presence of such content on their platforms (s.5(2) and s.9).


  1. There should be a  specific obligation on providers to specify in their terms of services how they will mitigate and manage the risks of content which is harmful to adults (including racism and/or hate speech) - mirroring the obligation in the current draft Bill in relation to services and children (s.5(5) and s.11).


  1. There should be a statutory power to enable the Secretary of State for Digital, Culture, Media and Sport to clearly specify content that is harmful to adults in secondary legislation (including discriminatory abuse and hate speech).


  1. Comments on news publishers’ platforms should be included with respect to discrimination and hate speech.


  1. Transparency reporting requirements should be defined by the Bill.


  1. Social media companies should be required to assist the authorities with their criminal investigations (s.49).


For more detail on these and other areas of the Bill, please refer to our individual submissions.


21 September 2021