[COR0148]

Written evidence submitted by Center for Countering Digital Hate (COR0148)

 

1. The Center for Countering Digital Hate is a UK-based non-profit NGO, launched publicly in September 2019.

2. CCDH seeks to understand digital hate and misinformation and the means by which they are proselytized, in order to disrupt their activities. We do so, in part, by creating economic, social and political costs for malignant online behaviour.

3. The same digital architecture used to spread identity-based hate and misinformation is being used by actors in the Covid-19 crisis. The same Facebook forums used to proselytise hate and anti-scientific misinformation are now focusing on Covid-19. The same fake news sites and techniques for encouraging the sharing of misinformation are being used.

4. As such, our solutions set has been retargeted to Covid-19 misinformation since March, but has not had to substantially change.

5. CCDH is using its resources, experience and insight to bolster prosocial forces, including state actors, in dealing with:

(a)    the exploitation of Covid-19 by identity-based hate actors, and;

(b)    the social contagion of misinformation and hate relating to Covid-19.

6. The Center’s CEO, Imran Ahmed, has recently joined the steering committee for the Commission for Countering Extremism’s Pilot Task Force, as an expert on digital hate and misinformation.

Nature, prevalence and scale of misinformation on Covid-19 online

7. Coronavirus has unleashed two parallel pandemics. One is the biological pandemic of Covid-19. The second is a social pandemic of digital misinformation. Misinformation not just militates against our success in containing Covid-19, it fundamentally threatens to weaken the liberal democratic values that underpin our societies.

8. Both pandemics were, in fact, predictable. They are acute eruptions of chronic problems: the first of coronaviruses, variants of which caused SARS and MERS; the second of globalised, digitally-transmitted misinformation, which has exacerbated social maladies in recent years, including those of political instability, illiberal democratic politics, vaccine hesitancy, climate denial and rising identity-based hate worldwide.

9. When it comes to Coronavirus, misinformed people put themselves and others at risk by taking dangerous “cures” which may harm them or give them false confidence, and mistrusting official guidance designed to protect us all.

10. The challenge governments face is to persuade the population to accept measures such as the adoption of mandatory and relatively inflexible social distancing, orderly mass testing, vaccination, and enormous changes to the patterns of our lives. 

11. There is particular resistance to these measures by various actors, which include but are not limited to populist anti-Government forces; economically-motivated hucksters who use misinformation to drive people to their solutions; and also by misinformed people who transmit misinformation without intent. Some of these forces will cynically exploit individual and communal stresses to drive opposition to the measures needed to keep the public safe. As traditional media has been relatively good at keeping these voices outside of mainstream discussion, they rely on social media to reach a mass audience.

12. Across social media, there is a growing ecosystem of misinformation about Coronavirus. On Facebook, YouTube and elsewhere, malignant actors are spreading the false concepts that there are cures to the virus; that there are measures which can prevent an individual from contracting it; as well as conspiracy theories about the virus and its origins, including that it was designed as a precursor to a malevolent mass vaccination programme.

13. These ideas are being shared in spaces previously dedicated to other forms of quackery, such as Facebook groups for anti-vaxxers and alternative medicine. Increasingly they are also being shared in new spaces set up for normal citizens seeking information about Coronavirus. The crisis has, as such, presented a prime opportunity for ideologues to attract potential converts from a much larger pool of concerned people.

14. Additionally, hucksters are exploiting this crisis for profit. We have found individuals selling fake “cures,” “treatments,” and preventative solutions on websites like eBay and Facebook Marketplace as well as their own websites. These hucksters use large social media platforms to not just advertise products they sell, but also to sow conspiracist mistrust of evidence-based medicine and the intentions of the Government and health authorities.

15. Our research has found that social media platforms are failing to live up to their public pledges when it comes to removing damaging content and banning the sale of dangerous products. Preliminary research from our partnership with Restless Development, through which twenty volunteers are finding and reporting misinformation to social media platforms, finds that less than 5% of reported misinformation is being removed. As people are spending more time on social media due to restrictions on unnecessary travel, the risk of members of the public being taken in by these extremist ideas increases.

16. There is a danger that a significant proportion of the population are being primed to reject any future vaccination in the long-term, and in the short-term are being misled to either harm themselves with dangerous pseudo-medicines, or harm society more broadly by refusing to follow the government’s advice around social distancing or self-isolation due to the false belief that the virus isn’t real or that they can make themselves immune.

17. Research by Dr Daniel Allington of Kings College London, and published by the Center for Countering Digital Hate, shows that people who subscribe to conspiracy theories about Covid-19, including that it is caused by 5G, are less likely to be following the government’s guidance to wash hands regularly, socially distance, and stay at home.

Radicalising spaces: Facebook

18. If you want to radicalise normal people, you don’t use Telegram. You use the platforms used by most people: in the UK, the dominant social media platform is Facebook.

19. Facebook Groups provide ideal polarising and radicalising conditions, thanks to the ease with which extreme ideas can be normalised by didactic administration; punishment of heresy with opprobrium or exile; and encouraging group polarisation through “likes” and supportive comments. Groups have become the prime driver of radicalisation in digital spaces. While the majority are innocuous and serve a positive social purpose in connecting people, a substantial minority have internal cultures that are toxic. Whereas Facebook has in the past taken down groups that we identified as being antisemitic or containing calls for violence against black people and Muslims, they refuse to do so in groups that radicalise members of the public using misinformation and lies about science. 

20. The CCDH began tracking fifty anti-vaccine Groups a year ago, and we have seen them pivot to coronavirus misinformation in the past months. The Groups’ membership now totals 2.3 million. There has also been an explosion of Facebook Groups being set up to share information about the crisis, with many being exploited by malignant actors to share misinformation. 

21. We presented briefings on two Groups at Facebook’s UK HQ: We Brought Vaxxed To The UK, and Coronavirus UK. The first was set up to promote Andrew Wakefield’s documentary, while the second was a new group run by a page named Coronavirus Conspiracy Theories. Both contain vast amounts of conspiracism, fake medical remedies and preventions for the virus and misinformation to discourage vaccinations. Facebook refused to take down either group, instead removing some posts and leaving up posts which claim that hand sanitiser causes cancer, the flu vaccine makes one more susceptible to Coronavirus, and links to sites selling yoghurt suppositories as a cure to Covid-19.

Radicalising spaces: YouTube 

22. CCDH has found videos viewed by millions of people promoting misinformation about Coronavirus and its causes on YouTube. These videos include false claims that the government is lying about the threat of coronavirus, that it is caused by 5G phone signals, and that it can be cured by prayer. Further, we found that YouTube is serving these videos up to its users, through its automatic ‘play next’ function, and by recommending misinformation videos on coronavirus. 

23. Following the publication of our research in the media and receiving a briefing from us, YouTube removed some videos on coronavirus. However, this did not translate into either closing the offending accounts down - which continues to produce misinformation - or a broader change in action taken against transgressions of their own rules.

24. John Bergman, a US chiropractor whose videos on the subject amassed over one million views, advocates the use of essential oils and vitamin C to treat Covid-19, claiming “everyone will survive.” However, YouTube allowed his account, with 685,000 subscribers, to remain on their site, and he has since repopulated it with new videos making the same claims.

25. David Icke, the leading producer of conspiracy theories and misinformation about Covid-19, saw his YouTube channel removed following our report on Icke, and a call from leading medics, MPs and anti-hate groups. However, videos of Icke repeating his dangerous misinformation remain on the site, hosted by a wide network of conspiracist channels that YouTube have kept online. Some of the videos which are still available have over one million views.

26. Much the same problem exists with Instagram and Twitter.

Profiteers: Personal websites

27. Behind the YouTube channels or Facebook groups are often personal websites of those seeking to sell fake medicines. Individuals behind these sites may pose as consultants, experts, and other positions of authority. We have found sites advertising a range of vitamins, pills, oils, and other products that they claim will tackle Covid-19.

28. Unfortunately, the state does not seem to have speedy or strong enough tools to tackle these hucksters. We found and reported a Bishop in South London who was selling “divine plague protection kits” to parishioners concerned about coronavirus in March. One month later, it was reported that, despite charity commission and trading standards investigations into the church, it is continuing to sell the fake cures.

Profiteers: Facebook | Amazon | Ebay

29. The CCDH has found adverts for quack coronavirus cures, and other products supposedly banned from these platforms, on eBay, Amazon and Facebook marketplace.

30. Ebay has hosted a range of products which claim to protect against Coronavirus, or other viruses, with no active ingredients that actually do so, including “Ultimate covid immunity formula,” “Immune boost formula Covid19” and organic "anti viral" capsules.

31. Amazon has removed products which explicitly claim to cure or treat Coronavirus, but a wide array of products on their site are avoiding the filter though claims to be “anti-viral”: for example “High voltage true colloidal silver,” and “anti-viral patches.” 

32. In March, research by the CCDH was reported showing that Facebook Marketplace was hosting advertisements for fake cures and falsely advertised products, including colloidal silver, which can cause argyria. The product is still being sold on Marketplace, advertised as a cure for flu or coronavirus.

Knowledge Architecture: Faux-Populist, Conspiracist “News” Sites

33. A critical component of the delivery of misinformation are the evidence points that can be used to justify extreme opinions. A series of websites operate in this space. They exist on both sides of conventional political divides between left and right. They are characterised by conspiracism, anti-elitism, appeals to populism and a casual indifference to truth as long as their content achieves their political agenda. By publishing health misinformation and conspiracism they grow their pool of readers who can then be cross-fertilised with other components of the conspiracist’s mindset and worldview.

34. What’s more, these websites are a profitable business thanks to Google’s advertising network, which places adverts on websites known to be contributing to the misinformation contagion. Our research shows that websites which publish false stories about the coronavirus being a bio-weapon developed in a lab could be receiving millions of dollars in advertising thanks to advert placements by Google.

Solutions

35.The global pandemic has demonstrated the danger posed by social media platforms’ permitting misinformation and conspiracy theories to reach mass audiences. If they continue to resist acting to remove these forces from their platform, then legislative and regulatory change are required to force these changes upon them.

Social Media/ User-Generated Content Platforms

Short Term

36. HMG should coordinate with civil society groups so that when we identify misinformation agents and hate actors or toxic forums such as particular Facebook groups, we publicly advocate for their removal and HMG uses its levers and relationships to do so too.

37. The argument that this will drive bad actors into dark spaces does not hold ground: other forums set up to obviate Facebook moderation are always smaller and less diverse than the major social media platforms, which are the main route by which the broader public can be infected by misinformation agents.

38. Furthermore, simply removing posts is not enough. Meaningful action needs to be taken. This is true across all platforms, which uniformly resist removing bad actors. For example, we reported a video to YouTube for containing egregiously dangerous Coronavirus misinformation, which was then removed. However, it rapidly popped back up with the same content because, despite the user’s channel being full of misinformation, the channel itself was not shut down.

Medium Term

39. There needs to be clarification for where legal responsibility lies for online forums which exist to spread dangerous medical disinformation. Facebook claims it is up to administrators to moderate the content within their groups, but it is clear they are not doing so.

40. One potential solution is the proposal outlined in Lucy Powell’s Online Forums Bill 2017-19, that administrators of these groups should be legally liable for any content that occurs within them if they set the groups up or have a role in moderating their content and fail to act to remove harmful content.

Long Term

41. Ultimately, the most effective solution is likely to be the threat of prosecution and fines for social media companies that fail to remove harmful content or spaces in which disinformation is routinely proselytised. Facebook, YouTube and others clearly have the ability to act, but they need to be pushed to have the will to act. Too often they do not enforce their terms, always coming down in favour of “free speech”, which they claim is best countered by “more free speech”, i.e. staying on the platform and having ads served to participants.

42. For example, at the start of the Coronavirus crisis, Facebook claimed:

We are working to tackle vaccine misinformation on Facebook by reducing its distribution and providing people with authoritative information on the topic. We are starting by taking a series of steps:

We will reduce the ranking of groups and Pages that spread misinformation about vaccinations in News Feed and Search. These groups and Pages will not be included in recommendations or in predictions when you type into Search.

When we find ads that include misinformation about vaccinations, we will reject them. We also removed related targeting options, like “vaccine controversies.” For ad accounts that continue to violate our policies, we may take further action, such as disabling the ad account.

We won’t show or recommend content that contains misinformation about vaccinations on Instagram Explore or hashtag pages.

We are exploring ways to share educational information about vaccines when people come across misinformation on this topic.

43. All of these are demonstrably untrue or insufficient.

44. The accounts we use to track health misinformation still have Coronavirus misinformation served into the News Feed on a regular basis from the Groups we have reported to Facebook. Those groups are still available from the Search function.

45. We have shown that adverts containing Coronavirus misinformation are still being accepted by Facebook.

46. Coronavirus misinformation is still being shown across Instagram hashtag pages.

47. The vast majority of posts which the volunteers we work with have reported to Facebook remain on the platform.

48. The claim by social media companies that all of this is down to either a deluge of misinformation or a lack of resources rings hollow when you consider their unwillingness to address evidence even when it is handed to them.

Fake News Sites

49. The ecosystem of websites which spread health misinformation is adaptive and somewhat resilient. In the past it has exploited monetisation platforms such as Google Display Network, payment platforms such as Paypal, Patreon and Donorbox and white-label merchandising stores to generate income to fund operations.

50. Campaign directly attacking the economic infrastructure of digital fake news iclude Stop Funding Fake News and Sleeping Giant, which seek to defund the hate misinformation infrastructure. However, that model is limited by a number of factors, all of which the Government could help to unlock.

Medium-Long Term

51. The digital advertising market lacks sufficient transparency and control for advertisers. At the moment, a large brand such as Sky might find its digital adverts appearing on a wide range of sites, from the Guardian and Daily Telegraph through to G News (which contains Coronavirus misinformation) and smaller, fringe misinformation sites. This is largely in the control of Google, whose platforms for advertisers and publishers are overly opaque.

52. The French Parliament has passed an advertising transparency law which requires brands to declare where their adverts appear; this serves the purposes of both making it more difficult for misinformation sites to obtain funding but also gives advertisers the transparency they want as to where their ads are appearing and allows them to crowdsource refinements to their ad deployment. CCDH’s discussions with advertisers such as Sky have revealed intense frustration at their inability to adequately control the appearance of their ads with the tools given to them by the oligopolistic digital ads market. A law like this, which mandates only transparency, would seriously impede the ability of misinformation sites to generate income and reach.

Profiteers

53. Combatting those seeking to profit from this crisis requires a government-led response, involving the police, CPS, trading standards, advertising standards and medicines and health regulators, in order to bring down the full force of the law on those selling false cures and treatments.

54. We cannot predict the size of any deterrent effect caused by criminal prosecution and visible police action or attention. Merely stating intent to actively police hucksters might have an immediate effect on their activities, their willingness to exploit open platforms and general social awareness of the disreputable nature of their businesses and the harm they can cause, making the general public less likely to accept their claims and purchase their products.

55. A number of civil society organisations, including CCDH, would be able to help in creating a flow of cases to HMG that identify misinformation-related scams in real time and provide options for action.

 

May 2020