House of Lords Communications and Digital Committee inquiry into Freedom of Expression Online
1. Is freedom of expression under threat online? If so, how does this impact individuals differently, and why? Are there differences between exercising the freedom of expression online versus offline?
○ Women are being particularly impacted by online censorship. This is happening when we challenge ‘gender ideology’ or the idea that we all have an ‘inner gender identity’ that is more important than our biological sex. There is no scientific basis for this ideology. It is having a serious impact on women’s rights via the implication that men should be able to ‘self-ID’ into women’s single sex spaces, sports, and wider opportunities. Yet individual tweets and whole accounts are being removed when women seek to challenge the ideology. Simple scientific facts and self-evident statements are regarded as reasons for information to be removed from online platforms: ‘there are only two sexes’; ‘it is not possible to change sex’; ‘men are not women’; ‘sex is not a spectrum’. It has become risky/difficult to make these self-evident arguments online on a number of different platforms. Yet how can women engage in one of the most important debates of our time, if these sorts of arguments are considered unacceptable on widely used online platforms?
2. How should good digital citizenship be promoted? How can education help?
○ My personal view is that online platforms and publishers have every right to remove abusive language and content, irrespective of free speech arguments. We will not survive without manners. As Edmund Burke said ‘Manners are of more importance than laws. Upon them in a great measure, the Laws depend. The Law touches us but here and there, and now and then. Manners are what vex or soothe, corrupt or purify, exalt or debase, barbarize or refine us, by a constant, steady, uniform, insensible operation, like that of the air we breathe in. They give their whole form and color to our lives. According to their quality, they aid morals, they supply them, or they totally destroy them’.
3. Is online user-generated content covered adequately by existing law and, if so, is the law adequately enforced? Should ‘lawful but harmful’ online content also be regulated?
○ The same legislation around free speech should be applied online as off. We cannot start regulating’ ‘lawful but harmful’ speech, because, who decides? That said, I have to reluctantly agree that private platforms should be allowed to set their own rules for the type of discourse they allow. Efforts of government should therefore be focused on creating an environment which allows multiple technology platforms to flourish.
4. Should online platforms be under a legal duty to protect freedom of expression?
○ I am not sure how this would work. Private companies should be allowed to publish or not publish whatever they like, as long as it is legal. If anything, we just need to ensure that existing online platforms are not given privileged positions or protections in law that gives them an advantage ‘in existing’. That way diversity can reign.
5. What model of legal liability for content is most appropriate for online platforms?
○ I may be missing something, but it should be the same as for offline publishers.
6. To what extent should users be allowed anonymity online?
○ Online anonymity has become essential. There are so many sensitive issues where people benefit enormously from being able to discuss them anonymously online. Trouble at work; issues to do with mental health; problems at home. The concern over anonymous accounts seem to revolve around abusive accounts - focus on these, not on anonymous accounts as a whole, the vast majority of which are sane and reasonable.
7. How can technology be used to help protect the freedom of expression?
8. How do the design and norms of platforms influence the freedom of expression? How can platforms create environments that reduce the propensity for online harms?
9. How could the transparency of algorithms used to censor or promote content, and the training and accountability of their creators, be improved? Should regulators play a role?
10. How can content moderation systems be improved? Are users of online platforms sufficiently able to appeal moderation decisions with which they disagree? What role should regulators play?
11. To what extent would strengthening competition regulation of dominant online platforms help to make them more responsive to their users’ views about content and its moderation?
○ We need to make sure that any competition law or public policy does not stand in the way of allowing multiple platforms to flourish. Where there is a genuine monopolistic or oligopolistic situation we may need government regulation to protect a level playing field that allows multiple end-user platforms and publishing opportunities to flourish.
12. Are there examples of successful public policy on freedom of expression online in other countries from which the UK could learn? What scope is there for further international collaboration?
13 January 2021