final logo red (RGB)

 

Select Committee on Communications and Digital

Corrected oral evidence: Freedom of expression online

Tuesday 20 April 2021

3.30 pm

 

Watch the meeting

Members present: Lord Gilbert of Panteg (The Chair); Baroness Bull; Baroness Buscombe; Viscount Colville of Culross; Baroness Featherstone; Baroness Grender; Lord Griffiths of Burry Port; Lord Lipsey; Baroness Rebuck; Lord Stevenson of Balmacara; Lord Vaizey of Didcot; The Lord Bishop of Worcester.

Evidence Session No. 24              Virtual Proceeding              Questions 194 - 198

 

Witness

I: Senator Marsha Blackburn, US Senator for Tennessee, Republican.

 

USE OF THE TRANSCRIPT

This is a corrected transcript of evidence taken in public and webcast on www.parliamentlive.tv.

 


10

 

Examination of witness

Senator Marsha Blackburn.

Q194         The Chair: Our next witness is Senator Marsha Blackburn. Senator Blackburn is the first woman to represent Tennessee in the United States Senate. She has served on a number of committees and is a ranking member on the Sub-Committee on Consumer Protection, Product Safety, and Data Security. She led the Senate Committee on the Judiciary tech task force, examining the influence of technology on American culture. In committee she has questioned the CEOs of Google, Facebook and Twitter.

Welcome, Senator, and thank you for taking the time to be with us today. Today’s session will be broadcast online and a transcript will be taken. This committee is examining freedom of expression online in the context of forthcoming legislation in the UK, which will seek to tackle harmful online behaviour, while having regard to the right of freedom of expression. Today, we would like to hear from you on some issues around regulation, competition and the role of platforms. Shall we start with the fundamentals? We only have 45 minutes for this session and lots to get through so, if committee colleagues and the Senator could do their best to be concise, that would be helpful.

Senator, could you give us a brief overview of your perspective on freedom of expression online, as an American lawmaker? This matters to us, because we are grappling with the role of large American corporations, whose values and policies are shaped by American attitudes and laws. Could you kick off by giving us your top-line thoughts on the issue of freedom of expression online? Then we will move on to questions.

Marsha Blackburn: When we talk about free expression, we have to go beyond the legal constraints we have already worked with and consider two main underlying questions. Those are how private parties, individuals, exercise their power over speech online, and how the show of power affects free expression overall.

In the US, we enjoy uniquely permissive legal constructs that, in the past, informed private interactions as well. Over the past 15 years or so, the explosion of digital platforms has warped our perception of our right to free speech. In January, the Pew Research Center completed polling showing that more than 80% of the American public get their news from a digital device. Half of US adults get their news from social media. Here is the problem: the algorithms employed by companies such as Facebook and Twitter created little bubbles for us. Slowly but surely, this has convinced a growing number of people that disagreeable opinions they encounter outside of their bubble should be punished.

Because these companies are glorified ad agencies and not free speech advocates, they have every reason to err on the side of censorship. They protect themselves with content guidelines that are so vague that they can remove just about anything without recourse. This encourages conformity, because the digital space is a massive source of revenue for news outlets and other content creators. Losing these clicks is like a death sentence for their ad business.

We now have a cycle of censorship, encouraged by companies that both demand and punish scandals as a condition of participation in the marketplace they control. We have confronted similar challenges with big oil and the railroads, and we will use the same laws as we take on big tech here in the US. That is the state of play from where we are sitting right now, as we look at protecting the American public.

The Chair: Let us apply some of that to the situation we face here, dealing largely with the same platforms and most of the same issues.

Q195         Viscount Colville of Culross: Good morning, Senator. Thanks very much for coming before us. I am very interested to hear you talking about the tech platforms as glorified ad agencies. In the past, you have said that there is an evasiveness from the tech companies about taking responsibility for content moderation and that they subjectively frame objectionable content. Do you see a conflict between the tech companies’ requirement to preserve freedom of expression as a human right and their commercial interests to engage users, or to gather ads, as you might say, more intensively with their platforms?

Marsha Blackburn: Let us walk back. As we look at freedom of expression online, it was less than a decade ago when we all watched the Arab spring take place. I know that you all watched it; we watched it here in the US. That blossomed in real time over Twitter and Facebook. Young people in those countries were mobilised to get online. That interaction with people who were here in the US really seemed to bring about a hopefulness. I look back at that situation and see a lot of opportunity in how technology was used to open up discourse and debate around the world about freedom and freedom’s cause.

Fast forward to today. We see that some of those windows of opportunity are actually closing. Rather than broadening diversity of opinion, tech platforms have sought to limit opposing ideas. That is a part of the cancel culture: “Agree with me, or we will shut you down”. In my opinion, the greatest emerging threat to free and open discourse is censorship at the hands of corporations. We do not want violence or terrorist propaganda indoctrinating the minds of our children, but we need clear definitions of what counts as hate speech, misrepresentation and disinformation.

Content moderators hold too much power in deciding what can or cannot be posted. Content moderators are expressing their opinion. This is not something that is founded in federal law. Algorithms alone cannot solve the bias problem, because your content moderators weigh in. The American experience with government-led censorship shapes my desire to protect freedom of expression online. Our history from our founding has taken a vigorous approach to protecting free speech. We should also protect that online. Our founders feared oppressive government censorship and protected our right to free speech via the first amendment to the constitution.

The first amendment states, “Congress shall make no law abridging the freedom of speech, or of the press”. The amendment makes clear that congressional action, some form of government action, will be required to find a violation of that right. Their goal was set to prevent government-led censorship and sedition-type laws, such as laws barring discussion of controversial political topics or ideas that might spark violence. The amendment does not speak to the power of corporations to censor and how that should be regulated. This is up to Congress, and we look to what the courts have said for guidance in this.

Our Supreme Court has said that businesses have free speech rights just like individuals do. For example, businesses can donate to campaigns through political action committees, refuse to service individuals based on political beliefs or refuse to make cakes or flowers for a wedding the owner does not approve of. The difference between those situations and the tech companies is the lack of competitive options available when service is refused. If Facebook refuses to allow my views on voter fraud or the mental health implications of school closings to stand, where can I go that is just as visible and equally competitive as that platform?

That is a rhetorical question. The answer is that there is no alternative that is just as competitive. These are keys to entry in the online public square. Social media is to be the modern-day public square. We are finding those keys to entry and that online public square in the hands of a very few incredibly wealthy and powerful CEOs, who believe their success has given them a mandate to control what to say, what we say, how we think, how we choose to vote and how we shape our opinions.

Q196         Baroness Featherstone: Following on, in a way, from lack of competition and huge platforms with unwieldly power, we heard concerns from several of our witnesses that regulation, if regulation were to be introduced, would favour big platforms and might penalise or prevent new entrants. The big boys, so to speak, are always going to be well resourced and able to deal with regulation, the small ones and new entrants not so much. How can Governments ensure that regulation does not entrench the market power of the largest platforms?

Marsha Blackburn: I am so glad you asked that question. That is something we are dealing with just every day now. Our approach here in the US has been much more nuanced than in the EU, and that has been for good reason. We want to regulate anti-competitive and abusive content, but not at the expense of the free market. That is one of our differences. Let us look at privacy regulations. When it comes to online privacy, we have to maintain an environment where companies are free to innovate, but with the understanding that their consumers are more than products. Right now, what I say many times is that, if you are using a social media platform, you are the product.

More people are beginning to realise to that and a lot of start-ups are struggling with GDPR. I know you are all looking at this. Very few small businesses have the legal department and the resources to absorb such large compliance costs as they are seeing from GDPR. It caused the tech giants such as Facebook less pain. If anything, it appears to have solidified their dominance in the marketplace, while increasing barriers to entry for these new entrants, because they are having difficulty dealing with those compliance costs in their start-up phase.

In the US, we are working towards legislation that is based on my Bill, the BROWSER Act, which has been our point of discussion for a few years. This would be one set of privacy rules for the entire internet ecosystem, with one federal legislator. The goal of consumer privacy is not to have a regime that is so complex that it creates a new cottage industry of compliance professionals who are there to try to help these companies walk through that compliance. We think that it is our responsibility to bring forward clear, conspicuous notification processes, so that consumers will be able to make a more educated choice about the nature of their relationship with tech companies. They should have the ability to opt in if they want to share information. They should opt out if they do not want some of their non-sensitive data to be shared.

Congress is continuing to push forward with investigations on these data privacy abuses, content censorship and anti-competitive conduct. When you do that, that leads you to moving on into competition regulation and enforcement, which is kind of the next part of that. I have stepped out of a hearing that we have this morning with our Federal Trade Commission. It is going on right now, so I am going to have to get back and get my questions into them.

In the US, the focus on competition regulation and enforcement is on harm to consumers and how it affects the consumer. It is a consumer-focused approach, as opposed to a market-structured approach. Current US law empowers our authorities, primarily the FTC and somewhat the DOJ, to go after tech platforms for monopolising conduct under Section 2 of the Sherman Antitrust Act. State and federal authorities in the US are aggressively using this law to go after companies that engage in illegal anti-competitive activity. Our state AGs, the FTC and the Department of Justice recognise that tech platforms represent a threat to the free market and to competition. Consumers and businesses will suffer in the absence of any action.

The crime here is not in being too big. The crime is taking action that hurts competition, in other words wanting to drive out new entrants. We had Judge Bork, who was a Supreme Court nominee, and the Chicago School of Economics. The best measure of competition is defined by the consumer welfare standard. Whether content is anti-competitive does not hinge on structural dominance, such as company size or market share. It focuses on the bottom-line impact to the consumer, on consumer harm. That is why in this country we call it the consumer welfare standard. Harm to the consumer is determined by any combination of four factors. Those are price, choice, quality and innovation.

When the actions of a platform raise prices but reduce choice, quality and innovation, that harms consumers and violates the law. When looking at price effects in a proposed merger, enforcers look at the likely post-merger impact on price due to the removal of a direct horizontal competitor. In choice, it is important for consumers to have a choice of services and products in a given market. If an incumbent firm decides to buy up a rival firm, the only one option they have will be that merged entity. Then, consumers do not have much of a choice but to rely on that one merged entity.

On quality, the quality of service or products has plenty of dimensions and privacy is one of those. Privacy protections empower the consumer and give them control over how their personal data is shared and used. Some people like the option of turning off cookies. Others do not like sharing their data with third parties or getting targeted ads. People who work out may not want their Fitbit data shared. Patients who use health apps certainly do not want sensitive data getting into the wrong hands.

Some critics resist the idea of privacy or even quality being a key component of the test. The reality is that enforcement agencies routinely analyse non-price harms in other sectors, such as autos and healthcare. For example, when looking at hospital mergers, the FTC evaluates quality of care, such as treatment, and provider options. In the auto industry, car safety is a big quality issue. We have had plenty of hearings here at the federal level on that issue. If two carmakers want to merge, with no effect on car prices, but the cars end up being less safe, consumer safety is a concern, much like here, when looking at tech platforms, consumer privacy is a concern.

We had a Supreme Court case from 1958, Northern Pacific R Co v United States. It lays out nicely how anti-trust has long focused on issues beyond price. The Sherman Act, the court says, rests on the premise that the unrestrained interaction of competitive forces will yield the best allocation of our economic resources, the lowest prices, the highest quality and the greatest material progress.

Progress refers to innovation and innovation effects are something entirely on their own. There are the effects that we want to see: new products, new entrants. The classic case of innovation harm is the Microsoft case. Many of you will remember that Microsoft dominance in the Windows operating system meant that it had network effects, a network of locked-in Windows users. That is now paving the way for the Google case that is currently at DOJ.

Baroness Featherstone: That was a very full answer. I think we have strayed into another question, which is on competition. Before I leave you, could I have a quick response on your proposed Online Freedom and Viewpoint Diversity Act? What are the benefits? Very quickly, what is the reasoning behind the amendment to remove the phrase “considers to be” and replacing it with an “objectively reasonable belief” standard? The second one replaces “or otherwise objectionable” with “promoting self-harm, promoting terrorism, or unlawful”. Can I understand what your thinking was behind those switches?

Marsha Blackburn: We are looking at Section 230 of the Communications Decency Act. That is legislation enacted in 1996. It gave tech companies immunity from certain types of litigation in order to incentivise innovation. The legislation I have would clarify the original intent of the law and increase accountability for the content moderation practices that I mentioned earlier. These tech companies have stretched their liability shield past its limit and this is the reason for the Bill. The national discourse now suffers because of it, as you see censorship of certain viewpoints.

We feel that today’s internet is very different from the internet of 1996. These polished mega platforms that we associate with online research and debate exert unprecedented influence over how Americans discover new information and what information is actually available for discovery. The contentious nature of current conversations provides a perverse incentive for companies to manipulate that online experience in favour of the loudest voices in the room. There exists no meaningful alternative to these powerful platforms, which means that there is no accountability for the devastating effects of this engrained ideology and its bias. It is important that Congress steps in and helps to rein this in and reshape this platform.

My legislation would clarify that the Section 230 liability protections apply to instances where online platforms choose to restrict access to certain types of content. It would condition the content moderation liability shield on an objective reasonableness standard. That is the wording we are using. It would be that, to protect from liability, a tech company may restrict access to content on its platform only where it has an objectively reasonable belief that the content falls within a certain, specified category.

It would remove “otherwise objectionable” and replace it with concrete terms, including promoting terrorism, content that is determined to be unlawful and content that promotes self-harm. It would clarify that the definition of information content provider includes instances in which a person or entity editorialises or affirmatively and substantively modifies the content originally created or developed by someone else, or the entity does not include mere changes to format layout or basic appearance of such content.

Baroness Featherstone: Is that something the individual could take to law, or is it the platform that is charged with the duty?

Marsha Blackburn: Your individual, the harmed entity, would then have recourse through the Federal Trade Commission. That would be their point of entry.

Q197         The Lord Bishop of Worcester: Thank you, Senator. It is really fascinating and helpful to hear your perspective. You have talked a fair bit about possible harm to the consumer. I want to dig deeper on a particular sort of harm, not in relation to competition.

There is quite a lively discussion here about the proposal to include legal but harmful content in new legislation. In other words, if content is deemed to have the possibility of harming someone psychologically, the tech platform would have responsibility for removing that. That is provoking a lively debate, and I suspect there will be one in this committee.

For those on the one hand who want to protect freedom of expression, the worry about this is that it might result in tech platforms taking down too much material. As one of our witnesses says, “If a Government believe that a category of content is sufficiently harmful, the Government may make that content illegal directly, through transparent, democratic processes, in a clear and proportionate manner”. There are those who feel that. On the other hand, there are those who feel that it is really important to protect people from harmful content, which might reasonably be thought to cause psychological harm. What is your perspective on that?

Marsha Blackburn: Bear in mind that we are not trying to move Section 230 off the books. We are seeking to clarify language. As I mentioned earlier, you repeatedly see that the opinion of these content moderators is being used to silence companies or individuals. Therefore, it is really important that we remove the language “otherwise objectionable”, which is nebulous. That is the phrasing that big tech companies are hiding behind. We will replace that with concrete terms, which would be specific, “unlawful”, “content that promotes self-harm”, “promoting terrorism”. We will be specific in those definitions, so people know that you cannot get on there and promote things that are unlawful, going to cause self-harm or going to promote terrorism or terrorist activity.

The Lord Bishop of Worcester: Could I give you another example? I think we would all be agreed that promoting self-harm is an appalling thing, that terrorism is an appalling thing, et cetera. There are things that are very controversial, both here and in America. For example, there are the rights of the trans community and whether someone saying something that is anti-trans, which might be reckoned by that person to cause harm, should be included in what is deemed unacceptable. There are other examples of that type that are much less clear-cut in a controversial sense, if you see what I mean.

Marsha Blackburn: This is why we think it is important for us to put the clarifications on this law in place, and then allow the FTC to go about its rule-making process. We think that the best way for us to approach the platforms is to say, “An objectively reasonable belief that the content falls within a certain category”. We will remove the “otherwise objectionable” and replace it with the specificspromoting terrorism, content that is determined to be unlawful and content that promotes self-harm.

The Lord Bishop of Worcester: You could say, for example, that content that is anti-trans could promote the self-harm of that trans person, because they would feel victimised by it. Would that be determined to be unacceptable? If so, who would make that decision, according to what you were saying?

Marsha Blackburn: Our goal is to update Section 230. Then, we would give the FTC the rule-making authority for the implementation of that rule.

Q198         Lord Vaizey of Didcot: It is a great honour to have you giving evidence to us, Senator. As a US citizen myself, I am particularly grateful that you have been able to spare the time, although most of my family is in the great state of Texas, not, unfortunately, in Tennessee, but we can fix that.

I was going to ask you about competition, but you have covered quite a lot of those issues. I noticed this week that Apple is going to let Parler back on to the App Store. Is one of the answers to this question of bias and freedom of expression that you can create, quite rapidly, platforms for people to express their views? I do not think Clubhouse is particularly political, but nobody had heard of Clubhouse about six weeks ago, and now everybody is on it. In about six weeks, everybody may have finished with it. These things come and go very quickly. I wonder whether that is the answer.

Marsha Blackburn: They do. I have looked at your UK Competition and Markets Authority report on promoting greater and more dynamic digital platform markets. In that, I saw three ideas that I think were really quite excellent. First were the data-related interventions, such as consumer control over data. My approach to that is to ask, “Who owns the virtual you?” We all know that consumers want to own their presence online.

When you look at those data-related interventions, such as consumer control, interoperability, data access and data separation powers, this type of regulation would occur through what we in the US would do as privacy legislation. The UK could certainly take some of the consumer-centric ideas of the GDPR and the CCPA, which is California’s law, and develop federal privacy laws that enhance the user’s control over their data. As I mentioned earlier, that is something that we are working towards here in the US, so that the individual has that control over that data.

The second important thing is consumer choice and default, being certain that you are promoting that consumer choice. My BROWSER Act, which we talked a little about earlier, allows for opt-out on non-sensitive personal information and opt-in on sharing sensitive personal information. If you allow that for consumers, you will see more competition and more innovation in the process. As I said earlier, as lawmakers we pass the statutes and then leave it up to the enforcers, such as the FTC and the Department of Justice, to handle appropriate remedies and enforcement.

We know that there are things that are being investigated now. As we go forward, we will see a little more of what is going to happen with Google. You have people talking more about the Facebook settlement with the FTC. There are things coming at us from the competitive aspect, such as Facebook trying to do Libra, which is a cryptocurrency project that it has at the top of its list. We will be watching to see what kind of service Facebook would want to be giving away in order to have greater access to that consumer data.

Lord Vaizey of Didcot: You mentioned the CCPA, the California privacy Act, which is, as I understand it, to a certain extent, based on the European data protection. Do you think there will be more connections between European and US legislation as tech regulation moves up the agenda in the US? I was thinking of one example, western if you like. Australia recently passed a law that means that Facebook and Google have to pay money to newspapers, because a lot of people access newspaper content through Facebook and Google. Are these the kinds of idea that could be debated in the US now?

Marsha Blackburn: You are going to see some debate on this. You are talking about the concept that the newspapers are working through with Google and Facebook, because they have become news sources. I mentioned the Pew research earlier. That is a great poll to view. Some 80% of US adults are getting their news from social media feeds. The newspapers want to make certain that news organisations are compensated in some way by these tech giants for the content that they are posting. The tech giants do not have the opportunity to pull and place without compensation and recognition of the creation of that content.

Likewise, in Tennessee, we hear the same thing from many of our content creators, people who are musicians, people who are writing books. They want to make certain that social media and these tech giants are paying, because they are making money. We have to realise, as I said earlier, that these are big ad agencies. They build their worth off the number of eyeballs that they draw to their sites. That is why they build a bubble out of what you are reading and accessing. They market to you within that bubble, and take your information and market to third-party entities.

The Chair: Senator, thank you very much indeed. I think it is coming up to midday where you are. It is very kind of you to spend some time from your morning with us. I know that you have to go back to the hearing you are holding yourself. You have given us some very useful evidence and a bit of a reading list, some reference points that I think will be useful to us in the inquiry. We will publish a report, probably in two parts. We will certainly be making some recommendations in terms of the forthcoming legislation in this country. We will be following what you are discussing in the Senate and elsewhere in US politics. There will be a wider-ranging report later in the year. We have a wide range of very interesting evidence. Your evidence today has been very useful and interesting to us. Thank you very much indeed, Senator, for taking some time out this morning to be with us.

Marsha Blackburn: Thank you. I have enjoyed being with you. I hope that you will all stay in touch. Feel free to reach out and ask questions. We will keep you posted as we move forward. I call it the virtual you protection agenda. We are addressing privacy online, data security online, the anti-trust issues and the Section 230 reform. Those are the four components where we are focused.

The Chair: It is the same agenda. We are finding that issues of competition are very important among the wider issues of regulating tech and preventing harm. We have had an opportunity to explore some of those relationships today. It would be good if you or your staff could keep us informed of anything you might find useful. We will happily do likewise. Thank you again for your time this morning. I hope your hearing at your end goes well.