Draft Online Safety Bill
Corrected oral evidence: Consideration of government’s draft Online Safety Bill
Thursday 21 October 2021
12.55 pm
Watch the meeting: https://parliamentlive.tv/event/index/4df31e2e-50c7-4b92-a745-a5fddc227498
Members present: Damian Collins MP (The Chair); Debbie Abrahams MP; Lord Black of Brentwood; Lord Clement-Jones; Lord Gilbert of Panteg; Darren Jones MP; Baroness Kidron; Lord Knight of Weymouth; John Nicolson MP; Dean Russell MP; Lord Stevenson of Balmacara; Suzanne Webb MP.
Evidence Session No. 10 Heard in Public Questions 148 - 153
Witnesses
I: Professor Jonathan Haidt; Jim Steyer, Chief Executive Officer, Common Sense Media.
USE OF THE TRANSCRIPT
17
Professor Jonathan Haidt and Jim Steyer.
Q148 The Chair: Good afternoon, in London time. Welcome to the members of our third and final panel for evidence today. We are very grateful to the witnesses for joining us, particularly with the early start, especially for Jim, who is on the west coast of the United States. Thank you for being with us.
I have a question about harm. The piece of legislation we are scrutinising is now called the Online Safety Bill. It was previously referred to as the Online Harms Bill. One of the most difficult areas of it is trying to come up with a meaningful definition of what we think harm is, particularly if we are defining some harms as not necessarily illegal but somewhat harmful. Jim and Jonathan, I would welcome your views. For the purpose of legislation, how do we go about defining this? How can we be clear in law about the harms that we need to protect people from? I appreciate you cannot see us. As I can see you, left to right, Jim, would you like to go first and then Jonathan?
Jim Steyer: Sure. Thank you so much for having me today. It is a delight to be here. I was listening to the prior panel. This is a really great discussion, and it is an honour to be addressing you all. Having listened to it, you are really doing this in a very thoughtful way. In light of all that we have learned, even in the past few weeks from the whistleblower Frances Haugen, this legislation has extraordinary implications for all of us. Those of us here in the United States have a lot to learn from the quality of your debate, so thank you very much for inviting me to participate.
One thing I would say about the definition of harms is that the psychological impacts on young people—on children and teens—are discernibly more important and significant because their brains are still developing. If you think about children and teens and their use of online platforms, social media in particular, they are much more likely to look to platforms such as Instagram, YouTube or others for social validation in the comparative culture that they are growing up in. They are especially receptive to rewards, and they can be deeply impacted by negative messages. For example, we saw all the recent research on body image that Instagram had tried to cover up.
At Common Sense Media, we have conducted a bunch of different research about the negative impacts of some of the social media platforms on kids’ self-esteem, their sense of anxiety and depression, their body image, and their sense that their body images and their overall existence, if you will, does not measure up to the idealised images they can see from influencers on platforms like Instagram. When you define harms, you should do it with a particular focus on young people and children. There is clear data we would be happy to share with the committee about how young people, particularly vulnerable young people, oftentimes feel worse about themselves because of the comparative culture that they experience.
Listening to the really thoughtful discussion in the previous panels, I urge that you folks define harms with a particular focus on youth and set aside specific regulations that protect the best interests of children and teenagers because they are more vulnerable to the platforms. I would also recommend that you call for more investment in research. Here in the United States there has been a dearth of that, so that large NGOs such as Common Sense Media have had to fill a void in the research area about the impact of technology and social media on the health and social and emotional well-being of young people.
I would, potentially as part of this legislation, make sure that you have ongoing research into those issues. I would not—repeat not—trust the large tech companies to do the research themselves because it tends to be biased and, if they do not like the results, they simply will not share it with the public, as we saw with Instagram.
Thank you very much for having me. I am really excited to have this discussion with you and to help you all craft good legislation. That is my opening thought about harms and the particular nature of harm as it relates to children and teenagers.
The Chair: That is great. Thank you very much, Jim. Jonathan, what are your thoughts?
Professor Jonathan Haidt: First, I echo what Jim said. It is an honour to be speaking to you and a real pleasure. Jim and I live in a country in which our legislature is about as dysfunctional as any on the planet. It is not likely to do anything. Your Parliament is way ahead. Looking through what I have read about the Bill—I have read the coverage of it—it is extremely thoughtful. The idea of a duty of care is great. Where I could be most helpful is to explain, as a social psychologist who has been studying this, how exactly social media is harming kids and why the Online Harms Bill is not getting at the central causal mechanisms. Here is the puzzle that we face.
If we look at mental health trends in the US, the UK and Canada, there is no trend from the late 1990s until 2012. Things move around but there is no trend. Suddenly, out of nowhere, in 2013 and 2014 you get these hockey stick-shaped functions, especially for girls. It is clearest on self-harm, and we are talking hospital admissions, not self-reported. Hospital admissions for self-harm in the US and the UK are stable until 2012, and then, by 2015, it is a hockey stick; it more than doubles, and for pre-teen girls it triples. Something happened that started sending girls in particular to hospitals, and the suicide rate greatly increased. The pre-teen suicide rate in the USA is more than double in those couple of years. Something terrible and huge is happening, but here is the puzzle.
Researchers, particularly at the Oxford Internet Institute—Amy Orben and Andrew Przybylski—keep putting out papers showing that the correlation between digital media use and depression is tiny. They keep saying it is no larger than for eating potatoes. What they do repeatedly—Jean Twenge and I have reanalysed those large datasets—is lump together all kids, not girls, and lump together all screen use, not social media, and they conclude that there is a tiny, almost no, relationship, and that is just wrong. If you zoom in on girls and social media, the standardised data—the measure of the effect size—goes up from 0.04, which is what they report in their main study, to 0.2, which is a huge increase. That is just in the correlational.
Here is the main thing I want to communicate to you. Everything you have heard—everything I have said—is within the dose response framework. We are treating Facebook and Instagram like sugar. Facebook people use the analogy too; it is like sugar. In small quantities, it is fine, so all we have to do is help teens limit their consumption, and then we are okay. But Facebook is not like sugar. The dose response framework is not the right way to look at it. What happened around 2012 is a complete rewiring of teen social relationships.
There is data from the UK. Girls in 2011 used to go over to each other’s homes after school sometimes; they would actually see each other. By 2014 and 2015, that plummeted—they do not do that. What happens? Many of you have children. What would happen if someone said, “Hey, I’ve got this thing. It’s going to take your kids 20 hours a week. You think your kid is busy. How about she gives 20 hours a week to this thing that is really bad for her and she’ll stop seeing her friends and sleep less? What do you think?” None of us would have said yes to that, but that is what happened—the complete rewiring of childhood. What is happening is not that they are consuming harmful content so much as that they are now totally focused on looking at pictures of each other.
That is my concern about the way you have defined harm. You are treating it in a dose response framework as though kids are taking in all this stuff and some of the stuff is bad for them, like scenes of child abuse and sexuality, and if only we can filter out the bad stuff—the harmful content—we will be okay. But content moderation is a losing game. One of the biggest things we learned from Frances Haugen is that Facebook—best in class—gets about 5% of hate speech and less than 1% of violence and intimidation. Content moderation is a losing game. Even if they double or triple what they do, it will be barely detectable. That is not the way to go. You need systemic changes that keep the kids off until, for God’s sake, they are 13, but that, as Jim says, is way too early.
In America, we need to reverse our terrible mistake of setting the age of internet adulthood at 13 in 1998, when we did not know what was coming. You are not going to succeed by trying to filter out the harmful content, because do you know what the harmful content is? It is a girl seeing a picture of her best friend celebrating with other friends. That, in part, is the harmful content. You are not going to filter that out. There has been a complete transformation of childhood caused by Instagram and other platforms. TikTok is now quickly rising to be as powerful. That is much harder to regulate, but that is, I believe, where most of the harm is.
The Chair: Thank you.
Q149 Baroness Kidron: Hello, both. It is good to see you even if you cannot see us. Sorry about that.
I am going to start with you, Jon, on this point. It is very hard for regulators to go, “Okay, now we’re just going to ban social media up until a certain age”, and so on. I want to explore some of the things that we have been thinking about here. Take, for example, safety by design; you take an approach and say, “Actually, largely speaking, we know what features drive engagement, we know what features drive spread and we know what features are very hard for young people to raise their eyes from”. If those are detoxified in the system for younger people, they have more interruptions in the experience that you have just so beautifully described. I would really like your thoughts on that. Start there.
Professor Jonathan Haidt: Certainly. I think we all agree that the number one indisputable change that has to happen is that Facebook must be forced to share data on what is happening to the kids and on what its algorithms are doing. There should be no debate around that. The metaphor is this: there is a crime scene, there are a lot of dead kids, and we cannot even look at the crime scene because the crime scene is completely controlled by the lead suspect. We have to beg for permission, and the suspect says, “No, I’m not going to let you see it”. I think what you are suggesting is that maybe there are tweaks that will make it less addictive. But you know what—even if we could make it less addictive, still there is no world in which 12 and 13 year-old girls are spending three hours a day looking at photos of other girls in which they turn out okay.
The basic premise of Instagram, it turns out, is incredibly bad for girls. Girls are much more socially connected and therefore girls are much more susceptible to social contagions. There was an article in the Wall Street Journal a couple of days ago—girls are developing Tourette’s syndrome, a neurological disorder, because if they see other girls with Tourette’s syndrome they develop it. Three hours a day of Instagram is just never going to be okay for 12, 13 or 14 year-old girls. The biggest change that you need to make, and we need to make, is to remove liability protection from any platform that chooses and selects what people see.
Frances Haugen is very clear about this. If it was just reverse chronological feed, technology connects people and they communicate, and we are not going to stop that. But as soon as a platform is saying, “This is what you are going to see, and that is what you are going see”, it is like the New York Times except that the New York Times is curating things for some professional standard, not to keep you hooked. As long as the algorithm is using an algorithm to select what people see, especially children, we should treat the companies like regular businesses. Why on earth do they have liability protection? If it is our fault, if it is America that did that with Section 230 of our Communications Decency Act, I apologise. We need to change that. It will have huge benefits for all the democracy concerns. We are just talking about children.
Baroness Kidron: Yes.
Professor Jonathan Haidt: But that is a small piece of a gigantic global problem. Removing liability protection from any platform that uses algorithms to feed things to people is the most important thing we can do.
Baroness Kidron: I will come to Jim in a minute, but can I push on that particular thing? You said, “Until they have professional standards”. One of the other things that we could do here is to say that there should be minimum standards by which the companies operate in their terms and conditions. The Bill does not currently do that, but a lot of evidence has come in saying that we should impose standards, and that we should impose them to the degree that we do elsewhere in society, picking up on your point that they are just businesses. What would you say to that?
Professor Jonathan Haidt: As a social psychologist witnessing the very rapid transformation of our world, our institutions, our children and our democracy, I think that pushing here and there and begging, even incentivising, will not do very much. There are certain parameters that have changed, and we have to think about changing parameters. Sure, you could try to mandate that they do something, and they will minimally comply. I am not very optimistic about it.
Baroness Kidron: Jim, we worked together on the age appropriate design code, and you have been a great saviour in the battle. I want to ask you a question about “likely to be accessed”, the idea of where we should protect kids. Kids are not always where we want them to be. In the US, you have got stuck on services directed at kids. Here, we have extended it a bit, but the Bill currently only deals in user to user and search. It does not really say that kids are protected wherever they are. What is your take on that? Where should we be offering protections?
Jim Steyer: Beeban, it is great to see you, and thank you very much for your leadership on this. I agree with a lot of what Jon just said. It is great to see that you are ahead of our country in looking at this. There are several things, and I will answer the question in a second. First of all, I underscore Jon’s point about liability and the fact that they have to be held liable for the content. I agree with your broader point that, in safety by design, you have to look at where kids are, not just at little restricted areas where companies would say that is really the only place that matters. The liability issue that Jon raised will help with that.
This goes to some of your question, Beeban. The legislation should address not just harmful content but the amplification and broadcasting of that content. If you regulate that, you are much more likely to be covering young people where they are, not just the specific content that a platform could say was aimed at children. The prior panel was talking about algorithms a lot. The risk assessments should explicitly address amplification and the algorithms that drive what kids see. That should be a primary consideration in the legislation. Essentially, safety by design should be required. We would be happy to work extensively with you offline about that, but that is the overarching principle.
In the context of the algorithms and amplification problem, because that is as much or greater than the content itself, I fully agree that there should be full transparency of algorithms, as Jon just mentioned. The idea that they control the crime scene and do not even allow you to examine it without their permission is a very good analogy. It is an insane model but it is exactly what we have.
Beeban, we have worked with you for years. You cannot trust Facebook and Instagram and so many other platforms to regulate themselves, period. Whether you give the power to Ofcom or another regulator—I heard the last panel say it was difficult to do that—it is worth it if you are talking about the fundamental mental health and well-being of your kids, your teenagers and your society. You have to give the regulator more power, not just for full transparency of the algorithmic broadcasting of the content and the amplification of the content, but for the ability to heavily regulate it and for there to be liability.
Baroness Kidron: Okay. We had a panel earlier today where people were very concerned about the chill on free speech and what that means. You put in protections, you start risk assessing, you look at algorithms, and in the end you make the platforms conservative about what is said, what is done, what is up there, and you start the chilling effect. That is an argument you are very familiar with in your part of the world, so maybe each of you—Jim first—could say how you would argue the case. You have both been very robust. How would you argue the case against the chilling nature of any of this on free speech?
Jim Steyer: First of all, it is a balancing process. I am a constitutional lawyer in the United States; that is what I teach at Stanford. Even in the more free speech-oriented American mindset, there is a balancing process when harms are involved, whether it involves hate speech, pornography or other issues that are obviously damaging to society. First, it is a balancing process, and you all should approach it in that light.
Secondly, when it comes to people under the age of 18, there need to be absolutely clear and additional protections. I agree with what Jon said earlier, or maybe it was Damian in the opening question, about the arbitrary nature of just assigning it to 13 and under and forgetting all the implications. Age verification is such an issue that many people under the age of 13 are on adult platforms in general. There need to be both special protections and balancing for people under the age of 18. It goes to the standard of care that you have, and the duty of care that the companies’ leadership should have when you are talking about specific content and the amplification thereof that is damaging to young people. We can put aside some of the democratic issues that we have all experienced, or the impact on democracy.
You have to have a very high duty of care imposed on the companies and their leadership. As a First Amendment professor, our motto at Common Sense is “Sanity not censorship”. The evidence is clear. It is time for much greater balancing. Leaving the decisions to the Mark Zuckerbergs of the world is an insane strategy, and it has backfired dramatically on my kids, on your kids, on everyone else’s kids, and on our democracies as well. I urge you to be even tougher than you might normally be in evaluating these questions.
Baroness Kidron: Thank you. Jon?
Professor Jonathan Haidt: For the first time, I disagree strongly with Jim by saying it is not a balancing issue. There is no problem with balance. The balance problem is only a problem if you are focusing on content moderation.
Baroness Kidron: Yes.
Professor Jonathan Haidt: As we do in America, if it is all about, “Do I get to post this?”, it turns out conservatives tend to get more censored. If we are all talking about content moderation, it is a balancing problem. What I am trying to say is, stop it. It is a dead end. You are barking up the wrong tree. Do you know who is up that tree? All the civil libertarians are up that tree. All the free speech people are up that tree. You keep barking up that tree. Stop it. Move to different trees. Tree number one—we have to get kids off these platforms until they are 16, or at least end the social dilemma in which we only let our kids on because everyone else does.
None of us wants our kids on Instagram at 13, and certainly not at 11. When my kids started middle school at 11, they said, “Dad, everyone is on Instagram. Can I have an account?”, and it was very painful for me to say no. Focus on keeping the kids off. As Jim said, teenagers’ brains are really delicate. Focus on keeping kids off as late as possible, and then they will be much less damaged by spending three or four hours a day looking at pictures of other girls. That is the first thing.
Secondly, focus on the amplification, not on who gets to speak. As is often said in these circles, freedom of speech is not freedom of reach. If you are going to be saying, “You can’t say this. You can’t say that”, it is a balancing censorship act and it is hopeless. You should be saying, “You can say whatever you want but we are going to make sure that the company is not amplifying the most toxic stuff”.
I just heard Frances Haugen on a podcast with Tristan Harris. She said that there are all these great ideas from people at Facebook that they never tried. I learned, for example, that they looked at how many comments people make and they looked at how many people other people invite to a group. One guy invited 300,000 people to QAnon groups. If they simply said there was a limit on how many people you can invite, and said you can invite as much as the top 99% of people a day to groups, which is maybe 100, or you can comment 100 times, and you put on some rate limiters, you are not censoring people. You are just saying, “We need room for everyone to have a voice”. We have to work on the amplification aspects. If you do that, you will have a much bigger impact than if you force them to triple the size of their content moderation team.
Baroness Kidron: Fantastic. Thank you very much.
Q150 Dean Russell: Thank you, Chair. I will come to you first, Jon. In a previous session, I asked a question—I forget which witness, if I am honest—about whether the way that social network platforms tweak content is a bit like the way that drug dealers tweak their product to make it more addictive. The word “addictive” has come up quite a lot. Are we looking at this from a slightly different angle? We are talking about publishers and platforms, but are we actually talking about the fact that we are creating a generation of people with addictive traits because they are addicted to content? These are more like drug companies; it is not a white powder they are taking—it is something else. Is there an analogy to addiction and drugs rather than it being about content and the way that we have seen it for centuries?
Professor Jonathan Haidt: Yes, there is absolutely an analogy to addiction. You have heard the word “dopamine” a lot. Anything that is pleasurable causes a little flash of dopamine. Kids are very vulnerable to that, as we have all seen with our kids playing video games.
In the last few months, I started investing in cryptocurrency. It is so much fun to watch it go up and down that I keep checking. I know I should stop checking. It is crazy and stupid, but I cannot stop myself. I have a fully developed frontal cortex, so what chance does an 11 year-old have when coming up against all the brilliance Facebook has hired? Facebook has hired most of the social psychologists and behavioural psychologists. Addiction is a real issue. It is not that it is going to make the kids addictable for life, it is that it is going to make them anxious and fragile for life.
The frontal cortex myelinates. We have what is called experience expectant neural development. That is part of our human heritage. If you grow up in a really threatening, dangerous and unpredictable environment, your brain gets set to see threats everywhere. If you grow up in a peaceful, calm, stable place with a reliable attachment figure, your brain is set more towards approach. What happened to us in universities in 2015 was that we were suddenly flooded with kids coming on to campus who saw speech and speakers as violent threats, and we could not understand what was happening. What happened to us was that Instagram changed the brains of kids born in 1996 and later so that when they arrived on campus they were paranoid. They saw danger everywhere—they will be like that for life. Their frontal cortex is set. That is especially true for girls.
I predict here and now, if you care at all about gender and closing gender gaps, you are going to see gender gaps closed for the next five or 10 years as the millennial and Generation X women are narrowing gaps. In the 2030s, you are going to say, “Why are they expanding? Why are women suddenly no longer rising to the top ranks?” It is because most of them have anxiety disorders. They do not take risks, they are afraid, and they are fragile. Facebook’s motto was “Move fast and break things”. They did. They broke our kids. They are breaking our democracy. We could at least maybe take away liability protection.
Dean Russell: May I ask something on that and then I will come to Jim with a slightly tweaked question on it? With regard to the addiction piece, do we need to be addressing it somehow in the Bill? At the moment, a lot of the conversation is about the 20th century view that my generation would have, which is, “You can choose to read content. You can choose to watch a video. You can choose those things”. The social bit of social media is the bit we do not talk about. We talk about the media bit. It is the social bit that is doing the damage. Do you know whether there is a huge body of research about dopamine and the impact that we should be referencing? Should we be referencing it in the Bill somehow with regard to the nature of harm?
Professor Jonathan Haidt: You cannot start telling people, “Do you know what? You shouldn’t do that so much—you’re addicted”. For adults in America, you can never say that. People get to choose. They are adults. For children, maybe you can, but even then it is really hard. What are we going to do? Are we going to say, “Children, you can only play Fortnite 40 minutes a day”? That is what China is doing. Closed societies can deal with Facebook. Open societies, so far, cannot deal with Facebook. It will be the ruin of us, I believe. The way to go is not telling the users, “You know what—you shouldn’t do this because it is bad for you”. It is telling the dealers, “You shouldn’t produce this product because it is so addictive”. It is the virality. You have to change features of what they are allowed to do, especially if they want access to children.
In America now, we are having a big conversation about whether Facebook is like the tobacco companies because the question is whether there would be big class action lawsuits. I have a better analogy for you. It is not tobacco—it is the British East India Company. Facebook is the British East India Company, chartered by the US Government in this case, and we said, “Go do whatever the hell you want in other countries. We don’t care. Start wars. Start genocides. Do whatever you want to children. We don’t care”. That is what we have done. We have unleashed the British East India Company on the world. As far as I understand it, it ran roughshod over everybody, and it was finally the British Government who did something about it, so please do something. Put us out of our misery.
Dean Russell: Thank you, Jon.
Jim, I have to say first of all that I am the chair of the Film and Production Industry All-Party Parliamentary Group, so I am passionate about film. You have solved, through Common Sense Media, so many family arguments about whether we could watch a film based on the age of my daughter at different times when we have referenced your site. At the moment, if somebody wants to watch a film or TV show, they may reference your site and say, “Actually, it is a 15, and my child is 12, so we’re not going to watch it”, but the same child can go on social media at any point and watch a video with real-life murders, real-life brutality, real-life misogyny and so on. Specifically on video content, does there need to be something that recognises ratings of the content that is put on, so that we see platforms almost as broadcasters rather than platforms, if that makes sense?
Jim Steyer: Absolutely. That is a great question. Yes, you should see them as that. They are the same. Look at it from the point of view of the end-user, and, yes, they are. That rating system could easily, and should be, applied there.
I agree with almost everything Jon said. The British East India Company example is hilarious, Jon. It is true, but it reflects the failure of the United States Government. Let us be honest about that. I am speaking to you from California at 4.30 in the morning, or whatever it is. The failure of our Government to take on these issues will be looked at historically as one of the great failures of our country. That said, I think we are going to see in the coming months leadership in the White House finally—and also, I hope, from Congress—but you have to move more quickly. I guarantee that you will move, in many cases, more quickly than the United States does, and we will follow you, and the same with the EU. That is why we have an office in London, and that is why we have been working in Brussels for the past few years; you are going to be ahead on this.
There are a few things. Yes, I would use the ratings. Thank you very much for your kind words about Common Sense Media. I will tell the 100 or so editors how much their work is appreciated. Obviously, companies should show the exact same kind of ratings system no matter what the platform so that parents and users can have a sense of the content—the nutritional information, if you will, about it—and the age recommendations, et cetera.
The other thing is your question to Jon about addiction. Good luck with your crypto usage, Jon. That is your own issue. Fortunately, I am not doing it. I think it is an addiction issue. There is legislation you could do around manipulative design. I encourage you as part of the legislation to include manipulative design because it is regulatable. It is similar to the fact that both of us are telling you to regulate the amplification and the algorithms and force the complete transparency of that, because the consequences are so significant. When you put it in the terms that Jon is framing it in, whether it is about childhood and adolescence or about democratic norms and institutions, those are the basic issues of our existence.
The idea that we have allowed one company and one person, whose values and judgment you could comment about for a long time, to make decisions with the implications they have for our children, our teens and our democracy is mind-boggling. I urge the committee to make the final legislation stronger than you would normally make it, and to take some risks in both the liability factor that we have been discussing and the full transparency and regulation of some of the algorithmic and amplification behaviours. That includes design because the design is intentional. Jon mentioned that.
Our colleagues who have worked at the companies and left are very familiar with the intentional design that has been done by the engineers and the product development folks, which is based on their surveillance capitalism business model. The point is that you are trying to keep people there, whether they are young people or adults, as long as possible. Engagement is really an arms race for attention—data and attention being the two holy grails. Do not be shy or too cautious in the way you draft the legislation. I would almost like to see, after well over a decade of non-action by Governments, more action than necessary to try to compensate for where we currently are. This committee’s work is critically important, and we will work with you on all of that.
I know you talked earlier about media literacy, digital literacy and digital citizenship work. That has to be an element of what you do, too. Not funding that and making it required for all young people would be a grievous error. The more that young people are able to understand some of the examples that Jon gave and what is being done to them, the more I think they will resist that, or they will at least be educated to some of the downsides. Please include digital literacy and digital citizenship as part of your recommendations as well. That is the field we pioneered, in many ways, at Common Sense Media, and we do it in the UK as well. We would be delighted to work on that with you. Be as strong as you possibly can. We will be with you, and the children and youth of the UK will be grateful to you for your leadership.
Dean Russell: Thank you very much.
Q151 Lord Black of Brentwood: Thank you. What a great rallying call for this committee. Thank you for that. I want to take up the point that both of you made about removing liability from the platforms. In many ways, we have quite a complicated Bill. What you are suggesting would be a way of short-circuiting that. You could do it, I guess, by classifying the platforms as publishers.
Professor Jonathan Haidt: Yes, exactly. They are.
Lord Black of Brentwood: Does that seem to you a sensible way forward? Would it have the effect that you want it to have by just being in UK domestic law when the platforms are, of course, domiciled elsewhere?
Professor Jonathan Haidt: When Facebook was facing down Australia, it did not blink and Australia had to blink. I do not remember the details. Australia is pretty small. Britain is much bigger. The EU might do it. This is the way forward. It is very important to start moving the Overton window. Right now, when I talk to people about this, they say, “Oh, what can you do? Kids are on the platform. We’re not going to change that. They’ll find a way”. The sense of “Nothing we can do” is amazing. At least broach the idea and put it forward, and see what reactions you get. We are not stopping anyone saying anything. We are just saying, “If you are choosing what gets put forward in ways that are world-changing, you are not the telephone company carrying people’s phone calls. You are closer to a newspaper”.
Lord Black of Brentwood: Yes.
Professor Jonathan Haidt: That was a big mistake we made, which has been tremendously beneficial to Facebook in particular. It has now sucked up all the advertising dollars in the world with none of the liability. It is a horrible situation that has been created, and there is no way forward until it is either stripped of Section 230 protection or forced to change its business model in order to keep it. The problem is the business model. As long as the business model is allowed to flourish, democracies will fall. As long as children as young as nine are sucked into that business model, a generation will continue to be weakened, anxious and suicidal.
Lord Black of Brentwood: Jim, is there anything you want to add?
Jim Steyer: I completely agree. I loved when Jon said do not go up the First Amendment tree. Too many of our friends in the advocacy community are lost over there in the wilderness. I actually think they are publishers. They are the biggest publishers in the history of the world. How are they not held liable for the content they are amplifying? That is why the distinction we are making is very important and clear. You are responsible for the content you amplify. They are amplifying it, in most cases, and broadcasting it through an algorithmic process run by processes that none of us could possibly understand, but they are responsible for it. If they were treated the same way that newspapers are or other large broadcasting entities—television, et cetera—it would change overnight.
I will give you an example. Because the US Government have failed over 20 years to pass a strong federal privacy law, in 2018 we wrote and spearheaded the passage of a California privacy law quite similar to GDPR in the EU. The one thing that we gave in on in our negotiations was the liability issue—the idea of whether or not there could be lawsuits against the major companies, which we call a private right of action. Most of the companies agreed with us about privacy. Facebook and Google did not, but the other folks did.
Everyone was worried about the private right of action. That should tell us something. It is because if they are held liable for their businesses they will have to change their business practices. The liability issue is absolutely critical, and you should be bold on it, basically comparing them to publishers. They have lawyered up more than any other companies and lobbyists in the history of the planet. If you step back from all the verbiage and think about it, they are the biggest publishers ever, and they are spreading content at a massive scale, so regulate them that way.
Lord Black of Brentwood: Exactly.
Jim Steyer: We have done very well with other industries, and we have never had these problems before, or they have been quickly moderated in other industries, so treat them that way.
Lord Black of Brentwood: Thanks very much.
The Chair: Thank you. We have a question remotely from Jim Knight.
Q152 Lord Knight of Weymouth: Thank you both for coming and being so animated at 4.30 in the morning. I hate to imagine how lively you would be after a glass of wine.
I am interested in getting back to media literacy. The Bill tasks our regulator, Ofcom, with media literacy. Is that sufficient for children, or are there actors that we should be mandating to be active in media literacy other than the regulator? Do you have any evidence and research around how we improve the media literacy of parents? Jon, in respect of what you said about barking up the tree of getting kids off social media until they are 16, we have a generation of parents who did not grow up with it. They have no idea how to parent it. Give us some help with that challenge, too.
Jim Steyer: Jim K, that is a very good question. I will be very frank with you because we have worked on this. I am sure many of you know Lord Vaizey. He has been our lead colleague in the UK on it. Let me be very blunt with you. Although I have spoken with Melanie Dawes on several occasions about the media literacy issue, I do not think Ofcom is where it belongs. I think it belongs in the Education Department. It is so important. I will get to the parenting issue in a second.
It seems like such a no-brainer. I will be very honest with you and tell you exactly what our experience on this has been in the UK, for better and for worse. Everybody understands it is important. People understand that you basically have to teach digital literacy, media literacy and digital citizenship, which is the safe ethical responsible use of platforms, basically from kindergarten on. Part of that is that you are also educating the teachers and parents at that point.
At Common Sense Media, other than the ratings and reviews that your colleague mentioned earlier, and the advocacy work we do, the great contribution we have made to the field in the past 15 years in the creation of the field of digital literacy and citizenship with some colleagues at Harvard—as a Stanford professor that is very painful for me to admit, but the leading researchers were at Harvard, Howard Gardner being our closest colleague—is the development of an extremely robust, sophisticated curriculum, K through 12, that teaches young people, as well as parents and educators, the core basics of digital literacy and media literacy for a digital world.
You are correct: there has to be a parent education component as well. Our experience in the UK has been that Wales and Scotland adopted it at the top, so we had massive adoption across Scotland and Wales. Remember, it has to be culturally adopted. The Queen’s English is different from American English, obviously. It has been very successful. It has been more complicated in Britain because you had a programme that was put together by the industry called Internet Matters. By the way, Google has its own media literacy programme called Be Internet Awesome. It is done by the industry. You are getting going and the industry says, “Hey, we’ll take care of this”. That is a flawed solution from its inception. The idea that the industry will do high-quality media literacy or digital literacy and citizenship is crazy. You should take the curriculum that we have built and spent $17-plus million on developing and is in the vast majority of American schools and successful. I would adapt it to the UK and then I would have the Education Department take the lead in making it happen. I would also put resources into teacher training and professional development.
There is a parent education component to it, Jim. You also have to train the teachers because they are the same age as many of the parents and they also have to be educated. You need fundamental investment in that. I think it belongs more on the education side of your Government. You can tax the companies to do to it. Letting Google do its Be Internet Awesome or have the major companies themselves design it is crazy. There are people like us who specialise in this. There are scholars who have developed the field. It is really up to you guys to put the resources in place and then require it in every school K through 12.
You would not let a kid get behind the wheel of a car without some degree of driver education. To me, media literacy, particularly digital literacy and citizenship, in this extraordinary robust curriculum is basically like driver education for the internet and social media, and you ought to have it from kindergarten through 12, and you ought to train the teachers and the parents at the same time. We will get that done. You could do that in the next year, and you would take care of the issue in the UK.
Lord Knight of Weymouth: That is really helpful. Jonathan, what do you have to add?
Professor Jonathan Haidt: Here, I will slightly disagree with Jim. It certainly is a good idea to do it, but with driver education kids actually want to drive, and they would rather be good drivers than bad drivers. If they have the information to be a good driver, they will use it. In America, one of our folk figures is PT Barnum, a circus magnate who said, “Nobody ever lost a penny underestimating the intelligence of the American people”. The same thing goes here. Nobody ever lost a penny underestimating the effectiveness of public health education when it was up against social pressures. We can tell people all we want about how bad this stuff is and how you should be careful, but the kids are trapped in an overwhelming social dilemma, which is, “Everybody else is on Instagram so I have to be, even though I know it is bad for me”.
The parents are trapped in the same dilemma. None of us wants our kids on, but the kids say, “I’ll be excluded”, and we do not want our kids excluded. If we try, they lie to us, as they lie to Facebook. I had a call with Mark Zuckerberg. He reached out to me to talk about polarisation. I created a fake account for my daughter just before I called him to make sure that there is no obstacle and you can make up whatever you want. He told me, “We don’t allow underage. We don’t allow under 13”. That is what he said to Congress as well. The urgent thing, if you want parent literacy, is that you have to break the social dilemma. The way to do that is that the default is that kids are not on until they are 16, and, if worst comes to worst, not until 13. Even that would be an improvement. We need to hold them legally responsible for underage use.
If the tobacco industry had cigarette machines all over and they said clearly, “You must be 18, but what are you going to do if the kids lie?”, we would not say, “Oh, yeah, you’re right, the kids lied. Okay, we tried”. The fact is that 11 and 12 year-olds are routinely on Instagram. If the company was 95% effective, that is fine. That would break the social dilemma. Only a few kids are on. But it is 1% effective—it does not kick off anybody. It should be sued to oblivion for that. We know the marketing strategy. Instagram knows it is losing to TikTok. It is working really hard, it says, to get 13 year-olds. It is working for the 10 and 11 year-olds. What it is doing by recruiting pre-teen kids is criminal. That has to stop. This is so much more important than any kind of education of the parents. We have to break the social dilemma, which is why all the kids end up on it.
Jim Steyer: I do not think what Jon said eliminates the absolute critical value of media literacy and digital literacy and citizenship education for kids.
Professor Jonathan Haidt: Yes, I agree.
Jim Steyer: They need to be in parallel. I still think you should invest in and require that from kindergarten on. It is not a substitute for some of what Jon is saying, but it is an absolutely valuable investment. The more young people understand what is happening to them, the better. The more that they are educated about the process, including the manipulative design, the addictive stuff, and all the self-esteem and privacy issues that are hard to teach to a kindergartener but you can certainly teach to a 7th or 8th grader or a high-schooler, the more they get value. I do not think what I am saying is at all in contrast to what Jon is saying. I would invest in both.
Professor Jonathan Haidt: Agreed.
Lord Knight of Weymouth: Thank you both very much. That was great.
Q153 The Chair: I have a final question. We have a Bill in the UK that is going to get rid of protection from liability as you know it in America. There will be no UK equivalent of Section 230; the companies would be liable for not just what they host but what they direct people to, and the regulator would be able to investigate them for what they do and demand access to data and information as part of those investigations.
The liability will be really clear on the company’s role in hosting illegal content or promoting illegal content to users. The area of definition, which is harder to get right in the Bill, is around harmful content that the companies are actively promoting to users. We know it is harmful to them, but it is not necessarily illegal at the moment. Do you have any thoughts on that? How do we successfully define the responsibilities the company has for which it would be held liable if it was found in breach of the guidance?
Professor Jonathan Haidt: There is no way to solve that problem. It is the same problem that we have been talking about. Most of what we are concerned about is harmful but not illegal. I will give you one example of what I learned from listening to Frances Haugen, relating to the person who invited 300,000 people to join QAnon groups. Do you know what Facebook does? If you get an invitation to join QAnon—a conspiracy group—and you do not click on it, Facebook’s algorithm says, “You might be interested to see what this group has to say”, and they will put into your feed stuff from those conspiracy groups. One person can create millions and millions of impressions from crazy conspiracy theory stuff. That stuff is not illegal. There is nothing illegal about QAnon groups.
The Chair: I listened to the same podcast that you referenced. We need more circuit breakers for legal but harmful things to slow down the virality of content.
Professor Jonathan Haidt: Yes, that would help. I do not think you are going to be able to hold companies to account by saying, “You are liable because you printed this one thing that we judge to be false”. Are you not allowed to lie? It would be very hard to do it that way, post by post. Hold them liable for the effects. If it leads to violence, they are liable. That is what I would think about it.
The Chair: Jim, do you have anything to add?
Jim Steyer: I have three things. I reiterate what we have both said and what you heard on some of the earlier panels. We need a really big focus on the algorithms and the amplification of the content. You have to look at that. Secondly, you have to make sure that the legislation is enforceable. That speaks to the liability issue that Jon keeps bringing us back to. You need to make sure that the legislation is enforceable, otherwise it is not going to be credible. I would be as strong and tough as you possibly can be on that. You are going to get lobbied heavily by the most lawyered-up industry in the history of the planet, so you should be prepared for that, but you should use common sense.
I remind you, to Jim Knight’s point, about media literacy and digital literacy and citizenship. You should ensure that all UK schools have the funding and resources for professional training and learning support for kids and teachers. At the end of the day, you have the chance to lead not just the UK and your kids’ future but the world. The tougher you are and the stronger you are, the more chance you have with this legislation to be the most important standard bearer across the world. You will set the example that the US will then grudgingly follow. If we can work with you offline on some of the specific definitions, we are happy to wordsmith some of that with you. We know Beeban. We really appreciate everybody on the committee’s leadership. We are here to help. At the end of the day, our kids will be the biggest beneficiaries when you do this job very well.
Professor Jonathan Haidt: What Jim said is exactly right, even more than he said. One of the big things I learned from listening to Frances Haugen is that what we see on Facebook and Instagram is by far the best version of it because we speak English, and they put most of their moderation resources into the English language. They also do French and Italian. They do not touch the other 200 languages. They do not do anything for most of the world. What we see is the best-behaved Facebook and Instagram possible. There are children all over the world who are suffering much more from it. There is no filter, no content moderation. Nobody is looking out for them. You are the premier Parliament, the premier legislative body that is looking at this. Of course, your primary duty is to children in the UK, but the rest of the world’s children will benefit if you can make progress on this.
The Chair: Great. Jim Steyer and Jonathan Haidt, thank you so much for your evidence. At the end of nearly four hours of public evidence, you have given us all a boost.