Science and Technology Committee
Oral evidence: Impact of social media and screen-use on young people’s health, HC 822
Tuesday 3 July 2018
Ordered by the House of Commons to be published on 3 July 2018.
Members present: Norman Lamb (Chair); Vicky Ford; Bill Grant; Darren Jones; Liz Kendall; Stephen Metcalfe; Carol Monaghan; Damien Moore; Neil O’Brien; Graham Stringer; Martin Whitfield.
Questions 214 - 352
Witnesses
I: Susie Hargreaves OBE, Chief Executive, Internet Watch Foundation; David Austin OBE, CEO, British Board of Film Classification; and Emily Cherry, Assistant Director of Policy and Public Affairs, Barnardo’s.
II: Anna Clark, Cardinus Risk Management; Dr Vicky Goodyear, Lecturer in Pedagogy in Sport, Physical Activity and Health, University of Birmingham; Dr Heather Woods, Lecturer, School of Psychology, University of Glasgow; Dr Max Davie, Officer for Health Promotion and Mental Health Lead, Royal College of Paediatrics and Child Health; and Professor Peter Fonagy, National Clinical Adviser on children and young people’s mental health, NHS England.
Written evidence from witnesses:
– British Board of Film Classification
– Dr Heather Woods, University of Glasgow
– Royal College of Paediatrics and Child Health
Witnesses: Susie Hargreaves OBE, David Austin OBE and Emily Cherry.
Q214 Chair: Welcome, all of you. Thank you very much for coming along this morning. May we start by getting you to introduce yourselves? We have a lot to get through, so I would really appreciate it if you gave succinct answers. Don’t feel that all three of you have to answer every question. Hopefully, we will then get through the questions we have for you.
David Austin: I am David Austin, chief executive of the BBFC, the UK’s independent regulator of film and video. We have just been designated by the Government to be the regulator of online pornography in the UK under the Digital Economy Act.
Susie Hargreaves: Good morning. My name is Susie Hargreaves. I am chief executive of the Internet Watch Foundation, which is the UK hotline for the reporting and removal of online child sexual abuse. We are a self-regulatory body, funded by the internet industry—around 130 companies. We operate globally and are part of a trusted triangle between ourselves, Government law enforcement and the technology industry.
Emily Cherry: Good morning. I am Emily Cherry, the assistant director of policy and public affairs at the children’s charity Barnardo’s. We are the largest UK children’s charity. We have over 1,000 services across the UK. Last year, we worked with over 300,000 vulnerable children and families.
Q215 Chair: Thanks very much. We are going to consider the type of content children and young people are exposed to online and on social media. I am not sure which of you feels able to comment on this, but can you talk a bit about the sort of inappropriate content children are exposed to and what causes you concern?
Emily Cherry: Fourteen years ago, we at Barnardo’s wrote a report called “Just one click”. That was before the likes of WhatsApp, Snapchat and Instagram were born into the lives of children. In 2015, we wrote another report, “Digital dangers”. For us, the type of content that is really concerning for children is child sexual abuse imagery and the grooming of children by offenders. We are starting to see a very big emerging pattern of children being groomed by criminal gangs, for both sexual exploitation and criminal exploitation.
Q216 Chair: Are you seeing a clear trend of increase?
Emily Cherry: We are seeing increases coming in via our services. We have seen increases to our child sexual abuse and exploitation services over the last few years.
Q217 Chair: Can you be sure that that is due to more illegal activity going on, rather than better reporting?
Emily Cherry: It could be a mix of better reporting and more illegal activity. We cannot rule that out. When we looked at it in our “Digital dangers” report, our practitioners told us that, in a three-year period, the percentage of referrals for child sexual abuse with an internet aspect had gone from 20% of young people in CSA services to 75% of the referrals coming through the doors.
Q218 Chair: Was the total number of referrals going up, as well as the mix changing from offline to online?
Emily Cherry: Yes. Barnardo’s is commissioned by local authority areas. We cannot always say that the number of services is going up, because we change our service portfolio based on local need, but the number of children we are seeing for whom there is an internet-enabled aspect to the abuse that they are experiencing has certainly been on the increase.
David Austin: We are in the middle of a large-scale public consultation that we carry out every four or five years to ask the public what issues concern them in film content, video content and on websites. Although the research has not finished—we will publish it around the end of the year or at the beginning of the new year—a number of issues are coming through.
The first is sexual violence. There is concern about the classification of sexual violence and exposure to sexually violent content. We already take a very strict line in relation to sexual violence. We remove some content that is of particular concern—for example, content that promotes the rape myth—but the public seem to be saying, “We want you to be even stricter in how you classify sexual violence.”
The second is discriminatory language and behaviour. Since we carried out research in 2009, concerns about exposure to discriminatory language and behaviour, both online and offline, have increased. We have tightened up our standards as a result.
The third area, which is coming through loud and clear from parents, is mental health. We are seeing concerns about depictions of pro-anorexic content, self-harm and suicide. We already take a strict line. We work with suicide prevention experts. The Samaritans feed directly into our policy on how we classify such content. SelfharmUK does the same.
The fourth area is pornography. There are all sorts of protections offline, where we regulate a great deal of pornography, but, largely, those protections do not exist online. Pornography is just one click away. Under the new Digital Economy Act, we are looking for a step change in behaviour from the big commercial porn companies. We want them to have effective age verification and to put their content behind controls, to stop children stumbling across online pornography.
Susie Hargreaves: The Internet Watch Foundation deals only with illegal content. We deal not with harmful content, but with illegal content to which no one should have access.
The important thing to appreciate with child sexual abuse is that it covers from the age of zero up to 18. There is a huge difference between the child sexual abuse of a two-year-old and the self-generated images of a 17-year-old that are unwittingly shared online. We are seeing that the younger the child is, the more serious the level of abuse is. In the last three years, 65% of abuse of nought to two-year-olds was what we call category A abuse, which is the most serious form of abuse.
At the other end of the spectrum—in the 13-plus age range—we are seeing an increase in the amount of abuse where young people are self-generating images, and those are being shared in many ways. Normally, they are being coerced in some way to do that or the images are being shared without their permission. Those tend to be lower-level images, but the young people themselves are actively participating in them, even if it is under coercion. Obviously, they were there when the pictures were taken. Therefore, the range of what we are trying to deal with is massive.
Q219 Chair: What are the overall trends in what you are dealing with and taking down? You are taking stuff off, or arranging for stuff to be taken off.
Susie Hargreaves: Yes. Last year, we removed 78,000 web pages. A web page could have one to 100 or 1,000 images or videos. Of course, these are hosted all over the world. The UK has an exemplary record in hosting. Less than 0.2% is hosted in the UK. If we find it, we remove it in under two hours.
Q220 Chair: How does the 78,000 figure for last year compare with the previous years’ figures?
Susie Hargreaves: In the previous year, we took down 57,000. In the year before that, we took down about 70,000.
Q221 Chair: Did you say 17,000 or 70,000?
Susie Hargreaves: Seventy thousand. One of the reasons it dipped in the previous year was not that there was a drop in content, but that we were working on different technologies. We have a hashing technology—which I can explain later, if that would be of use—where we were working with the Home Office and law enforcement on grading and categorising images. That took us away from our ability to search proactively for images. Therefore, the drop does not actually represent a fall in the numbers. Last year, there was a more realistic figure—78,000.
Q222 Chair: Does it suggest that the numbers are relatively stable overall? Do you see anything else in terms of trends?
Susie Hargreaves: No. The reality is that nobody knows how many images are out there. The vast majority of images and videos are duplicates. As more people come online around the world, more images and videos are shared. It is a kind of war of attrition—you just have to keep going at it and removing them. There is no definitive number of images out there, but we would say that the trend is that it is continuing to increase. Of course, there are new threats coming on all the time, such as grooming and live streaming.
Q223 Vicky Ford: You might want to point out how you compare with the rest of the world and what the trend has been, because what IWF has done is world class.
Susie Hargreaves: When we started 22 years ago, 18% of child sexual abuse was hosted in the UK. Since 2004, it has been less than 1%. It was 0.2% last year. You can compare that with other countries. The Netherlands hosts 35% and the US about 28%. The UK has an absolutely exemplary record on zero tolerance for hosting.
Q224 Chair: That is very good to hear. Presumably, it means that it is possible to confront this more effectively.
Susie Hargreaves: If every country in the world did what we do, it would have a massive effect. One of the reasons we are able to do what we do is that we are a self-regulatory body. We are governed by an MOU between the CPS and the National Police Chiefs Council, which enables us to look at content and take action without having to go to court. We can work very quickly, because all the industry works with us on a voluntary basis.
Q225 Chair: Have other countries copied you?
Susie Hargreaves: All countries are different. None of them has exactly the same arrangement as we have. In many countries, you have to go to court and get a court order to take the content down. We are in quite a unique position as regards the way in which we work. We are also one of only two countries in the world that search proactively for content.
Q226 Chair: It is extraordinary.
Social media sites often say that they are appropriate for those aged 13 and above. How easy is it for under-13s to get access?
Emily Cherry: Far too easy. We have seen many children come through Barnardo’s doors because of this, particularly in things like our harmful sexual behaviour services. That is where children are perpetrating sexual abuse against other children, which is an increasing concern for us. We are seeing children come through the doors at a younger and younger age, below the age of criminal responsibility. We are seeing children as young as five and an increase in the number of children aged eight to 10 coming through our doors for this. They are telling us that they are on social media sites. The technology-assisted side of harmful sexual behaviour is definitely growing, from the numbers that we see coming through our doors.
They are getting on to these sites too easily. They are then able to interact with potential groomers and to get access to pornography. As David said, we need much stricter regulation. We also need much more robust age verification on all sites, which would help children not to go on to these sites in the first place without parents and carers being able to control it.
Q227 Chair: Susie, your remit deals with child sexual abuse. Is there a case for extending the remit to cover other harmful content, such as violence?
Susie Hargreaves: The thing about child sexual abuse is that it is very clearly defined in law. In terms of legislation, it is absolutely clear what we are doing. It is not subjective. It is very clear, and industry can trust our judgment. It is either illegal or not illegal. It is really simple, which makes the self-regulatory model really effective in the circumstances in which we operate. Our model is useful to look at as a reference point for other types of content, but it is perhaps not the ideal model.
There are other areas of content that we are interested in taking on, potentially, if they fall within the child sexual abuse remit.
Q228 Chair: Which areas are you interested in?
Susie Hargreaves: We are in discussion with the Home Office about whether there is a role that the IWF could play in relation to grooming. These are very early discussions. We are always working with technology companies to look at the technological threats. We have a number of hackathons and work very closely with those involved. We have engineers in residence and try to look at the new technological challenges, but it is all within the field of child sexual abuse.
We are an inch wide and a mile deep, so everybody can get behind what we are trying to do. It is illegal across the world. It is a really simple proposition. When you come to definitions of what is harmful, it is much more complex. That legal definition is what is required for a model like ours to work.
Q229 Chair: Emily, in the written evidence, some pointed to Tumblr blogs containing very inappropriate graphic content. There is no requirement to sign up; you can just get access. Should there be a requirement for all social media sites to log in, with some age verification, before you get on to the site?
Emily Cherry: Yes. We believe absolutely that that is needed. We talk about the safety-by-design principle. For any site, any app or any game that children will play, there should be a set of rules and features that look at things like privacy settings, location features and the role of parents and carers in helping to verify how the child is using the site. We absolutely need to have those safety-by-design principles in place.
To take a real-world analogy, nobody can launch a new shop children will go into or a new playground where children can play without having health and safety features in place. Why should the online world be any different? It is really important that, through the White Paper and the work that the Government are doing in this area, we look at how all sites, not just the bigger sites, need to have those safety-by-design principles.
Q230 Vicky Ford: Before I start, I should declare that I have been an Internet Watch Foundation champion for seven years, working with Susie and supporting her work.
Many social media sites have age restriction as part of the terms of use. Do you think that there is a need for more rigorous age verification? How could that be made to work?
David Austin: We carried out a public consultation, as part of our new role as the age verification regulator. We finished that consultation at the end of May and handed the results to Government.
We set out four principles of effective age verification. I come back to Emily’s point. The age verification that social media sites have does not meet the four principles. Those principles will come to you for debate in the second half of the year. It is fairly straightforward technologically for those social media sites that contain pornography, for example, to have AV restrictions on the pornography itself, without encompassing the whole service. It is technologically doable.
We have been working with the AV industry for the last 12 months. Over that period, we have seen massive technological innovation. A year ago, the industry was saying, “We can’t age-verify at a reasonable cost. It will cost us £1 to £1.50 each time we age-verify.” The progress it has made over the last 12 months means that now it is free or costs only a fraction of a penny to age-verify. We have seen massive technological innovation. In the Government’s recent paper on the internet safety strategy, we saw that they are at least considering more robust age verification in relation to social media.
Q231 Vicky Ford: Would it make a difference?
David Austin: Yes. We all know of a tech-savvy 17-year-old who is determined to watch porn and can get around any control that you put in place. However, you have heard from other witnesses that around half of children, particularly young children, stumble across pornography online, rather than actively seek it out. Age verification would have a massive impact in stopping that happening.
Q232 Chair: We have the technology to stop that, if we choose to implement it.
David Austin: There are always ways around any law or restriction that you put in place, if you are determined enough and tech savvy enough. For huge numbers of children, it would be a really significant protection. When the Government launched the Digital Economy Bill, their own figures showed that 1.4 million children from this country visited a pornographic website. The NSPCC reckons that half of those children accidentally stumble across that. Therefore, I think that AV would have a really significant effect in protecting children.
Q233 Darren Jones: I have a question about how we age-verify—what we check. The original conversation was around things like credit cards and passports. I suppose that that makes sense if you are 18 and above, but I want to go to the broader question of age verification for 13-year-olds. When we were doing the Data Protection Act, the response from industry was that it is really hard to verify people below 18. I feel that that is unacceptable. Do you have any ideas about how we can age-verify people who are under 18 if they do not have a credit card or a passport?
David Austin: Obviously, our focus has been on 18-year-olds. There are plenty of ways of doing that. You can age-verify by having them go to a retail outlet and buy a card with an anonymous 16-digit number on it, or you can use a credit card. For children, it is more difficult, but it is definitely doable; you just need a different database. Schools hand out the CitizenCard. It is a project that they have in co-operation with Yoti, which is one of the age verification providers. That kind of database can be used for 13-year-olds. The principles remain the same. You just need a proper database.
Q234 Vicky Ford: Again, I have a declaration of interest. Yoti is based in my constituency.
Emma, you published a report about live streaming. You said that live streaming is used by adults to stream when they are sexually abusing children online. How widespread is it? What sort of platforms are used? Periscope is one that has been named. What others are used?
Emily Cherry: We published a report that looked at children’s experience of using live streaming. Over half of children, particularly between the age of 10 and 15, were regularly using live streaming apps—the likes of Twitter and Periscope. They talked about using Facebook Live, Musical.ly, Live.ly and YouTube Live. Children are using a whole range of sites to live-stream.
This Committee is looking at the impact on mental health and wellbeing. One thing that the survey told us was that over half of children regretted posting content after posting it. They are putting out live-streaming content and then experiencing negative comments, trolling and, potentially, adults grooming them. They are not getting the right education around the risks and dangers of live streaming. We certainly need to strengthen that in this country.
The mechanisms on those sites are not robust enough in helping children to understand the risks and dangers. For example, they could be able to live-stream only to parent-approved contacts—people within their network. We know that live streaming can be fun for children, if it is done in the right way and they are educated, but it is a real concern to us.
I cannot give you prevalence figures for sites such as Musical.ly and Live.ly, because they are global sites, but we know that children are using them.
In BBC and Channel 4 News reports that I did a few years ago, I saw myself that you almost have predatory adults live-grooming children. You see children just sitting there innocently, in their bedrooms, talking about their day, with very innocuous content. Then you start to see adults coming on, almost with a pack mentality, and text-messaging, “Can you lift up your top?” You see the child respond in real time, “What do you mean, ‘Can I lift up my top?’” A whole series of messages then go back to the child, asking them to do things on camera. There needs to be much more robust support in place to stop that. There must be live moderation, to make sure that children do not get those kinds of messages.
Q235 Vicky Ford: Susie, you use different technology to find the sexual abuse images. Can you describe that? Can it be used to stop the live streaming and to identify risks of that type?
Susie Hargreaves: On the child sexual abuse side, a lot of live streaming involves predators in the UK having live sex on demand with children in developing countries. That is a real, key problem on which law enforcement is very focused. At the moment, there is not the technology to detect when that is happening in a live moment. What happens for us is that it may be recorded and then come on to sexual abuse websites. That content will reappear.
At the moment, we have a technology that enables us to apply a digital fingerprint to any images and, now, videos that we have seen before. We can then go out and use our crawling technology to search for them, to try to bring down the duplicates so that the children in them are not revictimised. We are not yet able to stop it at source.
Q236 Vicky Ford: If you have seen an image before, you can stop it reappearing, technically, but you cannot—
Susie Hargreaves: Yes. Catching it in the moment is still very much a law enforcement issue.
Q237 Vicky Ford: You have talked a bit about British perpetrators abusing children in other countries. How much does the fact that the internet is a global phenomenon affect your work? What are you doing to work with other countries? What is the best way for the UK to be involved in the international challenge?
Susie Hargreaves: As you rightly say, it is a global challenge. Of course, the majority of internet users are not in the UK. We are dealing with the problem on a global scale. Our membership is broadly international. We have about 136 companies, including the big US companies: Apple, Amazon, Facebook, Microsoft and Google. We also have to work with industry across the world. We are currently in the middle of a long, ongoing discussion with all the major internet service providers in India to bring them online.
The more companies can engage, step up and take responsibility for their platforms, the more they can take services, work together and do what they can to remove this content from their platforms.
Government and law enforcement need to step up as well. That is why I am part, as many people are, of WePROTECT, which is an international initiative, formed by David Cameron, to bring all the countries online. There is a model national response that shows how people can build their capacity and what the key ingredients are that you need to have in place to fight online child sexual abuse.
However, it varies massively from one country to another. We currently have 22 reporting portals around the world for countries that do not have a hotline to report child sexual abuse. We will have another 30 by the end of 2020. The sad story is that it is a really big issue here, but it is not a really big issue in lots of other countries.
Q238 Chair: Do you find resistance when you approach companies in other countries and other Governments?
Susie Hargreaves: Absolutely. For instance, one of the reasons we do our portals is that it is a really cheap solution for them to have a reporting page in their country that comes to us in the UK. We will deal with all the content. The cost of setting up a hotline is exorbitant. Child sexual abuse online probably comes pretty low on the list of key issues that people are trying to deal with.
As the internet is used more broadly in developing countries, people are going straight into 4G and are having access to this content straightaway.
We always say that there are three pillars to this. One is organisations like us. Technology is a way of dealing with it, but technological solutions will go only so far. Education is absolutely key, from people being educated about the issue or zero tolerance through to young people being able to find ways to protect themselves more readily online. The third pillar is legislation. You need all those pillars in place to attack the problem.
It is very different in some countries. India is a great example, because the internet service providers there just do not see that it is their responsibility. That is exactly what happened in the UK 20 years ago. They have the “mere conduit” defence. You can see a step change with internet companies. I have seen it when apps come on. They say, “It is nothing to do with us. We are just a conduit.” As Emily said, people really have to start taking on board safety by design.
Q239 Vicky Ford: Talking of safety by design, if we can have age verification for pornographic websites, why can’t we have it for social media websites?
David Austin: Technically, you can. In the Digital Economy Act, the Government’s focus was on targeting commercial pornographic companies—those companies whose business model is to make money out of pornography. Effectively, that excluded social media. In itself, targeting commercial pornography companies will make a big difference. We talked about the 1.4 million British children in a month who access pornography.
Over the first year of implementation of this legislation, we will look closely at how behaviour changes. We will work with charities such as Barnardo’s, which is part of our charity working group, the NSPCC and others that talk to children regularly about this kind of issue. There is a legal obligation on us to report back to the Government 12 months after implementation to say what has and has not worked well. If after 12 months social media are an issue in relation to pornography, we will certainly make that clear.
Q240 Chair: Do we not know that it is an issue now?
David Austin: It is an issue now.
Q241 Chair: Why do we have to wait a year?
David Austin: Parliament has given us our role, which is to focus on commercial pornographic services. That is what we are going to do. We are able to talk to social media companies about pornography on their services. For example, the legislation defines a group of companies called “ancillary service providers”—people who enable and facilitate the making available of pornography online. They include advertisers, search engines and social media companies.
We have created guidance, which will come to you for scrutiny later in the year, about what those companies are and what we expect them to do. We can say to a social media platform, “We have come across an account on your platform that is driving traffic to a commercial pornographic service that is violating the terms of the Digital Economy Act,” and ask it to close down that account.
Q242 Vicky Ford: You can only ask them. You cannot force them.
David Austin: That is correct. We can only ask them.
Q243 Vicky Ford: Does the power need to be stronger?
David Austin: We do not know how they are going to react. At this stage, we can ask them. That is exactly what we are going to do. After 12 months, we will come back. If they do not take any action in response to us, there may be a case for saying that the power needs to be stronger. If they do, maybe the power is sufficient.
Q244 Vicky Ford: May I suggest that, at the minimum, you should not wait 12 months before saying whether the power needs to be stronger, if they are not reacting? This is non-compliant commercial pornography. If the social media companies are asked, are given notice and do not take it down, that should be more—
Chair: I am conscious of the time. Can we have a quick answer?
David Austin: In that case, we can take action against the non-compliant commercial pornographic service itself. We can instruct ISPs to block, for example. Possibly the most significant power is that we can ask payment service providers, such as Visa and Mastercard, to withdraw payment to the non-compliant service.
Q245 Vicky Ford: So you have other tools.
David Austin: We can ask them, but they have already told us that they will take action when we ask. That is a potentially helpful measure.
Q246 Vicky Ford: This is my final question. I am sorry; this was a long block. Age verification of pornography was supposed to come into effect this April? Why has it not done so?
David Austin: I know that the media have been talking about a delay. As far as the BBFC is concerned, there has not been a delay. We were designated on 21 February, after Parliament debated whether we should be designated in February.
There are a certain number of processes we have to go through to satisfy the legislation. They include creating guidance on what is effective AV and what we want ASPs to do. We have to consult on that guidance, which we have done. We have created the guidance and have consulted on it. We have given the results of that consultation to Government. Government have to table it, and you have to scrutinise it.
The one thing that has changed, which is really important, is that the Government have said that, once you have approved the guidance, as I hope you will, there will be a three-month window to enable AV providers and the big commercial porn companies to get their house in order, so that they are compliant with the legislation. That is really important, because what we want from day one is the maximum amount of compliance.
Q247 Neil O'Brien: This is a question for Susie. You have published a report saying that children are coerced or pressured into exposing themselves on social media in order to get likes. First, could you say a little about the examples that you have encountered? Secondly, is there anything that you think the social media companies themselves could be doing to solve the problem?
Susie Hargreaves: Recently we published a report on the use of webcams. We looked at young people in their bedrooms, on their own, who were being filmed. Clearly, they were being coerced. That film was then taken and harvested on paedophilic websites. In one case, we saw a child we assessed as being as young as five. Often these are children in nice bedrooms. Parents are oblivious to what is happening there. I think that that is the report you were referring to.
Q248 Neil O'Brien: It is. What service was the child aged five on?
Susie Hargreaves: We do not know, because we do not know what the original source was. We know that they were being filmed in their bedroom. We were very keen to promote the idea of appropriate supervision, where there is a camera and an internet-enabled device.
Q249 Neil O'Brien: Are you finding that this is all about coercion? To some extent, is it about older children feeling some kind of more gentle social pressure to do it? How do you distinguish between the two?
Susie Hargreaves: It is both, with respect. Obviously, the younger children are not in a position to make an informed decision about things. When you see stuff involving very young children, clearly the level of coercion is appalling. There is a trend for older children to go on some of the sites that Emily mentioned. Sometimes the content does not constitute illegal content, but clearly it shows inappropriate behaviour in order to get likes and to move up the rankings.
Less than 1% of the illegal content that we removed last year was on social media. Sixty-nine per cent was on image hosting boards. The majority of the content is not on social media. The bigger companies—I am talking about the ones that are within our membership—have stepped up and have zero tolerance of child sexual abuse content. I am not saying that there are not issues in other areas of content. Where it is clearly child sexual abuse, they take our hash list, ensure that the images are not uploaded in the first place and do everything they can to work very closely with us. Emily is probably in a better position than I am to talk about the harmful behaviour of young people.
Q250 Neil O'Brien: Is there anything that you think the social media platforms themselves could do about the harmful, rather than the illegal, material? In the IWF report, you mention a child aged 12 who was exposing herself online and saying that she would stop the broadcast if people did not continue to like the content that was being provided. Do you think that there is anything more that the social media companies could do to stop that kind of thing happening?
Susie Hargreaves: Where we know the original provenance, we will always go and talk to that company and bring it into membership. Snap joined recently. We work with companies so that they can take our services and work with us to show their approach to the problem. I will pass over to Emily, who will pick up the point.
Emily Cherry: There are a couple of things that I would like to say before I get into the context of your question. One thing that Barnardo’s has seen is quite a shift in the vulnerability of children coming in for our services. We are seeing young people from loving, stable family homes coming in and needing support. That has been around parents not understanding the rules and giving children early access to material, websites and technology without an understanding of the risks and the dangers.
You asked what more social media companies should be doing around here. It is not just an industry responsibility. Yes, there is more that industry can do, but this is where we need concerted action. We need leadership from Government, in the form of an education campaign to parents and children, backed up with incredibly good-quality RSE—relationship and sex education. We know that the Government are making that statutory, but it must also be made statutory within personal, social and health education. Children need the best world-class education, teachers and all social workers need world-class training in how to manage the online world, and parents need that guidance as well.
UKCCIS has already started to do some work in this area. There is a UKCCIS education framework, which talks about the different development stages of children and breaks that down into ages—what children should be doing at each different age and development stage. We are starting to get some action there.
All of that needs to be underpinned by social media companies not giving children early access. There are things that they can do. There are grooming tools. If a child is sharing an image of themselves, social media companies can send push notifications to that child. That educates the child and says, effectively, “Are you sure?” and, “Here is where you go to get support,” be that a trusted adult or—
Q251 Neil O'Brien: How does that work? One question that I had for you was whether there is anything about the design of these platforms at the moment that, unwittingly or wittingly, encourages children to post content that may not be appropriate for their age.
Secondly, how would you go about doing what you have just described? If a child was exposing themselves online, how would you alert them or their parents with a push notification? How would you spot that?
Emily Cherry: This is where you need a mix. You need to be able to use artificial intelligence to pick that up. There is facial recognition software that can look at children’s ages. The tech solutions need to be designed by the tech experts. As child protection experts, we can talk about the harms, the content that might be harmful for children and the behaviour that might be content.
It is then about working with the technology experts to say, “This is the technology solution behind it.” Industry is starting to work more with the sector on that. We are trusted flaggers. There is a trusted flagger process that you can share with social media companies for that type of behaviour.
Q252 Neil O'Brien: Other than proper age verification, is there anything that you would like to see the social media companies themselves doing or changing about the design of their services? You have mentioned future technological solutions.
Emily Cherry: It is still about safety-by-design principles. At the moment, behaviour and content are regulated by their own community guidance. We want safety principles to be enforced by an independent regulator, to make sure that they are designed safely for children from the outset, rather than allowing companies to regulate their own behaviour and content online, which is the situation we have currently.
Q253 Neil O'Brien: Are there particular principles of safe design that you think are not being met at the moment? Are there things that are not compatible with your safety-first approach?
Emily Cherry: Obviously, there is age verification, which we have talked about. It is about things like location features. Those should be set absolutely off for a child on a platform, so that you can never see where that child physically is. We should be looking at parents and carers having a role. YouTube has moved and has YouTube Kids. That is a safety-by-design principle app that it has brought in to enable children to use the platform safely. It uses algorithms to curate the right kind of age-appropriate content. Those are some of the examples we can look at.
Neil O'Brien: Thank you. That is very useful.
Q254 Martin Whitfield: What is your view of moderated sites? Frequently platforms will say, “This is moderated.” There seems to be a widely held view on the value or effectiveness of moderation. What comments do you have to make on that?
Emily Cherry: You need to have a mix of both artificial intelligence—machine-learning moderation—and an army of human moderators who can be there to moderate live content, particularly when it comes to the points that Susie was making about live streaming. There have to be humans looking at the behaviour and the content to be able to do that. We need more moderation.
Q255 Martin Whitfield: In the short term, that is really the only effective measure that we have, provided there is confidence in the moderators.
Emily Cherry: It is certainly arguable that moderation is effective.
Susie Hargreaves: I differ slightly from Emily in her last response. There is no moderation like human moderation. People think that the technology exists to do all of this using AI. Actually, there is no facial recognition software in the world that is accurate. There is no technology in the world at the moment that can age a child accurately. If a child is 13, 14, 15, 16 or 17, there is no technology that can do that.
A child’s age can be verified only by other means. That does not mean that it will not change and that we are not using technology to develop, but one of the reasons we are trusted moderators is that our analysts are experts in assessing child sexual abuse. Many of the social media have thousands of moderators, but they are dealing with all sorts of content. Our analysts are trained to be very specific. There is nothing like human expert moderation. More emphasis needs to be put on that area and more resources need to be put into it. There needs to be more acknowledgment that it is a really specialist skill.
Q256 Martin Whitfield: To take the first question, my understanding is that 13 years was chosen because of data protection. A 13-year-old can consent to hand over their data. Given the challenges with age verification, what is your view on holding 13 sacrosanct as the age at which young people can start to understand and give genuine consent to what is happening with the internet?
Emily Cherry: You will have incredibly competent 13-year-olds, backed by very supportive families, but you will also have 13-year-olds who absolutely do not have the right understanding and ability to give consent. Therefore, putting in an arbitrary limit of 13 is not necessarily the right approach. We should look at that as part of the Government’s internet safety strategy. We should look at where the limit of 13 came from. We have an age of consent of 16 in this country for sexual content. WhatsApp has just moved its age of consent up to 16, in line with that. We would recommend that Government look at the issue again. This comes from COPPA legislation in the United States. It is not a UK look.
Chair: May I ask you again to try to keep your answers as succinct as possible? We are running massively behind schedule, so do what you can to help in that regard. Have you finished, Martin?
Martin Whitfield: Yes, I have.
Q257 Damien Moore: Education was mentioned a few minutes ago. Digital literacy education has been cited as a way of reducing the risks that young people face online. Is the current curriculum fit for purpose?
Emily Cherry: We are about to look at the new guidance coming out from the Department for Education on relationship and sex education, which will have digital literacy as part of it. The guidance that is being taught in schools right now is not fit for purpose. The new guidance, as it comes out, will have strengthened digital literacy. It is really important that that is taught across all the ages and development stages.
Q258 Damien Moore: Is there an argument that expecting schools to do this takes away from some of the parental responsibilities?
Emily Cherry: It is really important that parents, schools and industry all work together. It would be dangerous to allow just parents, just schools or just the industry to solve this issue. We need absolutely to work together to make sure that children are getting the world-class education, advice and support to navigate the online world safely.
Q259 Damien Moore: Are there any good examples from around the world that we could use?
Emily Cherry: Australia has a digital commissioner. That is a fairly good example of the things it has put in place.
Susie Hargreaves: We are a third of the UK Safer Internet Centre. There are examples such as safer internet day. Forty-five per cent of all children were involved with that. As Emily said, the most important thing is that there is a combination of approaches.
Q260 Damien Moore: Parents have a significant role, but how can they get more involved, to deal with problems before they arise?
Emily Cherry: Parents have a really significant role. You cannot parent a child these days without understanding the digital world. It is an area where parents need support. That is why I said previously in this session that we think there should be some kind of campaign, led by Government. In the same way as we have a five-a-day campaign for healthy eating and healthy living, why do we not have a really pervasive, Government-backed campaign that looks at screen-time use and the age at which children can go online, so that parents can understand the rules for that?
Q261 Damien Moore: What about sharenting, where parents themselves are not responsible?
Emily Cherry: That is a really important point. It is doing it without the child’s consent and creating a digital footprint for the child. Parents need to understand the impact of that. That could be helped by a Government-backed education campaign.
Q262 Damien Moore: David, age verification for pornographic websites comes into force this year. Do you think that parents need to be educated about the new age verification systems and safeguarding tools?
David Austin: Yes, absolutely.
Q263 Damien Moore: How?
David Austin: DCMS has given us funding to explain to parents and consumers what is going to be involved. We are going to launch a new website to educate parents on what this new law and these new restrictions mean for their children. We will launch that website on the day when the guidance comes to you for parliamentary scrutiny. We expect that to be later this month.
We are also working with children on education. Age verification is a technological solution, but it is only part of a much bigger picture. We have heard from Emily about the importance of industry, parents and schools all working together. We work in schools. We speak face to face to tens of thousands of children. We have online resources to promote online resilience and making safe choices. With the PSHE Association, we have developed a lesson plan, which we have just trialled in 11 schools, about making safe choices online. Ninety-nine per cent of teachers who took part in the pilot said that the lessons were engaging. Ninety-five per cent said that their students had made progress in better understanding how to stay safe online. We are rolling that out to all schools in the UK. Education is absolutely key, both for children and, with age verification, for parents—and for consumers.
Q264 Damien Moore: I go back to the issue of sharenting. Susie, criminals are downloading these videos and pictures and using them for sexual content. Have you seen that in your work? Have they been part of the 78,000 posts that you have removed?
Susie Hargreaves: Clearly, the role of going after the perpetrators is for law enforcement. Our job is simply to deal with removal of the content from the internet.
Q265 Liz Kendall: Is industry doing enough to remove inappropriate content? If not, can you give me two or three concrete steps that need to be taken?
David Austin: We are expecting a step change in behaviour from the adult industry, so that pornography is no longer one click away. For the last 12 months, we have been engaging with all the big pornography companies in the world—all the household names we all hear about, based mainly on the west coast of the United States. All of them have said that they will comply with the new legislation and that, when it enters into force later this year, they will put in robust age verification.
The proof of the pudding will be in the eating. Let us see whether they actually do it, but I believe that their engagement with us has been genuine and that we have sufficient enforcement powers, should they not comply, to get them to comply.
That is the first thing. You asked for two examples, so I will give you a second one. We talked about moderation earlier. For the past three or four years, the BBFC has been working with three other countries to develop a tool, to be used by industry, to help people viewing content online to rate it—to give it a nationally sensitive age-appropriate rating—and to give content advice. This is a really simple tool that any member of the public can use. It is a simple questionnaire that can be attached by any platform to pieces of content. Anyone can answer six questions. That will produce a traffic light in the UK, and different age ratings in different systems in other countries. Those results can be linked to parental filters. There is a really significant way in which this tool, called You Rate It, can help to protect children online and enable people to make informed choices. The countries participating are the UK, Italy, the Netherlands and Ireland.
Q266 Chair: Presumably, that does not help people who are intent on putting online stuff that is bad and who do not want people to rate it in that way.
David Austin: We trialled this in Italy. People viewing a video were invited to answer these questions. Eighty-one per cent of the videos were rated.
Q267 Chair: Who puts up the facility to rate them?
David Austin: The platform. We need platforms. We had a successful trial in Italy. We are now looking for other platforms that are bigger than our Italian partner to take part in a more extensive trial of this tool. We think that it is really significant. It was an initiative of the European Union, and these four countries have taken part. The results of the Italian trial were successful and encouraging.
Q268 Liz Kendall: Thank you. Emily?
Emily Cherry: Social media companies and all tech companies should be doing more in this area. Given the good work of IWF and others, we know that child sexual abuse imagery is something that they take very seriously, but that needs to be the case across all social media platforms, games and apps, not just the big tech giants that are engaged in this space.
Q269 Liz Kendall: Can you name companies that you think are not doing enough?
Emily Cherry: I have mentioned Musical.ly and Live.ly as platforms. Those are two platforms where we have seen live grooming of children and that are not doing enough to keep children safe in that kind of space. We are very supportive of the brilliant work that IWF is doing. Our concern is about disrupting that grooming behaviour earlier. That is where social media companies are not doing enough.
I will give you an example. It is about sexual exploitation. At the beginning, I talked about criminal exploitation. We are seeing gangs use social media. I am going to give you a term that we put out last week in the press. Large criminal gangs are using social media for “baiting the skets.” They are deliberately trawling social media platforms to find vulnerable girls who are posting photographs of themselves. These girls are suffering from low confidence and low self-esteem; they may be in care. Gangs are friending them, grooming them and then coercing, deceiving and tricking them into gang activity. Social media companies need to do more to disrupt that grooming behaviour.
Susie Hargreaves: On the work we are dealing with, child sexual abuse, the big companies do step up. We are always looking to bring in new members and big international companies. Across industry, there is a responsibility to ensure that their platforms are kept clean of illegal material. They need to be held to account.
A lot of work is being done to try to develop the new technology to fight the new problems. We are not there with technology for grooming, but there is work happening. You are right to say that it cannot come quickly enough.
Q270 Liz Kendall: Barnardo’s is a trusted flagger. Do you think that all trusted flaggers should be paid for their work by the companies?
Emily Cherry: We are a trusted flagger. Google actually offers a grant. It is a voluntary grant to charities, and you have to ask for it to take it up. The trusted flagger programme is absolutely right in principle, but it needs to exist across all platforms and sites. It cannot just be for engaged tech companies such as Google and Facebook.
However, there is a concern for us, as the charity sector; in fact, there are two concerns. This is on top of people’s day jobs. We do it because it is the right principle and the right thing for children and families for us to engage in this process, but we do it on top of our busy jobs of dealing with children.
Q271 Liz Kendall: Would you like to be paid?
Emily Cherry: I think that there should be more funding available to charities.
My second point around the trusted flagger programme—we have said this directly to companies—is that it is quite a one-way process. We will share in context intelligence on what is happening in individual cases. Aside from our knowing that action has been taken, there is very little coming back out of the companies. They are aggregating across the UK different harms, new trends and things that are happening to children, but they do not share that back with the trusted flagger community. We then have to play catch-up. New terms such as “baiting the skets”, which I have just shared with the Committee, should be shared across all flaggers, so that they can look out for that kind of thing.
Q272 Liz Kendall: Why don’t they do that?
Emily Cherry: There is no requirement on them to do it.
Q273 Liz Kendall: You think that they should be required to come back to publish, especially with you, details of new trends. They are not doing that currently.
Emily Cherry: They are not sharing that back.
Q274 Bill Grant: Emily, we have heard some horrendous figures. We have heard about the criminal gangs grooming kids and their rapid percentage increase, and about the 1.4 million children viewing porn. With that in mind, do you think we need more regulation for social media companies? Will you say briefly what form that regulation should take?
Emily Cherry: Obviously, we are pleased to welcome the Government’s commitment to looking at a White Paper later in the year, with the Home Office and DCMS. We believe that we need to look at legislation—new legislation or amendment to legislation can be part of the consultation. As part of that, Barnardo’s has been calling for some time for a statutory code of practice, to apply to all social media sites, and an independent regulator with the teeth to hold social media companies to account. That means bringing them to the table, issuing fines if they are unable to comply with the code of practice and having proper transparency reporting that looks not just at what they have taken down, but what action they have taken and why.
Bill Grant: Do the other witnesses have any brief comments to make on this?
Chair: You do not have to. Keep going; move it on.
Q275 Bill Grant: Here is another opportunity. The Government have released their response to the internet safety strategy consultation. To what extent are you supportive of that response?
David Austin: We are very supportive. We certainly support the goal of bringing offline protections online. We could do a lot more in that space. The Digital Economy Act is a first step, but there is more that can be done. I talked about You Rate It, the international tool that has been developed. I should have said earlier that we believe that it brings trusted advice, based on the wisdom of the crowd, to enable people to make safe viewing choices online for all sorts of user-generated content.
There is also the facility to report abuse. We found in the Italian trial that people could report any abusive content—anything they felt really uncomfortable with or they thought went beyond the pale—to the platform, and the platform took action on it. That is the kind of thing that we would like to see as part of the strategy. Essentially, we are very supportive.
Susie Hargreaves: As a director of the UK Safer Internet Centre, I am very supportive, of course. As the IWF, we have a concern that, with the intention in the White Paper to look at harmful and illegal content, there is a danger of grouping us into harmful content, which is undefined in law. We have a model in the UK that works for criminal content and child sexual abuse. It is really important for that not to be affected, because it is a model that is unrivalled across the world in terms of what we are able to achieve. However, fundamentally, we support the elements within the internet safety strategy.
Q276 Bill Grant: Sticking to that response, the Government have released a draft code of conduct for social media companies. In what way do you think that that code tackles the horrendous problems we have heard about this morning? Is there a danger that, when it is finalised, the draft code might, as Emily said, prove to be toothless? What is your view on that?
Susie Hargreaves: I suppose that there is always a danger of that. From our perspective, we are absolutely determined to protect what works. It is not the self-regulatory body per se; it is simply to say that different harms need to be dealt with in different ways. It is not sensible to say that you can deal with everything in exactly the same way. We need to be really clear that, when we have not defined what is inappropriate or harmful, where something is clearly illegal, any legislation or codes are properly effective within the area that they are tackling. I am sure that Emily will have more to say on social media.
Emily Cherry: I want to add one final point. A code needs to have flexibility, to add new harms over time.
Susie Hargreaves: That is true.
Q277 Bill Grant: In a fast-moving environment, the dynamics change.
Emily Cherry: Absolutely. We know that perpetrators of sexual abuse against children, in particular, will look at new ways of doing that. They will move ahead of technology.
Q278 Bill Grant: You are supportive of the code, but the code has to know what it is covering.
David Austin: That is absolutely right. Different harms need different measures, but different sectors of industry need different measures. Some sectors have absolutely stepped up and are doing everything that they can to protect children online. That is less true of others.
One really good example is the mobile network operators. I think that you heard evidence from Carolyn Bunting about this. When you take out a mobile phone contract, the adult filters are switched on by default. The BBFC’s role in this is to determine what content goes behind filters and what content is allowed in front of them. Things like pro-anorexia content, promotion of self-harm, glorification of racism or other discrimination, if they are not illegal, go behind filters. This is an example of industry stepping up, on a voluntary basis, and putting real protections in place. As a result of that, hundreds of millions of websites from around the world are filtered according to standards that are acceptable to the UK public. I should say that the standards that we apply are all based on the results of large-scale public consultations.
Q279 Bill Grant: The Government have pledged—the key word is “pledged”—to require social media companies to publish transparency reports, sharing information on things like what content has been removed. Is there anything you would like to see added to these transparency reports? Do you think they will be valuable if that pledge is honoured?
David Austin: I think they will be valuable. In our research we find that people want trusted signposts in the online as well as offline space. The code should include trusted signposts to help people make safe choices about what they look at online.
Q280 Bill Grant: Would that be parents? In whom do you seek to have trust?
David Austin: There are a number of ways of dealing with it. We know from our own research what people do and do not trust and what they expect to see as being appropriate for different ages of children. I would hope we would be involved in some way alongside charities such as Barnardo’s and IWF; it is a job for all of us. We are all members of UKCCIS; some of us are on the UKCCIS executive board. That is certainly a job UKCCIS should look at pretty closely.
Q281 Bill Grant: Do you have confidence in that trust being established? Dare a politician use the words “trust” and “politician” in the same breadth?
David Austin: If there is sufficient will on the part of all the participants—industry, Government and regulators—we can take really positive steps. All of us have done so. Susie is head of a self-regulatory body. We have statutory backing for some of our work, but part of it is absolutely voluntary and not based on any obligation. We work with big companies—Netflix, Vodafone and O2—on a voluntary basis. If there is a will on all sides we can achieve a great deal.
Q282 Chair: I have a final question to you, Susie, on human moderators. My understanding is that Facebook employs 7,500 content moderators. Apparently, Pinterest[1] employs just 11 for 200 million users. I do not know whether those figures are accurate. What is your view of that very wide variation? If that figure for Instagram is accurate, do you think it has to change its approach, given what you said about human moderators?
Susie Hargreaves: Platforms have a responsibility to monitor their content. All of them have a responsibility to deal with complaints and people’s reports, and they have to deal with them appropriately. I cannot say what the right number is. I have no idea about Instagram or anything, but certainly it has a responsibility and duty of care. They have their own terms and conditions to monitor as well. It is impossible for me to say one way or the other, other than that I think they have a responsibility.
Chair: I thank the panel very much. I appreciate your time.
Examination of witnesses
Witnesses: Anna Clark, Dr Goodyear, Dr Woods, Dr Davie and Professor Fonagy.
Q283 Chair: Welcome, all of you; it is good to see you. I will ask you to do introductions in a moment. We have struggled to keep on time and failed with three witnesses. We now have five of you, so there is massive pressure on you to be highly disciplined in your answers and keep them succinct. We will try to be disciplined as well in the questions we ask. May we have introductions, beginning with Peter?
Professor Fonagy: My name is Peter Fonagy. I am national clinical adviser to NHS England on children and young people’s mental health. I am also chief executive of the Anna Freud National Centre for Children and Families, and a professor at University College London.
Dr Woods: I am Heather Cleland Woods. I am a lecturer at the University of Glasgow and I run the Superteams project there.
Dr Davie: I am Max Davie. I am the health promotion officer at the Royal College of Paediatrics and Child Health, but most of the time I am a community paediatrician working in Lambeth.
Anna Clark: I am Anna Clark, a PhD student. I am working with Salford University and Cardinus Risk Management looking at the impact of touchscreen devices on the mental health of young people, the risk factors and what that means for industry in the future.
Dr Goodyear: I am Vicky Goodyear. I am a lecturer in pedagogy at the University of Birmingham. I have been working with young people to understand their uses of digital technologies.
Q284 Chair: Thank you. May I start by asking you to comment on the link between social media and mental and physical health issues? I am focusing on what evidence there is to show whether there is a link between poor or adverse impacts of social media on mental and physical health.
Dr Davie: We do not know nearly enough. We have a degree of evidence about screen time, but a lot of that is fairly old and not of fantastic quality. Newer evidence, which is perhaps more relevant, tends to give quite equivocal results, certainly across mental health. I know you have already had an evidence session with Andy Przybylski, Amy Orben and a few others who are very good on this, so I will skip over that.
On physical health, there seems to be an association between screen time more generally and obesity. Obviously, as social media are a particularly compelling form of screen time, we have to explore that concern in more detail. The other thing we are concerned about is the association between screen time and poor sleep habits, and that has wide-ranging ramifications.
We have associations. Whether we have causal associations is very difficult to unpick. We probably do, but the implications of that and what we do about it is a tricky question.
Q285 Chair: Is one of your calls for the research to continue to increase our understanding?
Dr Davie: That is always our call.
Professor Fonagy: I totally agree with Dr Davie, but the Committee should take seriously the complexity of the issue. In the same way screen time may give rise to depression and anxiety, it is quite clear from longitudinal studies that depression and anxiety are often met by increased screen time as a way of the child or young person regulating their mood.
Q286 Chair: It is a chicken-and-egg problem.
Professor Fonagy: There is a bi-directional causation, which gives rise to two issues. It is a chicken-and-egg problem and it is difficult to untangle causation, but, more importantly, there may be some kind of vicious cycle that needs to be interrupted. If more screen time leads to depression, which in turn leads to more screen time, unless you interrupt that for a vulnerable child, particularly one who is, for example, overweight, it may be a solution to their life problems that is inappropriate. Unless you interrupt it they are trapped.
Dr Woods: Both my colleagues here have used the term “screen time”. It is important that we unpick that a little bit as well. I agree with both Peter and Max that research needs to be ongoing. We have a lot of studies, including my own work, which is correlational, on the effect sizes we see in the relationship between sleep quality and social media use, particularly night-time social media use, and anxiety, depression, self-esteem. They are significant but small.
I agree that Andy Przybylski is in a much better position to dig deeper. As he told you, he is working on the impact of social media where the use of screen time, as perhaps they call it from the data they had, is in minutes, but we do know there is a much stronger relationship between the impact of poor sleep quality and anxiety and depression. I agree that we need to unpick this.
We looked at the millennium cohort study, which looks at kids using social media. In particular, for those who do that for more than five hours a day—I want to return to that in a second—it will have an impact on their sleep. You still have only 24 hours in a day.
It is not just those two factors. Why are those kids having access for that amount of time in a day? That links to physical activity and obesity issues. I am agreeing in a very long-winded way, which you asked me not to do, that we need to understand all these factors.
Q287 Neil O'Brien: You mentioned children using social media for more than five hours a day and the impact on sleep. Give us a feel for what that impact is?
Dr Woods: Can I have a second to get my papers together to make sure I am giving you the correct information? This comes from the millennium cohort study—not a study done by us but an open access dataset. We classified about 12,000 kids who were recruited into this large-scale study. We classified zero to one as low. This was not screen time. We need to be careful about how we use that expression. This is social media use, or social interaction—all of us were once teenagers and we remember how important that was to us. Low was zero to one hour; average was one to three; high was three to five hours; and very high was five or more. About 20% of that sample was five or more hours.
Q288 Vicky Ford: What age range?
Dr Woods: The mean age was 14—so, it was 11 to 13. Of the 20% very high users—five or more hours—67% were likely to have a bedtime later than 11 pm. That is comparable with an adolescent using it for one to three hours, which is about the norm if you want to seek an average.
Eight per cent of the high users were likely to have a wake time of later than 8 am on a week day. We tried to see whether we could control for distance to school and look at those children who could lie in for a little bit longer because the school was right next door. That is another thing to think about when considering the data, but we could not see any evidence for that.
The very high users were also likely to wake after 11 am at weekends. That gives you, hopefully, a little more information about how we understand the data.
Interestingly, with sleep onset latency—the time it takes you to fall asleep once you have gone to bed—there was not any effect from the control variables. We were trying to control a number of covariants here.
Q289 Neil O'Brien: Where can we find this study? What is it called?
Dr Woods: The UK millennium cohort view.
Q290 Neil O'Brien: What is your paper on it?
Dr Woods: It is not published yet.
Q291 Bill Grant: You refer to poor sleep quality. Has any research been done on the educational performance of those individuals on returning to school the next day? We are all fearful throughout the United Kingdom that education standards tend to be slipping. Maybe you do not have evidence of that, but is there a link between sleep deprivation and performance at school on week days?
Dr Woods: It is a very good question.
Q292 Chair: What is the answer?
Dr Woods: I cannot think of a study off the top of my head that gives you that solid evidence.
Dr Davie: There is quite a large body of evidence linking sleep deprivation with various cognitive deficits. The question, therefore, is: can you extrapolate that to academic achievement? I do not think we necessarily have data on that, but if there are deficits in the cognitive skills required in school you are not going to thrive in school.
Dr Woods: It is not only the cognitive deficits but new material recruitment that has to come on board to maintain any level as sleep deprivation kicks in. We now have technology that can show us how our brain is working when we are in different states—for example, sleep deprived. For somebody who has poor sleep, we can see the amount of brain matter they have to recruit, if you like the amount of energy they have to expend, to maintain a level of performance. There is a difference there. If you are sleep deprived you have to work harder.
Chair: I am conscious that we are still on the first question.
Professor Fonagy: There is evidence that sleep deprivation does mediate the relationship between screen time and depression and anxiety that directly affects performance. Poor sleep is undoubtedly a signal that tells us this kid is depressed or anxious, and we recognise that as really important.
Dr Goodyear: We need to think about what young people are doing on social media and on their screens. One thing we have found is that apps have helped young people regulate their sleep and give feedback to their parents, so they have improved those kinds of things.
When we talk to young people—we have worked with about 1,700—they tell us about the positive aspects of social media: how they have seen campaigns and been motivated to be more physically active; they have seen information on diet and nutrition. They report on the positive roles.
Social media is a very powerful medium. If they are vulnerable, for whatever reason, that does not need to be a category of vulnerability; it can just be a concern about body image or something else. Adolescence is a period when interest in your body grows. They can get looped into different kinds of issues on social media, and that is when they can engage in more screen time, not necessarily become addicted, and those issues become apparent.
Dr Davie: When you change the start time of a secondary school and, therefore, slightly increase the amount of sleep young people get, you will improve results. It is not necessarily sustainable within a community, but increasing the amount of sleep young people get will improve their results. That is a very robust finding.
Dr Woods: I do not disagree with that point, but a caveat is that a lot of these studies have been done in the States where the school start times are 7 am; in the UK the start time is 9 am.
Q293 Chair: We have to be cautious about interpreting it.
Dr Woods: We have to be cautious. I have just come back from a sleep conference in Paris that looked at this issue. Everybody understood it was a challenge, but we need to be careful about where the data come from.
Q294 Chair: Vicky, you mentioned that some young people who talk to you find information about healthy living and go out and do more exercise, but there are also concerns in the other direction, in that it might lead to a more sedentary lifestyle. Is there any evidence in either direction in terms of links to more sedentary lifestyle and potentially issues around obesity?
Anna Clark: I have recently done some research of the literature. From the articles I found for my PhD, 108 of them were viable, but when looking for this only four were linked to screen time. I can talk only about physical health. Quite an interesting systematic review was done. It was old; it was done in 2010, but it tried to look at whether active video games were helping children with their obesity and things like that.
It found that pre‑school children aged three to five did not do their 60 minutes of exercise a day that the Government recommended, but eight to 10-year‑olds were spending 65 minutes playing on a computer.
They wanted to see whether they could change it to an active game. That was when Nintendo Wii and other things came out, which are much less common now, but they found it helpful; it increased children’s physical movement.
The problem is how to measure a child’s physical movement, because all the studies have been based on adults. They found it very difficult to compare children with adults and set standard protocols.
From my experience as a children’s physio, we used to ask lots of children up to the age of 16 how long they spent on a computer. We looked at the core stability of a child who was more likely to sit and watch. You need to be very careful how you define screen time. TV is classed as screen time and people do not think that it is screen time. When you ask a parent how much TV they watch, it is completely different from how long they are meant to be on their phone. We found that core stability decreased massively, which can lead to problems in the future.
Q295 Chair: We have heard about the limited evidence—there is some—about the link to adverse health effects and the chicken-and-egg issue. Given all of that, what should be done, if anything? What is the policy response?
Dr Goodyear: Education is important for young people. We heard a lot in the previous hearings about the education of young people, but there is also the education of adults and adult digital literacy, if they are going to be able to help young people.
We are talking about health: physical activity, sleep, diet and nutrition. Perhaps what has been missing is the role of physical education. Not only should there be aspects of PSHE but physical education. Most schools have two hours per week, and it is a key way to reach young people.
In our recent book we looked at the use by physical education teachers internationally of technology. We saw huge enthusiasm for technology. Most of the focus was on physical skills. I think a big impact in physical education would be to take on technology and its role in health education, mental health, sleep and physical activity.
Q296 Chair: The use of it as a facilitator to promote health.
Dr Goodyear: To open up critical discussions with young people. Physical education teachers are trained in schools as perhaps experts in these aspects of health, so it could complement PSHE programmes.
Professor Fonagy: I agree totally. Low physical activity appears to interact with high screen time—that is, you are more vulnerable to the negative effects of high screen time if you do less physical activity, so physical activity could be a protection. There is a Canadian experimental study, which I think is quite persuasive, that giving students a chance to engage in physical activity in non-teaching time reduces the negative impact of social media.
Dr Woods: I want to refer back to a qualitative study we did that looked at these relationships and found them to be significant but quite small. We wanted to dig a little bit deeper. I want to pull back from the term “screen time” and talk specifically about social media. The reason kids were finding it difficult to disengage from their phone was the social interaction aspect of it. They were using quite strong words; they said, “I feel that I am not being a good friend; I feel an incredible sense of guilt that I am not there to finish that conversation with someone because that reflects badly.”
Q297 Chair: Is there a danger that sometimes that can be late at night?
Dr Woods: This particular study was about why they use social media at night. It is pushing back sleep, so it is particularly at night time.
What was interesting about it was the elongation of social interaction that fed into the 24/7 cycle and pushed back bedtime and sleep onset time. Although I do not have hard data to prove it, my feeling is that you can talk to kids about peer support and interaction, or you can get a group of older kids to speak to younger kids and say, “What I do is just say at nine o’clock, ‘Okay, guys; I’ll catch you at school in the morning.’” That is the routine and that is what everybody else is doing, instead of everybody else talking on the phone, sending messages or whatever late at night. That is how we start reducing the impact.
I totally respect your point about PSHE teachers; that is important as well. But if we have someone telling adolescents in particular to do something, I am not sure how effective that is. When you get their pals and friends to do it and feed it back to the families is when you start to have an impact.
Q298 Vicky Ford: You mentioned PSHE lessons. Should they be compulsory, because that is a decision the Government are soon to take?
Dr Goodyear: PSHE on digital literacy and online safety education.
Q299 Vicky Ford: Should it be compulsory?
Dr Goodyear: I think so, yes.
Dr Davie: Yes. It has been our policy for a number of years that it should be compulsory.
Q300 Vicky Ford: I would like to get that on the record, if you all agree. That is an important decision the Government are about to take. Do you all agree?
Dr Goodyear: To add to that, young people said to us they would like some education within schools because it is important that it can reach all young people and a consistent message is given.
Q301 Chair: Does everyone agree that it should be compulsory?
Dr Woods: Can you say what should be compulsory?
Q302 Vicky Ford: Can you say which bits you want to be compulsory?
Dr Woods: Is it digital literacy?
Q303 Vicky Ford: Digital literacy and the online safety element.
Dr Woods: Yes. I would want to see that go a little bit further and take a more general view about the different facets of social interaction. It can be person to person; it can be with peers, and we should talk about online and offline and the distinction between those two, because I do not feel we have that distinction.
Q304 Chair: There is clarity that we all want something to be compulsory. There are different views about the extent of it.
Dr Davie: On the policy response, I would agree with the panellists that education is important but it cannot be just top down. We have to educate the public about this. We are going to produce some guidance for professionals, parents and young people on this hopefully early next year. We hope that will be part of a movement around sleep, obesity and social media—all these interconnected problems—to try as a society to get more common sense at family, school and community level.
Dr Goodyear: On guidance, there is a lot of research coming out at the moment, but there are also a lot of start-up companies offering guidance and tips about social media and digital literacy. A key point that needs to be made is that if it is put into the curriculum, or the programme is to be used by teachers or parents, there is a need for evidence-based practice and quality assurance.
Q305 Chair: Is there not a bit of a problem at the moment, in that there may be lots of different bits of information coming from different directions but no one quite knows what to believe?
Dr Davie: The royal college is hoping that it can be authoritative, particularly if the evidence changes.
Q306 Martin Whitfield: My question is about the physical effects. I will not use “screen time”; I will use “social media”. In what ways are social media affecting young people’s physical development?
From a physical point of view, it all depends on how they are sitting and doing it. There are studies that say we need education on sitting posture and how many breaks people are taking. Is the posture a dynamic one? You see children walking down the street while texting; you also see them slumped and texting like that. We just need to be doing a lot of education on how to operate the device they are using.
Q307 Martin Whitfield: How significant do you think this will be in the long-term physical development of children?
Anna Clark: Studies have shown that adults are off work for 31 million days per year with lower back pain, which equates to £41 billion. Those are people in their early 30s and 40s. They did not have this technology when they were children, but people are coming into the workforce with these problems, so it is already occurring.
Q308 Martin Whitfield: If we look at repetitive thumb movements, is there a technological answer—for example, altering screens or the dynamic—or is this an educational matter?
Anna Clark: It is more of an educational thing. Phones and tablets have changed so much over the years. You used to have a little keypad and screen; then you had Blackberry phones, which had a whole keyboard on them; and now you have complete touch screens. It is changing all the time. I do not think the device makes any difference to how it is going to affect physical development.
Q309 Martin Whitfield: Like so many of the health and safety issues and employment stuff, this is an educational answer by raising awareness, using hopefully peer group influence, and making teachers and parents aware of the potential long-term risks.
Anna Clark: Yes.
Q310 Martin Whitfield: If we look at young people and particular social media use at the expense of other activities—the physical meeting of people, going out to play and riding bikes—what evidence is there that that is affecting their physical development, or is it a million-dollar question that needs more research?
Anna Clark: It is a million-dollar question.
Dr Davie: I am going to use the term “screen time” because that is where the data are in the connection with obesity. It is a compelling form of screen time, so I think it applies.
There are a few ways in which it impacts obesity. One is that it is sedentary. Secondly, it appears that you increase your intake of high-density calorie food when you are engaged in screen time. You only have to go to a cinema after the lights have come up to see how much of a problem that is. Thirdly, there is exposure to high-density food advertising, which online is not very well regulated. As well as obesity making sleep worse, poor sleep seems to have a causal connection to increasing levels of obesity, which connect with back pain—so it is all interlinked.
Q311 Martin Whitfield: It is a complex Gordian knot that needs to be unpicked.
Dr Davie: Exactly.
Q312 Martin Whitfield: To tie this back to the mental side of it—this is probably as much for Peter as anyone—how are social media changing the way young people perceive themselves and their body and character? What is the evidence on that?
Professor Fonagy: On the whole, not only is “screen time” an inappropriate term; it is social media stress, or what predicts forward the impact of the time that kids spend in that way, that has mental health consequences. The amount of time accounts for 1% or 2% of the variability. The experience of that accounts for 10% of the variability in terms of depression or anxiety. It is easier to regulate the amount of time because parents can just take away the phone or whatever.
Q313 Chair: You think that’s easy.
Professor Fonagy: Probably what is much more important is content. That is a much more challenging target for us to intervene. As I think Heather was hinting at, probably the quality of the social interaction that can occur now is different from what it used to be. Everybody is saying that these factors interact. For example, we know that if you are overweight you are more vulnerable to the psycho-social pressures that the media present than if you are not overweight. Probably people need slightly different interventions. It is quite complicated.
There is a very high correlation between parents’ screen time and children’s screen time. One of the best ways that might be directly controlling children’s screen time is for us parents to cut down our screen time.
Q314 Chair: Is there a serious issue about parents’ screen time resulting in effect in neglect, with children not having enough social interaction?
Professor Fonagy: We know from experimental studies that, if for a 30‑minute period you switch on the TV, the quality of parent-child interaction decreases for three-year-olds. That is just switching on the TV quite passively, so the more parents engage in screen time in front of the kids, the more it impairs the kids because they lose something that they naturally need for their healthy development.
Dr Woods: You used the word “content”. I agree that is very important. What we are looking at and doing online is very important.
The other important thing to think about is context. There is content and context. We see a dose-response curve where social interaction can have a positive effect on wellbeing. I am talking here as a researcher and as a mum who understands the importance of my daughter developing independence by making arrangements and having that social interaction that leads to face-to-face social interaction, but also for atypically developing individuals, for example, and platforms that enable them to facilitate social interaction, make connections and communicate.
Q315 Chair: People who may be isolated at school because of all sorts of issues may find new communities.
Dr Woods: Exactly. We are starting to see research coming out about that. I have a student who is looking at that and at gaming and individuals with autism.
Q316 Martin Whitfield: In that social interaction, empathy and friendship development we are talking about a balance to be had from the value of it through to where it becomes damaging. Anna, do you think there is a similar balance with regard to physical changes and development, or are you more concerned that social media/screen time are physically detrimental?
Anna Clark: Yes. I am looking more at the physical detrimental effects.
Q317 Martin Whitfield: Do you think that is right and it is less of a balance and more a bad picture?
Anna Clark: I think so. There is no denying it. Children will pick up a phone. My 18-month-old picks up a phone and instinctively knows how to use it, and it is only going to get worse as they get older. I am sure that parents of teenagers cannot get them off their phones.
Q318 Chair: I am conscious of time. Dr Davie, are you okay with that?
Dr Davie: In a way, we are skirting round a parenting issue here. Most of my work is about supporting parents with children who have behavioural difficulties clinically. Getting parents and children to agree a regime at home about everyone’s screen time is a powerful way of making progress.
Q319 Darren Jones: I visit schools frequently as a constituency MP. Two things that keep coming up, regardless of background, areas or performance in school, are sleep and mental health, not just in terms of academic outcomes but of attendance and turning up on time. I often ask head teachers, “Do you think it is to do with things like social media use or technology?” They all say, “Absolutely.” They talk about the fear of missing out and the screens. The thing I do not yet have a grasp of is the credibility of the evidence base and, therefore, what we as a Select Committee ought to be recommending to Government or the research councils about what we need to do about it.
I would like to separate out screen use, which is a hardware issue—lighting, pop-ups on time use or whatever—and social media use, which is a software issue. I would be keen to understand on those two separate questions, which I will take in turn in a second, what the evidence base is. What are the weaknesses and, therefore, what recommendations ought we to be making?
On the first one—screens, which is hardware—is the research base solid on the consequences of links to things like mental health and sleep? If not, what recommendations should we be making on the hardware question?
Dr Woods: Dealing with the sleep part of that, there is evidence to show that the blue light emitted from devices has an effect on a chemical in the brain called melatonin. Melatonin facilitates the onset of sleep and blue light suppresses that. However, you would need to be on a screen for a very long time and have it very close to your face for it to have an effect.
This highlights very clearly almost the headline and the signs behind it and how we need to understand things like effect sizes. I was talking earlier about the link between night-time social media use and depression. There is a relationship, but it is small. I agree with you that we have an evidence base, but we need to start going further. We have all been calling for a move beyond correlational studies and to have the funding and ability to start digging deeper down to fund things like active watches, or technology we can use, to monitor in real time who is doing what and when and what they are looking at, as well as the nature of the social interaction.
On the light and hardware question, I would say there is evidence but the effects are small.
Q320 Darren Jones: Is it primarily a question of light on the screen, or are there other things we ought to be looking at?
Dr Woods: If I answer that, I am going to start moving into the discussion on content and context. I was at a conference a couple of years ago where a big study had been done on light-filtering glasses. It was very flash and interesting and the effects were there. I said, “That’s really interesting. What were they looking at?” They could not tell me. Were they looking at porn? I could not understand why they had not given them a specific task or had not looked at that.
That is incredibly relevant to try to understand. It does not seem to me that people have any difficulty disengaging from reading the news; they have difficulty disengaging from social interaction. Gaming possibly has another role.
I also think we are talking about education in schools on this topic. We are moving more and more into online provision. My other hat is that I run level 1 psychology in Glasgow and have 600 students to look after. We use virtual learning environments. We have availability 24/7. We send and receive messages at the weekend. It is also thinking a bit about other tools that we are bringing into education. Education plays an important role. What other activities are we engaged in that may impact best practice in how we educate our children?
Dr Davie: I completely agree and add that the blue-light story is a compelling one and a helpful one clinically—it makes sense—but it is by no means the whole story. The response of the tech companies—for example, “Okay; we will have orange light instead”—is not adequate. I think you are right to separate out hardware and software, because hardware is a pretty small part of the story.
Dr Woods: You cannot blame the phone; it is the use of it.
Q321 Darren Jones: They are different companies, are they not?
Moving into software and social media use and sleep and mental health, what are the gaps in the evidence base?
Dr Goodyear: There is limited evidence. Building on what has already been said about the contextual influences, we worked with young people mainly in a qualitative study and found out that school physical education, parents and other family members and peers played a key role in how much time they spent on social media, but also what they were looking at and why. We need evidence from broader samples of young people in different contexts and different demographics to be able to understand what the influence is and define an effective response.
Dr Davie: Here is the frustration for me. The tech companies have the data. Facebook and Twitter have data, but they are not sharing it with researchers to look at the actual consequences, the patterns of use and the effect.
I am very interested in the idea of a decision architecture within social media, or any online environment. For instance, in Snapchat you will lose a streak if you do not message that person within a 24‑hour period. That is a very strong pull to keep the person on that social network. They must know the effect of these mechanisms within social media.
Q322 Chair: You make the plea that those social media companies should be releasing the data.
Dr Davie: They should release them because they are having an effect; they are sucking data from all of us and they should be sharing it for the public good.
Professor Fonagy: As Simon Stevens recently pointed out in a number of different contexts, including “The Andrew Marr Show”, we are making tremendous progress in general in relation to children’s mental health, not least because of the initiative of members of this Committee. We are getting much broader access to children, and that is to be celebrated. However, as this is happening, we recognise the tremendous need out there. Health services, even when combined with education, will not be able to meet all the needs, and other actors need to step up to the plate in relation to the causation of mental distress in children.
Q323 Chair: Do you share Max’s view that the data should be made available?
Professor Fonagy: That is a very specific suggestion. I would make the general plea that, if we recognise there is an understandable cause of distress in children, it should be investigated and we should do the best we can to intervene to reduce that source of stress.
Q324 Darren Jones: I do not put a question but make a comment, for which I apologise given the time. If you look at the regulated sectors—energy, telecommunications and others—regulators require companies to share data about consumer harm. I see no reason why we cannot do that in the technology sector. The question is: to which regulator do we have to give authority?
I think that answers my question. I am still not entirely clear on what the research recommendations are. Unless anybody wants to add anything to that, I will finish.
Professor Fonagy: There is one thing that is not out there yet but is really important and interesting. Our smartphones are probably the best device available currently to give an indication of the functioning of our brains; that is to say, the way that we use our phones is different in the morning from the evening. The data are generally available to everyone and we could use it to monitor the impact, among other things, that social media have on functioning.
Dr Woods: I do not think we can give the answer to the question you are looking for at this point. I would hope to be in a position to continue what we are doing. We have a good foundation on which we can build. All of us round this table are looking at unique aspects of it. I think we have a foundation on which we can build. We are showing that there are relationships there, but we have to keep going because it is very difficult to access data. As you were saying, the data are there, but enabling us to have access to it would give us a much more constructive answer to your question.
Dr Davie: To refer back to Andy Przybylski’s work, which is probably the best quality I have seen, it appears that a little bit of social media is better than no social media for wellbeing, so it is a question of balance. It is not an objective harm from zero onwards; it is not a curve that goes down from the beginning. Therefore, when you get to the point where there is extensive use that is harmful, you have the difficulty of causation. Is it causing harm, or is it more like what Peter is saying? Is it a vicious cycle that the young person has got themselves into? What offline interactions are they avoiding? That is always my question when presented in clinic by someone who is objectively using a screen excessively. What are they avoiding doing? What is going on in their life that means they are on “Call of Duty” 12 hours a day?
Q325 Bill Grant: Vicky, in your submission you mention the health information that young people can access when on social media. Can you briefly tell us what sort of information they are accessing?
Dr Goodyear: They said that on social media they would access information relating to physical activity, diet, nutrition and body image. They considered that the material most relevant to them was motivation to floor exercise. That was reported by 78% of young people. They were interested in clean eating information, which perhaps contrasts with what you might think would interest them—junk food—or different advertising, and they were interested in physical activity workouts. Those were the three top aspects.
On what they were looking at, a sizeable majority changed their behaviours. Most of them thought that the information they could find had a positive effect. The issue was that not all of it in our interpretation was necessarily having a positive effect on their health in terms of the workouts and information they were looking at.
Q326 Bill Grant: On balance, is the information sourced from social media positive?
Dr Goodyear: Most of it was positive. The problematic issue is how information is mobilised to them. We have heard a lot about social relationships and the importance of peer interaction. If we asked young people whether they used social media primarily for health, they would probably say no. If we asked them whether they used it to communicate with their friends, they would say yes.
From that process of communicating with friends, they begin to become interested in the health aspects available on Instagram and YouTube, for instance. If all my friends liked FitTea, when I go on to my search engine I will see FitTea, even though I have never searched for it and am not interested in it. If it keeps coming up on the board, you start to think, “What’s FitTea?” I might click on it. Once you have done that, the next time you go on it there is more and more; there is green tea and other types of tea.
Q327 Bill Grant: You are drawn into that.
Dr Goodyear: It is like a consciousness. The only reason I am on social media in the first place is to be with friends. That is why I say it is a very powerful medium. A young person might have no body image concern, and something happens in school and they become a little more aware of their body. These things start happening. They can quickly shift from being in control of social media to it controlling them, and the issue is the process of invading peer networks.
Q328 Damien Moore: Did you find out what they were looking at, or were you just asking them?
Dr Goodyear: We ask them. We do different class activities with them. For example, we use Pinterest. We ask them to create a pin board of all the different things they would look at. They had to go away and source things they had seen and looked at, and we used interviews and surveys to quantify it and find out in further depth how well they used it.
Q329 Damien Moore: Sometimes people give the impression that they are healthier than they really are. If people go to the doctor, I am sure everyone does cut down on everything they have on the list.
Are some aspects of health a bit like cookery programmes? Everyone likes to watch them, but nobody actually cooks the stuff that they make.
Dr Goodyear: We had young people talking about avocado and egg on toast, which did make us think, “Why are you doing that?”
Damien Moore: Delightful.
Dr Goodyear: One of the biggest messages from young people—this concerns social media campaigns, different health-related apps and wearable devices such as Fitbits—is the novelty effect. They describe it as being similar to their parents going to the gym in January after Christmas and then they stop going. There is a novelty effect. They will use things like Pokemon Go because it is in trend and then they stop using it because it is not popular or in trend any more.
They also find things boring to use over time. While they are initially interested in perhaps regulating their diet, or doing something to improve their body image, from what they describe to us the effects generally last for about four weeks. They might try these things, but then they are not interested any more.
Q330 Bill Grant: The Committee has heard that there is a perception that social media can be used to promote bad or poor nutritional information, which is probably linked to that. Does it lead to harmful lifestyle changes? Is it simply a perception, or is there a problem with poor nutritional advice sourced from social media?
Dr Goodyear: We did not come across anyone reporting poor information, but one thing young people told us, which I think is significant, is that they place a significant amount of trust in official organisations. Fifty three per cent of them reported that they would consider changing their behaviour if something was shared by the NHS, the Football Association, Sport England and the Youth Sport Trust. They have a significant role to play in positive health messages and campaigning against perhaps more harmful messages, and young people reported to us that that was one action they would give to adults.
Dr Woods: One of the key things here is enabling young people to develop critical thinking skills and evaluate the sources they are seeing. I do not see us shutting down the internet. There will always be a huge amount of information that is overwhelming. Especially in the current state of affairs, evaluating information that is given to us is absolutely key in understanding sources that are reliable, are giving good information and should be listened to and thinking, “Why am I getting this message? Where are the data behind this message? What is the actual source behind this message?” That is absolutely key, and we need to think about that in the policy.
Q331 Bill Grant: Are organisations such as the NHS already using social media to put forward a positive lifestyle choice, or should they be doing that? Are they already on board? You mentioned them as a trusted source for some people.
Dr Goodyear: As for the NHS, we had the Yellow Men campaign with its characters and Change4Life. It has also talked about the This Girl Can campaign. There is a need to make messages equal. For example, most messages target girls.
Q332 Bill Grant: Not males.
Dr Goodyear: Most messages are targeted at girls—for example, This Girl Can. It was noted that this was not an equal message for everybody.
Q333 Chair: In other words, are you saying that the NHS could be doing more to use social media to target positive information?
Dr Goodyear: Yes. Young people have told us that they trust them; they believe them as a trusted source.
Dr Davie: Of course, the NHS is not a single organisation, as you very well know. Public Health England has a very active Twitter account. I think NHS England does so too. All the different organisations do bits and bobs, but we always have to be wary of the message and whether it is effective.
Change4Life is a good campaign, but we need to build on it because we are not at a point where we tell people that chips are bad for them and they say, “Oh, are they?” They know. It is a question of how we change the decision making around food. It is a much bigger issue than just information; it is about nudges and pressure on people’s lifestyles.
Q334 Vicky Ford: Max, you just made a throwaway comment. I think you referred to somebody spending 11 hours playing “Call of Duty”.
Dr Davie: I mentioned “Call of Duty”.
Q335 Vicky Ford: To make it clear for the record, you consider interactive video games to be social media.
Dr Davie: I think the distinction between video games and social media can be quite difficult to break down, if you think about multiplayer games online. We have had this for 10 years with “League of Legends” and “World of Warcraft”. That is a social network, so I think the definition of social media is really problematic.
Q336 Vicky Ford: It is really important that when we mention looking at social media we talk about interactive multiplayer online video games as a form of social media that can be hyper-addictive.
Dr Davie: Yes, because you have your mates. That is how “Call of Duty” is advertised; you can do it with your mates for ever.
Q337 Martin Whitfield: I am going to use “screen time”. What guidelines are out there now with regard to screen time in the wider sense of those words, from television through to smartphones and things? Is there anything out there?
Dr Goodyear: Yes, there is.
Q338 Martin Whitfield: What is the guide?
Dr Woods: I believe there are guidelines from the American Academy of Pediatrics for certain age groups. For example, children less than two should have no screen time at all. I think it jumps to five. They may be permitted an hour a day, or something like that. Forgive me, but there are guidelines out there.
Q339 Chair: Are they good guidelines or not?
Dr Davie: That is not the approach we are planning to take, because we do not think it has had a massive effect on the amount of screen time American children have.
Q340 Chair: We do our own thing and rely on your college to produce the definitive guidance.
Dr Davie: We are going to take a very consensus-based approach.
Q341 Martin Whitfield: Is it more complex than just simple guidelines on periods of time and that to try to reduce it to that simplicity is to make a mockery of what parents, children, particularly older ones, and adults need to know?
Dr Woods: Yes.
Dr Davie: Yes.
Dr Woods: Unless you are going to cut all media out of education, especially those guidelines, even at that one point, are thrown out of the window.
Anna Clark: It is also a matter of defining it. We have all argued about what is screen time and what social media are. Some parents do not think TV screen time counts. It is about education and defining it as a whole.
Q342 Liz Kendall: But surely you could just exclude what they do at school. There is a simple thing that parents want to know: when their children are back from school, how much is too much in front of the telly or on their iPad, or whatever it is? You are detailed and excellent academics, and there are all sorts of caveats, but it is like the five a day—it is simple and everybody remembers it. There is a saying, “Don’t let the perfect be the enemy of the good.” We need something that people can grab on to, where we know the effects.
Dr Woods: This is where sleep comes in, because we know the effects of poor sleep, so we can put that as our starting point.
Q343 Chair: So do we have a guideline about the time rather than about the number of hours?
Dr Woods: We have a 24-hour day, and we know what chunk of that is spent at school and what chunk of it we would recommend you spend sleeping.
Q344 Liz Kendall: What does that mean the limit should be—where you can say, “Above this, it’s a bit of a disaster”? What is the limit?
Dr Woods: Parents ask that, and I say that you need to think about the routine at home. As soon as you say to someone that they need to get off their phone, that is a negative message and it is not effective. You have to have a conversation that is not just about telling people what to do but is about being a family, saying, “This is the routine in the house.” Depending on the age of kids, you could say that nine o’clock is when the devices will be put away and they start thinking about whether they are ready for tomorrow. That means the whole family.
Q345 Chair: So you are suggesting that some sort of guideline about a time, dependent on age, when phones should really be off, would be helpful for parents.
Dr Woods: That feeds into the evidence that we have about the behavioural and cognitive impact—the CBT approach, basically—to insomnia. We know that it is effective and has more long-term efficacy, compared to any kind of medication that you can give. We know that works in improving sleep and mental health.
Q346 Chair: Would you suggest it for adults, as well?
Dr Woods: A lot of that evidence comes from adults, so we know that it is effective in improving sleep and health outcomes—so I do not see how it can be any different here.
Q347 Liz Kendall: So you are saying nothing after nine o’clock.
Dr Woods: It depends on the age of the child. You also have to take in the circadian factors. I just picked nine o’clock as an example. We need to make people understand the importance of adolescent sleep, as we do with sleep for newborns; then we can start building that into the 24-hour day.
Professor Fonagy: I would totally agree that sleep gives you an excellent signal and lever. I want to underscore what Heather has just said: it cannot be parents making children responsible. It turns out that parents are part of the problem, and they have to become part of the solution.
Q348 Liz Kendall: Are you talking about turning off devices an hour before bed?
Dr Woods: Would you think about giving your 10-year-old a can of Coke an hour before bed? It just sounds silly. It is the same thing. Okay, it is a different stimulant, but it is still a stimulant. I certainly would not think about having a coffee then. Allow yourself that time to feel sleepy tired, to de-arouse and allow that sleep onset to come.
Professor Fonagy: A wonderful recent brain-imaging study shows that adolescents as opposed to college-age students show a higher activation of the nucleus accumbens, which is a broad area, when they look at Instagram pictures, and lower arousal of cognitive control areas—this is comparing it across ages. They just get more pleasure out of it. The addiction side of it is very simple to understand. You have to deal with it as you do with all pleasurable things—in moderation.
Chair: So, Liz, you have to switch it off.
Q349 Martin Whitfield: I know that there was a big push in East Lothian for children to read books when they were in bed, rather than looking at their tablets or phones—and actually for their parents to be there reading the story, almost irrespective of the child’s age. Certainly, the subjective evidence was that that was very positive for them.
I know that there has been a call to ban junk-food advertising on social media. Do you have any confidence that that is going to be successful, or are we going to be caught up with the algorithmic problem that, the second you look at McDonalds, you are inundated with that advertising?
Dr Davie: We have had some involvement with this in the college. We called for a watershed for television advertising for foods that have a high calorie density, but we are very mindful of the television broadcasters’ argument that if we tie their hands but not those of the online companies we are being unfair. We understand that—so the next step would be to look at social media.
I do not accept that it is not feasible for these companies to do this, because they know exactly who they are advertising to, and when they are advertising. They can do this; it is whether they are willing to take the hit on their advertising revenue to have better health outcomes.
Q350 Chair: But, if they are not, you would say that the Government need to intervene.
Dr Davie: To be fair to the Government, they are taking quite a sensible approach to companies in trying a stepwise approach and engaging companies—but then having it in the locker to say, “If you don’t comply and play ball, we’ll have to go tougher on you.” It is a conversation at this point.
Q351 Martin Whitfield: I have one final question—two stars and a wish. You have this Gordian knot, which needs to be unpicked and not chopped through, you need people to accept responsibility across the board, and you want research into all of this. What would your wish be for the code of conduct on the transparency of social media companies? If you can put something into that code, what would it be?
Dr Goodyear: Young people have said that they would like some support on age verification—not age limits—on what types of health content they should use. Are body-building exercises for them, or are they for those aged 13 to 14? As with film, which is certified universal or for those aged 15 or 18, they want those kinds of recommendations to help them to navigate what they can use and what they should not.
Martin Whitfield: Anna?
Anna Clark: I am going to pass on that question. I am sorry, but I do not have enough information.
Martin Whitfield: That is fine.
Dr Davie: I always like to see more transparency about data and uses of data that can be used by researchers—
Martin Whitfield: And shared.
Dr Davie: They are anonymised data, of course. Once it is out there that this is what is happening, these companies do respond to public pressure. That is going to be the mechanism—not necessarily the Government so much as public pressure with the Government as well.
Dr Woods: I absolutely agree with that. I would like open access and collaborative research to access the data, so we can all pull together and present an evidence-based approach, which the media can then pick up on to present a realistic evidence-based picture, rather than scaremongering.
Professor Fonagy: There are new sources of data available that are untapped, which nobody has looked at and which may be absolutely critical in making judgments about social media. Companies should work alongside scientists, who have input on this—and, if we can, we can use the smartphone as a window on the brain.
Q352 Chair: That is a nice expression.
We heard from the first panel that they would like the code of conduct to be enforced with an effective regulator. Do you share the view that it needs to have teeth—in other words, that companies need to comply with it and, if they do not, there are consequences?
Dr Davie: I think that any code of conduct needs teeth.
Dr Woods: Yes, it has to.
Chair: Thank you all very much indeed. We appreciate your time.
[1] Note by Chair: ‘Instagram’ cited in the session in error, see https://www.theatlantic.com/technology/archive/2018/02/what-facebook-told-insiders-about-how-it-moderates-posts/552632/