Digital, Culture, Media and Sport Committee
Oral evidence: Influencer culture, HC 258
Monday 1 February 2022
Ordered by the House of Commons to be published on 1 February 2022.
Watch the meeting
Members present: Julian Knight (Chair); Steve Brine; Clive Efford; Julie Elliott; Damian Green; Simon Jupp; John Nicolson; Jane Stevenson.
Questions 453 - 499
Witnesses
I: Chris Philp MP, Minister for Technology and the Digital Economy, Department for Digital, Culture, Media and Sport; Sarah Connolly, Director, Security and Online Harms, Department for Digital, Culture, Media and Sport; and Mark Griffin, Deputy Director, Creative Economy, Department for Digital, Culture, Media and Sport.
Written evidence from witnesses:
– [Add names of witnesses and hyperlink to submissions]
Witnesses: Chris Philp, Sarah Connolly and Mark Griffin.
Q453 Chair: This is the Digital, Culture, Media and Sport Select Committee. The first panel hearing is going to be on influencer culture and the second on online safety and online harms. We are joined for our first panel by Chris Philp, Minister for Technology and the Digital Economy at the the Department for Digital, Culture, Media and Sport, Sarah Connolly, once again, director of security and online harms at the DCMS, and Mark Griffin, deputy director, creative economy, at DCMS.
Before I welcome you and we start the questioning, I want to see whether any members have any interests to declare. I would like to declare that I am the Chair of the all-party parliamentary group on new and advanced technologies. With that out of the way, Chris, Sarah and Mark, thank you very much for joining us this morning.
Chris, a first question for you. You have obviously been across what this Committee has been doing in terms of the influencer culture and been looking at all the witnesses we have had in. What struck you as interesting and potentially something that needs to be addressed in those hearings?
Chris Philp: Thank you, Chair, and thank you very much indeed for the opportunity to appear today for this session. In talking about the influencer question, it is worth mentioning at the outset that the issues that we are likely to discuss this morning probably cover three different ministerial portfolios. There is clearly an interaction with online safety and online harm, which is my responsibility, and I know that we are going to be discussing it in a lot more detail later this morning. It also touches on issues concerned with advertising and creative industries, which are looked after by my colleague, Minister Julia Lopez. Mark works in that team. Questions of employment rights and employment status are ordinarily covered by Paul Scully in his capacity as a Minister in BEIS. It was worth mentioning that this covers a number of ministerial portfolios, only one of which is mine.
To answer your question, the context is very important to keep in mind. We live in a world now where the reach of social media is extraordinarily wide, where pretty much everyone in the United Kingdom has access to social media of one kind or another, and globally half the population of the world—nearly 4 billion people—are online. The pervasive and constantly accessible nature of social media gives it an impact and an influence that goes substantially beyond the impact and influence that more traditional forms of media have had, particularly where the content being conveyed is particularly compelling. Video content, for example, is much more compelling and engaging and potentially influential than traditional static advertising or even TV advertising. Therefore, we are talking about an environment that is qualitatively different from the kind of environments that we have seen in the past.
You asked specifically what issues had caught my attention in the evidence that the Committee has heard so far. I think the Committee should be strongly commended for looking at this important topic. I found the questions around disclosure and labelling particularly interesting. If somebody is seeing content promoting some kind of product or some kind of service, and if the person who is producing that content is being paid to do so—they are not impartial, they are a paid advocate, essential, an issue that we have encountered in Parliament, obviously—it is important that the person viewing that content knows that the thing that they are viewing is not done impartially or out of strong conviction but because they are being paid. That was an interesting area that I know the Committee has heard evidence on.
The other question that you have looked at that is an interesting one, and not necessarily a straightforward one, is the question of exploitation and who is being exploited. Is it an influencer who may be in some way vulnerable and getting coerced into doing what is essentially a performance, or is it the recipient who is being potentially manipulated into buying a product or service without realising that they are being manipulated? The question of who is being exploited and who is the exploiter is a complicated but important question. If I had to pick out two things that particularly caught my attention, I think those two are the most interesting, but I am sure that there are lots of others.
Q454 Chair: Yes, we will probably look at exploitation particularly in the sphere of children during this session later. Julie Elliott had some questions on that.
Returning to advertising, we had the ASA in front of us last week. One of the things that struck me is the scale of this. If you think about it in a traditional media sense, you may have a few hundred fast-moving consumer brands that are regularly advertised through television, radio or other media. It is a big job but it is relatively easy to keep track on that. However, when you are dealing with InstaStories and you have hundreds of thousands of influencers, you literally are shooting fish in a barrel from a regulatory point of view. How do you think that your Department can help square that circle to ensure that many of the things that we value as a society, in terms of openness and transparency, are replicated in this ever more diverging world of influencers?
Chris Philp: That is a very good question. The regulators—in this case principally the Advertising Standards Agency—need to make sure that they have adequate skills and technology to properly monitor what is, as you say, a very broad and voluminous part of the market. That would include making sure that they have the technical capability to do that, certainly in the context of the Online Safety Bill, of which the principal regulator is Ofcom rather than the ASA. Making sure that they have the technical capability and resources is an important part of the work that we are doing to ensure that they are prepared, so making sure that the ASA has resources and the technical capability to monitor this is pretty important.
This is also a topic that I am sure will be addressed by the Online Advertising Programme, which my ministerial colleague Julia Lopez is working on, which will be published in consultation form relatively shortly. That will seek to address questions like that, to ensure that there are mechanisms and structures to cover the whole waterfront, because, as you say, it isn’t the same as TV advertising, where there are a finite number of slots on ITV or whatever it may be. By definition, it is potentially an infinite universe. It is certainly a much larger one than things like TV advertising.
Q455 Chair: How do you monitor the monitors, then? You talked about the ASA there. How does the Department—with BEIS, across Government—go about ensuring that they have the processes in place to ensure that this is a level playing field in influencers and advertising?
Chris Philp: Committees like this have an important role to play in looking at the work that regulators do. It sounds like you have had evidence from them and no doubt you ask them to appear before you periodically. Certainly, when I was on the Treasury Select Committee, we had people like the FCA and the PRA in pretty regularly. Of course, Ministers keep a close eye on what is going on as well. This is really Julia Lopez’s area rather than mine and I do not want to step on her toes. Mark, do you want to address that?
Mark Griffin: Yes. Good morning. Thank you for inviting me to give evidence today. What we do as a Department is we regularly meet with the advertising regulator, the Advertising Standards Agency, and Ofcom, which has responsibility for broadcast advertising. What we are discussing with them is the overall regulatory framework rather than any specific enforcement activities that they are doing.
What I would say by way of context is that we completely recognise that the growth of the online advertising market from a decade ago to now—which is over half of all advertising spend in the UK—has created huge challenges with the pace and scale of what is happening. What we are doing is a specific programme of work, through the Online Advertising Programme, to see whether the current regulatory framework is still fit for purpose. As part of that, we published two research reports and we issued a call for evidence in 2020. We will shortly issue a consultation that sets out our next stage of questions and consultation on proposals in this area.
What we are looking at are the questions that have come up as part of this evidence inquiry, which are: is there sufficient accountability of advertisers and, in particular, of other intermediaries in the advertising space, recognising that digital intermediaries and platforms may need to have more responsibilities in this area. Second is the question of transparency, which is: can we really be confident that we know who sees what and, therefore, the regulators can act to reduce the harms in this space?
Q456 Chair: Thank you. You mentioned the research that you had conducted. Could you outline that research very briefly, the scale of it and how it compares to the research that this Committee has taken as evidence in the United States?
Mark Griffin: We commissioned and published two research reports, one in 2019 and one in 2020. The first report looked at the overall regulatory framework and landscape in advertising and the players in the online advertising ecosystem. The second report looked at the issues and problems that have emerged.
One of the key things that we have been identifying is a taxonomy of the different types of harms that exist in advertising. Some of those overlap with the things that you have been looking at in terms of influencer marketing, but they go broader into other areas that the Committee is interested in, for example scam advertising.
Having done this research, we are looking at how we address all these issues as a whole and, in particular, can we improve the whole system of advertising regulation? Of course, I should say that we think that the Advertising Standards Agency has made huge efforts in its regulation. I think it spoke to the Committee last week about this in particular, upping the level of proactive assessments of the market, not just responding reactively to complaints that it receive. What we are thinking about is whether ASA has the right powers and tools to go as far as it needs to, recognising that it is a non-statutory body and that, therefore, it does have to work with other parties rather than impose rules.
Q457 Chair: Has any of your research encapsulated pay and conditions?
Mark Griffin: No, we have not looked specifically at the pay gap issues. It was very helpful that the Committee drew our attention to the US research on evidence of a pay gap. We are not aware of any UK research in the same area. Therefore, that does suggest that there is an area that should be looked at further.
Q458 Chair: Who is going to look at it? Are you going to look at it? Are you going to ensure that this is part of an ongoing process? We note that your two pieces of research were in 2019 and 2020. It is now 2022. This is a very fast-changing world and these are areas of concern. Will you commit to this Committee to commission research that looks at this particular area?
Mark Griffin: The next stage of research is looking to see if we can identify all of the harms in the advertising space and their severity and frequency. This is slightly different to the pay gap issue. What we will do is take that away and work with the stakeholders to ensure that that evidence gap is filled.
Q459 Chair: When is the Online Advertising Programme due to be published?
Chris Philp: I don’t think there is a precise date in circulation but, again, this is a matter for my colleague Julia Lopez. I think the intention is that the consultation will be published—I do not want to say too much—as close as possible to the revised Online Safety Bill, so that parliamentarians have a complete picture of the proposed reform, because they do cover adjacent spaces, obviously. Given that we intend to introduce the Online Safety Bill in this Session, relatively imminently, the Committee can infer from that that you will not have to wait too many more months before receiving satisfaction on both points, both on the Online Safety Bill and on advertising.
Q460 Chair: It has been a long wait for satisfaction when it comes to that particular Bill.
Chris Philp: Yes, but it will be all the more satisfying when it arrives, particularly with the benefit of this Committee’s recommendations.
Q461 Chair: Yes, thank you, we will turn on to that in our second stage. To drill down on that for one minute before I turn over to Jane, do I take it, therefore, that there is a role for legislative action against some of the abuses that we are highlighting through this inquiry, in the online influencers space in the Online Safety Bill? Is there a likelihood that some of that well permeate into the Online Safety Bill?
Chris Philp: We will obviously talk more about this later, but clearly, to the extent that anything done by influencers is straight up illegal, by definition that will be in the scope of the Online Safety Bill because it will be in the scope of user-to-user content. It is user-generated content by definition.
Secondly, the extent to which it is causing harm to children, whether as producers, as influencers themselves, or if children are the receivers of the content, if that content is harmful either to a child producing it or a child receiving it, that would be in the scope of the Online Safety Bill under the provisions that cover content that is legal but harmful to children.
To the extent that influencer-generated content might be taken as legal but harmful to adults, it will be subject to provisions that we will talk about later. For example, there will be an obligation on category 1 companies—the big social media firms—to have done a risk assessment and to have clear policies that they are then obliged to properly implement via those mechanisms that I have just described, which will apply to all user-generated content. That will include content generated by influencers.
Issues to do with things like labelling are an advertising issue. I don’t think that that will be addressed by the Online Safety Bill. However, I think it is likely that the Online Advertising Programme issue will get picked up.
Mark Griffin: Yes. In the Online Advertising Programme, hidden advertising is definitely one of the issues that we will pick up. Some of the proposals that we will consult on will require statutory changes to implement.
Q462 Jane Stevenson: Following on from that, as advertising shifts and influencer advertising is so different from what has had to be monitored before, I am especially concerned about children on the receiving end of influencer advertising. There are no enhanced disclosure requirements suggested at the moment for adverts aimed at children from influencers. I am glad to hear that there is a call for evidence around this, but how are you going to gather information about the influence on children and how can you protect them from seeing things that they do not realise are advertising?
Chris Philp: I will try to answer that first and perhaps Sarah may come in on the harm side, where the advertising content being produced is actively harmful. On the research and disclosure side I might turn to Mark.
If the content being produced that you are asking about, Jane, is harmful to children, that will be covered by the Online Safety Bill in the provisions that deal with content that is legal but harmful to children. Essentially, there is a duty on any social media platform that has a significant proportion of its users who are children to take proactive steps to prevent harm occurring to those children.
If it does not do that, it will be subject to regulatory enforcement action by the designated regulator, Ofcom, whose powers include levying a fine that is the greater of 10% of global revenue or £18 million. In the case of a platform like Facebook, that is an unimaginably large sum of money, even by Nick Clegg’s standards, so there are some quite powerful measures there. I will come on the disclosure in a second on the advertising side.
Q463 Jane Stevenson: It is not just harmful content. It is the clarity. We know that there are no watermarks saying, “This is an advert. This is sponsored”, so it is more around that.
Chris Philp: It is both. There are two parts to the answer. I am dealing with the harmful bit first and we will come on to the disclosure point second. In terms of research into harmful content and how it gets identified and exposed, do you want to comment on that before we come on to the transparency point, Sarah?
Sarah Connolly: Ofcom will have a duty to understand risk, to understand the harms, and that includes harm to children. As the Minister said, online safety goes much wider than just influencer; it is general harm to children. Mark might want to add a bit more on the influencer-specific bit.
Mark Griffin: The first thing is that the Committee of Advertising Practice is very clear that advertising targeted at children or which features children must not lead to any harm to them. If there is evidence of a problem, I know that ASA would want to consider that and take that away. I am sure that that was one of the points that came up in your evidence session.
Advertising rules and consumer protection laws are also very clear that you have an absolute requirement to disclose when you are advertising. You have heard issues with the level of compliance to those rules and we would agree that those compliance rates are too low. There are several things to say. First, the 35% statistic that came out through the ASA’s research in 2020 has obviously predated the undertaking that Instagram has given to the CMA. It also predates the strength and sanction regime that the ASA was talking to the Committee about. What we would like to see, of course, is a similar exercise repeated and to see that those compliance rates have increased. If those compliance rates have not increased, what we will look to do is strengthen the system of regulation through the Online Advertising Programme.
There are two key themes of that work; one is accountability and the other is transparency. On the accountability theme, we want to ensure that there are appropriate sanction powers available for advertisers who do not wish to co-operate with a non-statutory regulator at present; and secondly, that accountability is not just for the advertiser but is spread throughout the supply chain in advertising, whether that is the other digital intermediaries involved in the open-display advertising market, or the platform in the case of influencer marketing. Some of those proposals that we will consult on will require legislative change to implement.
Q464 Jane Stevenson: Because it is children that we are talking about, is there not a real need for disclosure to be really clear and transparent across every platform? How do we educate children to recognise an advertisement if there is no standard? If they do not have a watermark in the corner going “ad” in big pink flashy letters, I don’t think you can expect young children especially to know when they are being sold a product.
Sarah Connolly: If I may, although this is not a complete answer, the Online Safety Bill will require platforms to age-gate appropriately. Quite a lot of the platforms that I think you are thinking about in this space have an age of 13, so children under 13 should not be on the platforms to start with. As I said, it is not a complete answer, but certainly the younger end of the children should not be on those platforms to start with when legislation comes through. One would hope, therefore, that you do take away some of the problem for the younger end.
Q465 Jane Stevenson: Lots of young children are on YouTube, and if they have a favourite influencer or groups that their friends all follow, you can see how it can spiral.
Chris Philp: Clearly, there is already an obligation for paid-for advertising to be clearly labelled, which, as of 2020—on Instagram at least—was not being properly implemented. The compliance rate was only 35%. Getting that higher is critical. Mark mentioned a moment ago the possibility of creating statutory sanctions for the ASA, which do not currently exist. That work is in hand via the Online Advertising Programme.
The second point that you raised was about making sure that the label is clear enough. If the product being advertised is not supposed to be marketed to under-18s they should not see it at all. If it is a product that is allowed to be marketed to under-18s, like a toy or something, you are making the point that the label should not just be present but should also clear and comprehensible to an eight year-old. I hope that is something that the Online Advertising Programme might consider, or the ASA in its codes of practice.
Mark Griffin: Yes, the key thing is the requirement that it is obviously identifiable to the person looking at it. We recognise that it is not just a question of including the words in some form that the advertiser is comfortable with, it is the recipient of the ad. Both the ASA and the CMA, in their written evidence to you and in their guidance on their website, have set out a range of things that they think are acceptable forms of disclosure and a range of things that are not acceptable forms of disclosure. Even though there might be some words to that effect, they may not be very easily visible or identifiable. If there is evidence that children still find those means of disclosure insufficient, that is a clear area that the platforms should provide better tools to influencers on them to make sure that that disclosure is more prominent.
Q466 Jane Stevenson: Where does the education fall? Is it on parents, is it on schools, or is it on the platforms to make this system really clear to children?
Chris Philp: You are touching on a very important wider point about what we call media literacy. That is essentially educating the public—but children particularly—about the pitfalls of life online. That is a programme that specifically looks to educate on these sorts of issues. Beyond that, it falls to parents and to schools to make sure that children are aware of the risks that exist online.
This is obviously one of the risks but clearly there are far more serious risks. My eight year-old children are twins and they are getting into using online devices. As a parent, you are very conscious of the risks that that opens up and so you try to educate as to the risks, and schools do as well. This is probably at the lower end of the harm, but it is a risk and it is important that it is covered as well as the more obvious very serious risks that exist online as well.
Q467 Chair: Thank you. Just one quick question before we go to Damian. We can all agree that the ASA has its work cut out when it is dealing with the burgeoning world of influencers and online advertising. Given the fact that it is funded by a 0.1% levy on adverts, many of those on traditional media, isn’t there a case for that charge to be levied more heavily on the social media companies so that they pay for the regulation of this space and the potential harms, given the fact that the ASA’s work has been extended to such a degree?
Chris Philp: It is a very good question. It is one that I think lands squarely in the portfolio of Minister Julia Lopez. I hesitate to make any public utterance on a colleague’s policy portfolio, other than to say that you are raising a very interesting point. Mark, is it one that you feel able to comment on?
Mark Griffin: Perhaps I could add something. You are aware that the ASA is currently doing a piece of work called the Online Platform and Network Standards, which is trying to build and formalise its relationship with the platforms and other digital intermediaries in the space. There is obviously a question, as it extends its regulatory remit to the platforms, that they should contribute to the funding of that system. I would also add that, were there to be statutory regulation in this area, we would expect it to be funded by those market participants that contribute to the problems that we are seeking to regulate. We would expect all of those market participants to be contributing to that system in the fairest way possible.
Q468 Damian Green: Last July the Department produced an Online Media Literacy Strategy. Under the 2003 Act, Ofcom has the duty to promote media literacy, which I assume these days includes online media literacy. Are you conscious that Ofcom is doing anything about it?
Chris Philp: Have you had Melanie Dawes in front of the Committee?
Damian Green: Not on this.
Chris Philp: Not on this. Ofcom has a wide range of responsibilities. As part of the Online Safety Bill work and its increased duties, we are giving them commensurately substantially increased resources. The total funding package over a three-year period runs to £10 million, not all of which Ofcom gets. Given the scope of the Online Safety Bill, we would expect some of that at least to make sure that the media literacy duties are being fully discharged.
It is an important question. It is one that I have not received a specific assurance on so, as an action to take from this Committee, I will undertake to more directly probe the question that you have raised. Sarah, do you want to add to that?
Sarah Connolly: No. That is exactly right. Ofcom is conscious that with the additional powers and additional responsibilities it will need to do more in this space and, as the Minister says, is gearing up to do exactly that. However, we will take it away.
Q469 Damian Green: I am glad that you are taking it away, because I take it that the answer is, “No, not yet,” or, “They haven’t got around to it yet.” Given that the Department produced a whole strategy on the subject and it is the regulator, it is perhaps a bit surprising and disappointing that it does not appear to have done anything in the last six months.
Sarah Connolly: Without wishing to leap to Ofcom’s defence, I would say that Ofcom is looking at it. It has quite a lot of other things that we are asking it to do at the moment in this space but it is high on its list of things to think about. I would not take silence as a lack of activity.
Chris Philp: I did not say that Ofcom was not doing anything. I just said that I had not been briefed on what it is doing. You cannot infer one way or the other just that we will undertake to make sure we get a decent answer.
Q470 Damian Green: I am interested because, as long ago as 2019, there was an article written in the Journal of Cultural Policy describing this area of media literacy as one of the zombies of cultural policy. It has been a bit of a—to change metaphors—black hole for some time. Given that what we are discussing this morning is a whole new area of media activity that has arisen in the last few years that particularly influences children, it feels particularly important that the Department and the regulators should be proactive about it, and I hope we can encourage you to do that?
Chris Philp: Yes. It is a very reasonable point. To add to the previous comment, there is work going on, on the back of the Online Media Literacy Strategy published last summer, to do additional work through libraries and other organisations like that to ensure that media literacy training is being delivered to the public, including children but the public more widely as well.
It might be helpful to write to the Committee with further particulars of that work and to answer the question about Ofcom. I would not want to do Ofcom any injustice by omission.
Q471 Damian Green: That would be useful because my next question was going to be: do you have the strategy and what is happening to implement it? If you could write to the Committee to explain that it would be really helpful.
Chris Philp: On both questions, the Online Media Literacy Strategy implementation, which includes the work through libraries, and also the question about Ofcom, we will come back to you on both of those.
Q472 Julie Elliott: If I may add, if you could say how else that is being delivered, not just through libraries, because many areas have had local government cuts over the last 10 years. We have one library in my constituency. Libraries do not cover the community any more in most of the country. I would look to see what else is happening with them than just libraries, however good they are.
Child influencers represent a legislative grey area. This inquiry has found that child influencers are falling through the cracks of UK child labour legislation. Will you commit to addressing this by expanding the 2014 child performance regulations to require performance licences for children appearing in influencer content?
Chris Philp: Once again, this falls into Julia Lopez’s area in the creative industries so I cannot give you a firm commitment on behalf of another Minister. What I can do is ask Minister Lopez to respond to you in writing on this specific question.
You are right to point out influencers in general, most of whom seem to be working in what you would consider to be a freelance capacity. They are not employees of a particular brand; they are providing an ad hoc service in exchange for payments. Some of them are children, one assumes acting with the consent of their parents although not necessarily so. For children in particular, it is something of a grey area. It is conceivable that given children, 16 year-olds and so on, do often have unfettered access to smartphones and are able to produce a self-made TikTok video in five minutes.
Q473 Julie Elliott: I would suggest a lot of them are under 16.
Chris Philp: They could be a lot younger, yes. Sadly, you are probably right, it is entirely conceivable they might be doing that without any parental oversight or knowledge.
Q474 Julie Elliott: We had a session that looked at this and, even with parental oversight, there does not seem to be any protection at the moment. Mr Griffin, is this something you can comment on?
Chris Philp: Before you do, Mark, under the Online Safety Bill, if there is evidence there is harm being caused to a child—either to the recipient or in the case you are describing, the producer—that will be covered by the Online Safety Bill. There will be an obligation on all social media platforms where there are a significant number of children active to take steps to prevent harm to children. That will be covered by the Online Safety Bill but, Mark, do you want to comment on the wider point?
Mark Griffin: I am not aware in DCMS that we have looked at the specific question you are asking about, the change to the legislation. We are very happy to take that one away.
Q475 Julie Elliott: The other thing that is connected, France introduced its exploitation of the image of children online platforms Bill in 2020. Will the Government be looking at bringing in any legislation to mirror that, which seems to give quite a lot of protection to children, or has the Department looked at that?
Chris Philp: Exploitation of children, would that be a BEIS issue? Harm to children will be covered by the Online Safety Bill. To the extent that exploitation constitutes harm, which one would assume it would, the propagation of that, exploitative and therefore harmful content online—
Q476 Julie Elliott: It is about the hours that children can work because there does not seem to be any regulation in this area?
Chris Philp: Matters concerning hours of work, for example, are an employment issue, so it is a BEIS matter. We are into BEIS employment law territory here.
Q477 Chair: Forgive me, Minister. I understand the parameters working within Government when you say it is an employment matter, but Julie has a really powerful point. In our very first evidence session, we heard cases of children who were effectively on show from breakfast all the way through to bedtime, with their parents making tons of money from this—serious amounts of money—and yet in the laws of this country if you are under 14 you are allowed to work 12 hours per week. Some of these kids are on show on stage for 12 hours per day. As a family man, I have to say don’t you think that is fundamentally wrong and something that, as a Government, we should be doing something about?
Chris Philp: I agree that it does sound fundamentally wrong. From what you have just said, it sounds like that is already prohibited under existing labour legislation.
Julie Elliott: No, it is not.
Chris Philp: Did you say there was an existing law governing that?
Q478 Julie Elliott: Child influencers are not covered by current legislation. That is the problem. We are looking to the Government to introduce something that does regulate this and does protect children because at the moment, as far as I can see, they are not protected in law.
Chris Philp: Are you suggesting that applies to any child acting as a freelancer, or are you suggesting there is some specific loophole that means that child influencers are not covered?
Q479 Julie Elliott: Influencer culture is not covered by legislation at the moment. If you are an actor performing on stage it would be covered. Influencer culture is not covered.
Chris Philp: Even if they are being paid and everybody is essentially a freelancer?
Q480 Julie Elliott: They have not got a contract of employment, so they are being used to create income with their parents’ consent and often their parents are managing this. However, the child is not protected at all. Perhaps if you do not know the answer to it, Minister, you can go away and have this looked at, because it is a serious issue we came across where there does not seem to be any protection in law to children in this country being used in this way.
Chris Philp: You are highlighting what is potentially a lacuna in the way employment rights legislation operates. That is a BEIS area. If this lacuna or gap that you are describing exists, I would say at face value that does sound like an issue that should be sorted. What I can undertake to do is to draw the matter to the attention of my BEIS colleagues. Paul Scully is the Minister responsible for employment legislation. I will ask him to take a look at it based on what this Committee has found.
Q481 Chair: The key piece of legislation that works is, for example, if you go on the stage and you are performing in a musical—I am stretching an analogy there—you fall under the performance and activities regulations of 2014 related to children. If you are at home, and your parents are filming you eating your breakfast, falling off your skateboard and doing this and that, and then they are basically making money from you, that is not covered by any of this legislation whatever, and there are no parameters in what you would term working hours that we would recommend.
Chris Philp: If that is the case, we—the Government—should look at filling that gap. It will either be BEIS or, if it is in the creative industries, it might be our Department. We will find out which bit of Government is responsible and ask them to give you a response, because you have done exactly what a Select Committee should do: taken evidence and identified a gap that has arisen. This is a new thing. This did not exist 10 years ago, or possibly five years ago. This is a new phenomenon.
Chair: Probably the nearest phenomenon would be the freak shows—the travelling freak shows and circuses. That is the closest analogy I can bring to exactly what we are seeing now regarding child exploitation in this world of online influencers. Julie, do you have any more questions?
Julie Elliott: No.
Q482 Simon Jupp: Good morning, Minister, and the rest of the panel joining us this morning.
Before I move on to compliance, is it quite a frustration that you have just heard a potential problem highlighted by the evidence this Committee has heard and it falls under a different Department, sometimes with a different Minister? Is it very difficult to co-ordinate all the work between these different Departments and Ministers to achieve the right outcome and ensure the legislation you are bringing forward covers off all these aspects?
Chris Philp: You are very eloquently describing one of the challenges of Government in any area not just this one. We have 20-odd Departments, maybe slightly more—22, 24—and about 70 or so Ministers covering all of those Departments. To function you have to define the Department that does what and within each Department which Minister does what. You have to do that so that you know what you are working on. Inevitably, issues arise—like the ones that we are discussing today—that cut across a number of areas, and you have to try to find ways of co-operating with your colleagues across government to make things happen.
Damian was in the Cabinet Office; the Cabinet Office sometimes provide a co-ordinating function if it is cross-departmental. I am sure Steve had similar experiences.
Steve Brine: We used to do it brilliantly.
Chris Philp: Yes, exactly. It has gone downhill since. I didn’t mean that. That was a joke.
An example of that is in the Online Safety Bill, where we have been working quite closely with the Home Office, because some of the online safety stuff interacts with things like child sexual exploitation, national security, Home Office issues.
It is getting off topic here, but on the digital economy type issues there is a massive cross-over in promoting the UK as a tax centre, which is in my portfolio. There is a significant cross-over with Treasury, the tax system, the British business bank, research and development that is in BEIS, UK RI, a huge impact on the tax sector. Particularly at DCMS there is significant interaction with other Government Departments, so you spend a lot of time trying to co-ordinate that.
I was in the Home Office previously and it was more siloed, where you were dealing with a topic that had a ringfence around it, but the DCMS issues are in technology, media, online safety that reach into all kinds of other Departments.
Q483 Simon Jupp: Indeed. Moving on to compliance, in 2020, the Competition and Markets Authority did a three-week disclosure monitoring exercise, which was mentioned by Mr Griffin earlier. It looked at influencer advertising compliance and suggested that one in four stories were ads but only 35% of them were labelled as such; 2020 is the only example we have of this. Are you aware of any other reviews of this kind to ascertain how much of a problem it is now and not two years ago?
Chris Philp: I think they have done more since. The 2020 study you referred to was specifically on Instagram. I understand work has been done with both the ASA and the CMA subsequent to that to try to drive up compliance because that 35% under the regulations should be 100%, so it is a long way short.
As far as I am aware—and Mark can elaborate—there has not been another check-in to see where that 35% has moved to. I know with Instagram there has been a particular programme of engagement, because that study two years ago did highlight a particular problem. Mark, do you want to elaborate?
Mark Griffin: I am aware that, after it did that study in 2020, the ASA wrote to all of the influencers and brands that it found had non-disclosure advertising to ask them to address it. What it has been focusing on is strengthening the sanctions, and I think it described those to you. The key steps there are the register that it now holds on its website of influencers who do not disclose advertising, who have done that repeatedly and not corrected it.
A further step it has taken very recently, which is for persistent non-compliant influencers, was to take out advertising against them to make it clear that those influencers are not following the rules and that has brought—as the ASA described it—those into line. I think what it is going to do next, having done that work, is obviously to think about what its next step is. I know the ASA is considering that at the moment, and I am sure it will be considering the recommendations of this Committee at the same time.
Q484 Simon Jupp: The CMA say that the level of compliance is unacceptably low. Those sanctions and changes are not enough. There are still more solutions to be found to these problems, as Mr Griffin has alluded to, Minister.
Chris Philp: Yes, and there is more to do here. The Online Advertising Programme is a vehicle that I expect will address that. As Mark said earlier, at the moment, the ASA’s regulatory regime is non-statutory and it may well be there are proposals coming forward on the Online Advertising Programme requiring statutory regulation; regulation that carries the force of law to ensure compliance. The example you have been highlighting quite rightly just now is an illustration of why that may be needed.
Q485 Simon Jupp: Do you think the ASA needs additional powers to enforce compliance?
Chris Philp: That is a question that the Online Advertising Programme consultation, which is due in the relatively near future, will address. This failure to disclose paid for advertising essentially, paid for influencers, is a problem and it does need sorting out. Progress has been made and action has been taken since 2020, but it strikes me that there is more that needs to be done.
Q486 Simon Jupp: When the ASA was in front of us, one of the things that struck me was it admitted it prioritises areas separately from influencer culture because of limited resources, as the ASA put it. Should its focus change?
Chris Philp: As the Chair said at the beginning, it covers quite a wide area and this interacts with the question of resourcing asked about previously. The ASA does need to pay particular attention to the online world because it has grown so much. As Mark said, it now represents over half of the advertising spend. It is very important that regulators in all areas, this area—CMA, Ofcom, ASA, all of them—keep pace with technological change.
If you look at regulation in any of those sectors, quite often regulatory change lacks real world technical change in various periods of time from a month to, in some cases, years. Regulators in general, including the ASA, need to be agile and keep up with changes in the industry. Where there are new trends that emerge, they need to jump on them quickly and make sure they are resourcing proportionate to the risk.
I think it is true to say that traditional advertising, like TV, radio and newspapers, has been around for so long that the rules are well established, the participants and what is and is not acceptable. It also tends to be a relatively small number of quite large companies participating and it is easy to regulate. This area is hard. It is fast growing. It changes quickly. It is much more fragmented and it is harder to regulate. Therefore, it does merit commensurate attention.
Q487 Simon Jupp: Does it also merit additional resources within the ASA?
Chris Philp: Again, I am straying outside my territory here. I think that is a question the Online Advertising Programme will properly consider along with the questions about powers and funding, but the Online Advertising Programme is the place where that sort of question belongs. Unless, Mark, you want to add to that?
Mark Griffin: As I think I mentioned in one of my earlier answers, the ASA is focused predominantly on regulating advertisers. If it takes on responsibilities with regulating other parts of the advertising supply chain, digital intermediaries and platforms, that will incur additional regulatory costs and we would expect those to be fairly distributed across all of those players who are causing harm in the system.
Simon Jupp: Very interesting, thank you.
Q488 Clive Efford: Minister, I must say that I have found your approach to this Committee very engaging today. We have had other experiences with Ministers who have come before us. When you have identified an area that you are unsure about, you have said, “I think you are right and I will go away and look at that.” That should be a lesson to other Ministers who come here, so thank you.
We mentioned earlier on the MSL study: “Time to Face the Influencer Pay Gap”. That found there was a 35% gap between white and black influencers compared to pay gaps in other industries. It highlighted: education 8%, business and financial 16%, construction 19%, sports and entertainment 16%, and that the gap uncovered in influencer marketing vastly overshadows the gap in other industries.
None of those figures are good but it is more than double any comparator in that report. Does that suggest there is an urgency to deal with this, and what is the Department doing to address diversity in the influencer field?
Chris Philp: Thank you for the question. It does suggest there is a problem. I think it is worth saying that the MSL study, “Time to Face the Influencer Pay Gap”, which identified the 35% racial pay gap, came from the United States but it is reasonable to assume that issues may exist here as well that are serious and should be addressed.
One of the initiatives currently underway to try to do that is via ISBA, the Incorporated Society of British Advertisers. It produced a code of conduct last year, which I believe is being updated. The update is designed to focus on these issues of diversity and equality. The people who are doing the advertising are the ones paying influencers, so it is in their gift to pay people on a fair and equitable basis.
In the first instance, we would look to this code of conduct being updated to try to address this issue in the UK. Mark, do you want to add to that?
Mark Griffin: It has been very helpful that you have drawn the report to our attention. As I said earlier, we are not aware of any comparable UK research. Within the creative industries we have a significant programme to improve diversity and inclusion, and the creative industries themselves have committed to an eight-point plan to do that. Part of that is the measurement of what they are doing. Once the ISBA code of conduct is updated to cover these qualities and diversities issues, we would like them to see a measurable improvement in those outcomes as a result.
Q489 Clive Efford: The report also suggested that one of the simplest solutions to dealing with this issue is pay transparency. Is that something that is being considered as part of the review?
Mark Griffin: It is not part of the Government’s Online Advertising Programme. Influencers all have income from multiple areas. They may receive payments from the platform they are on and they will have commercial arrangements potentially with one or many brands. The contractual arrangements they enter into would be a private matter for them and those companies.
What we might be seeing, particularly in the United States, is the influencers want to begin to share more of that information across their community. That is very helpful because it allows them to understand, if they are entering into a particular arrangement, whether they are being treated fairly compared to others.
Q490 Clive Efford: Minister, you said it is reasonable to assume that we may see similar pay gaps here in the UK, because this is an American study. Do we intend to fill that gap with any research?
Chris Philp: What we will do is take that question away and look to explore whether we and our partners in industry can fill that.
Q491 Clive Efford: It has also been reported to us that influencer talent agencies struggle to find talent from under-represented groups because they are not profiled enough by social media companies. This suggests that there may be concerns about the racially and gender biased recommendation algorithms and content moderation. Is that something the Government are addressing, or does they have concerns about it?
Chris Philp: Regarding algorithms in general, including algorithms that promote content, we are looking at artificial intelligence algorithms as well. Many of these would be considered to be AI algorithms. It is an area we are considering. We are very aware of suggestions, and we are concerned by the suggestion that AI algorithms contain inadvertent bias. They are not designed to be biased but they may develop bias simply by the way that the algorithms operate.
We are publishing some work relatively imminently on artificial intelligence standards and regulation, which we hope will be world leading. One of the things contained in that are provisions on bias that may derive from algorithms, because we are concerned that it is growing not just in this area, so identifying social media people to get promoted, but in all kinds of other areas where AI algorithms get used. For example, financial decision-making. You get a loan/you do not get a loan, all that kind of thing. I think there is a wider issue for us to get on top of and the AI work happening at the moment is designed to address that.
Q492 Clive Efford: Any idea when that will be completed?
Chris Philp: It is going to be produced this calendar year, and I hope the first half of this calendar year, but it will be in 2022 for sure.
Q493 Chair: Just on that point, do you think the Equality Act needs to be extended to cover specific algorithms and the way they interact in decision-making, how we are advertised to but also, frankly, how we make our money, just given the way in which it impacts our daily lives now?
Chris Philp: Equality Act duties apply generally to the outcomes that different businesses deliver. In areas that are quite opaque, like algorithms, they have not been properly looked at from an equality perspective. The algorithms are most effective, or intrusive, have the most impact in the AI environment. That is why it is so important to get this AI regulation right, to ensure that biases do not creep into AI algorithms.
AI is having already—and will continue to have—a transformational effect on society and the economy. We have to make sure it does not have biases baked in. It is not a question of legislation; it is a question of making sure that, in the application of algorithms, this issue is properly regarded. It is quite easy to overlook or neglect it unless it is consciously looked at and addressed.
Q494 Chair: How do you bake in that best practice rather than use it to be reactive?
Chris Philp: By the regulatory environment. In things like AI, making sure that, when using algorithms, there is a general duty to have regard to this issue rather than just ignoring or forgetting about it. That is what we are working on at the moment. You do not want to go so far that you stifle innovation. There is a balance to strike between making sure important issues like equality are properly heeded and paid regard to, but not to the point that you prevent all innovation. That is the balance that needs to be struck. Historically, the Europeans have gone too far on the regulatory side and have stifled innovation, so we need to ensure we strike the balance in the right place.
Q495 John Nicolson: Thank you for joining us this morning. Regarding the influencer inquiry the Committee has done, it has been quite striking that lots of the witnesses we have heard from hitherto say that there are specific under-represented groups, specifically women, surprisingly—I am surprised by that—and also racial minorities. The talent agencies have told us that it is very difficult to book these groups because the social media companies and their algorithms do not represent them sufficiently.
It sounds quite obscure but it is important in practical terms because we want the faces we see online, influencing particularly young people, to represent society as a whole. What do you think you can do, Minister, to address this from the Government’s perspective?
Chris Philp: That is a very good question. The point I made a second ago about the way algorithms operate is an important one, particularly in the context of artificial intelligence. That is already—and will continue in the future—playing an increasing role in the way that content is selected and displayed online. Also, far beyond that, in financial services for instance. Making sure that as those algorithms become even more pervasive, impactful and influential that they are carefully monitored to ensure there are not unintended but, nonetheless, as you say, serious implications for diversity, fairness and representation.
The first thing is to make sure that those algorithms are properly looked at, and the current ongoing artificial intelligence work is designed to address that issue. That is the first point, about the algorithms themselves, which you have quite rightly pointed to.
Secondly, industry itself needs to play an important role here. There is data from a UK survey, I think, which essentially validates the point you have just made. That suggests that 9% of UK influencers are from BAME backgrounds, which is a lower percentage than the population as a whole, which suggests that there is indeed an issue. The work by industry, particularly the Creative Industries Council and their eight-point diversity charter, is important. I hope that will provide a push from the demand side but the algorithms are also important because they become a self-fulfilling prophecy, as you suggested in your question.
Q496 John Nicolson: You accept that Government intervention will probably be necessary in order to redress this because the companies themselves do not seem to be doing it.
Chris Philp: In relation to the algorithms, yes, and this is a much broader point that we may come on to in our second session. Generally speaking, tech firms often optimise their algorithms for profit, for money, and do not give regard to either harm that may be caused or inadvertent discrimination of the kind that we are discussing now. I think Frances Haugen’s evidence to the US Senate, The Wall Street Journal and so on, gave a powerful illustration of the way these companies—in that case, it was Facebook—run their algorithms for profit with no regard for wider social issues such as the those we are discussing here. It is a problem. and we need to address it through the AI work we are doing through the Online Safety Bill.
Q497 John Nicolson: Moving on, we all know that women are subjected to a disproportionate amount of abuse on social media but, in the course of our inquiry, social media influencers who have come before us have told us of absolutely horrific levels of abuse. One woman read out some of the things she has been getting all the time, day in, day out. It is obscene and the social media companies are doing nothing about it. They pretend that they are interested in it, They encourage people to report but, of course, if you report, absolutely nothing happens. She told us that she has given up. She just endures it. Nobody should have to endure this stuff.
What more do you think can be done, specifically, to protect people in this category, not just women in general but—since we are looking at social media influencers in our report—social media influencers who choose to put their heads about the parapet and talk about their lives?
Chris Philp: That is exactly the right context to set. Social media influencers have just as much right as anybody else to be protected. They should not suffer the sorts of abuse that you are describing and nor should anyone in public life, in private life or in using social media platforms to earn a living or part of their living. People should not suffer disproportionate abuse, or indeed any abuse as a result. They deserve protection just like everybody else. The vehicle that will deliver that is the Online Safety Bill that we are going to discuss later. If the abuse that is being suffered by people—like the witness you are describing—crosses the criminal threshold, it will be covered by the Bill. There will be a duty on social media companies of all sizes to prevent that abuse from happening. That is the first step.
The second step, in relation to content that falls below the criminal threshold but is nonetheless harmful—and it sounds like some of the things you are describing may fit into that category as well—there will be a duty on the category 1 social media platforms, the largest platforms, to do a risk assessment. In that case, if one of the risk indicators, say for a role as a social media influencer, is gender that will have to be taken into account by the social media platform in doing their risk assessment and they will have to have policies in relation to that. The policies are not prescribed but there should be a policy, which should be clear and transparent and, having set that policy, it will have to be consistently enforced.
One of the problems at the moment is that very often the platforms do have policies that notionally protect people but they are not applied in practice. I think it is exactly so for the example you are giving. Under the Online Safety Bill, the big category 1 firms will have to properly enforce those terms and conditions to protect people, like the witness you described.
Q498 John Nicolson: Right. Therefore, it sounds like we are entirely agreed on this. For our constituents who are sitting watching this session in surprising numbers, anything that is illegal offline should also be illegal online. In addition, you as a Government are going to impose on the social media companies some sort of rules whereby, if something is harmful but not illegal, the social media companies will have to take action. What action will that have to be in practical terms and what sanctions will the companies face if they do not uphold their own rules?
Chris Philp: On the first point about illegality, not only will what is illegal offline be illegal online but there will be a duty on social media firms to stop it from happening. It is not just a slap on the wrist, “That was illegal”. They have to stop it. They have to show Ofcom what they are doing to stop it. If they don’t they can suffer a massive fine.
On your second point, something that is legal that falls below the criminal threshold but is harmful, they first of all have to do the risk assessment I mentioned. Secondly, they have to have a clear policy as to what they are going to do about the risk that they have identified and they have to properly apply that policy, and often they do not do that. If they fail to do any of those three things—they have to do all three—Ofcom can take regulatory enforcement action. That includes ultimately levying a fine of 10% of global revenue or £18 million, whichever is the higher.
Q499 John Nicolson: Even if it is not illegal?
Chris Philp: Yes, even if it is not illegal. The thing I should say, in the interests of transparency, is that the second of those steps—having to have a policy to deal with the risk they have identified—the Bill does not prescribe what that policy should be. In theory, for example, it would be open to Facebook to say, “We have done a risk assessment. We have identified a risk and we choose to ignore it”. Facebook would be able to do that but that would be publicly exposed and advertisers and so on could choose to withdraw because it would be forced to admit it was not taking action.
However, as a matter of practice, I think pretty much all of the big social media platforms—Facebook, Tik Tok, Snapchat and so on—currently have policies that prevent harmful activity of the kind we are describing; they just do not enforce it.
The third of those three steps—you must enforce your policies—would bite. Assuming there is no change to their current policies, if they continue to fail to enforce their own policies, which is the situation you are describing, they would face regulatory enforcement action by Ofcom.
Chair: Thank you. I feel as if we have just gone into the second session but we are going to do that officially in one second. We need to change one of our witnesses. Mark Griffin, thank you very much for your evidence today. We will take a very short adjournment of two minutes.