Supplementary written evidence submitted by Helen Wills



Laws applying to platforms

Platforms need to be held accountable: most social media platforms talk a lot about their responsibilities but don't actually do the work to keep people safe, including children. There needs to be much better policing on their part of images that might contribute to problems in adolescents and children, including racism, misogyny, hate speech and posts which are sexually suggestive. Facebook, for example, limit their interventions to full-scale porn, and don't monitor accounts like the one I mentioned yesterday where underage girls submit their own photos of suggestive poses in bikinis.

They also need to more to limit DMs - my son is currently ignoring over 20 accounts who have DMd him video links offering live masturbation.

In addition, I think they need to somehow look at better management of age restrictions. There is a limit of 13 I think, but in reality that never happens. It's accepted that by year 6 of primary school everyone will have an Instagram account, and usually TikTok too. Parents accept it and facilitate their child signing up, because they know that this is how children are organising their social lives. My daughter missed a sleepover when she was 11 because it was arranged via group DM on Instagram and she didn't know about it - so I let her have an account. Until the platforms get better about limiting underage accounts this will be the norm.

Finally, I don't think anonymity has any place on social media. This is what allows trolling. The only place I'm really trolled is Twitter, and if it got serious I'd have no recourse because even Twitter doesn't know who is behind the avatar. I think this is dangerous and needs to stop. Anonymity is useful for people to be able to open up about mental health issues, etc. and the platforms are providing a wonderful opportunity to many in this. But when it's abused there needs to be a way to identify who the troll (or worse) is, even if just by the platform owners.


Influencers still don't adhere to the ASA guidelines. Ads get much lower engagement, so it's very tempting to hide your disclosure lower down in the text but the rules are pretty clear. It's just that they're not being enforced. It would also be good to give more guidelines in law to influencers as they usually do want to comply and get it right. At the moment everything is a grey area. I'm frequently involved in conversations with other influencers who are trying to work out the right way to manage their disclosures to make sure they're doing the right thing. A clear set of rules, and the knowledge that they will be enforced, would help.