Written evidence submitted by Sarah Champion MP (OSB0208)
I welcome the broad aims of this bill to tackle online harms and abuse, but currently it is insufficient in addressing the issues which are risking child safety.
Firstly, the scope of the child safety duty must be as broad as possible. The regulations need to cover any service or website which could be used by a child, including those which are not designed for children. Girls under the age of 18 are using sites like Only Fans, which distribute indecent material, and we need to be certain the bill will regulate these platforms and tackle these issues.
Age assurance must be enacted and functioning in practise, we cannot go backwards. The draft bill repeals Part 3 of the Digital Economy Act 2017, which laid out the proposals for age verification to be enacted and yet this has not taken place. In 2019, a study carried out by BBFC found that the majority of young people's first time watching pornography was accidental, with over 60% of children 11-13 who had seen pornography saying their viewing of pornography is unintentional. 83% of parents agreed that age-verification controls should be in place for online pornography. Viewing violent pornography at such a young age can have damaging effects on children and the Bill should do all in can to minimise occurrences of children viewing pornography.
Furthermore, elements of this bill would require age verification or age assurance for it to realistically work, so this must be met with genuine plans from Government which must be achievable and tangible.
It would be beneficial if the scope of the bill were to include all sites which host adult content- currently only sites which host user-generated pornography would be included in the bill’s proposals and this would not be effective in tackling children’s viewing of pornography.
The safety duties relating to illegal content are welcome. I strongly believe that the bill must specifically incentivise platforms to remove child sexual exploitation or abuse (CSEA) content, particularly first-generation material, and incentivising this in legislation will
Online cases of arranging or facilitating child sexual offences have been receiving lenient sentences where convictions have taken place. The Bill must ensure that arranging child sexual offences online is taken as seriously in the law as it is offline. The Bill currently fails to adequately tackle content that directly facilitates child abuse but does not meet the criminal threshold to be considered child abuse material. The safety duty on illegal content should treat material that directly facilitates abuse with the same severity as illegal content.
I am glad that private messaging is included in this legislation. In the UK, Instagram now accounts for one-third of all grooming offences, where the platform used is known, and a lot of these cases take place over direct messages. However, there are concerns from the NSPCC and other organisations that the current proposals under Clause 63, the “technology warning notice”, may be set too high for OFCOM to use the powers effectively. OFCOM would have to prove that persistent and prevalent abuse is taking place before the powers can be used, but this may be difficult to demonstrate. Additionally, all abuse is intolerable and must be taken seriously. Content can be just as harmful and abusive without it being prevalent and persistent and the bill must reflect this. I believe OFCOM should be able to take action before the point is reached that abuse is persistent on a platform.
The Internet Watch Foundation have strong expertise in detecting and removing CSEA content online and I believe the Bill should encourage OFCOM to work with them and organisations who have this kind of experience to enable the regulation of online abuse to be as effective as possible.
It may also be beneficial for the Bill to help ensure that no child found to be posting sexually explicit or indecent material of themselves is not criminalised. This is an approach the Home Office is already aiming to take in tackling self-generated indecent images of children online and the Bill must consistent with this approach to ensure that no child is criminalised, especially when they are likely to have been exploited.
26 October 2021