WEC launches new short inquiry on tackling non-consensual intimate image abuse: Is current legislation protecting victims?
The Women and Equalities Committee (WEC) will launch a new short inquiry on tackling non-consensual intimate image (NCII) abuse with an evidence session in Parliament on Wednesday, 6 November.
Meeting details
MPs on the cross-party Committee, chaired by Labour MP Sarah Owen, will hear from Revenge Porn Helpline campaigners and tech platforms Google and Microsoft during the first session on NCII. Non-consensual intimate image abuse occurs when intimate content such as photos or video is produced, published, or reproduced without consent.
WEC will examine the impact of non-consensual intimate image abuse on victims, and what steps Google and Microsoft are taking to prevent and tackle NCII. It will also consider the extent to which the Online Safety Act will be effective in mandating the removal of NCII and assess how legislation could be improved.
During the 6 November session, MPs will hear from the Chief Executive of charity SWGfL (South West Grid for Learning), which runs the Revenge Porn Helpline, on its technologies to help combat non-consensual intimate image abuse and the extent of its adoption by online platforms. It will also look at addressing the issues of deepfakes, synthetic NCII and sextortion.
In May this year, WEC held a one-off evidence session with campaigner, broadcaster and TV personality Georgia Harrison, who has appeared on The Only Way Is Essex and Love Island where she discussed her own lived experience as a victim of NCII. The Chief Executive of OnlyFans also gave evidence on its own approach to dealing with non-consensual content, and the impact of the Online Safety Act.
The predecessor committee wrote to Cabinet ministers and Ofcom after the May dissolution announcement stating: “NCII content needs to be treated with the same severity as child abuse material, we need legal provisions to allow for the rapid blocking of known NCII content, akin to how child exploitation content is handled, facilitating more immediate and effective responses.”
The Revenge Porn Helpline, alongside Meta has developed the StopNII.org tool which utilises preventative ‘hashing’ technology. Hashing enables victims at risk of non-consensual intimate image abuse to generate a digital fingerprint of their images or videos which will then be used by participating platforms to identify the images if they are ever posted. Using a hash means the victims retain control of the content, and it never has to leave their device.