Written evidence submitted by International Justice Mission (OSB0025)

 

International Justice Mission Submission

 

International Justice Mission (IJM) welcomes this call for evidence from the Joint Committee on the Draft Online Safety Bill.

 

IJM supports the Online Safety Bill, which is set to be groundbreaking in holding regulated services accountable for online sexual exploitation of children occurring on their platforms.  The Bill has great potential to make a global impact, with the repercussions for the prevention, disruption and detection of online sexual exploitation of children likely to be felt worldwide.

 

This submission outlines a number of areas in which the Bill could be strengthened in order to further improve protection for vulnerable children.

 

About IJM

 

International Justice Mission is a global organisation that protects people in poverty from violence. IJM partners with local authorities in 24 programme offices in 14 countries to combat slavery, violence against women and children, and police abuse of power against people in poverty. IJM works to help people out of situations of violence and into support, hold perpetrators accountable, and help strengthen public justice systems.

 

IJM has partnered with authorities around the world to help more than 66,000 people out of situations of slavery and violence and has seen slavery decrease by up to 86% in places where we’ve worked.

 

Tackling Online Sexual Exploitation of Children

 

Since 2011, IJM has partnered with the Philippine Government, international law enforcement, and NGOs to combat the trafficking of children by adults to create new child sexual exploitation materials, especially via livestream video, for paying sex offenders abroad. As of July 2021, IJM has supported dedicated law enforcement partners in the Philippines in 248 operations, leading to the rescue of 828 victims or at-risk individuals, the arrest of 293 suspected traffickers, and the conviction of 122 perpetrators, with prosecutions ongoing.

 

According to an IJM-led study released in May 2020, 64% of Philippine law enforcement operations from 2011-2017 were initiated by a foreign law enforcement referral, reflecting the global nature of the crime and the importance of effective law enforcement collaboration and referral sharing between demand and source countries.

 

In 2020, IJM expanded our programming by launching IJM’s Center to End Online Sexual Exploitation of Children. The Center partners with governments, industries, NGOs, and other stakeholders to expose, neutralise, and deter online sexual exploitation of children around the world. Leveraging practices proven effective in IJM’s ongoing programme against this crime in the Philippines, the Center helps:

 

  1. Improve technology and financial sector detection and reporting of livestreaming child sexual exploitation,
  2. Strengthen international collaboration in law enforcement and prosecution, and
  3. Support effective justice system (law enforcement, prosecution, and aftercare) responses in source and demand-side countries, resulting in sustainable protection for children and accountability for perpetrators.

 

Key recommendations

 

Whilst the Bill represents an important step forward, there are certain areas that could be strengthened in order to more effectively counter the production and dissemination of CSEA, most notably in the limitation of enforcement powers to tackle child sexual exploitation and abuse (CSEA) online.

 

Safety duties about illegal content: IJM welcomes this provision which places a duty on regulated services to actively mitigate and minimise the illegal content, including CSEA, on their service. By calling for participation of the service in the mitigation and minimisation of illegal content, the Bill is actively incentivising regulated services to prevent and detect CSEA. 

 

Codes of practice about duties: IJM welcomes this provision for Ofcom to provide regulated services with guidance on the systems and processes needed in order for them to be compliant with their duties of care. We are particularly concerned with the code of practice of regulated services describing recommended steps for the purposes of compliance with safety duties about illegal content so far as relating to CSEA content. 

 

Enforcement powers: IJM welcomes the enforcement powers given to the independent regulator Ofcom. Considering that self-regulation has not proven to be effective in countering the proliferation of CSEA, imposing a penalty on persons whose regulated services do not meet their duties of care incentivises proactive steps to combat CSEA online. Penalties are necessary to ensuring that regulated services are truly held accountable. 

 

Use of technology notices: IJM welcomes the provision in the Bill giving Ofcom the power to provide regulated services with use of technology warning notices which require services to use specified technology if they are not meeting their safety duties about illegal content. We view this as a pragmatic provision which acknowledges the need for technological intervention when CSEA is persistent and prevalent on a platform in order to promote detection. 

 

Prevalent and persistent CSEA content: As previously mentioned, IJM welcomes the prioritisation of countering CSEA content online. Platforms which have prevalent and persistent CSEA content pose a serious risk to the safety of children the world over and ensuring that this content is detected, reported and removed is critical to ensuring the cycle of online CSEA is slowed and stopped. 

One image is too many. The tools used to scan for child sexual abuse images give us survivors such profound hope that one day we may be out of the spotlight. Not only is the re-sharing of the content harmful to the victims in it, it is also used to groom and normalize the abuse of the next generation of victims. We don’t want any more children to have to deal with what we deal with when it can be fixed.

 

i)            Provision of information on the tools, rules, and systems they have been put in place to proactively combat the spread of CSEA content.

ii)            Data on the results (i.e. number and rates of illegal images blocked at the upload stage, number and rates of abusive livestreams terminated, number and rates of first-generation images and videos detected and removed).

 

Regulated service: IJM welcomes the draft Bill’s regulation of user-to-user and search services. This can have a groundbreaking impact in disrupting the prevalence of CSEA content.

 

The draft Online Safety Bill makes important provisions for countering CSEA online through putting in place accountability mechanisms for regulated services. However, without the strengthening of aforementioned elements of the Bill, there remains the potential for a wide margin of error and risk that places children at-risk of abuse and exploitation.

 

Objectives of the draft Online Safety Bill

 

The draft Online Safety Bill is an important step towards making the UK the safest place to be online. IJM welcomes the shift change in tackling online sexual exploitation of children that this legislation seeks to drive.

 

In particular, IJM is encouraged by the safety duties about illegal content, especially CSEA. By prioritising tackling CSEA and making provisions for children who are more vulnerable to being sexually exploited online, this draft Bill is forging the way in strengthening child protection online.

 

Duty of care approach

 

IJM agrees with the duty of care approach taken by the draft Bill. It is crucial that regulated services have a social responsibility to protect online users and those who are at-risk of being exploited on their platforms. Without the responsibilities that duties of care place on these services, they would not be held accountable, feeding into a culture of impunity in online spaces.

 

Further, the duty of care approach creates flexibility and space for response and innovation in the tech sector as new trends and threats arise. This enables a collaborative approach with the tech industry as regulated services are given the opportunity to take proactive and pre-emptive steps to tackle online sexual exploitation of children on their platforms. Industry can thereby ‘futureproof’ against regulatory risk as it is assessed on the positive actions taken to reduce risk.

 

By allowing this flexibility, the duty of care approach is able to balance holding regulated services accountable with the recognition that technological advances lead to sudden and unknown changes, including in how online sexual exploitation of children is perpetrated online.

 

Systems and processes

 

The draft Bill’s proposed requirement of regular risk assessments for design decisions takes the initial steps towards a safety-by-design approach. Within safety duties, the duty to use proportionate systems and processes to minimise CSEA content is also welcome.

 

However, systems and processes must go further than minimisation and risk reduction. This Bill could take a safety-by-design approach by encouraging services to place child protection at the heart of systems and processes design, incentivising proactive detection of CSEA.

 

IJM advocates for holistic solutions to tackling online sexual exploitation of children through prevention, detection and disruption. Addressing the client-side of this exploitation, which drives the demand for abuse, is critical to this. In our casework, we see first-hand the driving role that offenders take in financially incentivising the proliferation of CSEA content online. The Online Safety Bill should recognise the role of individual online users in creating and distributing CSEA content, as well as work in tandem with other legislation that captures the criminal offences they perpetrate.

 

 

Services in Scope

 

In order to tackle online sexual exploitation of children systematically, it is crucial that government and industry go upstream to the source of disrupting this exploitation. The draft Online Safety Bill applies to user-to-user and search services, making an important step towards disrupting the prevalence of CSEA content. However, without also addressing the role of manufacturers, operating system developers and app developers in creating on-device solutions, the UK will not be able to achieve its aim of becoming the safest place to be online.

 

On-device solutions are varied, but they entail the installation of different device-side artificial intelligence (AI) tools, such as computer vision and machine learning applications, to detect and block CSEA content within the camera frame. These on-device solutions use AI or other software built into the operating systems or apps to detect new CSEA content before it is uploaded and to detect and block the recording or streaming of new CSEA content at the camera level, thereby preventing CSEA content from being created in the first instance. 

 

This can be done by operating system developers (Apple, Google), at the app level, or on the hardware. There is also the potential for on-device solutions to be moved onto apps, such as Facebook Messenger and Skype. This is especially crucial as the production and distribution of first-generation CSEA content and livestreamed abuse is of greatest concern on private channels.

 

There is a thriving and growing online safety tech industry that is developing on-device solutions. The French social network Yubo proactively screens live video, implementing automated prompts to users to change behaviour. This demonstrates the technological feasibility of such solutions. SafeToNet has created on-device software that detects CSEA content in real-time, identifying high-risk images using a machine-learning algorithm. It can also be implemented by social media companies to prevent graphic content from being uploaded and distributed.

 

In terms of privacy, on-device solutions could become the most effective method of preventing the production of CSEA content in the first instance by blocking streaming or image/video capture, while maintaining high levels of privacy for users as it operates on-device and pre-entry into private channels. Such automated technologies protect the privacy of users as they are trained to only scan for CSEA content, thus reducing the need for human eyes to review private content.

 

In September 2021, Apple announced it would delay the implementation of one such on-device solution. This would have involved scanning phones in the US for child abuse images. The decision to delay introduction of this software followed concerns raised regarding protection of privacy.

 

There is a balance to be struck in protecting privacy and developing the tools necessary to tackle online sexual exploitation of children. Both are integral to online safety. Absolute privacy would entail criminal impunity, whilst absolute child protection would entail no private communications. Pre-existing detection tools for CSEA, such as PhotoDNA, do not compromise user privacy as they identify known exploitative material. If CSEA material is detected, that person has now forfeited privacy and all personal online content should be subject to screening.

 

The production of new CSEA materials, and particularly the livestreaming of abuse, poses significant detection challenges. If we are to ensure early detection of CSEA without compromising the privacy rights of users, then the development of new artificial intelligence image classifiers to detect CSEA is crucial. New AI tools would ensure that no human would have to review private content unless the content detected is likely to be CSEA. This protects the privacy of users while also ensuring that CSEA is detected and disrupted.

 

It is crucial that the central role of manufacturers, operating systems developers and app developers in preventing, disrupting and detecting online sexual exploitation is considered. By doing so, the UK would be taking a whole systems approach to tackling this violent crime. Further, on-device solutions would enable tech companies to avoid regulatory risk by aiding in the prevention of CSEA content from becoming prevalent and persistent on their platforms in the first place.

 

User agency

 

The role of users in promoting online safety must be acknowledged. Online sexual exploitation of children is perpetrated by individuals who abuse online spaces to access vulnerable children. However, there are limitations to the role of user agency. Without regulation holding users and services to account, there is unlikely to be system change in online spaces. The Online Safety Bill’s focus on regulated services ensures a holistic systems-based approach to tackling online sexual exploitation of children.

 

In our casework, IJM has seen that the exploitation and abuse of children is hidden to most online users, even when it happens across mainstream social media platforms. It is a crime perpetrated by individuals who actively try to avoid detection. Most of CSEA production, especially livestreaming, happens on private channels, meaning that only users actively engaged in CSEA production as offenders are aware of its prevalence on online spaces. These users will obviously not report their own crimes.

 

Further, victims of online sexual exploitation are often under the age of 12. The Canadian Center for Child Protection identified 46,859 images of unique children, with 78.3% under the age of 12. IJM’s latest casework data which showed that 195 victims rescued in the Philippines since 2011 in IJM-supported operations were below 6 years old, making up nearly 30% of all victims rescued. This implies that most victims are not users of the platforms through which they are exploited.

 

Whilst educating and equipping children to be safe online, thereby strengthening user agency, may mitigate the risk of some forms of online harm, this cannot be said of many children suffering online sexual exploitation. Not only are children suffering livestreamed abuse very young, but they are also being exploited by an adult often known to them. Therefore, while there may be other areas in which user agency is important in reducing vulnerability, children have limited agency and are easily coerced, manipulated and abused, making protective action necessary.

 

The role of Ofcom

 

IJM welcomes the designation of Ofcom as the independent regulator for the Online Safety Bill and the provisions and powers given to it. These powers allow Ofcom to hold regulated services to account on the minimisation and risk reduction of CSEA content.

 

However, as previously mentioned, the threshold of “prevalent and persistent” CSEA narrows Ofcom’s power to take appropriate enforcement action. It would be more effective to provide Ofcom with different avenues to trigger enforcement action against the presence of CSEA on a service. IJM recommends that “prevalent and persistent” is defined to enable Ofcom to issue penalties if necessary and that other enforcement routes are provided so that Ofcom is able to address exploitation and abuse cases in online spaces, no matter how varied they may be.

 

Further, considering the unique nature of the Bill, it is unclear whether Ofcom has the current capabilities and capacity needed to enforce it. It is crucial that Ofcom is provided with the necessary resources to implement the Bill, ensuring that it can make the most impact possible and make the UK the safest place to be online in the world. In the periodical review of the Online Safety Bill, not only should the powers of Ofcom be reviewed, but also that the resources available to Ofcom are sufficient to fulfil its role as independent regulator.

 

Collaboration with police services is not only essential for tackling online crimes, but would also alleviate capacity and capability restraints. Policing is under particular strain in countries struggling with under resourcing and restricted access to skills training. When tackling CSEA, it is crucial that international collaboration between law enforcement, independent regulators and governments works to upskill and capacitate law enforcement agencies tasked with countering CSEA online. Intelligence and information sharing, and collaboration between international law enforcement agencies is required to effectively tackle online sexual exploitation of children and hold perpetrators to account.  

 

IJM has seen first-hand the benefits of the UK’s international collaboration through the Philippine Internet Crimes Against Children Centre (PICACC), a collaboration between Philippine and Australian law enforcement, the UK’s NCA and IJM. Since its in inception in February 2019, PICACC has conducted 118 operations, leading to 84 arrest and 373 victims of online sexual exploitation rescued. PICACC’s work continued despite the challenge of COVID-19 lockdowns, with 46 operations conducted during the height of the pandemic.

 

September 2021