Written evidence from Google [IIA0011]
Dear Chair,
Thank you for the opportunity to share additional information with you as the Committee continues its examination of strategies to combat non-consensual intimate imagery both online and offline. We recognize the profound hardship and distress individuals experience when explicit or intimate imagery of them is shared without their permission and is discoverable online. We have addressed each of your questions below.
For specific details about the way Search works when it comes to dealing with this type of content, I refer the Committee to the written evidence we provided on 29 November 2024. The information in this letter remains an accurate accounting of our policies to combat NCII content. We think it's important, though, to underscore that Google sets clear policies about the content we will remove from Search. These policies aim to create a safe and positive experience. Applying our content policies and then removing webpage or content URLs from our search results is an action called “delisting.” We delist URLs when the associated content violates our policies.
Regardless of whether UK law classifies the content as legal or illegal, when NCII content, as defined here, is reported to us, we delist the result from Search, in accordance with our policies.
We have heard clearly from victims of NCII how traumatic it can be to have to repeatedly report the same image when bad actors seek to distribute it across multiple webpages or online services. This is why we apply several layers of technical measures to prevent reported and visually similar NCII from appearing elsewhere in search results. We also have proactive systems that aim to reduce the prevalence of NCII results generally in Search.
When an image is removed from Search under our NCII policy, we have systems in place to detect and remove duplicates of that image, to reduce the need for victim-survivors to request removals one by one. When image URLs that are reported via our NCII reporting tool are found to be violative and are subsequently de-listed, systems are in place to detect and remove duplicates of that imagery from Search.
We provide an option for individuals to request that Search filter explicit results for Search queries similar to the one included in the NCII removal request. For example, if a user’s removal request is related to the query (e.g., “[name of person] leaked nudes”) and that request is approved, then we aim to filter explicit results for similar name-related queries going forward. This mitigates the need for users to continually submit removal requests. Given the dynamic and ever-changing nature of the web, automated systems are not able to catch every explicit result that may appear.
In addition to these protections applied to individual victim-survivors and their content, we use confirmed violations of NCII to improve our Search ranking systems more broadly. In December 2023, we further clarified how Google handles sites with a high proportion of non-consensual intimate imagery in our spam policies and ranking systems guide. If we process a high volume of such removals involving a particular site, we use that as a signal to improve our results. For sites that receive a high volume of NCII removals, we demote other content from the site in our Search results. This feedback loop helps further reduce the prevalence of unreported NCII in search results in the future.
Google is committed to fighting online child sexual abuse and exploitation and preventing our services from being used to spread child sexual abuse material (CSAM). We identify and report CSAM with trained specialist teams and cutting-edge technology, including machine learning classifiers and hash-matching technology, which creates a “hash,” or unique digital fingerprint, for an image or a video so it can be compared with hashes of known CSAM. When we find CSAM, we report it to the National Center for Missing and Exploited Children (NCMEC), which liaises with law enforcement agencies around the world.
While we have learned important lessons from our work to combat CSAM, not all of these can be applied in the NCII context, regardless of whether UK law classifies NCII content as legal or illegal. Because of the nature and limitations of machine learning classifiers, a proactive monitoring obligation (as many jurisdictions have instituted for CSAM) is simply not technically feasible for NCII. For CSAM, classifiers can detect (a) whether content includes explicit imagery or nudity; and (b) whether the subject of the imagery is possibly underage. The challenge posed by NCII, however, is more difficult to combat by means of automated identification of unreported content through machine learning classifiers. There is no machine learning classifier that can process imagery and determine, from an image itself, whether or not consent was given, not given, or revoked for the creation or distribution of that specific content. In other words, while a classifier can determine, from an image itself, whether it is sexually explicit, it simply cannot determine whether that image was non-consensual. To a classifier, a consensual explicit image is not distinguishable from a non-consensual explicit image. Therefore, machine learning classifiers cannot play the same role in the detection of unreported NCII content that they play with respect to CSAM.
Despite these challenges, and long before the passage of the UK Online Safety ACT (OSA), Google maintained and continues to maintain strong policies against both CSAM and NCII. We enforce these policies through a variety of different technical and procedural measures, tailored to address the challenges intrinsic to the type of violative content, and in compliance with existing law and regulations. We also partner with non-governmental organizations and industry on programs to share our technical expertise, and develop and share tools to help organisations fight CSAM and NCII.
Thank you, again, for the opportunity to appear before the Committee and share information about the work our company does to combat NCII.
December 2024