Supplementary written evidence from Gail Kent, Global Director, Government Affairs and Public Policy, Search, News and Gemini, Google [IIA006]
By email
29 November 2024
Dear Chair,
Thank you for the opportunity to follow up on our oral evidence earlier this month. We recognize the profound distress victims experience when explicit or intimate imagery of them is shared without their permission and is discoverable online. In this letter, we’ll set out the ways that Google Search plays an active role in preventing the spread of nonconsensual intimate imagery (NCII), but we know solutions to this problem will not be reached by industry alone. Getting this right requires collaboration among companies like ours, victim rights groups and the government, and we welcome the Committee’s work in bringing these elements together.
Efforts to Combat NCII, and Other Types of Image-Based Sexual Abuse on Search
Google’s mission is to organize the world’s information and make it universally accessible and useful. Key to this is striking a balance between open access to information and user safety. Our approach to striking this balance is through:
● our legal content removal request process
● our ranking systems, which aim to surface high quality information at the top of Search
● our content policies
● empowering users with tools like SafeSearch to filter content.
Google Search uses automated systems to index content from the web. It’s these systems that generate search results and responses to the billions of search requests that we process daily. Our systems are specifically designed to prioritize what appears to be the most useful and helpful content on a given topic, and to not surface content that violates our policies. This means that automation is generally Google’s first line of defense in dealing with harmful content, and means we can take a preventative approach - but in some cases, in the limited and well-defined situations that warrant this, we may also take manual action in accordance with our policies, with humans reviewing cases and blocking content.
We set clear policies about the content and activity that is acceptable on Search. These policies aim to create a safe and positive experience. Our company-wide Terms of Service also include a provision requiring individuals that use our products and services to respect others, including their privacy rights.
Applying our content policies and then removing website URLs from our search results is an action called “delisting”. We delist URLs when the risk of privacy or contact harm to the individual is so great that it outweighs other important considerations like access to information. This applies to categories like CSAM, highly personal information, sites with exploitative removal practices, and non-consensual sexual content.
Since 2015, Search has had a specific NCII policy enabling individuals to request removal of their explicit or intimate imagery if it appears in search results. Last year, we shared an update regarding an expansion of this policy, which now covers any personal, explicit images the subject no longer wishes to be visible in Search. For example, if someone created and uploaded explicit content to a website, then deleted the original content, they can request its removal from Search if it’s being published elsewhere without approval. Separately, in 2018 — years before the recent rise in generative imagery technology — Search introduced a policy enabling people to request the removal of sexually explicit fake imagery from search results.
We know that reporting this content can be one of the most difficult things to navigate for victims who are in trauma or crisis, so have given significant thought as to how to make the process as simple as possible. We have included screenshots of the removals process in the Annex, below. We have sought to address the concerns highlighted to us by victims in the following ways:
● Collaboration: We have dedicated engagement channels with frontline advocacy groups who help individuals report this content. This includes a quarterly training programme with NGOs; we’d welcome UK-based advocacy groups with an interest in our work to attend a session.
● Research: We have invested in specialized user experience research to understand the needs of individuals affected by NCII. This research has led us to expand our policies and take a more capacious understanding of consent, including for commercialized content.
● Simplified tools: We’ve taken a trauma-informed approach to our reporting forms and systems. In collaboration with victims organizations, we updated and simplified the forms individuals use to submit removal requests for their personal sexual content on Search, as well as websites containing their personal information or other material that may be removed under our Search product policies. These reporting tools allow victims
- or their authorized representatives - to report content for review under these policies. This tool also allows people to upload multiple URLs so that any content a person identifies in search results can be reported through a single form.
● Ease of access: We also allow victims or authorized representatives to report NCII and other issues directly within the Image Search results page. To use this, individuals can select an image result, click the three-dot menu in the image viewer, and select “Report this result.” The goal is to make it easier for victims or authorized representatives to request removal of NCII they may find on Search.
● Delisting: When content violating our policies is reported, we delist the result from Search.
We know how traumatic it can be for victims of NCII to repeatedly report the same image if bad actors seek to distribute it across multiple webpages or online services. We apply several layers of technical measures to prevent reported and visually similar NCII from appearing elsewhere in
search results and we have proactive systems that aim to reduce the prevalence of NCII results generally in Search.
When an image is removed from Search under our NCII policy, we have systems in place to detect and remove duplicates of that image, to reduce the need for victim-survivors to request removals one by one. When image URLs that are reported via our NCII reporting tool are found to be violative and are subsequently de-listed, systems are in place to detect and remove duplicates of that imagery from Search.
Using our own internal hashing technology, our systems detect and remove duplicates for the vast majority of NCII imagery reported and removed from Search.
● Over 90% of images removed under our personal sexual content policy are found to have duplicates, which are automatically and swiftly removed from Search results to the best of our ability, using the current state-of-the-art hashing technology.
● On average, we are proactively identifying and removing 30 duplicates per violative NCII image.While Google makes best efforts to detect manipulated (but visually similar) “near-duplicates,” images can be modified to evade detection via current
hash-matching technology. As a result, these “de-duplication” protections may not detect all manipulated (but visually similar) “near-duplicates”.
We provide an option for individuals to request that Search filter explicit results for Search queries similar to the one included in the NCII removal request. For example, if a user’s removal request is related to the query (e.g., “[name of person] leaked nudes”) and that request is approved, then we aim to filter explicit results for similar name-related queries going forward. This mitigates the need for users to continually submit removal requests. Given the dynamic and ever-changing nature of the web, automated systems are not able to catch every explicit result that may appear.
In addition to these protections applied to individual victim-survivors and their content, we use confirmed violations of NCII to improve our Search ranking systems more broadly. In December 2023, we further clarified how Google handles sites with a high proportion of non-consensual intimate imagery in our spam policies and ranking systems guide. If we process a high volume of such removals involving a particular site, we use that as a signal to improve our results. For sites that receive a high volume of NCII removals, we demote other content from the site in our Search results. This feedback loop helps further reduce the prevalence of unreported NCEI in search results in the future.
We have also made progress in addressing synthetic imagery. In 2024, we introduced ranking updates to our ranking systems to help keep this type of content from appearing high up in Search results. For queries that are specifically seeking synthetic content and include people’s names, we'll aim to surface high-quality, non-explicit content, like relevant news articles, when it’s available. The updates we’ve made this year have reduced exposure to explicit image results on these types of queries by over 70%.
Google Search shows information gathered from websites across the web. Even if content is removed from Google Search, it may still exist on the web. This means someone might still find the content on the page that hosts it, through social media, on other search engines, or in other ways. To this end, it is important for such content to be removed also from where it is being hosted. We provide information to help users understand how to request removals from hosting
websites and webmasters, if they feel comfortable doing so. Additionally, we provide information on resources individuals can utilize to obtain additional support. This information includes links to non-profit organizations in North America, Asia, and Europe – such as the Revenge Porn Helpline in the United Kingdom.
We continue to partner with academic and gender-based violence experts to consult on our reporting flows, processes, and systems to build an approach that is sensitive to the needs of individuals affected by NCII. Some of the groups we've engaged, including StopNCII, are included as resources on our help center page, available here.
We are deeply committed to addressing non consensual explicit imagery, synthetic or otherwise. When we identify harmful content such as NCII on, or distributed through, our sites and applications, we take action against it. Our work to combat this malicious content persists, and we look forward to continuing to collaborate with legislators, non-governmental organizations, and affected individuals to develop innovative solutions that address these complex challenges.
Thank you, again, for the opportunity to appear before the Committee and share information about the work our company does to combat NCII.
Sincerely,
Global Director, Government Affairs and Public Policy, Search, News and Gemini
ANNEX: User Reporting Flow
November 2024