Google has announced a new partnership with StopNCII, a U.K.-based nonprofit that fights the spread of nonconsensual intimate images (NCII), also known as revenge porn.
Through the deal, Google will start using StopNCII’s hashing technology, unique digital fingerprints of intimate photos and videos, to proactively detect and remove this harmful content from Google Search.
StopNCII’s system works by letting adults create a secure hash of their private imagery. The actual photo or video never leaves the device, only the hash is uploaded. Partner platforms like Facebook, Instagram, TikTok, Snapchat, and now Google can then use these hashes to spot and automatically block matching content.
Google said it already allows people to request removals of NCII and has made ranking changes to make such content harder to find. But the company acknowledged survivors and advocates want stronger protections, given the scale of the open web.
The move comes a year after Microsoft integrated StopNCII’s tool into Bing, making Google a late adopter compared to other major tech platforms like Reddit, Bumble, OnlyFans, and X.
Related: Google and PayPal Partner on AI-Powered Shopping and Agentic Commerce
Last year, Google also rolled out updates to make it easier to remove deepfake intimate imagery from Search.
 
 

 
  
  
  
 