Google to Block Nonconsensual Intimate Images from Search

â–Ľ Summary
– Google has partnered with StopNCII.org to combat non-consensual intimate imagery (NCII) by using their hashes to identify and remove such content from search results.
– The hashes are unique identifiers that allow services to block abusive imagery without storing or sharing the actual source material.
– StopNCII uses PDQ for images and MD5 for videos to generate these hashes for identification purposes.
– Google acknowledged criticism for being slower than other companies in adopting this approach and recognized the need to reduce the burden on affected individuals.
– Other platforms like Facebook, Instagram, TikTok, Bumble, and Microsoft’s Bing had already integrated StopNCII’s technology as early as 2022.
Google has launched a major initiative to prevent the spread of non-consensual intimate images across its search platform. The tech giant is partnering with StopNCII.org to implement a system that will proactively identify and block such harmful content from appearing in search results. This move represents a significant step in protecting individuals from digital abuse and exploitation.
Over the coming months, Google will begin using StopNCII’s hashing technology to detect nonconsensual imagery without needing to store or view the actual content. These hashes serve as unique digital fingerprints, PDQ for images and MD5 for videos, enabling platforms to recognize and remove flagged material efficiently. This method prioritizes user privacy while still addressing abusive content at scale.
Critics have noted that Google has been slower to adopt this approach compared to other industry leaders. Social media platforms like Facebook, Instagram, TikTok, and Bumble integrated StopNCII’s tools as early as 2022, and Microsoft incorporated the system into Bing last September. In a recent blog post, Google acknowledged the need for broader action, stating, “We have also heard from survivors and advocates that given the scale of the open web, there’s more to be done to reduce the burden on those who are affected by it.”
This collaboration underscores a growing recognition within the tech industry of the urgent need to combat nonconsensual intimate imagery. By joining forces with established initiatives like StopNCII, Google aims to align its policies with evolving best practices and strengthen protections for users worldwide.
(Source: The Verge)





