Cybersecurity

Google Works to Reduce Non-Consensual Deepfake Porn in Search

Search engine downranks sites that frequently host harmful explicit imagery

Google has been adjusting the results for queries specifically seeking deepfake content tied to someone’s name, as well as demoting websites which feature a high volume of pages that have been removed from search because they violated policies against explicit fake content.

Photographer: Gabby Jones/Bloomberg
Lock
This article is for subscribers only.

Google is making adjustments to its search engine to reduce the prevalence of sexually explicit fake content high in results, responding to the explosion in non-consensual unsavory content people have created using generative artificial intelligence tools.

When that AI-generated content features a real person’s face or body without their permission, that person can request its removal from search results. Now, when Google decides a takedown is warranted, it will filter all explicit results on similar searches and remove duplicate images, the company said Wednesday in a blog post. The Alphabet Inc. unit also said it had improved its search ranking systems so that explicit fake content would not appear as top results — a change that Bloomberg reported in May was already in the works.