Uncategorized

Google starts broadly removing explicit deepfakes from search results

Google starts downranking more non-consensual intimate imagery in searches.

Enlarge (credit: Paper Boat Creative | Stone)

Today, Google announced new measures to combat the rapidly increasing spread of AI-generated non-consensual explicit deepfakes in its search results.

Because of “a concerning increase in generated images and videos that portray people in sexually explicit contexts, distributed on the web without their consent,” Google said that it consulted with “experts and victim-survivors” to make some “significant updates” to its widely used search engine to “further protect people.”

Specifically, Google made it easier for targets of fake explicit images—which experts have said are overwhelmingly women—to report and remove deepfakes that surface in search results. Additionally, Google took steps to downrank explicit deepfakes “to keep this type of content from appearing high up in Search results,” the world’s leading search engine said.

Read 27 remaining paragraphs | Comments

Read More 

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top
Generated by Feedzy