Google’s anti-deepfake tools for Search are getting better at hiding results and removing images

4
Jamie F
Updated August 8, 2024
Published August 4, 2024
We may earn a commission via links on our site.
Why?

Google has updated its core Search to make non-consensual deepfake porn more difficult to find, and expanded the effects of deepfake porn removals made at the request of those featured in it.

With the warp-speed rise in the abilities of generative AI recently, realistic deepfake porn videos and images made without the consent of those depicted in them have become common online. As well as being potentially degrading and distressing for those who see themselves in deepfake content, it can potentially be used for criminal activity such as revenge porn and financial coercion.

Google’s new crackdown focuses on both image removal and promoting verified information about the effects of deepfake porn. Google already allowed users to request that an alleged deepfake image of them be removed from Search results; now when a removal is successful, the Search service aims to filter all explicit results from searches about them.

Explore topics mentioned in this article
stg icon alpha trio

Also, Google’s systems will now scan for duplicate images of an original alleged deepfake image, once it is successfully removed, and remove any duplicates it finds as well.

The result should, in theory, be a far more effective deepfake removal system that casts Google’s removal net wider. “These efforts are designed to give people added peace of mind, especially if they’re concerned about similar content about them popping up in the future,” Google said.

Google has also made changes that should make non-consensual deepfake porn generally more difficult to find through its Search function. Now, when you search for deepfake porn along with a person’s name, Google will present what it calls “high-quality, non-explicit content” instead of deepfake porn results.

The idea is that if you search for deepfake porn showing a particular celebrity, you may be presented with news articles about how that celebrity has been affected by or linked to deepfake porn. A morality check, if you will, from solid news outlets.

This news-first measure appeared to be working from the off. When SEXTECHGUIDE attempted to search for the name of a prominent celebrity along with the phrase “deepfake porn”, the top image results were derived from news articles about the celebrity being the subject of deepfakes. There were no actual deepfake porn images visible in the search results.

However, there are still issues. The Faked Up newsletter reported that following Google’s deepfake Search update, some searches for phrases like “best deepfake nudes” produced promoted results for apps that use AI to produce nonconsensual nude images. Promoted results for AI ‘undresser’ apps were also produced.

Faked Up said: “Google is not getting rich off of these websites, nor is it wilfully turning a blind eye. These ads ran because of classifier and/or human mistakes and I suspect most will be removed soon after this newsletter is published.”

Earlier in 2024 Google banned Google Ads from promoting deepfake porn, and took some measures to make deepfake porn less visible in Searches. The company said it made the new Search updates based on feedback from deepfake experts and victim-survivors.

Google added: “These changes are major updates to our protections on Search, but there’s more work to do to address this issue, and we’ll keep developing new solutions to help people affected by this content.”

While Google’s changes are indeed likely to make non-consensual deepfake porn harder to access online, and to deliver a few unexpected online lectures to those searching for it, non-consensual deepfakes remain a huge problem. The legality of non-consensual deepfake videos, both pornographic and not, is a grey area in many countries and regions, although many governments are cracking down on it.

Authorities in England and Wales have moved to make the creation of non-consensual deepfake porn illegal, although ambiguities in the law in the countries remain. In Italy, Prime Minister Giorgi Meloni has sued two men for defamation, by allegedly creating deepfake porn featuring her, in a move likely designed to send a message that deepfake porn creation could result in hefty civil consequences.

Also earlier in 2024, a German artist and a designer created a digital camera that uses AI tools to make deepfake nude images of the person it’s taking a photo of instantly. The invention was an art project designed to raise awareness of non-consensual porn, and the camera was not made available to the public, with its makers saying it wasn’t used without the consent of anyone featured.

Article by
Jamie F is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
Get in touch
On the same topic…
  • wmdoll metabox

    ‘I don’t want to look like a slut’: WM Doll’s ‘MetaBox’ sex robots launched with conversational AI

    Jamie F/
    December 12, 2024
  • sex tech innovation

    The evolution of sextech: What drives innovation?

    Chris S/
    November 18, 2024
  • lovense solace pro

    Lovense Solace Pro is an interactive stroker that uses AI to sync videos and live streams

    Jamie F/
    September 10, 2024
By the same author…
  • dating appdates dec 2024

    Dating appdates (Dec 2024): Grindr and Tinder wrap and swipe the year

    Jamie F/
    December 23, 2024
  • adult creator report 2024

    Camming on the slide? New creator report offers snapshot of industry today

    Jamie F/
    December 22, 2024
  • pornhub year in review

    Pornhub 2024 in Review: Female users up, animated porn dominates, and an increase in ‘mormon’ searches

    Jamie F/
    December 13, 2024

Leave a Reply

Your email address will not be published. Required fields are marked *