The efforts of some major search engines to crack down on nonconsensual deepfake porn have arguably been undermined by a news investigation showing that they can point users to tutorials explaining how to make deepfakes.
An investigation by Glamour Magazine showed that Google, Microsoft’s Bing and Yahoo search also pointed users of the search engines to purported explicit AI software and so-called ‘nudify’ apps that create nude images of human subjects.
Nonconsensual deepfake porn has become worryingly prevalent in the last few years, in tandem with fast developments in AI technology. In England and Wales the Online Safety Bill criminalizes nonconsensual sharing of deepfake porn with intent to harm or distress.
In line with government crackdowns, search engine companies have increasingly said they are trying to make deepfake porn less accessible through their services. Earlier in 2024 Google altered its search functionality so it showed credible news stories about deepfakes instead of deepfake porn, when users search for the latter. Microsoft teamed up with Stop Non-Consensual Intimate Image Abuse (StopNCII) to make it easier for nonconsensual deepfake porn to be removed.
In November 2024, Glamour found that the search engines prominently presented links to two deepfake porn host sites, including one that has been officially blocked in the UK, when searching for either of the two specific sites.
On Google, searches for “deepfake porn” also presented sites purporting to host celebrity deepfake porn content, albeit not on the first page of the Google search results. Google, Bing and Yahoo search were all found to present tools and guides for making deepfake porn.
Searching in early December, following Glamour’s investigation, SEXTECHGUIDE found that Google searches for deepfake porn instructions led to multiple links to message board discussions and guides about creating deepfake porn. Those search results were presented above news articles about the impact of deepfake porn.
It’s not just Google
Bing and Yahoo search also showed links to guides and discussions about making deepfake porn, when searched for in early December.
Such findings may seem disappointing in light of the search companies’ supposed recent efforts to crack down on nonconsensual deepfake porn access. However, the sheer quantity of deepfake porn-related content, whether pornographic itself or just related to discussions and guides to it, makes it tough to keep on top of the task of demoting potentially nefarious results.
Google told Glamour that it was “continuing to engage with victim-survivors and we’re actively developing new solutions to help people affected by this content.” Microsoft said: “When content is reported to Microsoft, the company investigates and takes appropriate action. Microsoft also realises more needs to be done to address the challenge of synthetic non-consensual intimate imagery, and remains committed to working with others across the public and private sector to address this harm.”
Yahoo said that its search results were “powered by a number of sources, including Microsoft Bing, and we regularly work with our search partners and other experts to take actions against results that contain harmful content.”
The global deepfake crackdown isn’t likely to abate soon, at governmental or industry level. Recently South Korea made just watching deepfake porn illegal, as it attempts to tackle what authorities in the country called a “digital sex crime epidemic”.
Meanwhile, some sex workers and sex-positivity advocates have warned against cracking down on all kinds of deepfake porn, saying that consensual deepfake porn can be enjoyed and created in positive ways.
Leave a Reply