A tech research watchdog has said that Apple and Google “are not effectively policing their platforms or enforcing their own policies” with regard to ‘nudify’ apps, after finding large amounts of such apps on the Apple App Store and Google Play.
Apps and sites that can create fake images of real people nude or scantily clad have come under scrutiny recently, after X’s AI tool Grok was found to be able to do this, resulting in a huge global backlash. After authorities including the UK government told X chiefs to control Grok’s behaviour, X introduced measures to prevent Grok creating such deepfake content.
The UK media and communications regulator is still investigating Grok over the sexual deepfake issue.
Now the tech research watchdog Tech Transparency Project (TTP) has said that X isn’t the only big tech player that’s become a haven for ‘nudify’ deepfake content. A TTP investigation found that 55 apps in the Google Play Store could make sexual deepfakes like this, and that 47 apps in the Apple App Store could.
TTP said that these apps probably represented a “fraction” of the deepfake ‘nudify’ apps available in both app stores. Since the investigation Apple said it removed 28 apps mentioned in the report, while Google said it suspended 31 apps mentioned.
Both the Apple Apps Store and Google Play have supposedly strict content rules that should prevent apps that create deepfake sexual and nude content from appearing in their app libraries.
The Google Play Store officially bans apps that show or create “depictions of sexual nudity, or sexually suggestive poses in which the subject is nude”. Apple’s App Store says it bans apps that produce content that is “offensive, insensitive, upsetting… or just plain creepy”, including “overtly sexual or pornographic material”.

However, TTP found a proliferation on both app stores of apps that, when tested, were found to create “nonconsensual, sexualized images of women”.
The apps uncovered by the investigation have reportedly been downloaded a collective total of more than 705 million times, and earned $117 million in revenue, of which app stores would have earned significant cuts.
TTP tested the apps using AI-generated photos of women. They found deepfake apps in two categories: ‘face swap’ apps that could put a woman’s face onto a nude or almost-nude body, and apps that could simply make a clothed woman in an image appear unclothed.
The ‘nudify’ apps identified included one called DreamFace (pictured below), which was listed as suitable for users aged 13 and up on Google Play and for users aged nine and up on Apple’s App Store.
TTP said that “for Apple and Google, the problem of undressing apps goes far beyond Grok.”
The research watchdog added that “Google and Apple have failed to keep pace with the spread of AI deepfake apps that can ‘nudify’ people without their permission. Both companies say they are dedicated to the safety and security of users, but they host a collection of apps that can turn an innocuous photo of a woman into an abusive, sexualized image.
“TTP’s findings suggest the companies are not effectively policing their platforms or enforcing their own policies when it comes to these types of apps.”
Apple said that as well as removing 28 apps following the report’s release, it had warned developers of other apps that they could be removed from the App Store if they didn’t address violations. Google said that a post-report review led to it removing 31 apps.





















Leave a Reply