Meta is suing an AI ‘nudify’ app company, as part of a crackdown the media giant says it is undertaking on such apps being advertised on its platforms.
Mark Zuckerberg’s company announced that it was suing Joy Timeline HK Limited: a Hong Kong-based company that runs the CrushAI ‘nudify’ apps, that use AI to create a fake nude image of someone.
A recent CBS News investigation found that Meta platforms, such as Instagram, Facebook and Threads, were flooded with nudify app adverts, despite them being banned by the company.
The investigation found that there were, at least, hundreds of nudify app adverts available in Meta’s advertisement library, and across the company’s social media platforms. One advert uncovered featured a URL that redirected to websites promoting services that make images of real people appear as if they are performing moving sex acts, aka deepfake technology.
Meta sued Joy Timeline HK Limited in Hong Kong, to attempt to prevent the company from advertising on its platforms. It is also seeking to claim back $289,200 of investigation and content removal expenses from Joy Timeline HK Limited, via the lawsuit.
Meta said that some adverts for nudify apps on its platforms used “benign imagery”, as the companies behind them attempted to circumnavigate Meta’s advert approval processes. Meta added that some nudify companies quickly created new domain names to circumnavigate website blocks.
“This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it. We’ll continue to take the necessary steps – which could include legal action – against those who abuse our platforms like this,” Meta said.
The company added that that it was taking further measures to crack down on nudify app adverts, such as sharing information about nudify companies via the Tech Coalition’s Lantern progam. Launched in 2023, the program shares information about blocked adverts with other companies, when one firm blocks them.
Meta also claimed that since the start of 2025 it had run investigations to “expose and disrupt” four networks of advertising accounts that attempted to run nudify app adverts.
The company removed some nudify app adverts following the CBS News investigation. The network found that Instagram was running adverts for AI tools that promised the ability to “see anyone naked”, plus one nudify advert featuring the slogan, “How is this filter even allowed?”.
“We welcome legislation that helps fight intimate image abuse across the internet, whether it’s real or AI-generated, and that complements our longstanding efforts to help prevent this content from spreading online through tools like StopNCII.org and NCMEC’s Take It Down,” Meta added.
The company said it “championed” the new Take It Down Act, which was recently written into law in the US. The Act made sharing sexually explicit photos and videos without the consent of people depicted in the content, including AI-generated deepfake adult content, illegal.
Meta said it was “working to implement” the Take It Down Act.
Leave a Reply