Deepfake porn generator app ClothOff, which creates nude deepfake images from photos of real people and promised to let users “undress anyone using AI”, claimed that it is donating funds to “support those affected by AI”.
ClothOff also claims to have collaborated with a secretive AI victim support non-profit organization called AisafeUse Label (known as ASU Label), that was the subject of an investigation, raising questions about its legitimacy.
The ClothOff app garnered global media attention after it was allegedly used to create nonconsensual deepfake nude images of women in various parts of the world. To use ClothOff you buy credits on the app then upload a photo of a person, which the app ‘nudifies’ to create a naked version. There are no consent verification processes on the app.
The app epitomises the kind of technology that support groups helping victims of nefarious AI deepfake use (such as revenge porn and extortion) campaign against. However, an investigation by Bellingcat found that ClothOff claimed to be helping victims.
The investigation found that in December 2024 ClothOff declared that it was working with ASU Label and “donating funds to support those affected by AI”. ClothOff urged people to contact ASU Label if they had “experienced problems related to AI”.
What is ASU Label?
Evidence of ASU Label helping victims of AI image abuse, or even of being a legitimate organization, was not forthcoming in Bellingcat’s investigation. The domain for what appears to be the ASU Label website was registered in October 2024, shortly before ClothOff released a message to users about the organization.
ASU Label told Bellingcat that it was registered as a non-profit organization, but searches of non-profit organizations and non-government organizations found no mention of it. ASU Label said it gave “direct support to victims”.
The (supposed) organization said it received donations from ClothOff but would not reveal who ran or owned it, saying only that it was run by “professionals from the fields of AI, law and public advocacy”.
ClothOff said it collaborated with ASU Label “from time to time”, supporting the organization when it made requests about “assistance to individuals” or “proposals for joint research initiatives”.

On its website, which appears to be heavily reliant on AI-generated content, ASU Label says its job is “protecting your rights in the age of AI”. It adds that “more and more people are suffering from the unintended harm caused by AI systems.”
Rather than pointing to the clearly damaging deepfake image abuse, the copy on ASU Label’s site places focus on issues such as job loss, “biased AI decisions” and misinformation. It says that people suffering from AI-related problems like these can contact the organization to get help via legal advocacy, awareness campaigns or connecting with experts.
The secrecy of ASU Label chimes with ClothOff. In 2024 a Guardian investigation found that the app was connected to people based in Belarus, including people who denied involvement with ClothOff despite evidence to the contrary.
If ASU Label is a legitimate victim support organization, working with ClothOff on research projects would be a truly bizarre thing to do. To top things off, the ASU has no privacy policy on display anywhere on its site, or indication of how any data submitted will be handled.
Recently the US Senate passed the Take It Down Act, which criminalizes nonconsensual intimate images. In the UK making nonconsensual deepfake porn is set to become illegal, with offenders facing up to two years in prison.
Leave a Reply