How should porn, sex and sextech-adjacent AI tools (and the content they can produce) be regulated? Should they be regulated by government and government-commissioned authorities at all?
With the recent rise of AI-generated deepfake porn making news around the world, largely due to its nefarious use, these questions are also rising to the fore. Some countries recently introduced laws around deepfake porn and sex content: an issue that has led some to call for more AI regulation.
In England and Wales creating deepfake porn without the consent of those depicted in it is becoming illegal. In South Korea, just watching deepfake porn is being criminalized.
However, what was considered the world’s first ‘comprehensive’ AI law was only adopted in March 2024, when the European Commission launched the AI Act. The act, which will affect countries in the European Union, isn’t due to become fully implemented and enforced until August 2026.
The AI Act features elements that could heavily affect sextech and porn-related use of AI. It contains rules about AI content having to be clearly labelled as such, which would have implications for deepfake porn being used for deception. AI chatbots would also have to be signposted: a move that would have repurcussions for the the many porn content creators and influencers using AI to interact with fans.
There is little in the AI Act, however, that specificially covers porn and sex-related AI content and tools. Some sextech and porn industry figues, plus sex workers and sex-positivity advocates, are concerned about the future of AI regulation and governance. They fear that positive use of AI for sex and erotic content could end up being swept into future regulation clamping down on nefarious and illegal use and creation of such content.
In September 2024 the Digital Intimacy Coalition, which features many sex-positive industry figures, sent an open letter to EU regulators demanding that voices like theirs be heard in discussions about AI regulation. They claimed there was a “critical gap in the ongoing discourse surrounding AI regulation”.
So, how should AI regulators, in the EU and globally, approach sex content? ‘Ethical’ use of deepfakes, sex worker discrimination and censorship are just a few of the issues that should be considered.
A voice at the table
“We’re an industry, like it or not,” said Ana Ornelas, Berlin-based erotica writer, self-described pleasure-activist and advocacy officer with the Digital Intimacy Coalition. She added: “We’re going to be at a serious disadvantage if we’re not able to use this technology, that all the other industries are going to be using.”
Ornelas was talking about sex workers and advocates, and AI. She and the coalition argue that as AI is already a big part of many of her and her peers’ working lives, their views and expertise should contribute to discussions around its regulation.
Alessandro Polidoro, a lawyer and Digital Intimace Coalition member, said that the coalition was not pushing for specific laws or regulations around AI, but simply “to be heard, to be involved”.
Would exclusion amount to discrimination?
Currently it’s the norm for widely-used generative AI tools such as Open Mind’s ChatGPT to not allow a great deal of sex-related content, and often to not even allow mildly erotic content. Rather than regulation, this is largely down to major online platforms’ policies, which often reflect major social media platforms policies about sex content. Basically: they usually don’t want it, perhaps because they see it as intrinsically linked to vice and as having potential to scare off advertisers and investors.
As such, you usually have to use smaller or niche generative AI platforms and tools to create NSFW AI material, such as AI porn or erotic chat. Some sex industry workers and advocates are concerned that as AI regulation develops, it could reinforce this model.
However, there is potential for AI regulation to prevent generative AI tools from banning some kinds of sex content, if a more sophisticated approach to its definition was developed. Polidoro raised the prospect of AI regulation being implemented to stop platforms from banning some NSFW content.
Polidoro said that sex and porn workers should be a recognized group and that their exclusion from using AI tools for their work, which could involve creating certain NSFW material, should be seen as discriminatory.
He added that an AI platform’s internal policies on sex content should not be justification for stopping sex industry workers and advocates from using the platform for sex-positive NSFW content creation. He said that being “sex worker-phobic” should be seen as “being as unacceptable as being racist, as being homophobic, as being xenophobic.
Defining porn, erotica and education in the AI age
A tendency to ban, shadow ban or block some sex-related content, even if it isn’t pornographic or titillating, is a moderation style that has carried over from major social media platforms into many major generative AI platforms.
With AI regulation and law-making in its infancy, there is concern that sex and sex-adjacent content is being lumped in with all forms of porn by policies and AI moderation tools. Currently it’s a struggle even to get a mainstream generative AI tool to create an image of two people kissing.
Ornelas said that recently, when using AI tools for work on a project related to the clitoris, which she decribed as a “sex education” project, the tools’ content policies were triggered.
“We need to have discussion about the differentiation of types of content,” she said. “What is pornography? What is erotic content? What is sex education? ChatGPT triggers the ‘violating content’ policy immediately, whenever you have the word clitoris there.”
Again, this is currently more of a question of companies’ policies and moderation techniques rather than regulation and law. However, protections for sex workers as well as individuals seeking sex-related information could potentially be abetted by regulation forcing companies to refrain from banning or restricting some sex-related content.
Ornelas added: “Kids are going to be using AI as they grow up, and they need to have access to sex education.”
Deepfake porn: not always nefarious…
Think of deepfake porn, and you might think of its nefarious uses, such as hardcore nonconsensual deepfakes depicting female celebrities, or revenge porn.
Coalition members are keen for any regulation of AI deepfake porn to clamp down on nefarious and illegal use of this technology. They are also worried that with nonconsensual deepfakes being arguably the most high-profile use of AI porn, laws and regulations regarding their use and creation could end up being overly stringent.
“If AI-facilitated intimate abuse is created, of course, the first reaction from the public would be to [want to] ban these tools” said Polidoro. He added that this mentality could “feed into the panic and stigma around these [AI deepfake] tools.”
Both Polidoro and Ornelas were unoequivical in saying that nonconsensual deepfake image and video abuse needed to be cracked down on. Ornelas said that this use of the technology should not tarnish its other, potentially beneficial and positive uses, though.
“This tool [deepfake AI generation] is not bad in itself,” said Ornelas. “It could be super-useful for people in long-distance relationships. Or perhaps to be able to see yourself in another gender’s body. There are many colorful possibilities for the use of this kind of generative AI.”
The key, in many sex-positive advocates’ eyes, is for any regulation of deepfake technology to focus on the consent of those depicted rather than just the technology.
Ornelas says that however it pans out, the current landscape of larger, mainstream AI generation tools censoring deepfake porn content, and smaller, ‘anything goes’ tools having almost no restrictions, can’t continue.
“On one side we have strict censorship and prohibition, which is harmful because it squashes important conversations that we should have as a society,” she said. “And on the other hand we have full permissiveness [through smaller AI generation tools], which allows for AI-enabled image generated abuse.”
…Deepfake porn: protecting performers’ work
The rise of generative AI has led to rows about work it is trained on. Should artists be compensated when AI produces new works based on training of a ‘real’ artist’s style and works? If an AI tool creates a book, should those who created the written works it was trained on be credited and reimbursed?
With all the concern and controversies around deepfake porn, it’s often forgotten that such content is often created by AI trained on the work of porn performers. They may not have created the face, but they sure as heck worked hard on the bodies.
So, if consensual, non-harmful deepfake porn is created, or indeed any kind of AI-generated porn, shouldn’t the porn performers the AI was trained on receive credit and compensation?
Copyright and content usage laws and regulations may well catch up with AI usage soon, in terms of creating a landscape for fair and legal AI training on copyrighted materials such as articles, films and art.
Ornelas says that one positive solution could come in the form of official platforms and digital banks of porn content, that performers could submit porn content to for it then to be used for AI training and their subsequent payment. “With these platforms, you could have ‘ethically’-sourced generated AI porn,” she says.
Future discussions
Polidoro said that he and the coalition were not necessarily anti-AI regulation, when it comes to sex and sex-adjacent content and tools. Indeed, he said that a lack of regulation has likely contributed to the concerning spread of nonconsensual deepfake image abuse.
Ornelas added that while the coalition members may have various views on what good and ethical AI porn and sex content and creation regulation looks like, all they want for now is to be acknowledged as a group deeply affected by it.
“The discussion is at the beginning, and this is the opportunity,” she said. “Who knows, maybe soon we’ll be able to have a more frank conversation about adult content in AI.”
Leave a Reply