While AI porn and wider artificial intelligence-based artwork and image generators hit headlines regularly in 2022. However, as 2023 has kicked off, there’s a growing backlash against some of the companies building them, fuelled by accusations of artwork theft, sexualizing selfies, and aiding non-consensual generation of sexual imagery through deepfakes, has hit.
AI image generator companies have been sued by artists, AI porn image-sharing communities have been curtailed online, and at least one AI system has been called out for creating content allegedly portraying sexualized images of a child.
With AI image technology developing far faster than legislation around it, the sector has become a particularly shouty town square of ethical and legal argument.
AI porn image group censured
Most major AI image generators don’t allow you to upload nude images or generate porn content, according to their user guidelines. That doesn’t stop many people trying, and it hasn’t stopped a community of AI porn fans crystalizing over their desire to see more legit AI porn.
Unstable Diffusion, a group that took its name from the popular Stable Diffusion AI software, is exactly that. It has a Discord group where members can share their love of AI porn, and had been raising funds to put towards running its own porn-friendly AI image generator.
However, on December 21, 2022, the Kickstarter campaign was suspended, with Kickstarter releasing a statement saying that it was “on the side of creative work and the humans behind that work”. The statement followed public criticism on social media about the campaign being allowed, related to AI systems being trained on copyrighted work.
It seemed that the wider issue of artists’ copyrighted work being fed into AI systems such as this was the main issue, rather than the porn aspect. But still, in mid-January 2023 the Kickstarter campaign remained offline and Unstable Diffusion’s Patreon was under review.
Unstable Diffusion said it was not breaching copyright law and has claimed fair use of images, saying: “One of the most untrue statements [leveled at Unstable Diffusion] is regarding the illegal harvesting and usage of works without proper licensing.”
The company said: “While Kickstarter’s capitulation to a loud subset of artists disappoints us, we and our supporters will not back down from defending the freedom to create.”
Lensa accused of generating sexualized child images
One of the darkest elements of the AI image generator backlash is an accusation that Lensa, the hugely popular AI selfie-generating app, has created sexualized images from photos of a child.
The app does not allow photos of children to be submitted to its service for AI image generation, according to its terms and conditions. However, Olivia Snow, research fellow at UCLA’s Center for Critical Internet Inquiry, had little trouble uploading photos of herself as a child to the app.
Writing in Wired, she said that she uploaded “a mix of childhood photos and selfies. What resulted were fully nude photos of an adolescent and sometimes childlike face but a distinctly adult body.”
Snow added: “Similar to my earlier tests that generated seductive looks and poses, this set produced a kind of coyness: a bare back, tousled hair, an avatar with my childlike face holding a leaf between her naked adult’s breasts.”
Lensa was already under fire for allegedly generating sexualized AI images of women from non-sexualized submitted photos, and for racial issues such as allegedly whitening the skin of non-white people in AI images.
Responding to Snow’s claim, Prisma Labs, which owns Lensa, told Jezebel that Snow had “explicitly and intentionally violated our Terms of Use” and refused to comment further.
Artists sue AI image apps
Three artists have sued Stability AI and Midjourney, two of the biggest AI image generation companies, for various alleged copyright infringement violations.
Sarah Anderson, Kelly McKernan and Karla Ortiz, plus portfolio site DeviantArt, are seeking damages from the companies in California. Their suit followed much online outrage from artists about AI image generators allegedly using copyrighted works to ‘learn’ and create original images, without crediting or paying the original creators.
Stability AI is the company behind Stable Diffusion. If the artists’ lawsuit is successful it will potentially have huge implications for the operation of AI image generators like that one, which tend to digitally gorge on online content to continually beef up their image databases and ‘learn’ more deeply.
Could 2023 be the year the AI image generation industry goes arse over AI-generated tit? This backlash feels like it’s yet to peak.
Read next:Porn Pen is a site that generates super-realistic AI porn (sometimes)
Leave a Reply