AI backlash begins: Porn AI community criticized, artists file lawsuit

5
Jamie F
Updated June 26, 2023
Published January 17, 2023
We may earn a commission via links on our site. Why support us?

While AI porn and wider artificial intelligence-based artwork and image generators hit headlines regularly in 2022. However, as 2023 has kicked off, there’s a growing backlash against some of the companies building them, fuelled by accusations of artwork theft, sexualizing selfies, and aiding non-consensual generation of sexual imagery through deepfakes, has hit.

AI image generator companies have been sued by artists, AI porn image-sharing communities have been curtailed online, and at least one AI system has been called out for creating content allegedly portraying sexualized images of a child.

With AI image technology developing far faster than legislation around it, the sector has become a particularly shouty town square of ethical and legal argument.

AI porn image group censured

Most major AI image generators don’t allow you to upload nude images or generate porn content, according to their user guidelines. That doesn’t stop many people trying, and it hasn’t stopped a community of AI porn fans crystalizing over their desire to see more legit AI porn.

Unstable Diffusion, a group that took its name from the popular Stable Diffusion AI software, is exactly that. It has a Discord group where members can share their love of AI porn, and had been raising funds to put towards running its own porn-friendly AI image generator.

However, on December 21, 2022, the Kickstarter campaign was suspended, with Kickstarter releasing a statement saying that it was “on the side of creative work and the humans behind that work”. The statement followed public criticism on social media about the campaign being allowed, related to AI systems being trained on copyrighted work.

It seemed that the wider issue of artists’ copyrighted work being fed into AI systems such as this was the main issue, rather than the porn aspect. But still, in mid-January 2023 the Kickstarter campaign remained offline and Unstable Diffusion’s Patreon was under review.

Unstable Diffusion said it was not breaching copyright law and has claimed fair use of images, saying: “One of the most untrue statements [leveled at Unstable Diffusion] is regarding the illegal harvesting and usage of works without proper licensing.”

The company said: “While Kickstarter’s capitulation to a loud subset of artists disappoints us, we and our supporters will not back down from defending the freedom to create.”

Lensa accused of generating sexualized child images

One of the darkest elements of the AI image generator backlash is an accusation that Lensa, the hugely popular AI selfie-generating app, has created sexualized images from photos of a child.

The app does not allow photos of children to be submitted to its service for AI image generation, according to its terms and conditions. However, Olivia Snow, research fellow at UCLA’s Center for Critical Internet Inquiry, had little trouble uploading photos of herself as a child to the app.

https://twitter.com/MistressSnowPhD/status/1600500652496588802

Writing in Wired, she said that she uploaded “a mix of childhood photos and selfies. What resulted were fully nude photos of an adolescent and sometimes childlike face but a distinctly adult body.”

Snow added: “Similar to my earlier tests that generated seductive looks and poses, this set produced a kind of coyness: a bare back, tousled hair, an avatar with my childlike face holding a leaf between her naked adult’s breasts.”

Lensa was already under fire for allegedly generating sexualized AI images of women from non-sexualized submitted photos, and for racial issues such as allegedly whitening the skin of non-white people in AI images.

Responding to Snow’s claim, Prisma Labs, which owns Lensa, told Jezebel that Snow had “explicitly and intentionally violated our Terms of Use” and refused to comment further.

Artists sue AI image apps

Three artists have sued Stability AI and Midjourney, two of the biggest AI image generation companies, for various alleged copyright infringement violations.

Sarah Anderson, Kelly McKernan and Karla Ortiz, plus portfolio site DeviantArt, are seeking damages from the companies in California. Their suit followed much online outrage from artists about AI image generators allegedly using copyrighted works to ‘learn’ and create original images, without crediting or paying the original creators.

Stability AI is the company behind Stable Diffusion. If the artists’ lawsuit is successful it will potentially have huge implications for the operation of AI image generators like that one, which tend to digitally gorge on online content to continually beef up their image databases and ‘learn’ more deeply.

Could 2023 be the year the AI image generation industry goes arse over AI-generated tit? This backlash feels like it’s yet to peak.

Read next:Porn Pen is a site that generates super-realistic AI porn (sometimes)

Article by
Jamie F is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
Get in touch
By the same author…
  • dating appdate nov 2024

    Dating appdates (Nov 2024): Apps for news junkies and the ‘visual generation’, prizes for the ghosted, and more

    Jamie F/
    November 19, 2024
  • meta robot hand

    Meta’s latest robotics project brings ‘human-level’ touch to machines

    Jamie F/
    November 14, 2024
  • bellesa silent toys

    Quiet revolution? Bellesa launches ‘silent’ vibrator range

    Jamie F/
    November 12, 2024
On the same topic…
  • sex tech innovation

    The evolution of sextech: What drives innovation?

    Chris S/
    November 18, 2024
  • lovense solace pro

    Lovense Solace Pro is an interactive stroker that uses AI to sync videos and live streams

    Jamie F/
    September 10, 2024
  • Ethical dilemmas of ai in sextech

    Ethical dilemmas of AI in sextech: Balancing technological advances and consent

    Stu N/
    October 1, 2024

Leave a Reply

Your email address will not be published. Required fields are marked *