‘Are AI selfie generator apps perpetuating misogyny?’: Lensa AI randomly goes big on breasts

2
Jamie F
Updated December 6, 2022
Published December 6, 2022
We may earn a commission via links on our site.
Why?

Lensa AI users have complained that the selfie editing app has been churning out sexualized images of them, despite the users submitting standard, non-nude or sexualized photos for processing.

The app became enormously popular recently, and for the week beginning December 6, 2022 topped the Photo & Video app chart in the Apple Store. The app can create beautiful artificial intelligence (AI) images from selfies, often using mystical, sci-fi and fantasy imagery and stylings.

However, some users have shared sexualized images unexpectedly created by the app. A Twitter user named Brandee Barker posted AI images allegedly created with Lensa AI, including one showing a female figure with huge breasts wearing an extremely low-cut top.

“Is it just me or are these AI selfie generator apps perpetuating misogyny? Here’s a few I got just based on my photos of my face,” Barker wrote.

Another Twitter user, Rosa Shores, posted an AI image of a woman with hugely prominent breasts. “You kinda just gotta laugh,” she wrote. “OK AI. What’s up with you?”

https://twitter.com/TheRosaShores/status/1599213777345396738

So, what is indeed up with the AI?

Lensa AI uses the Stable Diffusion AI system to generate images. Recently researchers, using systems such as Stable Diffusion Explorer, found that Stable Diffusion often generates potentially problematic stereotypes of people, based on the huge amount of images it uses for information. For example, when asked to generate images of a janitor, images of black janitors have primarily been returned.

Due to the abundance of sexualized images of women online, such stereotypical versions of women could be feeding the AI in a similar way. As well as sexualized imagery, people have complained about racial elements of AI images generated on Lensa AI.

https://twitter.com/feministnoire/status/1599643071771447298

“It perpetuates racism and sexism – I ended up looking like a white woman in most of the pictures,” Anna Horn wrote on Twitter.

Alleged AI biases such as these could be potentially combated by AI developers exercising more control about the kinds of images that newly generated AI images are trained on. With AI image technology developing fast, this seems to have been a blind spot in its development.

The quick rise of Lensa AI has led to other controversies related to the app, including whether the app should be crediting and compensating artists who have allegedly contributed to its ability to generate content.

Read next: ‘The best you’ve ever looked’: PhotoAI will sell you fake photos of yourself to use on dating apps

Article by
Jamie F is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
Get in touch
On the same topic…
  • realbotix aria

    Realbotix shows off Aria, its $150K humanoid AI robot (but she just wants to be friends)

    Jamie F/
    January 16, 2025
  • love and sex with robots 2025

    Love & Sex With Robots 2025: 10th edition of conference scheduled for August in Montreal

    Jamie F/
    January 11, 2025
  • uk deepfake porn prison sentence

    UK deepfake law proposes up to two years in prison for porn creators

    Jamie F/
    January 9, 2025
By the same author…
  • Fleshy Thrust Sync no text

    Fleshy Thrust Sync AI heated stroker syncs 2D and VR videos

    Jamie F/
    January 16, 2025
  • realbotix aria

    Realbotix shows off Aria, its $150K humanoid AI robot (but she just wants to be friends)

    Jamie F/
    January 16, 2025
  • motorbunny ces 2025

    Motorbunny launches Fluffer interactive gaming app and sex toy range

    Jamie F/
    January 16, 2025

Leave a Reply

Your email address will not be published. Required fields are marked *