‘Are AI selfie generator apps perpetuating misogyny?’: Lensa AI randomly goes big on breasts

Filed in
2
Jamie F
Updated December 6, 2022
We may earn a commission via links on our site.
Why support us?

Lensa AI users have complained that the selfie editing app has been churning out sexualized images of them, despite the users submitting standard, non-nude or sexualized photos for processing.

The app became enormously popular recently, and for the week beginning December 6, 2022 topped the Photo & Video app chart in the Apple Store. The app can create beautiful artificial intelligence (AI) images from selfies, often using mystical, sci-fi and fantasy imagery and stylings.

However, some users have shared sexualized images unexpectedly created by the app. A Twitter user named Brandee Barker posted AI images allegedly created with Lensa AI, including one showing a female figure with huge breasts wearing an extremely low-cut top.

“Is it just me or are these AI selfie generator apps perpetuating misogyny? Here’s a few I got just based on my photos of my face,” Barker wrote.

Another Twitter user, Rosa Shores, posted an AI image of a woman with hugely prominent breasts. “You kinda just gotta laugh,” she wrote. “OK AI. What’s up with you?”

https://twitter.com/TheRosaShores/status/1599213777345396738

So, what is indeed up with the AI?

Lensa AI uses the Stable Diffusion AI system to generate images. Recently researchers, using systems such as Stable Diffusion Explorer, found that Stable Diffusion often generates potentially problematic stereotypes of people, based on the huge amount of images it uses for information. For example, when asked to generate images of a janitor, images of black janitors have primarily been returned.

Due to the abundance of sexualized images of women online, such stereotypical versions of women could be feeding the AI in a similar way. As well as sexualized imagery, people have complained about racial elements of AI images generated on Lensa AI.

https://twitter.com/feministnoire/status/1599643071771447298

“It perpetuates racism and sexism – I ended up looking like a white woman in most of the pictures,” Anna Horn wrote on Twitter.

Alleged AI biases such as these could be potentially combated by AI developers exercising more control about the kinds of images that newly generated AI images are trained on. With AI image technology developing fast, this seems to have been a blind spot in its development.

The quick rise of Lensa AI has led to other controversies related to the app, including whether the app should be crediting and compensating artists who have allegedly contributed to its ability to generate content.

Read next: ‘The best you’ve ever looked’: PhotoAI will sell you fake photos of yourself to use on dating apps

Article by
Jamie is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
Get in touch
Jamie F Avatar
Related articles
  • Digital Intimacy Coalition

    Sex-positive industry coalition calls out ‘critical gap’ in EU AI regulation

    Jamie F/
    October 7, 2024
  • watching deepfake porn illegal

    Watching deepfake porn will be illegal in South Korea, amid ‘digital sex crime epidemic’

    Jamie F/
    September 30, 2024
  • realbotix intimate friendship robot

    RealDoll founder says Realbotix is making robot heads with ‘intimate friendship’ AI, but no sexbots

    Jamie F/
    September 26, 2024
  • Realbotixs next gen humanoid robot 1

    Realbotix launching ‘next-generation’ AI humanoid robot in January 2025

    Jamie F/
    October 1, 2024
  • microsoft deepfake porn bing

    Microsoft hooks up with StopNCII to tackle non-consensual deepfakes on Bing

    Jamie F/
    September 12, 2024

Leave a Reply

Your email address will not be published. Required fields are marked *