‘Are AI selfie generator apps perpetuating misogyny?’: Lensa AI randomly goes big on breasts

2
Jamie F
Updated December 6, 2022
Published December 6, 2022
We may earn a commission via links on our site.
Why?

Lensa AI users have complained that the selfie editing app has been churning out sexualized images of them, despite the users submitting standard, non-nude or sexualized photos for processing.

The app became enormously popular recently, and for the week beginning December 6, 2022 topped the Photo & Video app chart in the Apple Store. The app can create beautiful artificial intelligence (AI) images from selfies, often using mystical, sci-fi and fantasy imagery and stylings.

However, some users have shared sexualized images unexpectedly created by the app. A Twitter user named Brandee Barker posted AI images allegedly created with Lensa AI, including one showing a female figure with huge breasts wearing an extremely low-cut top.

“Is it just me or are these AI selfie generator apps perpetuating misogyny? Here’s a few I got just based on my photos of my face,” Barker wrote.

Another Twitter user, Rosa Shores, posted an AI image of a woman with hugely prominent breasts. “You kinda just gotta laugh,” she wrote. “OK AI. What’s up with you?”

https://twitter.com/TheRosaShores/status/1599213777345396738

So, what is indeed up with the AI?

Lensa AI uses the Stable Diffusion AI system to generate images. Recently researchers, using systems such as Stable Diffusion Explorer, found that Stable Diffusion often generates potentially problematic stereotypes of people, based on the huge amount of images it uses for information. For example, when asked to generate images of a janitor, images of black janitors have primarily been returned.

Due to the abundance of sexualized images of women online, such stereotypical versions of women could be feeding the AI in a similar way. As well as sexualized imagery, people have complained about racial elements of AI images generated on Lensa AI.

https://twitter.com/feministnoire/status/1599643071771447298

“It perpetuates racism and sexism – I ended up looking like a white woman in most of the pictures,” Anna Horn wrote on Twitter.

Alleged AI biases such as these could be potentially combated by AI developers exercising more control about the kinds of images that newly generated AI images are trained on. With AI image technology developing fast, this seems to have been a blind spot in its development.

The quick rise of Lensa AI has led to other controversies related to the app, including whether the app should be crediting and compensating artists who have allegedly contributed to its ability to generate content.

Read next: ‘The best you’ve ever looked’: PhotoAI will sell you fake photos of yourself to use on dating apps

Article by
Jamie F is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
Get in touch
On the same topic…
  • wmdoll metabox

    ‘I don’t want to look like a slut’: WM Doll’s ‘MetaBox’ sex robots launched with conversational AI

    Jamie F/
    December 12, 2024
  • sex tech innovation

    The evolution of sextech: What drives innovation?

    Chris S/
    November 18, 2024
  • lovense solace pro

    Lovense Solace Pro is an interactive stroker that uses AI to sync videos and live streams

    Jamie F/
    September 10, 2024
By the same author…
  • pornhub year in review

    Pornhub 2024 in Review: Female users up, animated porn dominates, and an increase in ‘mormon’ searches

    Jamie F/
    December 13, 2024
  • sex toy sales texas word

    US lawmaker wants to ban sex toys from non-sexually orientated stores

    Jamie F/
    December 13, 2024
  • sexify audio porn

    Spotify, but hardcore? Sexify is a new audioporn and erotica platform

    Jamie F/
    December 11, 2024

Leave a Reply

Your email address will not be published. Required fields are marked *