‘Are AI selfie generator apps perpetuating misogyny?’: Lensa AI randomly goes big on breasts

2
Jamie F
Updated December 6, 2022
Published December 6, 2022
We may earn a commission via links on our site. Why support us?

Lensa AI users have complained that the selfie editing app has been churning out sexualized images of them, despite the users submitting standard, non-nude or sexualized photos for processing.

The app became enormously popular recently, and for the week beginning December 6, 2022 topped the Photo & Video app chart in the Apple Store. The app can create beautiful artificial intelligence (AI) images from selfies, often using mystical, sci-fi and fantasy imagery and stylings.

However, some users have shared sexualized images unexpectedly created by the app. A Twitter user named Brandee Barker posted AI images allegedly created with Lensa AI, including one showing a female figure with huge breasts wearing an extremely low-cut top.

“Is it just me or are these AI selfie generator apps perpetuating misogyny? Here’s a few I got just based on my photos of my face,” Barker wrote.

Another Twitter user, Rosa Shores, posted an AI image of a woman with hugely prominent breasts. “You kinda just gotta laugh,” she wrote. “OK AI. What’s up with you?”

https://twitter.com/TheRosaShores/status/1599213777345396738

So, what is indeed up with the AI?

Lensa AI uses the Stable Diffusion AI system to generate images. Recently researchers, using systems such as Stable Diffusion Explorer, found that Stable Diffusion often generates potentially problematic stereotypes of people, based on the huge amount of images it uses for information. For example, when asked to generate images of a janitor, images of black janitors have primarily been returned.

Due to the abundance of sexualized images of women online, such stereotypical versions of women could be feeding the AI in a similar way. As well as sexualized imagery, people have complained about racial elements of AI images generated on Lensa AI.

https://twitter.com/feministnoire/status/1599643071771447298

“It perpetuates racism and sexism – I ended up looking like a white woman in most of the pictures,” Anna Horn wrote on Twitter.

Alleged AI biases such as these could be potentially combated by AI developers exercising more control about the kinds of images that newly generated AI images are trained on. With AI image technology developing fast, this seems to have been a blind spot in its development.

The quick rise of Lensa AI has led to other controversies related to the app, including whether the app should be crediting and compensating artists who have allegedly contributed to its ability to generate content.

Read next: ‘The best you’ve ever looked’: PhotoAI will sell you fake photos of yourself to use on dating apps

Article by
Jamie F is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
Get in touch
By the same author…
  • dating appdate nov 2024

    Dating appdates (Nov 2024): Apps for news junkies and the ‘visual generation’, prizes for the ghosted, and more

    Jamie F/
    November 19, 2024
  • meta robot hand

    Meta’s latest robotics project brings ‘human-level’ touch to machines

    Jamie F/
    November 14, 2024
  • bellesa silent toys

    Quiet revolution? Bellesa launches ‘silent’ vibrator range

    Jamie F/
    November 12, 2024
On the same topic…
  • sex tech innovation

    The evolution of sextech: What drives innovation?

    Chris S/
    November 18, 2024
  • lovense solace pro

    Lovense Solace Pro is an interactive stroker that uses AI to sync videos and live streams

    Jamie F/
    September 10, 2024
  • Ethical dilemmas of ai in sextech

    Ethical dilemmas of AI in sextech: Balancing technological advances and consent

    Stu N/
    October 1, 2024

Leave a Reply

Your email address will not be published. Required fields are marked *