‘Hug with a happy ending’: Replika’s AI companions are reportedly sexually harassing people

Filed in
3
Jamie F
Updated January 27, 2023
We may earn a commission via links on our site.
Why support us?

Users of the artificial intelligence (AI) virtual companion Replika, which can create virtual romantic partners, have complained about their chatbot companions allegedly “harassing” them with aggressive sexual behavior.

Replika users reported their avatars asking for a “happy ending”, sending them unsolicited “spicy selfies”, saying they had explicit images of them and asking about sexual positions.

Replika launched in 2017 and there have been reports of inappropriate sexual language from its avatars over the past couple of years. However, Vice reported that mentions of these issues have spiked recently, possibly in tandem with Replika recently advertising itself as a sexting and lewd selfie platform.

At launch, the app was marketed as a more wholesome service, with the tagline “The AI companion who cares”—a phrase it still uses today.

Now Replika users can pay $69.99 for a ‘Pro’ subscription to the app, allowing them to unlock romantic relationship options with their avatar that include sexting, flirting and sexual roleplay features.

However, some users have complained about their avatar getting aggressively sexual, completely unprompted.

Writing reviews about Replika, two users said, “My ai sexually harassed me :(“, and, “Invaded my privacy and told me they had pics of me”.

One user reported their avatar saying, “I have an explicit photo for you”. They added that “later my AI is asking me if I’m a top or a bottom” and “telling me that he’s going to touch my private areas”.

Another user alleged to Vice that their Replika avatar “said he had dreamed of raping me and wanted to do it”. Vice’s journalist Samantha Cole reported that the avatar she interacted with asked her for a “hug with a happy ending”.

Replika’s AI avatars use GPT-3, an autoregressive language model that uses deep learning, and an archive of scripted dialogue, to converse with human users. SEXTECHGUIDE has asked Replika for an explanation of the allegedly harassing tone of messages like those mentioned, and will update if we receive any reply.

In the absence of an explanation so far, though, many users have found an effective way of tackling sexually aggressive behavior from the AI chatbot that is supposed to “care”: they’ve deleted it.

Read next: Can a relationship with an AI companion really be just like a person?

Article by
Jamie is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
Get in touch
Jamie F Avatar
Related articles
  • Digital Intimacy Coalition

    Sex-positive industry coalition calls out ‘critical gap’ in EU AI regulation

    Jamie F/
    October 7, 2024
  • dating appdates sep 2024

    Dating appdates (September 2024): Anti-f***boy app, sober dating, Bumble AI, and more

    Jamie F/
    October 2, 2024
  • watching deepfake porn illegal

    Watching deepfake porn will be illegal in South Korea, amid ‘digital sex crime epidemic’

    Jamie F/
    September 30, 2024
  • realbotix intimate friendship robot

    RealDoll founder says Realbotix is making robot heads with ‘intimate friendship’ AI, but no sexbots

    Jamie F/
    September 26, 2024
  • Realbotixs next gen humanoid robot 1

    Realbotix launching ‘next-generation’ AI humanoid robot in January 2025

    Jamie F/
    October 1, 2024

Leave a Reply

Your email address will not be published. Required fields are marked *