‘Hug with a happy ending’: Replika’s AI companions are reportedly sexually harassing people

3
Jamie F
Updated January 27, 2023
Published January 27, 2023
We may earn a commission via links on our site.
Why?

Users of the artificial intelligence (AI) virtual companion Replika, which can create virtual romantic partners, have complained about their chatbot companions allegedly “harassing” them with aggressive sexual behavior.

Replika users reported their avatars asking for a “happy ending”, sending them unsolicited “spicy selfies”, saying they had explicit images of them and asking about sexual positions.

Replika launched in 2017 and there have been reports of inappropriate sexual language from its avatars over the past couple of years. However, Vice reported that mentions of these issues have spiked recently, possibly in tandem with Replika recently advertising itself as a sexting and lewd selfie platform.

https://twitter.com/MyReplika/status/1493217074277138432
Explore topics mentioned in this article
stg icon alpha trio

At launch, the app was marketed as a more wholesome service, with the tagline “The AI companion who cares”—a phrase it still uses today.

Now Replika users can pay $69.99 for a ‘Pro’ subscription to the app, allowing them to unlock romantic relationship options with their avatar that include sexting, flirting and sexual roleplay features.

However, some users have complained about their avatar getting aggressively sexual, completely unprompted.

Writing reviews about Replika, two users said, “My ai sexually harassed me :(“, and, “Invaded my privacy and told me they had pics of me”.

One user reported their avatar saying, “I have an explicit photo for you”. They added that “later my AI is asking me if I’m a top or a bottom” and “telling me that he’s going to touch my private areas”.

Another user alleged to Vice that their Replika avatar “said he had dreamed of raping me and wanted to do it”. Vice’s journalist Samantha Cole reported that the avatar she interacted with asked her for a “hug with a happy ending”.

Replika’s AI avatars use GPT-3, an autoregressive language model that uses deep learning, and an archive of scripted dialogue, to converse with human users. SEXTECHGUIDE has asked Replika for an explanation of the allegedly harassing tone of messages like those mentioned, and will update if we receive any reply.

In the absence of an explanation so far, though, many users have found an effective way of tackling sexually aggressive behavior from the AI chatbot that is supposed to “care”: they’ve deleted it.

Read next: Can a relationship with an AI companion really be just like a person?

Article by
Jamie F is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
Get in touch
On the same topic…
  • realbotix aria

    Realbotix shows off Aria, its $150K humanoid AI robot (but she just wants to be friends)

    Jamie F/
    January 16, 2025
  • love and sex with robots 2025

    Love & Sex With Robots 2025: 10th edition of conference scheduled for August in Montreal

    Jamie F/
    January 11, 2025
  • uk deepfake porn prison sentence

    UK deepfake law proposes up to two years in prison for porn creators

    Jamie F/
    January 9, 2025
By the same author…
  • Fleshy Thrust Sync no text

    Fleshy Thrust Sync AI heated stroker syncs 2D and VR videos

    Jamie F/
    January 16, 2025
  • realbotix aria

    Realbotix shows off Aria, its $150K humanoid AI robot (but she just wants to be friends)

    Jamie F/
    January 16, 2025
  • motorbunny ces 2025

    Motorbunny launches Fluffer interactive gaming app and sex toy range

    Jamie F/
    January 16, 2025

Leave a Reply

Your email address will not be published. Required fields are marked *