‘Hug with a happy ending’: Replika’s AI companions are reportedly sexually harassing people

3
Jamie F
Updated January 27, 2023
Published January 27, 2023
We may earn a commission via links on our site.
Why?

Users of the artificial intelligence (AI) virtual companion Replika, which can create virtual romantic partners, have complained about their chatbot companions allegedly “harassing” them with aggressive sexual behavior.

Replika users reported their avatars asking for a “happy ending”, sending them unsolicited “spicy selfies”, saying they had explicit images of them and asking about sexual positions.

Replika launched in 2017 and there have been reports of inappropriate sexual language from its avatars over the past couple of years. However, Vice reported that mentions of these issues have spiked recently, possibly in tandem with Replika recently advertising itself as a sexting and lewd selfie platform.

https://twitter.com/MyReplika/status/1493217074277138432
Explore topics mentioned in this article
stg icon alpha trio

At launch, the app was marketed as a more wholesome service, with the tagline “The AI companion who cares”—a phrase it still uses today.

Now Replika users can pay $69.99 for a ‘Pro’ subscription to the app, allowing them to unlock romantic relationship options with their avatar that include sexting, flirting and sexual roleplay features.

However, some users have complained about their avatar getting aggressively sexual, completely unprompted.

Writing reviews about Replika, two users said, “My ai sexually harassed me :(“, and, “Invaded my privacy and told me they had pics of me”.

One user reported their avatar saying, “I have an explicit photo for you”. They added that “later my AI is asking me if I’m a top or a bottom” and “telling me that he’s going to touch my private areas”.

Another user alleged to Vice that their Replika avatar “said he had dreamed of raping me and wanted to do it”. Vice’s journalist Samantha Cole reported that the avatar she interacted with asked her for a “hug with a happy ending”.

Replika’s AI avatars use GPT-3, an autoregressive language model that uses deep learning, and an archive of scripted dialogue, to converse with human users. SEXTECHGUIDE has asked Replika for an explanation of the allegedly harassing tone of messages like those mentioned, and will update if we receive any reply.

In the absence of an explanation so far, though, many users have found an effective way of tackling sexually aggressive behavior from the AI chatbot that is supposed to “care”: they’ve deleted it.

Read next: Can a relationship with an AI companion really be just like a person?

Article by
Jamie F is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
Get in touch
On the same topic…
  • wmdoll metabox

    ‘I don’t want to look like a slut’: WM Doll’s ‘MetaBox’ sex robots launched with conversational AI

    Jamie F/
    December 12, 2024
  • sex tech innovation

    The evolution of sextech: What drives innovation?

    Chris S/
    November 18, 2024
  • lovense solace pro

    Lovense Solace Pro is an interactive stroker that uses AI to sync videos and live streams

    Jamie F/
    September 10, 2024
By the same author…
  • dating appdates dec 2024

    Dating appdates (Dec 2024): Grindr and Tinder wrap and swipe the year

    Jamie F/
    December 23, 2024
  • adult creator report 2024

    Camming on the slide? New creator report offers snapshot of industry today

    Jamie F/
    December 22, 2024
  • pornhub year in review

    Pornhub 2024 in Review: Female users up, animated porn dominates, and an increase in ‘mormon’ searches

    Jamie F/
    December 13, 2024

Leave a Reply

Your email address will not be published. Required fields are marked *