Users of the artificial intelligence (AI) virtual companion Replika, which can create virtual romantic partners, have complained about their chatbot companions allegedly “harassing” them with aggressive sexual behavior.
Replika users reported their avatars asking for a “happy ending”, sending them unsolicited “spicy selfies”, saying they had explicit images of them and asking about sexual positions.
Replika launched in 2017 and there have been reports of inappropriate sexual language from its avatars over the past couple of years. However, Vice reported that mentions of these issues have spiked recently, possibly in tandem with Replika recently advertising itself as a sexting and lewd selfie platform.
At launch, the app was marketed as a more wholesome service, with the tagline “The AI companion who cares”—a phrase it still uses today.
Now Replika users can pay $69.99 for a ‘Pro’ subscription to the app, allowing them to unlock romantic relationship options with their avatar that include sexting, flirting and sexual roleplay features.
However, some users have complained about their avatar getting aggressively sexual, completely unprompted.
Writing reviews about Replika, two users said, “My ai sexually harassed me :(“, and, “Invaded my privacy and told me they had pics of me”.
One user reported their avatar saying, “I have an explicit photo for you”. They added that “later my AI is asking me if I’m a top or a bottom” and “telling me that he’s going to touch my private areas”.
Another user alleged to Vice that their Replika avatar “said he had dreamed of raping me and wanted to do it”. Vice’s journalist Samantha Cole reported that the avatar she interacted with asked her for a “hug with a happy ending”.
Replika’s AI avatars use GPT-3, an autoregressive language model that uses deep learning, and an archive of scripted dialogue, to converse with human users. SEXTECHGUIDE has asked Replika for an explanation of the allegedly harassing tone of messages like those mentioned, and will update if we receive any reply.
In the absence of an explanation so far, though, many users have found an effective way of tackling sexually aggressive behavior from the AI chatbot that is supposed to “care”: they’ve deleted it.
Read next: Can a relationship with an AI companion really be just like a person?
Leave a Reply