SEXTECHGUIDE Home

Rise of AI kissing apps creates tangle of ethics issues

5
Jamie F
Updated March 12, 2025
Published March 12, 2025
We may earn a commission via links on our site.
Why?

A wave of generative AI ‘kissing apps’ has risen to prominence in the first few months of 2025, allowing you to make deepfake videos of two people kissing, with no consent required.

Apps such as Pollo.ai, Vidu, Filmora and AI Video let you create these kissing videos by simply uploading photos of two people (or, in some more disturbing cases, animals). Many apps like these are available in Apple’s App Store and Google Play, and are not flagged as safe for adults only.

Many of the apps are also being advertised on social media platforms, and their rise could mark a new, perhaps somewhat unexpected, phase in the global crackdown against nonconsensual deepfake video and image content.

A tongue-thrust into the mainstream

AI kissing apps do not, as a general rule, create pornographic or explicit content. They don’t generally even create videos that’d be considered NSFW, unless a quick kiss on the lips is seen as egregiously offensive at your workplace.

Vidu kissing AI

As such, they’ve been able to be hosted on mainstream app stores as well as often being available as web apps. App stores tend to base their approval systems for banned content on the explicit nature of the visuals or other content you can access within them, rather than on their deeper ethics.

The result, as seen in recent months, has been a flood of fake AI videos of celebrities and world leaders locking lips. Many may view such content as harmless online pranksterism, but many AI kissing videos are extremely realistic, and their potential for abuse is great, while routes to recourse routes seem blurry.

Evading the crackdown

Are we being killjoys by highlighting concerns about these apps, which many people may use for harmless or consensual kissy content?

Well, many of the apps’ makers are brazen about them potentially being used to create kissing videos for relationship fantasies, whether that’s with celebrities or your unattainable crush from down the road. In seconds, it’s possible to harvest a headshot photo from someone’s social media profile then begin circulating a fake video of them getting off with you, or indeed anyone else.

It could be disturbing to be the subject of one of these videos, which could be used for internet shaming, or just general meddling in someone’s relationship. But depending on which country you did this in, it’d be unlikely that you’d be breaking laws unless you used the content for extortion or another crime.

Deepfake porn laws are tightening across the world – the US Senate has passed the Take It Down Act, which criminalizes nonconsensual intimate images. In the UK making nonconsensual deepfake porn is set to become illegal, with offenders facing up to two years in prison.

But AI kissing videos are not porn.

Does this mean that deepfake content laws need to be changed to include such content?

If someone tries to extort someone with an AI kissing video, it’s tough to argue that this is considerably less nefarious than using deepfake porn for extortion. The potential harm it could do to someone could be pretty much the same.

What’s more likely is mainstream app stores taking a stance on AI kissing apps, perhaps placing more focus on consent standards for apps they feature and banning some accordingly.

This could push AI kissing apps closer to the realm of AI and deepfake porn generators. Some AI kissing apps could, though, introduce some kind of consent system, for example only allowing kissing videos to be created when individuals consent to social media profile photos being used in an app by a trusted partner.

However it pans out, the fast rise of AI kissing apps has shown that deepfake consent issues don’t just arise when the clothes come off.

If you’ve experienced image-based sexual abuse or non-consensual sharing of intimate images, help is available worldwide.

For a complete international directory of support services, visit the Cyber Civil Rights Initiative’s international resources.

Explore the topics in this article
  • 135
    Artificial Intelligence
  • 22
    consent
  • 47
    Deepfake
  • 11
    ethics
  • 103
    Laws
Article by
Jamie F is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
By the same author…
  • Australian users will have to verify their age to view porn via search engines

    Australia introducing search engine age verification as part of porn crackdown

    Jamie F/
    July 14, 2025
  • emteq sense emotion-tracking smart glasses

    Eye love you: Emteq emotion-tracking smart glasses set for release in 2026

    Jamie F/
    July 7, 2025
  • Denmark Deepfakes

    Denmark to use copyright law to fight nonconsensual deepfakes

    Jamie F/
    July 4, 2025

Leave a Reply

Your email address will not be published. Required fields are marked *