A wave of generative AI ‘kissing apps’ has risen to prominence in the first few months of 2025, allowing you to make deepfake videos of two people kissing, with no consent required.
Apps such as Pollo.ai, Vidu, Filmora and AI Video let you create these kissing videos by simply uploading photos of two people (or, in some more disturbing cases, animals). Many apps like these are available in Apple’s App Store and Google Play, and are not flagged as safe for adults only.
Many of the apps are also being advertised on social media platforms, and their rise could mark a new, perhaps somewhat unexpected, phase in the global crackdown against nonconsensual deepfake video and image content.
A tongue-thrust into the mainstream
AI kissing apps do not, as a general rule, create pornographic or explicit content. They don’t generally even create videos that’d be considered NSFW, unless a quick kiss on the lips is seen as egregiously offensive at your workplace.

As such, they’ve been able to be hosted on mainstream app stores as well as often being available as web apps. App stores tend to base their approval systems for banned content on the explicit nature of the visuals or other content you can access within them, rather than on their deeper ethics.
The result, as seen in recent months, has been a flood of fake AI videos of celebrities and world leaders locking lips. Many may view such content as harmless online pranksterism, but many AI kissing videos are extremely realistic, and their potential for abuse is great, while routes to recourse routes seem blurry.
Evading the crackdown
Are we being killjoys by highlighting concerns about these apps, which many people may use for harmless or consensual kissy content?
Well, many of the apps’ makers are brazen about them potentially being used to create kissing videos for relationship fantasies, whether that’s with celebrities or your unattainable crush from down the road. In seconds, it’s possible to harvest a headshot photo from someone’s social media profile then begin circulating a fake video of them getting off with you, or indeed anyone else.
It could be disturbing to be the subject of one of these videos, which could be used for internet shaming, or just general meddling in someone’s relationship. But depending on which country you did this in, it’d be unlikely that you’d be breaking laws unless you used the content for extortion or another crime.
Deepfake porn laws are tightening across the world – the US Senate has passed the Take It Down Act, which criminalizes nonconsensual intimate images. In the UK making nonconsensual deepfake porn is set to become illegal, with offenders facing up to two years in prison.
But AI kissing videos are not porn.
An app store consent reckoning?
Does this mean that deepfake content laws need to be changed to include such content?
If someone tries to extort someone with an AI kissing video, it’s tough to argue that this is considerably less nefarious than using deepfake porn for extortion. The potential harm it could do to someone could be pretty much the same.
What’s more likely is mainstream app stores taking a stance on AI kissing apps, perhaps placing more focus on consent standards for apps they feature and banning some accordingly.
This could push AI kissing apps closer to the realm of AI and deepfake porn generators. Some AI kissing apps could, though, introduce some kind of consent system, for example only allowing kissing videos to be created when individuals consent to social media profile photos being used in an app by a trusted partner.
However it pans out, the fast rise of AI kissing apps has shown that deepfake consent issues don’t just arise when the clothes come off.
Leave a Reply