The deeply troubling capabilities of deepfake porn video technology have been highlighted by the launch of a new artificial intelligence (AI) app, which allows users to upload photos of people’s faces and have them swapped into porn videos.

The app, which appears to have a small user base, was reported on by MIT Technology Review after it was seen by deepfake researcher Henry Ajder. The app was not named by the publication, and SEXTECHGUIDE is not naming it or any other porn deepfake apps or sites in this article, to avoid giving them web traffic.

Quick and easy-to-use deepfake technology, which allows users to place photos of individuals in videos so it appears as if they were in the original footage, has been around for a few years. However, the fact that this new app launch offers porn-specific deepfake creation tools has raised concerns about its potential easy misuse through revenge porn.

The app touts itself as allowing you to upload a photo of your own face to appear in a deepfake porn video, but a photo of any face can be uploaded. The app offers a bank of porn videos that the face image can be integrated with.

It takes a few seconds to upload a photo to the app and link it to a porn video, with video previews offered to the user for free. Users can then pay to download the full video with crypto-currency.

Most of the porn videos in the app’s bank feature women, but a cache of videos featuring men in gay porn comprises part of the selection. This has raised concerns that the technology could be misused to portray someone living in a country where homosexuality is illegal or otherwise persecuted.

The new deepfake porn app put up a notice on its site after it was contacted by MIT Technology Review, saying it was unavailable to new users.

As well as for revenge porn, deepfake technology is often used to create fake ‘celebrity’ porn videos. In 2020 TikTok banned deepfake videos, following porn platforms including Pornhub in banning them.

Twitter, Reddit and Google are among the other sites and online services to have banned deepfake videos. Videos that Google identifies as deepfakes do not show up in Google searches. However, Twitter accounts that lead to private deepfake channels on the chat app telegram are still active in posting videos to the site.

Also in 2020, South Korea toughened rules on deepfake videos, introducing jail sentences of up to five years for anyone making or distributing them. The move came after floods of deepfake porn videos featuring the faces of South Korean K-pop stars were put online.

The arrival of the new deepfake porn app comes when the online porn industry is going through a reckoning with regards to safety issues such as consent.

Pornhub has removed hundreds of thousands of videos from its site after being accused of hosting illegal content, and has toughened its rules about porn performer consent.

However, as well as being used to make revenge and celebrity deepfake porn, the technology is also being used by scammers who take pictures from people’s Instagram profiles to use in deepfake sex videos. They then bombard the victim with video calls and demand money, threatening that the images and videos will be sent along to friends and family.

Read next: TikTok joins the rush to ban deepfake videos