Pornhub has told SEXTECHGUIDE that the site doesn’t use creators’ uploaded content for generative AI training, at a time when conversations about generative AI use in the online porn industry are heating up.
The world’s biggest porn site’s Terms Of Service page declares that “all content you will upload” to Pornhub “may be used for training purposes”. The page was reportedly last modified on June 30, 2025, however, Pornhub said that those specific terms have been listed for a number of years.
The wording of those terms means that Pornhub may be able to legally use videos uploaded to the site to train AI datasets, without compensating the creators who made and own the content. Despite the terms not being a new addition, the somewhat concise wording has led to questions around content ownership and AI.
Some porn creators are concerned at the prospect of their content being used to train generative AI models, which could then create porn content that could threaten human creators’ livelihoods.
Pornhub has not specified if or how it would use videos for AI training, but a spokesperson for the site has now told SEXTECHGUIDE: “We do not use any uploaded content for generative AI training”.
The site’s Terms Of Service also has a section suggesting that uploaded content may be used to train AI systems that detect potentially illegal or unethical content.
“This Website reserves the right to hash any uploaded Content for training purposes including, but not limited to, training for identifying, or combating illegal activities and these hashes may be shared with third parties, including law enforcement agencies,” the terms state.
A hash, or ‘digital fingerprint’, can be created and assigned to content then used to identify occasions when that content appears elsewhere online. This technology has previously been used to identify when content identified as child sexual abuse material (CSAM) and assigned a hash has been uploaded elsewhere.
Pornhub is one of many sites and social media platforms signed up to Stop Non-Consensual Intimate Image Abuse (StopNCII).
StopNCII has a hash bank containing hashes created by people who believe themselves to be the victims of nonconsensual deepfake or revenge porn, who upload the hashes to the platform. If a corresponding hash is detected on a site signed up to the platform, it is flagged for moderation and potential removal.
Aylo recently revealed that Pornhub’s UK traffic dropped by 77 percent after tougher age verification rules for online porn access were introduced in the country in July, but it argues that rather than fewer people accessing adult material, they’re simply migrating to sites that haven’t yet complied with the age verification requirements.
















Leave a Reply