Discord is rolling out age verification for its adult content spaces from March 2026, and a lot of users aren’t thrilled about it. Anyone wanting access to age-restricted servers and channels will need to prove they’re 18 or older, either through age estimation based on their existing Discord activity, or by submitting a face scan or official ID. Given the platform’s recent dodgy security record, that last part is making people especially nervous.
Discord has already tested verification processes with some users in the UK and Australia, where age verification rules for accessing content aimed at adults were recently toughened. Discord used the third-party verification platform Persona for this early rollout.
For many users, the prospect of having their ‘real life’ identities linked to their Discord identity has caused concern, with some saying they will leave Discord rather than get verified. There has reportedly recently been a spike in the amount of online searches for Discord alternatives. Meanwhile, Discord says that your real identity won’t be linked with your Discord account, but your age will be.
Effect on indie adult creators and communities
Discord allows sexually explicit content in servers and communities categorized as being for adults only. Adult content-related communities and servers on Discord have a wide range, taking in everything from the sharing of user-generated AI porn to OnlyFans porn creator support to adult gaming to erotic anime.
Discord has attempted to allay privacy and security concerns by stating that facial scans used for age verification will never leave the user’s device, and that copies of ID documents will be deleted once age verification is done.
However, in 2025 user trust in the platform was shaken when it was revealed that 70,000 Discord users’ IDs may have been exposed due to the hacking of the third-party customer service platform 5CA, that Discord worked with.
On a Reddit thread about Discord’s age verification changes, one user wrote: “I am an adult and I am tired of being treated like a child on the internet. I will not be uploading my face or ID to a database that I know is not secure enough to handle this.”
Another Reddit user suggested that they were concerned about their ID potentially being linked to their Discord identity, because they may use Discord groups for survivors of abuse. “Am I really going to have to associate my legal name with the PTSD groups I’m a part of?” they wrote. “Because of course they’re NSFW, we’re discussing horrifying experiences of abuse.”
A content creator named Annie (seen in the video above), who runs the Sex Positive Gaming YouTube channel, said: “Most Western developers in our space have Discords, and I’m not sure how willing our users are going to be with sharing a scan of their face and an upload of their ID.”
Annie added: “My Discord was recently classified as age-restricted even though I don’t allow any NSFW content, and I’m curious to see how many people will still access the Discord once they have provided more personal information.”
The problem is bigger than Discord
Discord’s age verification changes have caused concern far beyond the realms of porn and adult content. Hugely popular content creators have spoken about how many of their followers value being able to have totally separate online lives to their ‘real’ lives, in terms of ID.
Tobias James Smith, a Minecraft-focused streamer, told BBC News: “I just think it’s kind of a dangerous precedent for social media companies to request 3D scans of your face or official documents without there being any kind of knowledge of how that information is being protected or stored.”
Savannah Badalich, Discord’s head of product policy, said that its products are developed “with teen safety principles at the core” and that it “will continue working with safety experts, policymakers, and Discord users to support meaningful, long term wellbeing for teens on the platform.”
Discord isn’t the only platform facing heat over how it handles younger users. On February 24, the UK’s data watchdog announced that Reddit had been fined £14.47 million ($19.5 million) for using children’s data unlawfully, including failing to properly check the age of its UK users, a reminder that regulators are tightening the screws across the board.
The Information Commissioner’s Office (ICO) estimated that there was a large number of under-13s using the platform, despite Reddit not officially allowing them on it. The ICO said that Reddit’s age checks were “easy to bypass” and that Reddit “must do better” with regards to age assurances.
Reddit said it would appeal the fine. A spokesperson for the platform said that the “ICO’s insistence that we collect more private information on every UK user is counterintuitive and at odds with our strong belief in our users’ online privacy and safety.”
One thing’s clear: platforms are being pushed harder than ever to prove they know who’s using them, and users are pushing back just as hard.
























Leave a Reply