Back when the Internet was a sort of lawless land, I remember sneaking into my family’s desktop computer after midnight (Internet telephone rates were cheaper between midnight and 6 am) to secretly access raunchy content – namely, spicy fanfiction and erotica that ranged from sweet first-time fantasies to some hardcore tentacle smut. These times are long gone.
Nowadays, the Internet is where a lot of our goods and services are traded, and this naturally comes with regulation. In the early times of our online endeavours, it was believed that the Internet could eliminate nationalism altogether, and that policing people would be impossible once they existed online without bodies.
Oh, sweet innocence.
However, regulating the digital landscape is no easy feat, and countries have scrambled to understand how to do it – especially as many questions arise about who is responsible for what is shared online and how to keep the Internet safe. We’ve seen the UK grapple with the rollout of the Online Safety Act, as well as some disastrous consequences for FOSTA/SESTA in the US.
Since 2022, mainland Europe has been under the Digital Services Act (DSA) – a piece of legislation that regulates all the platforms offering services online, which casts an impressively wide net.
Creating governance for something this vast and fast-moving is challenging, and this is specifically why the DSA has tried to create “flexible” legislation that can evolve with the Internet.
Legal Disclaimer
This article is for informational purposes only and does not constitute legal advice. The DSA is a complex piece of legislation, and how it applies to your specific business will depend on factors we can’t assess from here. If you’re unsure whether you’re compliant — particularly if your platform also serves minors or hosts explicit content — talk to a lawyer who understands both EU digital regulation and the adult industry. We’re here to help you understand the landscape, not to replace proper legal counsel.
So, what is the DSA, anyway?
Here, we’ll explore the relevant articles on the adult and sex tech industries, what to look out for in the upcoming policy discussions, and whether you’re better off hiring a compliance expert.
These are some of the foundational blocks of the DSA:
- It is a piece of “framework-law”, meaning it’s concerned with the general outcome more than with specifics;
- It is future-proof, meaning it aims to evolve with the Internet;
- And it is proportionally scalable, meaning those with more reach and impact online should also have more responsibility.
The lineup: Who’s responsible for what
As I’ve said, the DSA aims to be proportionately scalable, so not all platforms are treated equally. Players are going to have attributed responsibilities depending on their reach.
Small fish, carrying messages upstream; basic duties.
Should be keeping an eye on content, removing illegal stuff when spotted.
Larger group, connecting everyone, showing off content; extra transparency duties.
Dominant players; full compliance, biggest societal impact.
The DSA and the adult sector: VLOPS and beyond
When it comes to the DSA, four platforms have been designated as VLOPs after the Digital Intimacy Coalition wrote an open letter to the European Commission about it, because they had been left out, even though they are some of the most accessed websites operating in Europe today.
Pornhub, XXNX, and XVideos are all VLOPS. Stripchat was initially designated but successfully argued for declassification, claiming its active user numbers fell below the threshold — a manoeuvre other borderline platforms may attempt.
In compliance with the DSA, these platforms were prompted to publish detailed risk assessments on the usage of platforms. But of course, the DSA goes beyond the porn giants.
First off, we must note that other VLOPs like Meta products, YouTube, Google, etc. are also under the DSA – and a lot of the censorship and issues faced by the sex tech adult platforms happen on these sites.
For instance, Article 21 provides users with a mechanism to challenge content moderation decisions without resorting to litigation. This means that if your platform is facing censorship on Instagram or being shadow-banned in Google Search, there is more that can be done than the usual blurry recourse channels offered by platforms.
Of course, this is not without its downsides – because platforms are asked to report risk, they often can overreport to show good faith and compliance. Big enforcement numbers make good headlines and signal regulatory compliance, so when Instagram reports taking down millions of posts for ‘containing CSAM’, there’s a strong incentive to over-remove — and sex tech companies are often collateral damage.
Something else that happens a lot is that big tech passes the responsibility onto users – in my work with the Digital Intimacy Coalition, I have been analysing risk assessment reports from tech giants, and there’s an overarching theme of ‘our systems work fine, the problem is bad-apple users’ — a framing that conveniently shifts blame away from algorithmic bias and onto the people being moderated. If your ad was rejected or your account flagged, this is the logic you’re up against.
Zooming out from VLOPs, we need to understand that no matter how big or small your platform is, the DSA views it is your duty to protect users. This means enabling users to report illegal or harmful content easily, having processes for swift removal of such content and protect user data privacy — especially since sex tech data is highly sensitive.
However, if the service or product you provide is explicit then compliance is going to look a little bit thornier for you. All thanks to Article 28.
The dreaded article 28: You must be 18 to enter this club
Out of the whole of the DSA, surely the article that got most (in)famous was Article 28 — and for good reason. It deals specifically with the protection of minors, and it’s a masterclass in regulatory contradiction.
Here’s the problem: Article 28 requires platforms to protect minors if their content poses a risk. Fair enough. But it also explicitly states that platforms are not obliged to process additional personal data to assess whether someone is actually a minor.
So you must protect children from your content, but you can’t collect the data needed to identify who the children are. So, yes — it’s confusing.
If you’ve been following the chaos around the UK’s Online Safety Act, you’ll understand why there’s some uneasiness in mainland Europe about how this will play out. Age verification is a political lightning rod, and the DSA has essentially handed platforms a mandate and a blindfold simultaneously.
To help sort things out (or at least try to), the DSA has released additional guidelines on how Article 28 should be applied.
Am I compliant?
In general, if you are offering sex toys, but if you don’t have explicit footage of sexual acts (meaning porn, whether video or image) on your platform, you probably don’t need to worry about age assurance, at least for now.
If you do have explicit material that is not age-gated, this is a good time to act. But remember, the DSA does not require platforms to process additional data, so this is something to take into consideration while choosing your age assurance provider.
If you are an independent escort and you have a website not hosted by aggregating platforms like OnlyFans or KaufMich, the wording you use might matter. These are murky waters still, so it’s better to err on the side of caution and (unfortunately) use more euphemistic language to safeguard your business.
There are thornier cases, though, in which you might be better off hiring a compliance expert – namely, if your platform grants access to minors (for instance, social media sites, forums, or chatbots/companion services that allow teenagers).
In the case that your services can be used by people under 18, and there is erotic or explicit content around it it’s worth investing in a compliance expert — ideally one familiar with both DSA requirements and adult industry specifics, as the intersection is niche.
Most importantly, no matter how big or small the platform, the DSA needs to see that there’s an effort being made towards identifying potential risks for users.
Of course, small platforms are not expected to hand in full assessments, but the level of compliance you owe is directly connected to how much risk you pose – and risk is connected to both reach and the kind of service provided.
The DSA isn’t going away, and neither is regulatory attention on adult platforms. Know your risk tier, document your safeguards, and keep an eye on how Article 28 guidance evolves. And with any luck, the tentacle smut will survive.
For further reading, the European Commission has published guidance on Article 28 implementation, and the Digital Intimacy Coalition maintains resources specifically for adult platforms navigating DSA compliance.


Leave a Reply