As preparations for this Spring’s Digital Economy Act grind on, it seems there are still more questions than answers as far as age verification (AV) for adult content is concerned in the UK.
The Government’s Regulatory Policy Committee estimates the enforcement of new age-verification requirements for adult content will cost the taxpayer some £4.45 million, rising to an upper limit of £7.9 million, according to a paper published by the committee this month.
This figure is based on similar processes in the gambling industries, which cost nearly £16m in 2014/2015. The new regulation will fall under the remit of the British Board of Film Classification (BBFC).
The legislation, which aims to prevent under-age consumption of pornography by penalising sex content providers, many of whom are based outside of the UK, has attracted a fair bit of controversy, especially from rights advocacy groups who see it as a pretext for censorship and a potential invasion of privacy (the Ashley Madison data breach being a case in point), with ISPs’ role in policing content yet another affront to net neutrality.
The paper also suggests that third party payment partners like Paypal will have power to withdraw their service to any companies found failing to comply.
There are additional fears that it will pave the way for major operators like MindGeek (the company behind Pornhub) to develop its own costly age verification systems (likely anchored around credit cards, or possibly some sort of ID card scheme) which smaller, more artisan producers will have little choice but to adopt if they want to continue to operate in the UK.
You may remember that we’ve been highly sceptical about the legislation’s likely effectiveness, since there’s no shortage of means by which determined, digitally savvy under 18-year-old porn users can bypass standard verification processes to access adult material, via VPNs, TOR or file sharing for example.
Plus there are legitimate concerns that this new regulation could even make some young people resort to using the dark web to find porn, exposing them to a whole gamut of less quantifiable risks.
Given that the scheme is planned to go live by April this year, there is still an alarming lack of clarity from the Department for Digital, Culture, Media and Sport on how AV will be implemented in practice, covered only by a vague statement in the proposal paper on the “rapid development” and “wide range” of AV systems “to securely verify that [users] are over 18.”
Who is going to pay?
Equally worrying is an admission that the costs of AV may be passed on to the end consumer. The paper admits to an “absence of a clear understanding of how the AV process will be implemented in practice, the Department is unable to estimate the costs to the consumer of the measure, or even the approximate scale of these costs.”
Obscenity Lawyer and Open Rights advocate Myles Jackman foresees wider issues with this latest cost report. “It doesn’t consider the impact on the free speech ecosystem,” says Myles.
“Nor does it consider the potential adverse economic impact, which could see perfectly legitimate adult businesses being shuttered, as was the case with the 88 dominatrices who were reported and run out of business by a rival sex worker under the EU ‘s Audiovisual Media Services Directive (AVMSD),” Myles added.
The report also calls into question a lack of pervasive evidence of pornography’s harm, an issue explored by the rather more liberal Dutch government.
Then there are other issues to consider, such as that old chestnut concerning the disparity between the age of consent (16) and the legal age for viewing pornographic content (18). The reality is that sexually active 16 or 17-year-olds are quite likely to continue to consult ‘research material’ that is legally off-limits.
Jackman is in no doubt about the regulation’s efficacy, or lack thereof.
“This new regime is unworkable in practical terms, it’s aspirational at best. For one thing, there’s serious doubt that it’s capable of doing what it sets out to do. Indeed it could well have the absolute opposite effect, pushing young people towards the Dark Web and so on. It’s fundamental to strike the right balance between upholding adult rights, while protecting children of course. The friction point between the two is privacy and security.”
So far it seems that none of the authorities concerned, the ICO, BBFC or DCMS, want to step above the parapet and take evasive action and decide what’s more important. While the risk of ‘mere embarrassment’ might not seem like much to some, there are real risks involved for the 25 million adults in the UK legally watching pornographic material, as the suicide cases around the Ashley Madison affair showed.
It could well set a worrying precedent too. These measures are being trialled on the (relatively easy) target of porn, but they will inevitably be rolled out to apply to social media and other spheres of public life.
‘First they came for the pornographers…’
Read Next: #CES2019: By ignoring sex, CES fails to fundamentally understand tech’s influence
Leave a Reply