The UK government has said that the country will become the first to criminalize owning, making or distributing AI tools that are designed to create child sexual abuse material (CSAM).
With British authorities cracking down on child abuse material following the rise of AI image generators and deepfake technology, the UK’s Home Office said that people found guilty of being linked to CSAM AI tools will face up to five years in prison.
The new law is one of a slew being brought in as part of the government’s crime and policing bill, set to be sent to Parliament later in February 2025. In 2025, the UK government is also bringing in new rules regarding online porn sites implementing age verification, to prevent minors from viewing porn.
AI tool crackdown
As well as criminalizing the use of AI tools designed for CSAM, the UK government will make it illegal to own manuals that can teach people how to use these tools to make abuse-related imagery, or aid them in otherwise abusing children. The maximum prison sentence for these crimes will be three years.
Furthermore, there will be new laws brought in to criminalize the operation of websites used to share CSAM, and those that host CSAM-related advice.
The new laws are also set to toughen up ‘on the ground’ crackdowns. The UK Border Force is set to be given more power to compel members of the public to unlock digital devices such as phones or computers, if they are suspected of posing sexual threat to minors.
Stats behind the change
Recently AI image and video technology such as deepfake and ‘nudify’ applications have made it easy for many people to create explicit material, including content featuring virtual minors. The UK government had already announced that creating non-consensual deepfake porn, and preparing technology elements to create such content, are set to become criminal offences.
According to the Internet Watch Foundation (IWF) charity, in 2024 there were 246 confirmed reports of AI-generated child sexual abuse images. In 2023 the figure was 52, meaning that the year-on-year rise to 2024 was 380 percent. Each single report can contain thousands of CSAM images.
The IWF has warned that AI images of children are being created more frequently, and are becoming more available on the open web.
Age verification for online porn incoming
The UK government is also cracking down on age verification for porn sites and platforms in 2025.
It recently announced that sites hosting porn have until the summer to implement age verification for UK users, although some sites are expected to start bringing in age verification processes ahead of that deadline.
In January, the government also said that sites hosting user-generated porn will have to implement age verification processes for UK users by July, to ensure that only those aged 18 or older can access it. Sites that host or publish their own porn, including some generative AI tools, have been told that they must take steps to implement “robust” age checks “immediately”.
The guidance was issued on January 16 by the UK’s government-approved communication regulation body Ofcom. It is part of the Online Safety Act, which has largely been designed to protect minors from potentially harmful online content and other dangers.
Previously the government said that companies in breach of the act can be fined up to £18 million ($22 million) or 10 percent of their qualifying worldwide revenue: whichever figure is greater. Some companies running porn sites have said that they should not be responsible for verifying users’ ages, and that this should be done at device, rather than site, level.
A porn reckoning for X and Pornhub?
Ofcom suggested that for “robust” age verification companies could use photo ID matching, facial age estimation, credit card checks and third-party digital identity services.
Most major social media companies do not allow nudity or porn on their platforms, but the guidance could have implications for X, which does allow porn. Ofcom confirmed to BBC News that the guidance meant that social media sites must implement “highly effective checks” on age which could mean “preventing children from accessing the entire site”.
The guidance also raises the prospect of some of the world’s biggest porn sites being officially blocked in the UK. Aylo, which owns brands including Pornhub, has blocked access to its porn sites in many US states after state-wide age verification rules were introduced.
Aylo has suggested that the sites implementing age verification is not feasible, and has pushed for laws to require verification to be done through devices rather than on websites. It is still possible to easily access the sites from blocking areas by using a virtual private network (VPN) on your device.
Ofcom’s guidance statement was criticized by the privacy campaign group Big Brother Watch. The organization’s boss, Silkie Carlo, said: “Children must be protected online, but many technological age checking methods are ineffective and introduce additional risks to children and adults alike including security breaches, privacy intrusion, errors, digital exclusion and censorship.”
Leave a Reply