A sex offender in the UK has been banned by a court from using or accessing “AI creating tools”, following his conviction for making over 1,000 indecent illegal images of children.
The ban, made in a sexual harm prevention order following recommendations from the Crown Prosecution Service, was given to a 48 year-old man named Anthony Dover. It is the first known case of a sex offender being banned in the UK from using generative AI tools, with the Internet Watch Foundation (IWF) saying convictions like this could prove to be “landmark” cases for AI generative tools.
Although news of the AI tool ban was made public last week (beginning April 15, 2024), the sexual harm prevention order was imposed in February. Earlier this month it was announced that creating deepfake porn content without the consent of those depicted in it, with the intent to cause “alarm, humiliation or distress”, will be illegal in the UK. It was already illegal to share such deepfake porn content without consent.
Although Dover was convicted for making illegal images, whether he used AI tools in their creation has not been made public. His sexual harm prevention order did, however, name Stable Diffusion specifically as included in his generative AI software ban. Stable Diffusion has previously been named in UK court cases as being used to create child sex abuse content.
Stability AI, which owns Stable Diffusion, told The Guardian that it banned use of its software for unlawful activity such as the creation of child abuse images. The company added that concerns about such material being made using Stable Diffusion related to a version of the software released before Stability AI took over the software’s exclusive license in 2022.
Dover will be able to use generative AI tools if he gets permission to use specific tools from the police in advance.
Restrictions on internet use, such as bans on using certain types of messaging apps or using ‘incognito’ mode when browsing online, have often been placed on sex offenders in the past. However, with generative AI tools becoming increasingly common and often used in workplaces, restrictions on their use could have wider implications for those slapped with them.
Such bans could, for example, potentially exclude those holding them from fulfilling work tasks that involve certain AI tools. Not many people are likely to have sympathy from criminals affected in this way, but it does raise questions for the logistics of such bans, as well as managing offenders’ connections with, and reintegration to, wider society.
The IWF said that convictions like these “should sound the alarm that criminals producing AI-generated child sexual abuse images are like one-man factories, capable of churning out some of the most appalling imagery.”
It saw Dover receive a community order and a fine of £200 ($247) as well as the sexual harm prevention order. It came at a time when explicit AI-generated content, particularly deepfake content, had been pushed to the fore of conversations about online ethics and legalities.
AI-assisted deepfake image and video technology has made it relatively straightforward to make deepfake porn and other explicit content, and sites such as MrDeepFakes have become popular by hosting adult deepfakes. While the creation of sexual abuse imagery featuring children was clearly illegal, legalities around the creation of deepfakes depicting adults was, until recently, ambiguous in many regions.
Around the same time that the UK government announced its recent toughening of deepfake porn laws, Italian Prime Minister Giorgia Meloni sued two men for defamation after they allegedly made deepfake porn videos featuring her face.
The IWF has said that it has observed a “slow but continual” increase in the proportion of child sexual abuse imagery featuring AI-generated imagery, although there are as yet no official statistics about its prevalence in UK criminal cases.
Following this sexual harm prevention order, it will be interesting to see if further generative AI tool bans are given, for cases involving deepfake AI tech generating adult porn content as well as for child abuse material.
A Crown Prosecution Services spokesperson said: “Where we perceive there is an ongoing risk to children’s safety, we will ask the court to impose conditions, which may involve prohibiting use of certain technology.”
Leave a Reply