South Korea declares deepfakes a ‘sex crime’ after rise in videos featuring K-pop stars

Kpop Deepfakes

South Korea has become the latest country to take action on deepfake videos, after research showed that up to 25 percent of all porn deepfakes in circulation featured the country’s beloved K-Pop stars.

Since the technology came of age in 2017, the country has seen a particularly bad influx of deepfakes, reports Korea JoongAng Daily.

The key to why deepfakes are so eerily accurate lies in the machine learning (ML) technology, which extracts all the data from the two faces and ‘intelligently’ merges the features into the host.

The more photos that the machine can learn from, the better the result, and of course, the more extreme the difference between the two images, the less likely it is to look authentic.

Although not exclusively used for pornography (its ripe for misuse for propaganda, too) the technology can be used to create fake sexual content, often featuring celebrities – usually women. 96 percent of all deepfake videos are explicit in nature, according to a 2019 report.

Korea is second only to America in terms of number of fakes featuring its citizens, and now the country is fighting back.

A revision of the existing ‘Act on Special Cases Concerning the Punishment, Etc. of Sex Crimes’ has already been passed and will kick in on June 25, 2020. Under the new rules, anyone caught making or distributing deep fake videos or similar (basically, any videos that manipulate the image of a living person against their will or without their knowledge), will be subject to up to five years in prison, or a fine of up to 50 million Won ($40,500).

The big difference that the law brings is the inclusion of deep fakes as a sex crime, rather than a simple identity crime.

The big problem is that unless the video was made on Korean soil, there’s little that law enforcers can do, except show their teeth and hope they look scary.

Additionally, the law doesn’t punish viewers of such videos, though that’s probably for the best, given that you could be forgiven for not realising that it was a deep fake in the first place. That’s sort of the krux of the problem though.

Law-makers have already said that this is just a first step on the road to protecting its citizens from deepfakes. But it’s an important marker when a country that has seen such a rise in deepfakes taking charge of trying to control them.

Read Next: DeepNude Shuts Down After Going Viral, but DeepFakes Aren’t Going Away

Affiliate Disclosure
Some articles contain affiliate links that allow us to earn money if you decide to purchase any of these products or services. This does not cost you any extra money, and it allows us to continue to run this website. Affiliate links have no relation to review ratings or other editorial coverage. You can read the full policy here.

Chris M

Chris M

Chris has worked in technology journalism for over a decade, and brings his nerdy expertise to looking at what goes on under the hood of sex tech.

Be the first to leave a comment

Leave a reply