This week, Yale University student, Yiqin Fu, attracted a lot more attention than she probably expected, after a tweet of hers went viral. In it, she describes a Germany-based Chinese developer – and equally unnamed team – that had “identified 100k porn actresses from around the world, cross-referencing faces in porn videos with social media profile pictures.”
The result was that the developer behind the software was branded as “creepy“, by some media outlets, and received widespread criticism.
There could be more nuance to this particular situation though, as the developer attributed to building the software – @BuriedInMemory, on Weibo – says he didn’t build it at all.
In the original post shared on Weibo, it says the project was originally intended to “find resources with a photo”.
In a statement to SEXTECHGUIDE, BuriedInMemory explained that its intended use is as a facial recognition algorithm (something China has been pretty big on for a while now) to find and remove revenge porn, and that the private data would never be publicly released.
He also told us that “some 0ldsch00l [sic] hacker developed this programme”, and that he was simply using it.
On top of this, he also says he recognizes that getting revenge porn removed is an uphill battle, and can take a long time. The software, he says, is to help people build evidence in such cases, and includes a sample DMCA takedown letter to point people in the right direction of the next steps.
Well-intentioned dev, still creepy software?
Few people would think that a tool that can help innocent victims of revenge porn get those videos removed from the offending sites is a bad thing, but as is so often the case with the application of technology, it’s the intent that makes a difference.
While in this instance, the motives may indeed be noble – there’s no general access to the system, and to use it you would have to provide your real eID information, as well as verify yourself with facial recognition so that you can only check for the existence of your own videos – there’s no such guarantees that the next use will be.
The problem, of course, is that fear for people’s privacy is a pretty obvious – and valid – concern. There’s also a fairly healthy dollop of misogynism mixed in there, from the wording of the tweet (“help others check whether their girlfriends”) to those who would be keen to use it to publicly out sex workers.
Whether or not BuriedInMemory built this particular software is almost irrelevant at this point – the technology behind it isn’t novel, and deep-fakes are very much a thing already.
Read Next: Privacy-invading camera tech is a problem that authorities will always struggle to combat
Leave a Reply