A new online anti-revenge porn app, funded by the UK’s Department of Digital, Culture, Media and Sport (DCMS), is set to launch in March 2023.
Minerva, developed by the South West Grid for Learning (SWGfL) charity, will allow people in the UK to keep a digital diary of any potential instances of image-based revenge porn committed against them, using timestamped screenshots, photos and videos.
Once this portfolio of evidence has been collated it can be submitted to the police, with the app offering links to support services such as the SWGfL’s Revenge Porn Helpline.
“Minerva arose out of our frustrations around the limitations of a nine ’til five office helpline to support victims of online abuse,” Sophie Mortimer, Revenge Porn Helpline’s manager, says.
She added: “The online world is omnipresent in our lives, and as such, online abuse knows no boundaries or timeframes. We realized we needed to harness technology to tackle online abuse 24/7 and help improve conviction rates.”
In December 2022 the Online Safety Bill, the UK government’s flagship internet regulation bill that is expected to toughen up management of online porn and other explicit content, returned to parliament.
StopNCII
Meanwhile, TikTok and Bumble have joined Facebook and Instagram by signing up to another anti-revenge porn service, that has been used by over 12,000 people so far.
The video-sharing app and the female-first dating app already have pretty stringent policies banning porn and other explicit content. Their move to sign up with Stop Non-Consensual Intimate Image Abuse (StopNCII) cracks down further on the potential for revenge porn images or videos to be uploaded to them.
How does the anti-revenge porn app work?
If someone is being threatened by the potential release of revenge porn content, or just fears that it may be leaked, they can submit hashes of the images or video to StopNCII’s website.
A hash (also known as a ‘digital fingerprint’) is created on the user’s device through an algorithm that assigns a unique hash value to an image. “Duplicate copies of the image all have the exact same hash value,” says the StopNCII FAQs. These are then added to StopNCII’s hash bank. The original content is never sent to StopNCII – just the hashes.
If an image or video corresponding to a hash in the bank is uploaded to Facebook or Instagram – and now TikTok or Bumble – the file is moderated by StopNCII then potentially removed and blocked.
Obviously, you can’t create a hash for an image or video that only a potential revenge porn perpetrator has access to, somewhat limiting the service’s effectiveness. However, you could potentially create one immediately after you see it uploaded somewhere online.
StopNCII said that after one year of operation over 12,000 people had created cases with the service, with more than 40,000 hashes generated.
“With more of our lives spent online, non-consensual image abuse is a growing issue that, like other forms of sexual harassment, disproportionately targets and impacts women. We’re proud to be partnering with StopNCII.org to fight against intimate image abuse and ensure that the wider internet is a safer space,” Lisa Roman, Vice President of public policy at Bumble, says in a statement.
TikTok, which has a younger average user base than Facebook, Instagram and Bumble, doesn’t allow use of words such as ‘clitoris’ or ‘orgasm’, let alone sexually explicit imagery. This isn’t from a lack of TikTok users trying, though, with platform moderators reportedly having to deal with “millions” of explicit videos submitted for upload.
Julie de Bailliencourt, TikTok’s head of product policy, said: “Our goal at TikTok is to foster a safe and supportive environment for our community, and there’s no place for this kind of malicious behavior or content on our platform. We’re proud to partner with StopNCII.org to strengthen efforts to stop the spread of non-consensual intimate imagery and better support victims.”
Read next: TikTok introduces ‘adult audience’ category and raises minimum age to 18 for live streamers
Leave a Reply