Twitter has scrapped plans to launch an OnlyFans-style adult content subscription function, that would have allowed creators to sell content on the social media platform, opening up potentially valuable new revenue streams for both parties.
An investigation by The Verge found that Twitter dropped the plan, dubbed Adult Content Monetization and Adult Creator Monetization (both shortened to ACM) at various stages, due to the platform’s inability to stay on top of moderating illegal content, including material related to children.
With porn creator trust in OnlyFans eroded and few mainstream social media platforms allowing explicit content, Twitter realized that there was huge potential for porn creators to earn from subscriptions on the platform, with the company taking a cut. According to documents viewed by The Verge, the company viewed this as “consistent with Twitter’s principles in free speech and freedom of expression”.
Many adult content creators already use Twitter to promote themselves, often linking to external revenue-garnering sites such as OnlyFans, so the social media site had a ready-made user base for ACM. The company was reportedly prepared to accept losing advertisers and facing more scrutiny from the public and US government for promoting porn, due to the extra income ACM would generate.
Twitter was also reportedly planning to get a money transmitter license allowing the company to sort ACM payments legally. With financial companies such as Mastercard and Visa cutting ties with some porn companies and sites, this could have been another huge benefit for porn creators.
In September 2021 Twitter rolled out Super Follows, a subscription function allowing users to charge money for non-explicit content. However, the company put the brakes on rolling out ACM in spring 2022 when it realized the extent of the strain it could add to its already allegedly inadequate content moderation procedures, specifically related to child-related material.
According to the documents The Verge viewed, a Twitter team working on ACM wrote that “we cannot proactively identify violative content and have inconsistent adult content [policies] and enforcement.”
They added: “We have weak security capabilities to keep the products secure.”
Like many big tech companies, Twitter has used an image database created by Microsoft called PhotoDNA, that flags known child sexual exploitation (CSE) images, to help crack down on CSE on its platform. Platforms using PhotoDNA by law have to report CSE findings to the US government-funded National Center for Missing and Exploited Children (NCMEC).
According to Twitter documents, the NCMEC said that out of one million reports of CSE made by Twitter each month, 84 percent were related to CSE material not on the PhotoDNA database, suggesting that large amounts of CSE material were not being discovered elsewhere on Twitter.
Twitter reports showed that the platform’s manual CSE detection processes created backlogs, and that its machine learning tools weren’t detecting new instances of CSE in tweets and videos.
The result is the seriously concerning potential for widespread CSE on Twitter, and the shutting down of an opportunity for creators in an increasingly tough earning landscape for individuals making a living from porn.
A Twitter spokesperson told The Verge: “Twitter has zero tolerance for child sexual exploitation. We aggressively fight online child sexual abuse and have invested significantly in technology and tools to enforce our policy. Our dedicated teams work to stay ahead of bad-faith actors and to help ensure we’re protecting minors from harm – both on and offline.”
Read next: Twitter’s new ‘statuses’ feature is flooded with hardcore porn
Leave a Reply