SEXTECHGUIDE Home

New Mexico claims there’s 10x more ‘child exploitative content’ on Instagram and Facebook than Pornhub or OnlyFans

1
Jamie F
Updated December 12, 2023
Published December 12, 2023
We may earn a commission via links on our site.
Why?

The state of New Mexico’s attorney general has claimed in a lawsuit against Meta that Facebook and Instagram feature more “certain child exploitative content” on their platforms than Pornhub and OnlyFans.

Raúl Torrez, the state’s attorney general, is suing Mark Zuckerberg’s company, claiming that the firm’s social media platforms do not do enough to tackle child sexual abuse material (CSAM), and so undermined the health of New Mexico children.

Meta told CNBC that the company uses “sophisticated technology” and reporting measures to help “root out predators” when it came to CSAM.

The lawsuit comes at a time when online platforms are facing more pressure from authorities to enforce rigorous age-verification processes, to prevent minors from accessing porn and other explicit content. The state of New Mexico is sung Meta for engaging in “unfair trade practises” by allowing CSAM to be distributed and facilitating the trafficking of minors.

The state also alleges that Meta’s platforms’ algorithms promote sexual content but don’t enforce proper age verification, and don’t identify child sexual exploitation networks.

Meta said that “in one month alone, we disabled more than half a million accounts for violating our child safety policies.”

The New Mexico attorney general’s office said they conducted an investigation before filing the lawsuit, that “found that certain child exploitative content is over ten times more prevalent on Facebook and Instagram than it is on Pornhub and OnlyFans.” 

In the lawsuit, the office claimed that “OnlyFans has prohibited certain adult sexual content related to dominating father figures and young girls (“DDLG”) that has the potential to cross over to child sexual exploitation.”

“Yet, investigators found numerous posts and accounts on Instagram that depicted or promoted choking, slapping, tying up, engaging in sex with, and otherwise abusing little girls,” it added.

Meta told CNBC that the company uses “sophisticated technology” and has measures to “hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators.”

Explore the topics in this article
  • 49
    Age Verification
  • 29
    Facebook
  • 26
    Instagram
  • 112
    Laws
  • 27
    Lawsuit
Article by
Jamie F is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
By the same author…
  • Dating app updates August 2025

    Dating appdates (Aug ’25): ‘Male’ rival to Tea app, browser history dating, Grindr’s zionist issue, and more

    Jamie F/
    August 20, 2025
  • 16K VR porn 'coming soon'

    16K VR porn coming ‘very soon’ for high-end headsets

    Jamie F/
    August 16, 2025
  • UK PORN VIEWERS VPN SMALLER SITES

    Major porn sites lose nearly half their UK traffic after new age verification laws (but smaller sites grow)

    Jamie F/
    August 14, 2025

Leave a Reply

Your email address will not be published. Required fields are marked *