New Mexico claims there’s 10x more ‘child exploitative content’ on Instagram and Facebook than Pornhub or OnlyFans

New Mexico is suing Facebook and Instagram parent company Meta, for failing to prevent CSAM material from its platform.

The state of New Mexico’s attorney general has claimed in a lawsuit against Meta that Facebook and Instagram feature more “certain child exploitative content” on their platforms than Pornhub and OnlyFans.

Raúl Torrez, the state’s attorney general, is suing Mark Zuckerberg’s company, claiming that the firm’s social media platforms do not do enough to tackle child sexual abuse material (CSAM), and so undermined the health of New Mexico children.

Meta told CNBC that the company uses “sophisticated technology” and reporting measures to help “root out predators” when it came to CSAM.

The lawsuit comes at a time when online platforms are facing more pressure from authorities to enforce rigorous age-verification processes, to prevent minors from accessing porn and other explicit content. The state of New Mexico is sung Meta for engaging in “unfair trade practises” by allowing CSAM to be distributed and facilitating the trafficking of minors.

Did you miss these?

The state also alleges that Meta’s platforms’ algorithms promote sexual content but don’t enforce proper age verification, and don’t identify child sexual exploitation networks.

Meta said that “in one month alone, we disabled more than half a million accounts for violating our child safety policies.”

The New Mexico attorney general’s office said they conducted an investigation before filing the lawsuit, that “found that certain child exploitative content is over ten times more prevalent on Facebook and Instagram than it is on Pornhub and OnlyFans.” 

In the lawsuit, the office claimed that “OnlyFans has prohibited certain adult sexual content related to dominating father figures and young girls (“DDLG”) that has the potential to cross over to child sexual exploitation.”

“Yet, investigators found numerous posts and accounts on Instagram that depicted or promoted choking, slapping, tying up, engaging in sex with, and otherwise abusing little girls,” it added.

Meta told CNBC that the company uses “sophisticated technology” and has measures to “hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators.”

Ready for more?

Affiliate Disclosure
Some articles contain affiliate links that allow us to earn money if you decide to purchase any of these products or services. This does not cost you any extra, and it allows us to continue to run this independent website without ads. Affiliate links have no relation to review ratings or other editorial coverage. You can read the full policy here.

Jamie F
Jamie F

Jamie is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.

Be the first to leave a comment
      Leave a reply

      Comparisons and Guides
      Best VR Porn Sites
      Best Gay & Trans VR Porn Sites
      Best AI Porn Generators
      Best AI Girlfriend Apps
      Best XXX Cam Sites
      Best Chromecast Porn Sites
      Best Porn Apps
      Best iPhone Porn Sites
      Sound-responsive Vibrators
      OnlyFans Alternatives
      Best Toys for People with Disabilities
      Best Mini Vibrators