SEXTECHGUIDE Home

TikTok moderator says ‘millions’ of explicit videos are submitted on the app

0
Jamie F
Updated December 30, 2021
Published December 30, 2021
We may earn a commission via links on our site.
Why?
Our editorial policy and independence is vitally important to us. It has been crafted drawing upon more than 15 years of publishing experience – we wouldn’t jeopardise this reputation for a free product or travel of any kind.

At SEXTECHGUIDE, transparency, honesty, and integrity are the core of our values. We are committed to providing high-quality, unbiased content to our readers. Below are our detailed guidelines on how we maintain our editorial independence and ethical standards. 

We do not accept sponsored content or link insertion requests.
Read Full Policy

The scale of the influx of explicit videos submitted for upload on TikTok has been glimpsed at in the details of a new lawsuit against the video sharing app.

Candie Frazier, who worked as a TikTok video moderator, is suing TikTok and its parent company ByteDance in California. Frazier claims that she suffered “significant psychological trauma” after being exposed to a barrage of explicit content when moderating videos for the app.

In her lawsuit she states: “Every day, TikTok users upload millions of videos to its platform. Millions of these uploads include graphic and objectionable content such as child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.”

TikTok’s terms state that users cannot upload videos that promote sexually explicit material or violence, or that contain “a threat of any kind, including threats of physical violence”. Moderators such as Frazier are hired by the company Telus International, which provides content moderation services for TikTok.

Frazier claimed in the suit that she had to watch between three and ten videos simultaneously during her 12-hour shifts, which included a lunch hour plus 15-minute breaks, to keep up with the huge flow of videos. She alleged that ByteDance “constantly monitors” moderators’ work, and “heavily punishes any time taken away from watching graphic videos.”

“Without this court’s intervention, ByteDance and TikTok will continue to injure content moderators”

Candie Frazier’s lawsuit claims against ByteDance and TikTok

The suit states that “Plaintiff Frazier views videos of the genocide in Myanmar, mass shootings, children being raped, and animals being mutilated.”

It added that Frazier, who lives in Las Vegas, watched videos showing child abuse and child sexual assault as part of her work.

It was alleged that she also saw content showing the following: a smashed open skull with people eating from it; a woman who was kidnapped and beheaded by a cartel; a person’s head being run over by a tank; a man eating the head off a rat; a fox being skinned alive; a man falling to his death off a roof that included audio of the impact of his body hitting the ground, and school shootings that included dead bodies of children.

Frazier is seeking unspecified monetary compensation, claiming that the job caused her to suffer anxiety, depression, and post-traumatic stress disorder (PTSD). The lawsuit states that she was also motivated to file against TikTok in the hope that it would push the company to implement safety standards for content moderators.

The lawsuit states: “Without this Court’s intervention, ByteDance and TikTok will continue to injure Content Moderators and breach the duties they owe to Content Moderators who review content on their platform.”

TikTok has over one billion monthly active users and was the most downloaded app in 2020.

Read Next: UK media watchdog targets OnlyFans, TikTok, Snapchat and Twitch with ‘robust’ age verification measures

Explore the topics in this article
  • 29
    Lawsuit
  • 102
    Social Media
  • 18
    TikTok
Article by
Jamie F is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice, among others. He is also the creative force behind the Audible podcast Beast Master.
By the same author…
  • satisfyer patent australia

    Satisfyer Pro loses Australian patent case against Lelo in clit-sucker battle

    Jamie F/
    September 29, 2025
  • ChatGPT Age Verification

    ChatGPT will guess your age to block sexual content and ‘flirtatious talk’

    Jamie F/
    September 29, 2025
  • Dating app news updates for September 2025

    Dating appdates (Sep ‘25): Bumble’s AI matchmaker, Japan’s virtual girlfriends, and why psychopaths win at dating apps

    Jamie F/
    September 23, 2025

Leave a Reply

Your email address will not be published. Required fields are marked *