TikTok moderator says ‘millions’ of explicit videos are submitted on the app

tiktok news
Updated December 30, 2021
Filed in
0
We may earn a commission via links on our site. Why support us?

The scale of the influx of explicit videos submitted for upload on TikTok has been glimpsed at in the details of a new lawsuit against the video sharing app.

Candie Frazier, who worked as a TikTok video moderator, is suing TikTok and its parent company ByteDance in California. Frazier claims that she suffered “significant psychological trauma” after being exposed to a barrage of explicit content when moderating videos for the app.

In her lawsuit she states: “Every day, TikTok users upload millions of videos to its platform. Millions of these uploads include graphic and objectionable content such as child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.”

TikTok’s terms state that users cannot upload videos that promote sexually explicit material or violence, or that contain “a threat of any kind, including threats of physical violence”. Moderators such as Frazier are hired by the company Telus International, which provides content moderation services for TikTok.

Frazier claimed in the suit that she had to watch between three and ten videos simultaneously during her 12-hour shifts, which included a lunch hour plus 15-minute breaks, to keep up with the huge flow of videos. She alleged that ByteDance “constantly monitors” moderators’ work, and “heavily punishes any time taken away from watching graphic videos.”

“Without this court’s intervention, ByteDance and TikTok will continue to injure content moderators”

Candie Frazier’s lawsuit claims against ByteDance and TikTok

The suit states that “Plaintiff Frazier views videos of the genocide in Myanmar, mass shootings, children being raped, and animals being mutilated.”

It added that Frazier, who lives in Las Vegas, watched videos showing child abuse and child sexual assault as part of her work.

It was alleged that she also saw content showing the following: a smashed open skull with people eating from it; a woman who was kidnapped and beheaded by a cartel; a person’s head being run over by a tank; a man eating the head off a rat; a fox being skinned alive; a man falling to his death off a roof that included audio of the impact of his body hitting the ground, and school shootings that included dead bodies of children.

Frazier is seeking unspecified monetary compensation, claiming that the job caused her to suffer anxiety, depression, and post-traumatic stress disorder (PTSD). The lawsuit states that she was also motivated to file against TikTok in the hope that it would push the company to implement safety standards for content moderators.

The lawsuit states: “Without this Court’s intervention, ByteDance and TikTok will continue to injure Content Moderators and breach the duties they owe to Content Moderators who review content on their platform.”

TikTok has over one billion monthly active users and was the most downloaded app in 2020.

Read Next: UK media watchdog targets OnlyFans, TikTok, Snapchat and Twitch with ‘robust’ age verification measures

Article by
Get in touch
Jamie F Avatar
Related articles
  • womanizer vibe silent

    Womanizer launches Vibe, its first ‘silent’ lay-on vibrator

    Jamie F/
    October 4, 2024
  • California AI performer bill

    California AI replica bill shields adult performers from exploitation

    Jamie F/
    October 3, 2024
  • dating appdates sep 2024

    Dating appdates (September 2024): Anti-f***boy app, sober dating, Bumble AI, and more

    Jamie F/
    October 2, 2024
  • watching deepfake porn illegal

    Watching deepfake porn will be illegal in South Korea, amid ‘digital sex crime epidemic’

    Jamie F/
    September 30, 2024
  • realbotix intimate friendship robot

    RealDoll founder says Realbotix is making robot heads with ‘intimate friendship’ AI, but no sexbots

    Jamie F/
    September 26, 2024

Leave a Reply

Your email address will not be published. Required fields are marked *