TikTok moderator says ‘millions’ of explicit videos are submitted on the app

0
tiktok news

The scale of the influx of explicit videos submitted for upload on TikTok has been glimpsed at in the details of a new lawsuit against the video sharing app.

Candie Frazier, who worked as a TikTok video moderator, is suing TikTok and its parent company ByteDance in California. Frazier claims that she suffered “significant psychological trauma” after being exposed to a barrage of explicit content when moderating videos for the app.

In her lawsuit she states: “Every day, TikTok users upload millions of videos to its platform. Millions of these uploads include graphic and objectionable content such as child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.”

TikTok’s terms state that users cannot upload videos that promote sexually explicit material or violence, or that contain “a threat of any kind, including threats of physical violence”. Moderators such as Frazier are hired by the company Telus International, which provides content moderation services for TikTok.

Frazier claimed in the suit that she had to watch between three and ten videos simultaneously during her 12-hour shifts, which included a lunch hour plus 15-minute breaks, to keep up with the huge flow of videos. She alleged that ByteDance “constantly monitors” moderators’ work, and “heavily punishes any time taken away from watching graphic videos.”

“Without this court’s intervention, ByteDance and TikTok will continue to injure content moderators”

Candie Frazier’s lawsuit claims against ByteDance and TikTok

The suit states that “Plaintiff Frazier views videos of the genocide in Myanmar, mass shootings, children being raped, and animals being mutilated.”

It added that Frazier, who lives in Las Vegas, watched videos showing child abuse and child sexual assault as part of her work.

It was alleged that she also saw content showing the following: a smashed open skull with people eating from it; a woman who was kidnapped and beheaded by a cartel; a person’s head being run over by a tank; a man eating the head off a rat; a fox being skinned alive; a man falling to his death off a roof that included audio of the impact of his body hitting the ground, and school shootings that included dead bodies of children.

Frazier is seeking unspecified monetary compensation, claiming that the job caused her to suffer anxiety, depression, and post-traumatic stress disorder (PTSD). The lawsuit states that she was also motivated to file against TikTok in the hope that it would push the company to implement safety standards for content moderators.

The lawsuit states: “Without this Court’s intervention, ByteDance and TikTok will continue to injure Content Moderators and breach the duties they owe to Content Moderators who review content on their platform.”

TikTok has over one billion monthly active users and was the most downloaded app in 2020.

Read Next: UK media watchdog targets OnlyFans, TikTok, Snapchat and Twitch with ‘robust’ age verification measures

Affiliate Disclosure
Some articles contain affiliate links that allow us to earn money if you decide to purchase any of these products or services. This does not cost you any extra money, and it allows us to continue to run this website. Affiliate links have no relation to review ratings or other editorial coverage. You can read the full policy here.

Jamie F

Jamie F

Jamie is a freelance writer, contributing to outlets such as The Guardian, The Times, The Telegraph, CNN and Vice.

Leave a Reply

Sending

SEXTECHGUIDE
Logo