TikTok is shifting from human-led to automated video moderation and removal in the US and Canada, for videos featuring banned content such as nudity.
The video-sharing app, which previously only removed videos for content violations after they’d been checked by human moderators, is rolling out the automated system in the two countries this month.
Eric Han, TikTok’s head of US safety, said that the system will be imposed for “content categories where our technology has the highest degree of accuracy”. They include violations for TikTok policies on “adult nudity and sexual activities”, violent and graphic content, minor safety, plus illegal activities and regulated goods.
TikTok is in line with other social media giants such as Facebook and Instagram in not allowing nudity on its platform. Twitter does allow nudity, and looks set to embrace adult content further with its Super Follows function.
TikTok’s move is likely to speed up the platform’s video removal process. In a statement, Tan said that it would also mean that human moderators will be exposed to fewer harmful videos.
“In addition to improving the overall experience on TikTok, we hope this update also supports resiliency within our Safety team by reducing the volume of distressing videos moderators view and enabling them to spend more time in highly contextual and nuanced areas, such as bullying and harassment, misinformation, and hateful behaviour,” he said.
TikTok’s low error rate for its automated video removal system is likely to have given the platform’s bosses confidence in the move. It said that only five percent of videos removed by the automated system were taken down in error.
Despite the low mistake rate, the vast amount of videos uploaded to TikTok means that the move could still result in much content being taken down in error. In the first three months of 2021 over 8.5 million videos were uploaded to the app in the US alone.
“While we strive to be consistent, neither technology nor humans will get moderation decisions correct 100 percent of the time, which is why it’s important that creators can continue to appeal their content’s or account’s removal directly in our app,” Tan said.
“If their content or account has been incorrectly removed, it will be reinstated, the penalty will be erased, and it will not impact the account going forward. Accrued violations will expire from a person’s record over time.”
TikTok users will be able to appeal video removals directly in the app. As was the case before the new automated measures were in effect, they can report potential violations in other users’ videos, for review.
Tan said: “While no technology can be completely accurate in moderating content, where decisions often require a high degree of context or nuance, we’ll keep improving the precision of our technology to minimize incorrect removals.”
For a TikTok user’s first violation resulting in a video removal, they will receive a warning in the app — unless the warning is for a zero-tolerance subject such as child sexual abuse, in which case they get an automatic ban.
If further violations occur, accounts can be suspended for 24 or 48 hours, or restricted to view-only permissions for up to one week. Persistent rule violation can result in an account being permanently removed.
Read next: Tinder left in ashes? TikTok’s fuelling a new generation of dating apps