TikTok removed 81 million videos for violations in Q2, representing 1% of uploads

TikTok released a transparency update about content that was removed from the platform between April 1 and June 30, 2021. The platform says it removed 81,518,334 videos for violating its community guidelines or terms of service during that time period, which represents less than 1% of the total videos posted. This also means that during that quarter, 8.1 billion+ videos were posted on TikTok, averaging out to about 90 million videos posted each day.

TikTok rolled out technology in July that automatically removes content that violates its policies on minor safety, adult nudity and sexual activities, violent and graphic content, and illegal activities. The app uses automation for these kinds of content in particular, because it says these are the areas where its technology has the highest degree of accuracy. TikTok also has human content moderators on its safety team that reviews reported content.

When the automated technology was announced, TikTok said its false-positive rate for removals — when it removes content that actually doesn’t violate rules — is 5%. This number is reflected in the transparency report released today, as about 4.6 million of the 81 million removed videos were reinstated. The most common reason for content to be removed, encompassing 41.3% of removed videos, was for violation of minor safety guidelines. TikTok says that 94.1% of all removed content was identified and taken down before a user reported it.

Still, within that 5% of falsely removed content lies some troubling examples of failed content moderation, like when popular Black creator Ziggi Tyler pointed out that the app would flag his account bio as “inappropriate” when he included phrases like “Black Lives Matter” or “I am a black man.” Meanwhile, phrases like “supporting white supremacy” weren’t immediately censored.

As platforms like YouTube and Facebook move to ban content with vaccine misinformation, other prominent social media companies like TikTok must decide how to tackle similar issues. TikTok’s community guidelines ban content that’s false or misleading about COVID-19, vaccines and more broad anti-vaccine disinformation.

In Q2, TikTok removed 27,518 videos due to COVID-19 misinformation. TikTok says that 83% of these videos were removed before they were reported by users, 87.6% of these videos were removed within 24 hours of posting, and 69.7% of the videos had zero views. Meanwhile, TikTok’s COVID-19 information hub, developed with information from the WHO and CDC, was viewed over 921 million times globally.

Numbers-wise, it’s no secret that TikTok continues to grow, as the app crossed the 1 billion monthly active user landmark last month. The numbers in these transparency reports also reflect that growth — in the first half of 2020, TikTok removed 104 million videos, which represented 1% of all content posted. That 1% removal rate is consistent with the data published today, but today’s report accounts for 81 million videos posted in just one quarter, rather than two quarters.