Mahon shared the statistic in a blog post outlining how the company approaches misinformation on its platform. “Misinformation has moved from the marginal to the mainstream,” he wrote. “No longer contained to the sealed-off worlds of Holocaust deniers or 9-11 truthers, it now stretches into every facet of society, sometimes tearing through communities with blistering speed.”
At the same time, the Youtube executive argued that “bad content” accounts for only a small percentage of YouTube content overall. “Bad content represents only a tiny percentage of the billions of videos on YouTube (about .16-.18% of total views turn out to be content that violates our policies),” Mahon wrote. He added that YouTube removes almost 10 million videos each quarter, “the majority of which don’t even reach 10 views.”
Facebook recently made a similar argument about content on its platform. The social network published a report last week that claimed that the most popular posts are memes and other content. And, faced with criticism over its handling of COVID-19 and , the company has argued that vaccine misinformation isn’t representative of the kind of content most users see.
Both Facebook and YouTube have come under particular scrutiny for their policies around health misinformation during the pandemic. Both platforms have well over a billion users, which means that even a small fraction of content can have a far-reaching impact. And both platforms have so far declined to disclose details about how vaccine and health misinformation spreads or how many users are encountering it. Mahon also said that removing misinformation is only one aspect of the company’s approach. YouTube is also working on “ratcheting up information from trusted sources and reducing the spread of videos with harmful misinformation.”
Editor’s note: This post originally appeared on Engadget.