YouTube is removing content that promotes “harmful or ineffective” cancer treatments or discourages viewers from getting professional medical treatment, the company announced on Tuesday. The change is being made under YouTube’s updated and streamlined medical misinformation guidelines.
The new policy will remove content that promotes unproven treatments in place of approved care or as a guaranteed cure, and treatments that have been specifically deemed harmful by health authorities. For instance, a video that claims “garlic cures cancer,” or “take vitamin C instead of radiation therapy” would be removed.
“When cancer patients and their loved ones are faced with a diagnosis, they often turn to online spaces to research symptoms, learn about treatment journeys, and find community,” YouTube wrote in a blog post. “Our mission is to make sure that when they turn to YouTube, they can easily find high-quality content from credible health sources. In applying our updated approach, cancer treatment misinformation fits the framework — the public health risk is high as cancer is one of the leading causes of death worldwide, there is stable consensus about safe cancer treatments from local and global health authorities, and it’s a topic that’s prone to misinformation.”
Moving forward, YouTube will apply its medical misinformation policies if content is associated with a high public health risk, publicly available guidance from health authorities around the world, and when it’s generally prone to misinformation. YouTube says it needs to preserve the important balance of removing egregiously harmful content while ensuring space for debate and discussion.
YouTube says its policies on cancer treatment misinformation will go into effect today and enforcement will ramp up in the coming weeks. The company plans to promote cancer-related content from the Mayo Clinic and other authoritative sources.
The platform’s updated policies come a few years after YouTube strengthened its approach around health and vaccine misinformation during the COVID-19 pandemic. In 2020, YouTube removed COVID-19 misinformation videos from its platform. A year later, the company expanded its medical misinformation policies to include guidelines that banned vaccine misinformation. At the time, the company had already banned over 1 million videos over COVID-19 misinformation. The updated policy said YouTube would also start removing content that spreads misinformation about vaccine safety, the efficacy of vaccines and ingredients in vaccines.
Last year, YouTube said it would start cracking down on videos containing abortion misinformation and take down videos deemed unsafe. The company also launched an information panel that provides viewers with information from local and global health authorities under abortion-related videos and above relevant search results.
Earlier this year, YouTube updated its guidelines for dealing with eating disorder content on its platform. Although the platform has long removed content that glorifies or promotes eating disorders, YouTube began prohibiting content about eating disorders that users could be prompted to imitate.