YouTube Bans Anti-Vaccine Misinformation
The new set of policies will cover not just the Covid-19 vaccines or those for routine immunizations against measles and hepatitis B, but will also apply to general claims about vaccines, YouTube said.,
YouTube bans all anti-vaccine misinformation.
Sept. 29, 2021, 10:00 a.m. ET
By Davey Alba
YouTube said that it was banning several prominent anti-vaccine activists from its platform, including the account of Robert F. Kennedy Jr.Credit…Clemens Bilan/EPA, via Shutterstock
YouTube said on Wednesday that it was banning several prominent anti-vaccine activists from its platform, including the accounts of Joseph Mercola and Robert F. Kennedy Jr., as part of an effort to remove all content that falsely claims that approved vaccines are dangerous.
In a blog post, YouTube said that it would remove videos claiming that vaccines do not reduce transmission or contraction of disease, and content that includes misinformation on the contents of the vaccines. Claims that approved vaccines cause autism, cancer or infertility, or that the vaccines contain trackers will also be removed.
The platform, which is owned by Google, has had a similar ban on misinformation about the Covid-19 vaccines. But the new policy expands the rules to misleading claims about approved vaccines such as those against measles and hepatitis B, as well as to falsehoods about vaccines in general, YouTube said. Personal testimonies relating to vaccines, content about vaccine policies and new vaccine trials, and historical videos about vaccine successes or failures will be allowed to remain on the site.
“Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board” in policies that bring its users high-quality information, the company said in its announcement.
Misinformation researchers have for years pointed to the proliferation of anti-vaccine content on social networks as a factor in vaccine hesitation — including slowing rates of Covid-19 vaccine adoption in more conservative states. Reporting has shown that YouTube videos often act as the source of content that subsequently goes viral on platforms like Facebook and Twitter, sometimes racking up tens of millions of views.
YouTube said that in the past year it had removed over 130,000 videos for violating its COVID-19 vaccine policies. But this did not include what the video platform called “borderline videos” that discussed vaccine skepticism on the site. In the past, the company simply removed such videos from search results and recommendations, while promoting videos from experts and public health institutions.
This is a developing story. Check back for updates.