YouTube has said it will remove content that spreads misinformation about all approved vaccines, expanding a ban on false claims about Covid-19 jabs.
Videos that say approved vaccines are dangerous and cause autism, cancer or infertility are among those that will be taken down, the company said.
The policy includes the termination of accounts of anti-vaccine influencers.
Tech giants have been criticised for not doing more to counter false health information on their sites.
To continue reading this BBC News report, go to:
YouTube bans all anti-vaccine misinformation
YouTube said on Wednesday that it was banning the accounts of several prominent anti-vaccine activists from its platform, including those of Joseph Mercola and Robert F. Kennedy Jr., as part of an effort to remove all content that falsely claims that approved vaccines are dangerous.
In a blog post, YouTube said it would remove videos claiming that vaccines do not reduce rates of transmission or contraction of disease, and content that includes misinformation on the makeup of the vaccines. Claims that approved vaccines cause autism, cancer or infertility, or that the vaccines contain trackers, will also be removed.
YouTube is banning prominent anti-vaccine activists and blocking all anti-vaccine content
YouTube is taking down several video channels associated with high-profile anti-vaccine activists including Joseph Mercola and Robert F. Kennedy Jr., who experts say are partially responsible for helping seed the skepticism that’s contributed to slowing vaccination rates across the country.
As part of a new set of policies aimed at cutting down on anti-vaccine content on the Google-owned site, YouTube will ban any videos that claim that commonly used vaccines approved by health authorities are ineffective or dangerous. The company previously blocked videos that made those claims about coronavirus vaccines, but not ones for other vaccines like those for measles or chickenpox.
Managing harmful vaccine content on YouTube
Crafting policy around medical misinformation comes charged with inherent challenges and tradeoffs. Scientific understanding evolves as new research emerges, and firsthand, personal experience regularly plays a powerful role in online discourse. Vaccines in particular have been a source of fierce debate over the years, despite consistent guidance from health authorities about their effectiveness. Today, we’re expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO.
Our Community Guidelines already prohibit certain types of medical misinformation. We’ve long removed content that promotes harmful remedies, such as saying drinking turpentine can cure diseases. At the onset of COVID-19, we built on these policies when the pandemic hit, and worked with experts to develop 10 new policies around COVID-19 and medical misinformation. Since last year, we’ve removed over 130,000 videos for violating our COVID-19 vaccine policies.
Russia threatens YouTube ban for deleting RT channels
Russia has threatened to ban YouTube if it does not reinstate two German-language channels backed by the Russian state that were deleted for violating Covid misinformation guidelines.
YouTube deletes RT’s German channels over Covid misinformation
YouTube has deleted Russian state-backed broadcaster RT’s German-language channels, saying they had breached its Covid misinformation policy.
Russia threatens YouTube block after RT TV’s German channels are deleted
Russia on Wednesday threatened to block Alphabet Inc.’s YouTube after Russian state-backed broadcaster RT’s German-language channels were deleted a day earlier and said it was weighing possible retaliation against the German media.
Why Is YouTube So Much Worse Than Facebook and Twitter at Stopping Misinformation?
On Wednesday, YouTube announced a major escalation in how it deals with content that poses a public health risk: The platform is banning misinformation related to any vaccine approved by local health authorities and the World Health Organization. YouTube’s medical misinformation policies previously prohibited the promotion of harmful untested treatments and false claims about COVID-19, but the Google subsidiary says that the pandemic has spurred it to scrutinize anti-vaccine content in general.