fbpx
Ethiopia: Loan from United Nations Fund Allows Food and Agriculture Organization (FAO) to Scale Up Fertilizers for Farmers in TigrayRead more How Choosing the Right Printer Helps Small Businesses and Content Creators to Save Time, Maximise Productivity and Achieve GrowthRead more The United States Contributes USD $223 Million to Help World Food Programme (WFP) Save Lives and Stave Off Severe Hunger in South SudanRead more Eritrea: World Breastfeeding WeekRead more Eritrean community festival in Scandinavian countriesRead more IOM: Uptick in Migrants Heading Home as World Rebounds from COVID-19Read more Network International & Infobip to offer WhatsApp for Business Banking Services to Financial Institution Clients across AfricaRead more Ambassador Jacobson Visits Gondar in the Amhara Region to Show Continued U.S. Support for the Humanitarian and Development Needs of EthiopiansRead more Voluntary Repatriation of Refugees from Angola to DR Congo ResumesRead more Senegal and Mauritania Are Rich in Resources, Poor in Infrastructure, Now Is the Time to Change That Read more

YouTube cracks down on anti-vaccine misinformation

show caption
YouTube said it is removing channels that falsely claim proven vaccines are dangerous, including one belonging to anti-vaccine advocate Sherri Tenpenny./AFP
Print Friendly and PDF

Sep 30, 2021 - 11:53 AM

SAN FRANCISCO — YouTube announced Wednesday it would remove videos and some high-profile users that falsely claim approved vaccines are dangerous, as social networks seek to crack down on health misinformation around Covid-19 and other diseases.

Video-sharing giant YouTube has already banned posts that spread false myths around coronavirus treatments, including ones that share inaccurate claims about Covid-19 vaccines shown to be safe.

But the Google-owned site said its concerns about the spread of medical conspiracy theories went beyond the pandemic.

“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general,” YouTube said in a statement.

“We’re now at a point where it’s more important than ever to expand the work we started with Covid-19 to other vaccines.”

A YouTube spokesperson said channels of several “well-known vaccine misinformation spreaders will be terminated,” naming Joseph Mercola and Sherri Tenpenny as well as a Robert F. Kennedy, Jr. affiliated channel.

An AFP investigation found that Tenpenny runs a sprawling enterprise based on anti-vaccine activism, disdain for masks and testing, and denials that Covid-19 is real.

Mercola, a Florida-based osteopathic physician, was the subject of a New York Times article titled: “The most influential spreader of coronavirus misinformation online”.

While Robert F. Kennedy Jr. had already been blocked from Instagram in February for spreading misinformation about Covid-19 and vaccines.

130,000 videos removed 

YouTube said the expanded policy will apply to “currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO (World Health Organization).”

It will see false claims about routine immunizations for diseases like measles and Hepatitis B removed from YouTube.

These would include cases where vloggers have claimed that approved vaccines do not work, or wrongly linked them to chronic health effects.

Content that “falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them” will also be taken down.

“As with any significant update, it will take time for our systems to fully ramp up enforcement,” YouTube added.

It stressed there would be exceptions to the new guidelines, with personal testimonials of negative experiences with vaccines still allowed, so long as “the channel doesn’t show a pattern of promoting vaccine hesitancy.”

YouTube said it had removed more than 130,000 videos since last year for violating its Covid-19 vaccine policies.

On Tuesday, the company told German media that it had blocked the German-language channels of Russia’s state broadcaster RT for violating its Covid misinformation guidelines.

YouTube said it had issued a warning to RT before shutting the two channels down, but the move has prompted a threat from Moscow to block the video site.

It is not the only social media giant grappling with how to deal with the spread of Covid-19 conspiracy theories and medical misinformation in general.

Facebook this month launched a renewed effort to tackle extremist and conspiracy groups, beginning by taking down a German network spreading Covid misinformation.

  • bio
  • twitter
  • facebook
  • latest posts

ZONNTECH.COM uses both Facebook and Disqus comment systems to make it easier for you to contribute. We encourage all readers to share their views on our articles and blog posts. All comments should be relevant to the topic. By posting, you agree to our Privacy Policy. We are committed to maintaining a lively but civil forum for discussion, so we ask you to avoid personal attacks, name-calling, foul language or other inappropriate behavior. Please keep your comments relevant and respectful. By leaving the ‘Post to Facebook’ box selected – when using Facebook comment system – your comment will be published to your Facebook profile in addition to the space below. If you encounter a comment that is abusive, click the “X” in the upper right corner of the Facebook comment box to report spam or abuse. You can also email us.