YouTube Cracks Down on All Anti-Vaxx Content in an Attempt to Limit Harmful Misinformation about COVID-19 Vaccines

YouTube Cracks Down on All Anti-Vaxx Content in an Attempt to Limit Harmful Misinformation about COVID-19 Vaccines

YouTube announced earlier this week that all anti-vaxx videos and other content spreading misinformation about vaccines and common illnesses on the platform had been banned!

That being said, the social media platform is determined to remove all content attempting to talk negatively about vaccines already proven safe and beneficial as well as approved officially by federal health officials.

This was first reported by the Washington Post, and it sounds like this step was made in order to discourage people from spreading panic and uncertainty when it comes to getting the COVID-19 shots.

Amongst other disproven conspirations now forbidden on the platform, it is no longer allowed to say that vaccines cause cancer, autism, infertility, or that they come with microchips since all of these are objectively untrue and not a matter of speculation of personal opinion.

This is not the first time that YouTube bans false information on the COVID-19 vaccines, as it happened a year ago as well, in October of 2020.

However, they noted at the time that they would still allow open conversations about brand new vaccine trials and policies as well as discussions about people’s personal experiences receiving any of the available shots.

A spokesperson for the company also revealed to Insider that they would be removing the channels of some of the most visible anti-vaxxers on the platform, including former President John F. Kennedy’s nephew Robert F. Kennedy Jr., and author and anti-vaxx activist Joseph Mercola.

As you might be aware, Kennedy Jr. is one of twelve influencers that a report has found to be the most dangerous and well known spreaders of misinformation about the COVID-19 pandemic.

This means that the new update expands some already implemented rules related to vaccination in an effort to combat fake information getting spread online.

YouTube’s vice-president of global trust and safety, Matt Halprin, shared with the Post that “Developing robust policies takes time. We wanted to launch a policy that’s comprehensive, enforceable with consistency and adequately addresses the challenge.”

YouTube, however, is not the only social media platform implementing such safety policies meant to protect users from harmful and false information.

Similarly, Twitter and Facebook are also fighting to limit misinformation about vaccines online as much as possible nowadays.

Regardless, keep in mind that fake content still manages to make its way to these platforms, sometimes with disastrous results.

One example is the spread of a disproven belief that horse dewormer, Ivermectin, is efficient in healing and preventing COVID-19 infections.

In reality, this drug made for large animals is very dangerous for your health, a concerning number of people having ended up in the emergency room with poisoning after believing the unreliable posts about its efficiency online.

Post Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.