Home Tech YouTube Bans Anti-Vaccine Misinformation – The New York Times

YouTube Bans Anti-Vaccine Misinformation – The New York Times

by Mary Sewell

YouTube said on Wednesday that it was banning the accounts of several prominent anti-vaccine activists from its platform, including those of Joseph Mercola and Robert F. Kennedy Jr., as part of an effort to remove all content that falsely claims that approved vaccines are dangerous.

In a blog post, YouTube said it would remove videos claiming that vaccines do not reduce transmission rates or contraction of the disease and content that includes misinformation on the makeup of the vaccines. Claims that approved vaccines cause autism, cancer or infertility, or that the vaccines contain trackers will also be removed.

The platform, owned by Google, has had a similar ban on misinformation about the Covid-19 vaccines. But the new policy expands the rules to misleading claims about long-approved vaccines, such as those against measles and hepatitis B, as well as to falsehoods about vaccines in general, YouTube said. Personal testimonies relating to vaccines, content about vaccine policies and new vaccine trials, and historical videos about vaccine successes or failures will remain on the site.

“Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board” in policies that bring its user’s high-quality information, the company said in its announcement. In addition to banning Dr. Mercola and Mr. Kennedy, YouTube removed the accounts of other prominent anti-vaccination activists such as Erin Elizabeth and Sherri Tenpenny, a company spokeswoman said.

The new policy puts YouTube more in line with Facebook and Twitter. In February, Facebook said it would remove posts with erroneous claims about vaccines, including taking down assertions that vaccines cause autism or that it is safer for people to contract the coronavirus than to receive vaccinations against it. But the platform remains a popular destination for people discussing misinformation, such as the unfounded claim that the pharmaceutical drug ivermectin is an effective treatment for Covid-19.

In March, Twitter introduced its own policy that explained the penalties for sharing lies about the virus and vaccines. But the company has a five “strikes” rule before it permanently bars people for violating its coronavirus misinformation policy.

The accounts of such high-profile anti-vaccination activists like Dr. Mercola and Mr. Kennedy remain active on Facebook and Twitter — although Instagram, which is owned by Facebook, has suspended Mr. Kennedy’s account.

For years, misinformation researchers have pointed to the proliferation of anti-vaccine content on social networks as a factor in vaccine hesitation — including slowing rates of Covid-19 vaccine adoption in more conservative states. Reporting has shown that YouTube videos often act as the source of content that goes viral on platforms like Facebook and Twitter, sometimes racking up tens of millions of views.

“One platform’s policies affect enforcement across all the others because of the way networks work across services,” said Evelyn Douek, a lecturer at Harvard Law School. The latter focuses on online speech and misinformation. “YouTube is one of the most highly linked domains on Facebook, for example.”

You may also like

Leave a Comment