YouTube announces new vaccine content guidelines, aimed at stopping ‘harmful misinformation’

Coronavirus

Premium Getty WFLA Only

TAMPA, Fla. (WFLA) — YouTube announced a new set of vaccine misinformation content guidelines in a morning blog post, creating new rules for content that casts doubts on the safety and efficacy of COVID-19 vaccines.

The post on the company’s blog site is titled “Managing harmful vaccine content on YouTube” and announced what the video site called an expansion of medical misinformation policies.

The new guidelines direct content creators to not post content that “includes harmful misinformation about currently approved and administered vaccines” on topics such as vaccine safety, efficacy of vaccines and the ingredients in vaccines.

It’s a continuation of YouTube’s efforts to “prohibit certain types of medical misinformation.” The company announcement says the guidelines are similar to other content rules they’ve used in the past, such as removing content that “promotes harmful remedies” like drinking turpentine to cure illness.

In the past year, YouTube says it has removed more than 130,000 videos for breaking their COVID-19 vaccine policies, and created 10 new policies to handle the misinformation problem.

Now, the expanded rules will be targeted at “content that falsely alleges that approved vaccines are dangerous and cause chronic health effects.”

The guidelines on vaccine misinformation

The YouTube guideline page breaks down what it means for each of those three topics and provides examples of the content it refers to.

From the guidelines:

  • Vaccine safety: content alleging that vaccines cause chronic side effects, outside of rare side effects that are recognized by health authorities
  • Efficacy of vaccines: content claiming that vaccines do not reduce transmission or contraction of disease 
  • Ingredients in vaccines: content misrepresenting the substances contained in vaccines

As far as examples, YouTube mentioned claims about vaccines not having effects on contracting illnesses, containing substances or devices to track or identify recipients, containing items not on their ingredient lists, vaccines altering someone’s DNA, causing autism or causing chronic side effects like cancer or diabetes were just a few of the types of content not allowed on YouTube from the updated policy.

Still, the content policy change does state that some content that violates the new misinformation policies might be allowed if it includes additional context in the video, audio, title or description. However, it’s not a blanket statement to say anything that could be potentially harmful as long as the context is included.

“This is not a free pass to promote misinformation. Additional context may include countervailing views from local health authorities or medical experts,” according to the policy.

The company also reserves the right to allow exceptions to the misinformation guidelines, in the cases of content that is made “to condemn, dispute, or satirize misinformation that violates our policies.” YouTube will also allow for potential exceptions in cases of people showing their personal experiences with vaccines, whether firsthand or from members of creators’ families.

That being said, YouTube made it clear that while sharing experiences is allowed, “we recognize there is a difference between sharing personal experiences and promoting misinformation about vaccines. To address this balance, we will still remove content or channels if they include other policy violations or demonstrate a pattern of promoting vaccine misinformation.”

Consequences for violating guidelines

Content creators who violate the new rules will be subject to a 90-day, three-strike policy.

For a first violation, the offending content will be removed and the content creator will receive an email to let them know. If it’s their first time violating YouTube’s rules, they’ll also get a warning and will avoid a channel penalty.

However, if it’s not the first time a channel or creator has violated the site’s community guidelines, YouTube might issue a strike against the channel. If it gets three strikes over three months, the channel will be terminated.

If a single case of violation is severe enough or if the channel itself is “dedicated to a policy violation,” YouTube may terminate the channel upon a first offense.

The company says the new guidelines are also intended to remove content that promotes “vaccine hesitancy,” even if it comes in a personal testimonial.

Explaining the overall goal of the updated guidelines, YouTube’s blog reads in part:

All of this complements our ongoing work to raise up authoritative health information on our platform and connect people with credible, quality health content and sources.

Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board in the policies and products that bring high quality information to our viewers and the entire YouTube community.

YouTube announces new rules about Managing harmful vaccine content on YouTube

>>Follow Sam Sachs on Facebook
>>Follow Sam Sachs on Twitter

Copyright 2021 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Coronavirus Need-to-Know Info

More Coronavirus

Trending Stories

get the app

News App

Weather App

Don't Miss

More Don't Miss