YouTube is expanding its vaccine misinformation policies to include all vaccines approved by health agencies, pursuing more aggressive enforcement against anti-vaccine content and deleting specific accounts.
The social media company will remove any content that “falsely alleges” vaccines approved by the World Health Organization (WHO) are dangerous or ineffective at reducing transmission, as well as content that claims the vaccines contain certain substances, YouTube said in a blog post Wednesday announcing its new guidelines. Examples include claims that vaccines cause autism, infertility, or include microchips and tracking devices, the company said.
YouTube will also remove conspiratorial vaccine content, such as claims that vaccines are part of a depopulation agenda. The tech company said it implemented these new policies after consulting with health experts.
YouTube is also removing specific accounts deemed to have a pattern of sharing vaccine misinformation, YouTube executives told several outlets Wednesday. These accounts include anti-vaccine advocates Joseph Mercola, Erin Elizabeth, Sherri Tenpenny, and Robert F. Kennedy Jr., all of whom were featured in a report titled “The Disinformation Dozen” by the Center for Countering Digital Hate which was referenced by the White House in its urging of Facebook to crack down on COVID-19 misinformation.
“There is information, not from us, but information from other researchers on health misinformation that has shown the earlier you can get information in front of someone before they form opinions, the better,” Garth Graham, YouTube’s global head of health care and public health partnerships, told reporters.
YouTube said it would make exceptions for certain types of content that express skepticism about vaccines, such as personal testimonies. However, the content would be removed “if the speaker then goes on to generalize and make calls for all parents not to vaccinate or makes broad claims about vaccines not being safe or effective,” the company’s vice president of global trust and safety Matt Halprin told reporters.
The company will also permit content about “vaccine policies, new vaccine trials, and historical vaccine successes or failures” in the interest of scientific discussion and debate.
YouTube has already removed over one million videos for promoting COVID-19 misinformation, including misinformation related to the COVID-19 vaccine. The company attracted controversy in August for suspending Sen. Rand Paul’s account over claims related to the efficacy of masks in curbing the spread of the virus.
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact [email protected]