Danny Molochok / AP
YouTube is cracking down on the spread of disinformation by banning misleading and inaccurate vaccine content.
The platform announced the change of a blog post Wednesday, explaining that its current EU guidelines, which already prohibit the sharing of false medical information, were extended to cover “currently administered” vaccines that have been shown to be safe by the World Health Organization and other officials. health.
The site had previously banned content containing false claims about COVID-19 vaccines under its COVID-19 Disinformation Policy. The change extends this policy to a much larger number of vaccines.
“We have regularly seen false claims about coronavirus vaccines turn into misinformation about vaccines in general, and we are now at a point where it is more important than ever to expand the work we started with COVID- 19 to other vaccines, ”the company says.
YouTube says it has already deleted pages
YouTube said it now bans videos that claim vaccines are not safe or effective or cause other health problems such as cancer and infertility. In its announcement, the company specifically flagged videos that inaccurately describe what ingredients are used in vaccines as well as claims that vaccines contain properties that can be used to “track” those who receive them.
There are a few exceptions: Users are still allowed to share content related to their personal experiences with the vaccine, but only if those videos meet the site’s community guidelines and the channel in question does not consistently promote “the”. reluctance to vaccinate “.
The new mandate goes into effect immediately and YouTube has already removed pages known to share anti-vaccination sentiments, such as those belonging to the children’s health advocacy organization of Joseph Mercola, Erin Elizabeth, Sherri Tenpenny and Robert F . Kennedy Jr., CNBC reported.
Company says widespread application will take time
But the company, which is owned by Google, has warned that the more widespread removal of videos could take some time as it struggles to enforce the policy.
As big tech companies like YouTube and Facebook have tightened their restrictions on vaccine misinformation over the past year, many conspiracy theorists have started to migrate to other, less regulated platforms. Rumble, another video-sharing site, has become a popular choice for far-right groups and others resistant to vaccines, Slate reported in March.
But many conservative pages that spread misinformation about vaccines are still active on YouTube, and their videos continue to garner millions of views.
Editor’s Note: Google is one of the financial backers of NPR.