YouTube on Wednesday announced it is banning “harmful vaccine content” from the platform, a long-awaited shift that vastly expands its existing policy on medical misinformation.
The new rules explicitly ban content that makes false claims about any “currently administered vaccines that are approved and confirmed to be safe and effective” by local health authorities and the World Health Organization. Previously, similar rules applied only to misinformation about COVID-19 vaccines, as well as videos that uncritically presented unproven COVID treatments like hydroxychloroquine and ivermectin.
In an emailed statement, Matt Halprin, YouTube’s vice president of global trust and safety, said the company didn’t implement the new rules sooner because it was focused on COVID. Its work countering coronavirus falsehoods is now informing its efforts at identifying and removing all anti-vaccine content.
“We dedicated a lot of resources to COVID-19 vaccine misinformation, and learned important lessons about developing and enforcing this type of policy at scale. Developing robust policies takes time, as does preparing for enforcement in many different languages, and we wanted to launch a policy that is comprehensive, enforceable with consistency, and adequately addresses the challenge.”
In addition to claims about specific, routine immunizations like for measles or hepatitis B, broad claims about vaccines, in general, are also off-limits. Per YouTube, that includes:
– Content that falsely alleges that approved vaccines are dangerous and cause chronic health effects
– Claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines […]
– Content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them.
The company didn’t respond to questions about how offending content will be identified and removed. While artificial intelligence can filter and screen out specific keywords, it tends to be imprecise and can be easily circumvented.
Targeting the most prolific spreaders of false claims may go a long way toward curtailing the content. In March, a study by the Center for Countering Digital Hate found just that 12 people are responsible for 65% of the anti-vaccine content on Facebook, Instagram, and Twitter.
Among the so-called “Disinformation Dozen” are Joseph Mercola, who sells dietary supplements, and Robert F. Kennedy Jr., the son of the late Sen. Robert F. Kennedy and chairman of Children’s Health Defense, an anti-vaccine group.
YouTube said it banned both of their channels, in addition to a number of other high-profile anti-vaccine channels. Notably, both Mercola and Kennedy still have massive presences on Facebook. Despite Facebook’s repeated claims to have curtailed anti-vaccine content, medical misinformation there continues to run rampant.