Facebook bans anti-vax ads — but anti-vax groups remain safe on the platform

In light of flu season and COVID-19 vaccine research, the social media giant draws a soft line in the sand

By Nicole Karlis

Senior Writer

Published October 14, 2020 5:47PM (EDT)

Mark Zuckerberg | Anti-Vax protester (Photo illustration by Salon/Getty Images)
Mark Zuckerberg | Anti-Vax protester (Photo illustration by Salon/Getty Images)

On Tuesday, Facebook announced that it will ban ads that promote an anti-vaccine agenda as the country approaches flu season. While the ban will affect paid advertisements with anti-vaccination messaging, Facebook's move oddly won't affect organically-shared content that surfaces from the dozens of anti-vax groups known for using the social media platform to radicalize people. 

The announcement was made in a blog post in conjunction with news about "a new flu vaccine information campaign." Specifically, Facebook said, as part of a new global policy the tech company will prohibit "ads discouraging people from getting vaccinated."

"We don't want these ads on our platform," Kang-Xing Jin, Facebook's Head of Health, and Rob Leathern, Director of Product Management, wrote in the blog post. "Our goal is to help messages about the safety and efficacy of vaccines reach a broad group of people, while prohibiting ads with misinformation that could harm public health efforts."

Previously, Facebook prohibited ads that promoted "vaccine hoaxes" that were identified by leading global health organizations—like the World Health Organization (WHO) or the US Centers for Disease Control and Prevention (CDC). The policy will reject any ad "explicitly" discouraging a person from getting a vaccine.

"Enforcement will begin over the next few days," the authors of the announcement said.

However, there are some caveats. Specifically, Facebook ads that advocate for or against legislation or policies about vaccines are still allowed on the social media platform.

"We'll continue to require anyone running these ads to get authorized and include a 'Paid for by' label so people can see who is behind them," the blog post stated. "We regularly refine our approach around ads that are about social issues to capture debates and discussions around sensitive topics happening on Facebook. Vaccines are no different. While we may narrow enforcement in some areas, we may expand it in others."

In other words, it's a loophole. And organic misinformation that promotes anti-vax rhetoric won't be prohibited, either.

Certainly Facebook is no stranger to dealing with anti-vaccination conspiracies on its platforms. In March 2019, the company announced that it would reject vaccine ads, yet somehow some slipped through the cracks.

According to the announcement, Facebook will be working with organizations like WHO and UNICEF on "public health messaging campaigns to increase immunization rates."

Facebook has been in the news a lot over the last couple weeks for announcing a series of censorship policies of its content after long resisting being an arbiter of truth on its platform. Recently, the company announced a ban on Holocaust denialism, groups promoting the QAnon conspiracy theory, and a ban on political ads indefinitely after the Nov. 3 election.

However, Facebook continues to resist a zero-tolerance policy, which is what experts say is needed for the social media company to tackle misinformation running rampant on the platform. As STAT News recently reported, Facebook could take a cue from Pinterest, which has implemented a zero-tolerance policy on vaccine misinformation.

For years, anti-vaccination advocates have used Facebook to organically spread public health misinformation, and the social media giant has undoubtedly inadvertently contributed to a growing global vaccine skeptic movement. In May, researchers published a study in Nature that showed a rise in followers of pages promoting anti-vaccine rhetoric on the social media platform between February and October of 2019. In the study, researchers found that there were more pages spreading misinformation about vaccines that were growing faster than those pages that shared actual factual vaccine content.

On Facebook, there are dozens of groups centered around "stopping" mandatory vaccinations in which group members give "health advice" to each other. In one exchange in a popular anti-vaccine group, a woman's son died from the flu in 2019 when she gave him thyme and elderberries instead of Tamiflu, according to CBS.

According to a new report by the Centre for Countering Digital Hate, 31 million people follow anti-vaccine groups on Facebook. Their  researchers found that these groups often "radicalize" those who are skeptical.

"The 64 groups identified in our research provide spaces for anti-vaccine misinformation to be shared with large audiences with little or no opportunity for scrutiny, challenge or oversight," the report states. "This makes them ripe for the process of radicalisation as posts in line with each group's prevailing values receive approval in the form of likes, while posters expressing contrary views are swiftly removed."

In 2019, the World Health Organization declared "vaccine hesitancy" as a public health threat. More troubling, only about half of U.S. adults said they would definitely or probably get a COVID-19 vaccine, according to Pew Research Center.

"There's been a lot of anti-science, anti-public health framing of the pandemic in recent times," Dr. Monica Schoch-Spana, a medical anthropologist and senior scholar with the Johns Hopkins Center for Health Security, said in a statement.


By Nicole Karlis

Nicole Karlis is a senior writer at Salon, specializing in health and science. Tweet her @nicolekarlis.

MORE FROM Nicole Karlis


Related Topics ------------------------------------------

Aggregate Anti-vaccination Movement Covid-19 Vaccine Facebook Misinformation Social Media