Though Facebook has made changes to limit advertisements that promote unscientific health claims, its previous policies may have inadvertently promoted anti-vaccine content while suppressing vaccination campaigns, according to new findings from a team that includes University of Maryland researchers.

According to their pioneering study of health-related Facebook ads, published Wednesday in the journal Vaccine, a small group of anti-vaccine ad buyers successfully leveraged the social media platform to reach targeted audiences, while its efforts to improve transparency actually led to the removal of ads that promote vaccination and communicate scientific findings. 

The researchers from the UMD School of Public Health, the George Washington University and Johns Hopkins University warn that social media misinformation could contribute to increasing “vaccine hesitancy,” identified by the World Health Organization as one of the top threats to global health this year. This growing resistance to vaccination threatens progress against preventable diseases like measles, which recently has seen a 30% increase in cases worldwide. 

The team examined more than 500 vaccine-related ads posted on Facebook before the March 2019 policy change and archived in Facebook’s Ad Library, which since late 2018 has catalogued ad content related to important national issues. They found that 54% of ads that opposed vaccination were posted by only two groups funded by private individuals, the World Mercury Project and Stop Mandatory Vaccination, and emphasized the purported harms of vaccination. 

“The average person might think that this anti-vaccine movement is a grassroots effort led by parents, but what we see on Facebook is that there are a handful of well-connected, powerful people who are responsible for the majority of advertisements,” said Amelia Jamison, a faculty research assistant in the Maryland Center for Health Equity, and the study’s first author. “These buyers are more organized than people think.”

In contrast, ads promoting vaccination did not reflect a common or organized theme or funder, and focused instead on promoting vaccination against a specific disease in a targeted population. Examples included ads for a local Walmart’s flu shot clinic or the Gates Foundation campaign against polio. 

Yet, because Facebook categorizes ads about vaccines as “political,” it led the platform to reject some pro-vaccine messages. That’s because pro-vaccination advertisers don’t view placing an ad for a flu vaccine clinic, for instance, as political, and can accidentally run afoul of Facebook’s stringent policies, the researchers found. The small group of anti-vaccination advertisers, meanwhile, are more organized in their approach to clearing their ads with the social media giant.

“By accepting the framing of vaccine opponents—that vaccination is a political topic, rather than one on which there is widespread public agreement and scientific consensus—Facebook perpetuates the false idea that there is even a debate to be had,” said David Broniatowski, associate professor of engineering management and systems engineering at GW, and a principal investigator of the study. “This leads to increased vaccine hesitancy, and ultimately, more epidemics.”

Facebook is a pervasive presence in the lives of many people, meaning its decisions about how to handle vaccine messaging have far-reaching and serious consequences, said Sandra Crouse Quinn, professor and chair of the Department of Family Science at UMD’s School of Public Health, and a principal investigator on the study.

“In today’s social media world, Facebook looms large as a source of information for many, yet their policies have made it more difficult for users to discern what is legitimate, credible vaccine information,” Quinn said. “This puts public health officials, with limited staff resources for social media campaigns, at a true disadvantage, just when we need to communicate the urgency of vaccines as a means to protect our children and our families.”

The data for the study was collected in December 2018 and February 2019, before Facebook updated its advertising policies to limit the spread of vaccine-related misinformation. Thus, the study provides a baseline to compare how the new policies may change the reach of ads from anti-vaccine organizations. The new standards, issued in response to the proliferation of anti-vaccination misinformation that coincided with measles outbreaks across the U.S. in early 2019, included promises by Facebook to block ads that include false content about vaccines and disallow advertisers from targeting anti-vaccine ads to people who might be receptive.

Yet, the messengers may simply mutate their messages, virus-like, to avoid the tightening standards. “There is a whole set of ads that focus on themes of freedom’ or ‘choice’ and that elude the Facebook rules around vaccine ads,” Broniatowski said.  

The research team will continue to study how anti-vaccine arguments are spreading on Facebook and how the company is responding to demands from public health organizations to clean up its act.

“While everyone knows that Facebook can be used to spread misinformation, few people realize the control that advertisers have to target their message,” said Mark Dredze, John C. Malone associate professor of computer science at Johns Hopkins. “For a few thousand dollars, a small number of anti-vaccine groups can micro-target their message, exploiting vulnerabilities in the health of the public.”