Researchers said the results surprised them. Much of the anti-vaccine content posted on social media platforms such as Facebook and Twitter may appear to be organic, grass-roots discussions led by neighborhood groups and concerned parents, said David A. Broniatowski, an associate professor at George Washington University and one of the authors of the study.
“In fact,” said Broniatowski, who studies group decision-making, “what we are seeing is a small number of motivated interests that are trying to disseminate a lot of harmful content.” The small group of anti-vaccine ad buyers successfully used the ads to reach targeted audiences.
The study was conducted before Facebook changed its policies around anti-vaccine advertising, but researchers said it provides a look at how the platform has been used to spread misinformation. The study also provides a baseline for researchers to evaluate how well Facebook’s new policies are working, said Amelia Jamison, a social science researcher at the University of Maryland and another study author.
The report in the journal Vaccine is the first to study anti-vaccine advertisements in Facebook’s advertising archive. The platform, a publicly available and searchable repository, was introduced by Facebook in 2018 to improve transparency related to certain forms of advertising considered of “national importance.” The social media giant has repeatedly come under fire for allowing the promotion of anti-vaccine material.
In recent years, false claims on social media about vaccines have led growing numbers of parents to shun or delay getting their children vaccinated. Misinformation and skepticism about the safety of the measles-mumps-rubella vaccine contributed significantly to the nearly year-long measles outbreak in the United States that ended in October. The potentially deadly disease surged to 1,261 cases this year, as of Nov. 7, the highest number in nearly three decades. Anti-vaccine activists also spread misinformation about vaccine-preventable diseases, downplaying their danger.
Earlier this year, The Post reported on a wealthy Manhattan couple who have emerged as major financiers of the anti-vaccine movement. Hedge fund manager and philanthropist Bernard Selz and his wife, Lisa, have contributed more than $3 million in recent years to a handful of activists who have played an outsize role in the anti-vaccine movement.
Kennedy is another major player in anti-vaccine publicity and support. The attorney and nephew of president John F. Kennedy runs the Children’s Health Defense, which is closely aligned with the World Mercury Project. The group’s overall message falsely claims that vaccines are contributing to a vast array of childhood illnesses. In May, Kennedy’s brother, sister and niece publicly criticized him, saying he has helped “spread dangerous misinformation over social media and is complicit in sowing distrust of the science behind vaccines.”
The group Stop Mandatory Vaccination is headed by Larry Cook, who calls himself “an advocate for natural living.” On his website, Cook says he uses donations to pay for Facebook advertising, among other expenses, including his personal bills. “All donations to me go directly to me and into my bank account,” he writes on the site. Many advertisements his group funded featured stories of infants allegedly harmed by vaccines, researchers found.
Broniatowski and colleagues at the University of Maryland and Johns Hopkins University searched Facebook’s Ad Archive, now called the Ad Library, for vaccine-related ads at two points: December 2018 and February of this year. Of 309 relevant advertisements, 163 were pro-vaccine and 145 were anti-vaccine. The messages promoting vaccination did not have a common or organized theme or funder. They focused on trying to get people vaccinated against a specific disease, such as ads for a flu vaccine clinic, or were part of the Gates Foundation campaign against polio, for example.
Despite a similar number of advertisements, there were 83 different groups that promoted vaccinations, while five groups accounted for 75 percent of anti-vaccine messages. The top two were the World Mercury Project and Stop Mandatory Vaccination.
Many pro-vaccine advertisements were taken down by Facebook, researchers found, because first-time buyers failed to fill out required information disclosing their funding. That ends up inadvertently removing science-based information.
“So people are getting penalized not for the content but for being unfamiliar with this platform,” Broniatowski said. That creates a bias for organizations with more resources and familiarity with Facebook’s advertising, he said, pointing to the two groups funding the majority of the anti-vaccine messages. “They are very, very, very used to this platform and know how to use it effectively.”
An ad by the Utah Cancer Control Program about cancer prevention with the HPV vaccine, for example, was taken down by Facebook because it had an incomplete disclaimer about its funding, researchers found.
Facebook’s decision to categorize vaccines as an issue of “national importance” also frames the issue as a debate, rather than one on which there is widespread public agreement and scientific consensus, researchers said.
In March, after mounting public pressure, Facebook announced that it would reject ads that include “misinformation about vaccinations” and would block advertisements that include false content about vaccines as part of a wider crackdown against vaccine conspiracy theories.
“We tackle vaccine misinformation on Facebook by reducing its distribution and connecting people with authoritative information from experts on the topic,” a Facebook company spokesperson said. “We partner with leading public health organizations, such as the World Health Organization, which has publicly identified vaccine hoaxes — if these hoaxes appear on Facebook, we will take action against them — including rejecting ads.”