Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
News about coronavirus in Florida and around the world is constantly emerging. It's hard to stay on top of it all but Health News Florida can help. Our responsibility is to keep you informed, and to help discern what’s important for your family as you make what could be life-saving decisions.

Just 12 People Are Behind Most Vaccine Hoaxes On Social Media, Research Shows

The majority of anti-vaccine claims on social media trace back to a small number of influential figures, according to researchers.
Chandan Khanna
/
AFP via Getty Images
The majority of anti-vaccine claims on social media trace back to a small number of influential figures, according to researchers.

The majority of false claims about COVID-19 vaccines on social media trace back to just a handful of influential figures. So why don't the companies just shut them down?

Updated May 14, 2021 at 11:48 AM ET

Researchers have found just 12 people are responsible for the bulk of the misleading claims and outright lies about COVID-19 vaccines that proliferate on Facebook, Instagram and Twitter.

"The 'Disinformation Dozen' produce 65% of the shares of anti-vaccine misinformation on social media platforms," said Imran Ahmed, chief executive officer of the Center for Countering Digital Hate, which identified the accounts.

Now the vaccine rollout is reaching a critical stage in which most adults who want the vaccine have gotten it, but many others are holding out, these 12 influential social media users stand to have an outsize impact on the outcome.

After this story published on Thursday, Facebook said it had taken down more of the accounts run by these 12 individuals.

These figures are well-known to both researchers and the social networks. They include anti-vaccine activists, alternative health entrepreneurs and physicians. Some of them run multiple accounts across the different platforms. They often promote "natural health." Some even sell supplements and books.

Many of the messages about the COVID-19 vaccines being widely spread online mirror what's been said in the past about other vaccines by peddlers of health misinformation.

"It's almost like conspiracy theory Mad Libs. They just inserted the new claims," said John Gregory, deputy health editor at NewsGuard, which rates the credibility of news sites and has done its own tracking of COVID-19 and vaccine misinformation "superspreaders."

The claims from the "Disinformation Dozen" range from "denying that COVID exists, claiming that false cures are in fact the way to solve COVID and not vaccination, decrying vaccines and decrying doctors as being in some way venal or motivated by other factors when they recommend vaccines," Ahmed said.

Many of the 12, he said, have been spreading scientifically disproven medical claims and conspiracies for years.

Which provokes the question: Why have social media platforms only recently begun cracking down on their falsehoods?

Both members of Congress and state attorneys general have urged Facebook and Twitter to ban the "Disinformation Dozen."

"Getting Americans vaccinated is critical to putting this pandemic behind us. Vaccine disinformation spread online has deadly consequences, which is why I have called on social media platforms to take action against the accounts propagating the majority of these lies," Sen. Amy Klobuchar, D-Minn., told NPR.

Social networks crack down on COVID-19 vaccine claims

The companies have stopped short of taking all 12 figures offline entirely, but they have stepped up their fight: They've labeled misleading posts. They've removed falsehoods. In some cases, they've banned people who repeatedly share debunked claims.

After NPR's reporting, Facebook said it had taken additional action against some of the figures identified by the Center for Countering Digital Hate, several of which operate multiple accounts on the social network's apps. The company said on Thursday it had found new posts violating its rules.

Facebook has now removed 16 accounts from Facebook or Instagram and placed restrictions on 22 others, such as preventing them from being recommended to other users, reducing the reach of their posts and blocking them from promoting themselves through paid ads.

"We reacted early and aggressively to the COVID-19 pandemic by working with health experts to update our misinformation policy to target harmful claims about COVID-19 and vaccines, including taking action against some of the accounts in the CCDH report," spokesperson Kevin McAlister said in a statement. "In total, we've removed more than 16 million pieces of content which violate our policies and we continue to work with health experts to regularly update these policies as new facts and trends emerge."

However, Facebook also disputed the methodology of the center's report, saying it was not clear what criteria the group used to choose the set of social media posts at which it looked.

Twitter said it permanently suspended two of the "Disinformation Dozen" accounts for repeatedly breaking its rules, required other accounts to delete some tweets and applied labels that link to credible information about vaccines and don't allow the tweets to be shared or replied to. Overall, it's removed more than 22,400 tweets for violating its COVID-19 policies.

However, spokesperson Elizabeth Busby said Twitter distinguishes between "harmful vaccine misinformation that contradicts credible public health information, which is prohibited under our policy, and negative vaccine sentiment that is a matter of opinion."

And so the "Disinformation Dozen" are still easy to find on social media.

"Tried and true" tactics

Sometimes they skirt the platforms' rules by using codes.

"Instead of saying 'vaccine,' they may, in a video, hold up the V sign with their fingers and say, 'If you're around someone who has been' — hold up V sign — 'you know, X might happen to you,' " Ahmed said.

Or they take something true and distort it, such as falsely linking a famous person's death to the fact that the celebrity got a vaccine days or weeks earlier.

NewsGuard's Gregory said a "tried and true" tactic of vaccine opponents is "grossly misrepresenting some sort of research, some sort of data to promote whatever narrative they've chosen."

Facebook said it now limits the reach of posts that could discourage people from getting vaccinated, even if the messages don't explicitly break its rules.

But the cat-and-mouse game continues.

Anti-vaccine activists claim censorship

As the social networks have cracked down, some previously prolific spreaders of vaccine misinformation have toned down their posts and have told their followers they are being censored.

Take anti-vaccine activist Robert F. Kennedy Jr., one of the "Disinformation Dozen" identified by the center, who has promoted the long discredited idea that vaccines are linked to autism. During the pandemic, he has shared baseless conspiracy theories linking 5G cellular networks to the coronavirus, and suggested, without evidence, that the death of baseball great Hank Aaron was "part of a wave of suspicious deaths" tied to vaccines.

None of that is true.

Kennedy was kicked off Instagram, which Facebook owns, in February over repeatedly sharing debunked claims.

Yet Facebook did not remove him from its namesake platform. He told NPR the company has flagged some of his posts, however, so he has become more cautious.

"I have to post, like, unicorns and kitty cat pictures on there," he said. "I don't want to give them an excuse."

He also uses it to promote his website and newsletter, where he makes claims he cannot on the social network.

Kennedy said he's never posted misinformation and accused Facebook of censorship. He said the crackdown has cost "hundreds of thousands of dollars" in donations to his organization.

A battle of persuasion

Even as the social media companies have gotten tougher recently on misinformation, researchers worry the persistence of vaccine-related hoaxes will further erode confidence among people who hesitate to get the shot.

That's especially concerning as vaccines roll out for children 12 and up.

In a survey of U.S parents, Indiana University sociologist Jessica Calarco found more than a quarter don't plan to vaccinate their kids.

"So many of these moms are turning to Facebook, are turning to Twitter, are turning to other social media platforms" for news and information, she said. "And they're saying, 'Every time I open my phone, I see something different.' "

Even some parents whose kids have had routine childhood vaccines told Calarco they're unsure about COVID-19 jabs.

Facebook this week released survey data showing vaccine acceptance among adults in the U.S. has increased by 10% since January. However, its survey also shows that the top reasons people said they don't want to get vaccinated are worries about side effects and lack of trust in the vaccines or the government — exactly the kind of fears anti-vaccination accounts promote.

The social networks said amplifying credible information from authoritative sources, such as the Centers for Disease Control and Prevention, is just as important as reducing the spread of harmful posts. Both Facebook and Twitter link to public health information in their apps and in the labels they put on misleading posts.

But they now face an uphill battle of persuading the skeptics.

Calarco said many of the parents she spoke with weigh the posts they see on social media "equally against the kinds of expert medical recommendations, expert medical information coming out of things like the CDC."

Editor's note: Facebook is among NPR's financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.