Facebook purges accounts, groups related to the 'disinformation dozen'

Facebook removed a slew of pages, groups and accounts for spreading COVID-19 vaccine misinformation on its platforms.
By Emily Olsen
11:40 am
Share

Photo: FG Trade/Getty Images

Facebook said it removed more than three dozen pages, groups and Facebook or Instagram accounts for spreading misinformation about COVID-19 vaccines.

The social media giant said the accounts were linked to 12 people who were accused in a much cited report by Center for Countering Digital Hate of disseminating the vast majority of vaccine misinformation on the network.

But Facebook disputes that claim, arguing the analysis was based on a small set of content from only 30 groups. 

“They are in no way representative of the hundreds of millions of posts that people have shared about COVID-19 vaccines in the past months on Facebook,” wrote Monika Bickert, vice president, content policy, at Facebook.

“Moreover, focusing on such a small group of people distracts from the complex challenges we all face in addressing misinformation about COVID-19 vaccines.”

Facebook’s report said it also imposed penalties on nearly two dozen additional pages, groups and accounts linked to the 12 people, which would move their posts lower in the news feed and not recommend the information to other users.

The company has also penalized some web domains as well, so posts containing those links from unrelated accounts will also be pushed down in the news feed.

Facebook said it had removed more than 3,000 accounts, pages and groups, and 20 million pieces of content for spreading misinformation since the beginning of the pandemic.

WHY IT MATTERS

Just over half of the U.S. population is fully vaccinated, according to the Centers for Disease Control and Prevention

But the vaccines remain the best way to prevent infection or serious outcomes from COVID-19. An analysis by the Kaiser Family Foundation in late July found breakthrough cases, hospitalizations and deaths were still rare for people who are fully vaccinated. 

Earlier this week, U.S. health officials announced plans to distribute booster shots to Americans eight months after their second shot of the Pfizer or Moderna vaccine, in an attempt to hold off the more contagious Delta variant.

“The COVID-19 vaccines that are authorized in the United States have been remarkably effective, even against the widespread Delta variant,” said Surgeon General Dr. Vivek Murthy in a press briefing. “But we know that even highly effective vaccines become less effective over time.”

THE LARGER TREND

The social media giant has received a slew of bad press for allowing misinformation to spread on its platforms during the pandemic.

Facebook won’t share its data with the White House, even though administration officials argue that information could help them understand how to overcome vaccine skepticism, according to reporting by The Washington Post.

The company has removed content that violates its rules. Earlier this month, Facebook said it removed 65 of its own and 243 Instagram accounts for spreading misinformation about the Pfizer and AstraZeneca vaccines. 

The company also removed the #VaccinesKill hashtag in July after criticism from the surgeon general and President Joe Biden, who said social media misinformation about vaccines was "killing people." Biden later walked back his comment.

Facebook’s head of health, Kang-Xing Jin, said the company is also trying to promote vaccination on its platforms by helping people access accurate information about the vaccines and share their vaccination status in support.

“One of the most effective strategies we found was amplifying content from people you know and trust who are wearing masks, and that performed better than just information,” Jin said at a HIMSS21 Digital panel.

Share