Facebook, Instagram to limit spread of vaccine misinformation

Pages, groups and ads whose content includes "verifiable vaccine hoaxes" will be taking a backseat.
By Dave Muoio
02:33 pm
Share

Facebook has detailed its new strategy to curb the spread of vaccine misinformation on both its primary social media platform as well as Instagram. Broadly, these include efforts to reduce the prominence of (but not outright ban) certain flagged groups, pages and search results, as well as the adoption of a firm stance against advertising content including false statements about vaccination.

“Leading global health organizations, such as the World Health Organization and the US Centers for Disease Control and Prevention, have publicly identified verifiable vaccine hoaxes,” Monika Bickert, VP of global policy management at Facebook, wrote in a news post on the company’s website. “If these vaccine hoaxes appear on Facebook, we will take action against them.”

Bickert gave the example of a group or page admin posting inaccurate information regarding vaccination. Once identified, Facebook would cut that group or page from its automated recommendations and limit how often they are surfaced within search results or the News Feed. Similarly, the company said that it would not be showing or recommending this kind of material through Instagram Explore or hashtag pages.

In the advertising space, Bickert said that any ads including vaccine misinformation will be rejected, and that targeting options related to the subject have already been removed. Facebook may also completely disable ad accounts that are repeat offenders.

“We also believe in providing people with additional context so they can decide whether to read, share, or engage in conversations about information they see on Facebook,” Bickert wrote. “We are exploring ways to give people more accurate information from expert organizations about vaccines at the top of results for related searches, on Pages discussing the topic, and on invitations to join groups about the topic. We will have an update on this soon.”

Why it matters

Vaccine-preventable diseases such as measles are on the rise, with a number of identified outbreaks across the US coinciding with recent data from CDC and academic researchers suggesting an increase in vaccine exemption rates. Many over the years have discussed how social media platforms can help disseminating misleading health information, with the debate coming to a head this past week during a Senate panel hearing that specifically highlighted the role Facebook has played in the growing public health threat.

A conscious effort by the company could limit the spread of such misinformation, and will likely help to limit a broader anti-vaccination movement that is largely reliant on debunked links to developmental diseases and an interest in preserving individual liberties at the cost of public health.

What’s the trend

Facebook is the latest in a string of media platforms who have announced new policies clamping down on vaccine misinformation. BuzzFeed News reported in February that YouTube would be demonetizing anti-vaccine content, citing its broader policy against the monetization of videos with “dangerous and harmful” content. Pinterest took an even harder stance, and chose to disable search results for any terms relating to vaccination after automated efforts to remove the content came up short, Wall Street Journal reports.

Beyond this issue, the past couple of years has also seen Facebook take a more proactive approach to digital health, with initiatives ranging from blood donation sign ups to increased surfacing of addiction resources during searches.

Share