Social Media

How Facebook's Fake News Fight Helped Circulate Vaccine Misinformation

Reports show that some social media algorithms designed to fight fake news are pointing people to vaccine misinformation before real info.

How Facebook's Fake News Fight Helped Circulate Vaccine Misinformation
Getty Images / Joe Raedle
SMS

Social media giants have struggled lately to stop misinformation campaigns on their platforms, and now they're coming under fire again for their possible role in outbreaks of measles around the world. According to a recent investigation, Facebook and Youtube's recommendation tools routinely promote unscientific vaccine misinformation instead of verified, factual sources. This has drawn the attention of lawmakers who want to how these companies plan to solve the problem. 

The big social media companies do take extra steps to knock down misinformation that can cause real-world harm — but those policies are meant to combat politically-charged or viral misinformation. Facebook recently told the Washington Post that anti-vaccination content doesn't fit into this frame. In early February, YouTube echoed Facebook, saying vaccination videos weren't a primary target of its policies.

Now, unregulated misinformation is easy to find. A Guardian report found when users with no friends or likes entered neutral terms like “vaccine” into Facebook or YouTube’s search engines, the top results skewed toward anti-vaccination information. 

US Lawmakers Want To Question Facebook About Closed Group Privacy
US Lawmakers Want To Question Facebook About Closed Group Privacy

US Lawmakers Want To Question Facebook About Closed Group Privacy

The House Committee on Energy and Commerce sent a letter to Facebook chairman and CEO Mark Zuckerberg demanding a briefing about the accusations.

LEARN MORE

Media researchers say Facebook seems to be responsible for most of the misinformation, in part because of the steps it took to fight fake news after the 2016 U.S. Presidential election. Facebook said it would refocus its Groups feature to direct people to join more groups tailored to their interests, and to make the content from those groups appear on their timeline more often. In just two years, Facebook group membership jumped 40% to 1.4 billion monthly users.

Now, experts say these groups are harder to regulate or monitor than before — and because Facebook's business model is built on getting people to join more groups, misinformation can spread to hundreds of thousands of people at a time. According to a Credibility Coalition and Health Feedback study, Facebook accounted for 96% of shares of the 100-most engaged health stories on social media — and less than half of those 100 stories were considered "highly credible."

U.S. Rep. Adam Schiff has sent letters to Facebook and Google to ask how they currently address this problem, and what additional steps they plan to take. Specifically, Schiff asked how these companies planned to "distinguish quality information from misinformation or misleading information" in its algorithms.

So far, the companies have responded with basic steps. YouTube said it would stop recommending what it calls "borderline content" in its autoplay algorithm. Facebook said it would explore "additional measures to best combat the problem" like reducing the amount of antivaccine content that shows up in its searches.