A new study suggests the same Russian-intelligence-backed Twitter accounts indicted for social media interference in the 2016 presidential election also pushed vaccination misinformation.
Scientists at George Washington University and Johns Hopkins University collected three years' worth of vaccine-related tweets to see how Russian troll activity compared to real users' tweets. They found these bot accounts shared anti-vaccination messages 75 percent more often than the average Twitter user.
The team reviewed more than 250 tweets about vaccination that came from the Internet Research Agency. The Russian government-backed group was indicted for its attempts to interfere in the 2016 U.S. elections with memes, Facebook posts and tweets.
This research shows the group spread pro- and anti-vaccination messages to the same effect: not necessarily to endorse one side over the other, but to exploit the ongoing arguments and tension over vaccination.
And this sort of disinformation has been shown to have certain measurable health consequences. Studies show exposure to anti-vaccination material is associated with an increased chance of parents declining vaccinations for their kids. And parents who delayed or refused vaccines are significantly more likely to have looked for vaccine information on the internet instead of asking health care providers.
And researchers worry the effects could get worse. If vaccination rates fall, disease can spread more easily, which could lead to outbreaks that might have otherwise been prevented. Europe, for example, is dealing with record numbers of measles cases due to low vaccination rates. Some health workers there say vaccine disinformation has been part of the problem.
Researchers on this new study warn disinformation in the U.S. or elsewhere could have similar effects. Obviously, viruses don't care about state lines or national borders: If trust in vaccination falls, the risk to everyone grows.