Russia’s meddling online went beyond the 2016 US presidential election and into public health, amplifying online debates about vaccines, according to a new study.
The recent research project was intended to study how social media and survey data can be used to better understand people’s decision-making process around vaccines. It ended up unmasking some unexpected key players in the vaccination debate: Russian trolls.
The study, published in the American Journal of Public Health on Thursday, suggests that what appeared to be Twitter accounts run by automated bots and Russian trolls masqueraded as legitimate users engaging in online vaccine debates. The bots and trolls disseminated both pro- and anti-vaccine messages between 2014 and 2017.
The researchers started examining Russian troll accounts as part of their study after NBC News published its database of more than 200,000 tweets tied to Russian-linked accounts this year. They noticed vaccine-related tweets among the Russian troll accounts, and some tweets even used the hashtag #VaccinateUS.
These known Russian troll accounts were tied to the Internet Research Agency, a company backed by the Russian government that specializes in online influence operations.
“We started looking at those tweets, and immediately, we were like, ‘These are kind of weird,’ ” said David Broniatowski, an assistant professor in the School of Engineering and Applied Science at George Washington University who was lead author of the study.
“One of the things about them that was weird was that they tried to — or they seemed to try to — relate vaccines to issues in American discourse, like racial disparities or class disparities that are not traditionally associated with vaccination,” Broniatowski said.
For instance, “one of the tweets we saw said something like ‘Only the elite get clean vaccines,’ which on its own seemed strange,” he said. After all, anti-vaccine messages tend to characterize vaccines as risky for all people, regardless of class or socioeconomic status, the researchers wrote in the study.
The consensus among doctors is that vaccines are safe, effective and important for public health, as they help reduce the spread of preventable disease and illness. A Pew Research Center study found last year that the vast majority of Americans support vaccine requirements.
Since the start of the new study, most of the tweets have been deleted as part of Twitter’s efforts to suspend Russian troll accounts, but Broniatowski said that he and his colleagues stored several in their own archives.
The researchers were stunned to find Russian troll accounts tweeting about vaccines, but unraveling why they would stoke the vaccine debate was mind-boggling, too.
Why trolls tweet about vaccines
For the study, the researchers collected and analyzed nearly 1.8 million tweets from July 2014 through September 2017.
While examining those vaccine-related tweets, the researchers discovered many bot accounts, including “content polluters,” which are accounts that disseminate malware or unsolicited commercial content. The researchers also uncovered a wide range of hidden online agendas.
When it came to the Russian troll accounts, the researchers found 253 tweets containing the #VaccinateUS hashtag among their sample. Among those tweets with the hashtag, 43% were pro-vaccine, 38% were anti-vaccine, and the remaining 19% were neutral.
By posting a variety of anti-, pro- and neutral tweets and directly confronting vaccine skeptics, trolls and bots “legitimize” the vaccine debate, the researchers wrote in the study.
“This is consistent with a strategy of promoting discord across a range of controversial topics — a known tactic employed by Russian troll accounts. Such strategies may undermine the public health: normalizing these debates may lead the public to question long-standing scientific consensus regarding vaccine efficacy,” they wrote.
Overall, the researchers found that Russian trolls, sophisticated bots and “content polluters” tweeted about vaccination at significantly higher rates compared with average users.
The study remains limited, in that it’s difficult to determine with 100% accuracy who is behind a Twitter account, and “the Internet Research Agency is certainly not the only set of trolls out there,” Broniatowski said.
Additionally, it’s even more difficult to determine an account’s true intent. But the researchers and other experts have some ideas about why Russia might want to fuel America’s vaccine debate.
It may be a strategy to promote political discord, Broniatowski said, adding, “we cannot say that with 100% certainty, because we’re not inside their head.”
“The Internet Research Agency has been known to engage in certain behaviors. There’s the one everybody knows about, which is the election. They also tend to engage in other topics that promote discord in American society,” Broniatowski said.
So, considering that the agency has engaged in hot-button debates online before to promote discord, the new study suggests that the intent could be the same when it comes to fueling vaccine debates.
Historically, the Russian government has not responded to CNN requests for comment regarding accusations of using social media to influence public opinion in the United States.
Between 2014 and 2017, the Internet Research Agency trolls were running many social media experiments to build division among Americans, said Patrick Warren, an associate professor of economics at Clemson University. Warren was not involved in the study but has conducted extensive research on Russian trolls.
The brief use of the #VaccinateUS hashtag among troll accounts could have been an experiment, he said.
“Apparently, they tried to get this hashtag going to get people to fight about vaccines, and it never got picked up,” said Warren, who shares with his colleagues a database of more than 3 million tweets from Internet Research Agency-linked social media accounts.
“I would call that an experiment that they abandoned,” he said of the hashtag.
Warren added that he was not surprised to learn about Russian trolls posting vaccine-related tweets.
“I don’t know if it would seem strange once you understand their goal, which is basically to divide both sides against the middle. They’re going to grab onto all of those social issues. So for example: black lives matter, all lives matter; immigrants are destroying America, immigrants are great for America,” Warren said.
“It’s basically the hot-button political issues of the day. They’re happy to grab onto whatever is salient,” he said. “I think that they want us focused on our own problems so that we don’t focus on them.
“If most of our energies are focused internally with divisions inside of the United States — or divisions between the United States and, say, Europe — that leaves a window open for Russia to expand its sphere of influence.”
So it seems, such an effort to spread divisive misinformation — including in the form of public health messaging — is nothing new.
In the 1980s, there was a Soviet campaign to spread false news about the AIDS epidemic in the US. The campaign began with placing an anonymous letter in an obscure newspaper in India, the Patriot, with the headline, “AIDS may invade India: Mystery disease caused by US experiments,” according to a 2009 article in Studies in Intelligence, a journal published by the CIA’s Center for the Study of Intelligence.
Eventually, “[w]ith the end of the Cold War, former Soviet and East German intelligence officers confirmed their services’ sponsorship of the AIDS disinformation campaign,” according to the article.
Messages ‘that aren’t scientifically sound’
Russian trolls could have amplified online vaccine debates in other countries as well, but more research is needed to determine that, said Renee DiResta, who researches disinformation online as the head of policy at Data For Democracy, a volunteer group of scientists and technologists, and who was not involved in the new study.
DiResta pointed to how Italy’s Five Star movement and its coalition partner, the far-right League party, both have voiced their opposition to compulsory vaccinations. She also has seen some Twitter accounts linked to Russian trolls tweeting in Italian — but she doesn’t speak the language to translate what those tweets say.
“We know that in Italy, the Five Star movement ran on an anti-vaccine platform. I do think it’s worth it to go look at the social media conversation in Italy to see if inauthentic accounts were capitalizing on those divisions or involved in that debate,” DiResta said.
In the meantime, however, she said that the new study on Russian trolls meddling in US online vaccine debates is an example of how there has been growing distrust in science and public health initiatives, such as those underlying vaccinations.
“Both real people and trolls are capitalizing on that mistrust to push conspiracy theories out to vulnerable audiences,” DiResta said.
“This isn’t just happening on Twitter. This is happening on Facebook, and this is happening on YouTube, where searching for vaccine information on social media returns a majority of anti-vaccine propaganda,” she said. “The social platforms have a responsibility to start investigating how this content is spreading and the impact these narratives are having on targeted audiences.”
Anti-vaccine sentiment has taken root in some European countries. Cases of measles have reached a record high in Europe this year, with more cases recorded in the first six months of 2018 than any other 12-month period this decade, the World Health Organization reported this week. In general, it remains unclear what influence online vaccine debates have on such sentiment — if at all.
More research is needed to determine how these actions by Twitter bots and trolls might impact public health, said Jon-Patrick Allem, a research scientist at the University of Southern California’s Keck School of Medicine, who was not involved in the study but has conducted separate research on social bots and trends.
“There are messages being put out there that aren’t scientifically sound,” Allem said.
“This has the potential to drown out scientifically sound messages from health care providers, and from the public health community in general, on the best way to make a health-related decision,” he said. “When people are looking at these messages, does it matter to them? Does it lead to an attitude change? Subsequently and ultimately, does it lead to a behavior change? Does a person who sees a thread on Twitter discussing the pros and cons about vaccination, does that cause hesitancy for a parent deciding to get their child vaccinated? These are the next sets of questions that will need to be answered.”