Research Finds Bots and Russian Trolls Influenced Vaccine Discussion on Twitter

New Study Discovers Tactics Similar to Those Used During 2016 US Election

August 23, 2018

MEDIA CONTACTS:
Timothy Pierce: [email protected], 202-994-5647
Jason Shevrin: [email protected], 202-994-5631

WASHINGTON (Aug. 23, 2018)—Social media bots and Russian trolls promoted discord and spread false information about vaccines on Twitter, according to new research led by the George Washington University. Using tactics similar to those at work during the 2016 United States presidential election, these Twitter accounts entered into vaccine debates months before election season was underway. The study, “Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate” was published today in the American Journal of Public Health.

The team, which also includes researchers from the University of Maryland and Johns Hopkins University, examined thousands of tweets sent between July 2014 and September 2017. It discovered several accounts, now known to belong to the same Russian trolls who interfered in the U.S. election, as well as marketing and malware bots, tweeted about vaccines and skewed online health communications.

“The vast majority of Americans believe vaccines are safe and effective, but looking at Twitter gives the impression that there is a lot of debate. It turns out that many anti-vaccine tweets come from accounts whose provenance is unclear. These might be bots, human users or ‘cyborgs’ — hacked accounts that are sometimes taken over by bots. Although it’s impossible to know exactly how many tweets were generated by bots and trolls, our findings suggest that a significant portion of the online discourse about vaccines may be generated by malicious actors with a range of hidden agendas,” David Broniatowski, an assistant professor in GW’s School of Engineering and Applied Science, said.

For example, the researchers found that “content polluters” — bot accounts that distribute malware, unsolicited commercial content and disruptive materials — shared anti-vaccination messages 75 percent more than average Twitter users.

“Content polluters seem to use anti-vaccine messages as bait to entice their followers to click on advertisements and links to malicious websites. Ironically, content that promotes exposure to biological viruses may also promote exposure to computer viruses,” Sandra Crouse Quinn, a research team member and professor in the School of Public Health at the University of Maryland, said.

Russian trolls and more sophisticated bot accounts used a different tactic, posting equal amounts of pro- and anti-vaccination tweets. Dr. Broniatowski’s team reviewed more than 250 tweets about vaccination sent by accounts linked to the Internet Research Agency, a Russian government-backed company recently indicted by a U.S. grand jury because of its attempts to interfere in the 2016 U.S. elections. The researchers found the tweets used polarizing language linking vaccination to controversial issues in American society, such as racial and economic disparities.

“These trolls seem to be using vaccination as a wedge issue, promoting discord in American society,” Mark Dredze, a team member and professor of computer science at Johns Hopkins, said. “However, by playing both sides, they erode public trust in vaccination, exposing us all to the risk of infectious diseases. Viruses don’t respect national boundaries.”

-GW-