Politics & Government

Russian trolls are coming for 2020, smarter than ever, Clemson researchers warn

Many Americans think they know what a Russian troll looks like.

After the 2016 election, voters are more aware of bad actors on social media who might be trying to influence their opinion and their vote on behalf of a foreign government.

But Clemson University professors Darren Linvill and Patrick Warren warn that picture may not be accurate.

“People I know — smart, educated people — send me something all the time and say ‘Is this a Russian? Is this foreign disinformation?’” said Linvill, a communications professor at the Upstate university. “And it’s just someone saying something they disagree with. It’s just someone being racist. That’s not what disinformation looks like.”

Linvill and Warren, who teaches economics, would know. The two compiled a database of roughly 3 million tweets identified as the products of Russian government-backed accounts both before and after the 2016 election.

Now, the researchers say there are no signs Russia — and even other countries — have slowed their efforts to manipulate social media for their own ends, and are getting more sophisticated about how they use it.

They highlight the case of @PoliteMelanie, a popular account with 20,000 followers that garnered media attention and thousands of retweets from actual Americans on items both politically snarky and mundane.

She was also, as Linvill and Warren laid out in a Washington Post column, apparently a product of the St. Petersburg-based Internet Research Agency, often identified as a Russian “troll factory.” Melanie’s account has since been suspended by Twitter.

Trolls are not using “crass, vitriolic, vodka-fueled attacks featuring broken English and spreading fake news,” according to the Clemson professors’ research. Instead, trolls are actively working to be your online friend and feed you just enough (accurate, but slanted) information to feed your pre-existing inclinations toward division, anger and mistrust.

The pair’s most recent work focuses on two related phenomena: accounts that have been “touched by a troll” — which are accounts that have been engaged with by fake Russian profiles to expand the latter’s reach — and accounts that are “touching trolls,” actively, but unknowingly, engaging with faked accounts.

“Just because someone agrees with you doesn’t mean they are on your side,” Warren said. “They’re not trying to start fights. They want to pull you in the way you were already leaning. You have to be willing to check on things even when they agree with your priors.”

Linvill puts it succinctly. “It’s PR and marketing, not Boris and Natasha,” he said.

‘Effing with Americans’

Both professors have continued to monitor online troll activity since the last election, and they say candidates, the government and the media have to be prepared for a renewed effort to interfere with the 2020 election. Warren expects foreign actors will use “some combination of social media and hacking,” just like the targeted leaks of stolen emails from the Democratic National Committee and Clinton campaign chairman John Podesta, to disrupt the 2016 election.

Having actual documents to drive a political conversation, he says, is much more effective than making up false stories.

“Fake news is not so much of a problem,” Warren said. “It hurts you because you get shut down, but also because it establishes a reputation among your co-partisans that you say things that are demonstrably false and you’re not credible.”

The researchers have seen Russian-style tactics adopted by other countries like Iran and Venezuela, but those countries are less interested in influencing Americans’ domestic political conversations.

“They do a lot of defensive tweeting,” Linvill said. “It’s more straightforward propaganda ... about things that are of interest to those countries. I follow Saudi bots that, after the murder of (Washington Post columnist Jamal) Khashoggi, tweeted about how wonderful (Saudi Crown Prince Mohammad bin Salman) is, and just propaganda about ‘don’t pay attention to the man behind the curtain.’

“Russia does that when they need to ... but they are more offensive,” Linvill said. “It’s more just ‘effing’ with Americans.”

‘That’s what they want us to think’

What makes the situation this time around different from 2016 is a stronger vigilance for evidence of manipulation. Social media is somewhat better at identifying and blocking fake accounts.

Twitter is now active and open enough about it that the Clemson professors no longer update their own database, although Linvill believes he can identify other Russian accounts that haven’t been flagged, and that Facebook has been much less open about how it polices content on its site.

For his part, Warren worries that the U.S. reaction to 2016 won’t be as strong as it could be, partly because President Donald Trump — whose election Russia purportedly sought to secure — has denied and downplayed the extent of Russian interference in the first place.

“There is good evidence that the degree to which it’s been tied up with partisanship has made it harder for the U.S. versus our European allies,” Warren said.

The next, and potentially more disruptive, step would be for American political campaigns to adopt the same tactics — something they already see popping up in races like the contentious 2017 Alabama Senate election.

What’s more, increased vigilance against online disinformation can actually help the bad actors’ cause if it means you distrust your fellow citizens more.

“It’s dangerous to assume you’re talking to a Russian troll, because that’s what they want us to think,” Linvill said. “This has caused us to distrust each other.”

The key to combating this kind of interference, they argue, is greater digital literacy — a better understanding of the sources of information, their motives and the ability to distinguish reliable from unreliable information, especially among less tech-savvy older voters.

“If it’s pulling you in a more extreme direction, it really doesn’t matter if it’s the Russians,” Warren said. Better online tools “help with how you interact with trolls, but it also helps how you interact with grandma.”

Related stories from The State in Columbia SC

Bristow Marchant is currently split between covering Richland County and the 2020 presidential race. He has more than 10 years’ experience covering South Carolina. He won the S.C. Press Association’s 2015 award for Best Series on a toxic Chester County landfill fire, and was part of The State’s award-winning 2016 election coverage.
  Comments