Russian Twitter bots laid dormant for months before impersonating activists

Written by

Twitter accounts deployed by Russia’s troll factory in 2016 didn’t only spread disinformation meant to influence the U.S. presidential election. A small handful tried making a buck, too.

An analysis of 3,836 Twitter accounts and nearly 10 million tweets published Wednesday by Symantec shows how Russian efforts to amplify propaganda went further than previously reported.

The Kremlin’s Internet Research Agency thoughtfully planned its information operations, creating Twitter accounts an average of 177 days before sending the first tweet. Many accounts also posed as falsified news operations, and spread content meant to antagonize U.S. citizens across the political spectrum by impersonating users affiliated with pro-Trump and Black Lives Matter movements. Accounts typically remained active for 429 days, mostly from the beginning of 2015 through August 2016, when many suddenly went quiet.

“Most accounts were primarily automated, but they would frequently show signs of manual intervention, such as posting original content or slightly changing the wording of reposted [content], presumably in an attempt to make them appear more authentic and reduce the risk of their deletion,” Symantec engineer Gillian Cleary wrote in a blog post. “Fake news accounts were set up to monitor blog activity and automatically push new blog posts to Twitter. Auxiliary accounts were configured to retweet content pushed out by the main accounts.”

The most influential account, @TEN_GOP, described itself in its biography as the “Unofficial Twitter of Tennessee Republicans. Covering breaking news, national politics, foreign policy and more. #MAGA #2A.” The account was retweeted more than 6 million times, Symantec found. The user @Crystal1Johhnson, which had 3.7 million retweets, appeared more innocuous with the description, “It is our responsibility to promote the positive things that happen in our communities.”

Researchers also found 13 accounts that used monetized URL shorteners like Shorte.st, which promised to pay social media users $14.04 for every 1,000 clicks on that page.

Symantec goes on to suggest that, since one fake pro-Trump user had 4,123 followers, it’s “possible this account generated an income of $56.16 per tweet” and ultimately earned $1 million. But that figure is based on an unsound methodology that would have required every one of the account’s followers to click the link once, and for many of the people who saw the 8,362 retweets to have clicked on it, too.

The other 12 accounts that used link shorteners were less active, sending between one and 1,192 tweets with a service that might return a profit.

“[T]he fact that such services were only used by a minority of accounts suggests that some employees may have been trying to make money on the side,” Cleary wrote.

Symantec’s research comes amid a steady drumbeat of reports that offer more details at the ways Russian bots littered social media ahead of the election that propelled Donald Trump into the Oval Office. Research unveiled in December by Graphika and the University of Oxford revealed how many accounts sought to deter African-Americns from voting.

The U.S. sought to deter similar activity in the 2018 midterm elections by interrupting the internet access of the Kremlin-backed Internet Research Agency. Meanwhile, the National Security Agency and Cyber Command have made permanent their “Russia Small Group,” an intelligence team meant to disrupt digital threats from Moscow.