The first half of Thursday’s Senate Select Intelligence Committee’s hearing on Russian disinformation campaigns wasn’t quite as fun as watching James Comey squirm his way around classified intel in the House, but it did provide some valuable context on Russian cyber methods and social media campaigns.
The morning hearing lacked the big names that are a little further down the pike, including former Trump campaign members Paul Manfort and Roger Stone and Trump son-in-law and senior advisor Jared Kushner, focusing instead on academics and researchers.
While most of the hearing was scene-setting, including a thorough Russian counterintelligence history lesson, some of its more compelling insights came from Clint Watts, a senior fellow with the Foreign Policy Research Institute’s program on national security. Watts, a former FBI special agent who specializes in terrorism and online influence campaigns, spelled out some of the methods Russia uses to create and optimize its disinformation efforts.
“The first thing we need to understand is it’s not all automated and it’s not all human, it’s a combination of the two,” Watts explained to the committee. “You can have someone engaging with you as an individual and using a bot to amplify their message… or [they] can create more personas on Twitter, for example.” This sort of thing saw an uptick in 2014, but Watts says it wasn’t until 2015 that “they tied hacking and influence together for the first time.”
He went on to explain how Russia state actors create believable sock puppet accounts by insinuating themselves into the middle of a demographic they wish to influence. Using Wisconsin as an example, Watts described how such an actor would first “inhale” all of the accounts from a given slice of the population, parsing out details so they can then replicate the prevailing qualities in an average account. “They look exactly like you. It looks like an American from the Midwest or the South.”
They go on to build an audience within that target group, but they run into a bit of a problem when they wish to rebrand the sock puppet with a different identity. “They build an audience they don’t want to get rid of it,” Watts explained. “These accounts are reprogrammed strategically. Then when they play both sides the audience will go with them once they have them.”
Those strategic accounts then work together to create the news of the day. Accounts associated with Russian intelligence “tweet heavily at Trump during times they know he’s online [in order] to push conspiracy theories.”
After pushing coordinated waves of the propaganda du jour, such a high volume of content usually ends up trending. “Once it pushes to the top of the feed, mainstream news pays attention.”
When established news organizations must weigh in in order to debunk those fake news pushes, it still sets the national conversation for the day and distracts from other news that might work against Russia’s interests, continuing the story’s spread “organically.”
To combat this, Watts thinks that news stories need warning labels or certifications, not unlike nutrition labels on food, “a trusted integer that you can go to” so news consumers would know what they’re getting.
The Senate committee stands in sharp contrast to the corresponding investigation in the House, which this week stalled after erratic behavior from its chair led to a full-fledged partisan meltdown. In the Senate, Republican Chairman Richard Burr and Democratic Vice Chairman Mark Warner appeared Wednesday to put their spirit of bipartisan cooperation on display, and led with opening remarks at Thursday’s hearing that were friendly and collaborative.