October 5, 2017 at 11:58 AM
Facebook has said ads bought by Russian operatives reached 10 million of its users.
But does that include everyone reached by the information operation? Couldn’t the Russians also have created simple — and free — Facebook posts and hoped they went viral? And if so, how many times were these messages seen by Facebook’s massive user base?
The answers to those questions, which social media analyst Jonathan Albright studied for a research document he posted online Thursday, are: No. Yes. And hundreds of millions — perhaps many billions — of times.
“The primary push to influence wasn’t necessarily through paid advertising,” said Albright, research director of the Tow Center for Digital Journalism at Columbia University. “The best way to to understand this from a strategic perspective is organic reach.”
In other words, to understand Russia’s meddling in the U.S. election, the frame should not be the reach of the 3,000 ads that Facebook handed over to Congress and that were bought by a single Russian troll farm called the Internet Research Agency. Instead, the frame should be the reach of all the activity of the Russian-controlled accounts — each post, each “like,” each comment and also all of the ads. Looked at this way, the picture shifts dramatically. It is bigger — much bigger — but also somewhat different and more subtle than generally portrayed.
Albright, who also is a faculty associate at Harvard’s Berkman Klein Center for Internet & Society, has been studying fake news and Russian propaganda for months. And over the past week, as the names of some of the 470 Russian-bought pages and accounts have trickled out in news reports, he has been using a Facebook-owned analytics tool, called CrowdTangle, to measure the Russian campaign and also has downloaded the most recent 500 posts for each of them.
For six of the sites that have been made public — Blacktivists, United Muslims of America, Being Patriotic, Heart of Texas, Secured Borders and LGBT United — Albright found that the content had been “shared” 340 million times. That’s from a tiny sliver of the 470 accounts that have been made public. Even if those sites were unusually effective compared to the 464 others, Albright’s findings still suggest a total reach well into the billions of “shares” on Facebook.
The terminology is important here. For the purposes of these metrics, a “share” is essentially how often a post may have made its way into somebody’s Facebook “news feed” — without determining whether any of these users actually read the post. Another metric, called “interactions,” counts something narrower but more important -- the number of times individual users acted on what they had read by sharing a post with their Facebook “friends,” hitting the "like" button, making a comment or posting an emoji symbol.
That measurement for those six accounts, Albright's research showed, was 19.1 million. That means that more people had direct “interactions” with regular posts from just six accounts than saw the ads from all 470 pages and accounts that Facebook has identified as controlled by the Russian troll farm in St. Petersburg, called the Internet Research Agency.
Facebook had no immediate comment on Albright’s research. The company has shut down all 470 pages and accounts it has identified as controlled by the Internet Research Agency.
In a blog post on Monday, Elliot Schrage, vice president of policy and communications, wrote, "We’re still looking for abuse and bad actors on our platform — our internal investigation continues. We hope that by cooperating with Congress, the Special Counsel and our industry partners, we will help keep bad actors off our platform."
The other revelation in Albright’s download, including thousands of posts he has put online in an interactive graphic, is that most of them have nothing to do with the Nov. 8 election. Instead they are tailored to fit seamlessly into the ordinary online conversation of their particular audiences — politically activated African Americans, gay women, Muslims and people concerned about illegal immigration, Texan heritage or the treatment of veterans. There is talk of political issues, but relatively little about voting for Republican Donald Trump or against Democrat Hillary Clinton.
That suggests that the Facebook part of the Russian disinformation campaign consisted of at least two steps: The first was to identify voters and sort them into buckets based on the issues they responded to. This was done through the organic posts. The second step was to target voters in these buckets with Russian- bought political ads shaped to their interests, with the intention -- in at least some cases -- of affecting voting behavior.
“They were working to lead people along and develop a sense of trust,” Albright said.
The tone of the posts varies strikingly by the page. The one seemingly managed by a lesbian is intimate, confidential and chatty, with complaints about parents and teachers not understanding the challenges of being young and gay. The English is nearly flawless. One popular post said simply, "Bi and proud!" with a thumbs-up emoji attached to the end.
The United Muslim posts take pride in their religion, demand respect and seek to distance their faith from terrorism and ISIS. "Share if you believe Muslims have nothing to do with 9/11. Let's see how many people know the truth!," said one that reached 35,275 news feeds.
The Blacktivist posts are assertive and often angry, with many references to police violence against African Americans. Several urge the sharing of a video. One that reached 68,000 news feeds said, "There is a war going against black kids."
The other pages are conservative and anti-immigrant, with particular complaints about the treatment of U.S. veterans.
One on "Being Patriotic," said, "At least 50,000 homeless veterans are starving dying in the streets, but liberals want to invite 620,000 refugees and settle them among us. We have to take care of our own citizens, and it must be the primary goal for our politicians!" That post reached the news feeds of 724,323 Facebook users.
The most explicitly political of the posts may be from Secured Borders, which routinely refers to Clinton as “Killary” instead of Hillary.
Many of the posts Abright collected went viral, reaching tens or hundreds of thousands of Facebook news feeds, often by posing a question or calling for a response — tools known to people savvy in the use of social media.
“Issues that divide and outrage tend to spread the most, which is a common theme in social networks,” he said.
One final insight from Albright’s research: To the extent there is a discernible political motive in them, the goal seemed less to inspire enthusiasm for one candidate than to dampen support for voting at all. This fits with what many other researchers and investigators have said about the Russian disinformation campaign, that it drove directly at the fractures in American society and sought to widen them.
“A lot of these posts had the intent to get people not to vote,” Albright said. “This is a concerted effort of manipulation. Based on the engagement and reach and the outcome of the election .. I’d say it’s been fairly successful, sadly."