Meanwhile, more players have learned from the Russian example and have started disinformation operations in their own countries, Facebook says. That includes networks of shadowy public relations firms that sometimes do work for both sides within a country, as well as politicians, fringe political groups, and governments themselves, said Nathaniel Gleicher, Facebook’s head of security policy, in a media call.
“It started out as an elite sport, but now we see more and more people getting into the game,” said Gleicher, who added that such efforts increasingly resemble influence operations that were conducted before social media, “narrower, more targeted, expensive, time-consuming, and with a lower success rate.”
In 2017, Facebook discovered a vast influence operation, in which the Russian Internet Research Agency had subjected 126 million of the platform’s users to political disinformation ahead of the previous year’s election. Since then, the social network has invested resources in policing its service — including hiring more than 10,000 third-party content moderators and subject matter experts — and building algorithms to scan for unwanted content.
The big caveat to the report is that Facebook and other social media platforms see only the nefarious operations that they uncover — and do not know about the broader universe of disinformation that goes undetected.
“I think we should be careful about saying that we know what the denominator is,” Gleicher said.
Some insiders have alleged that Facebook executives ignored certain areas of disinformation in some countries despite internal flags, according to reporting by The Washington Post and other media reports. They claim that political hesitancy around dinging certain politicians and parties, as well as prioritizing policing what are deemed more important elections, events, and geographies, have led to problems. Facebook has disputed those allegations.
In recent years, no other social media influence campaign that the company has detected has appeared to achieve the scale of the 2016 Russian operation. But the initial campaign also was unsophisticated in some respects. Posts often included grammatical errors that suggested non-English speakers were writing them, for example.
Since then, operators have had to devise new methods to co-opt the public.
One strategy has involved recruiting native speakers, and another involves seeking a more targeted audience to manipulate, according to the report. In early 2020, for example, Facebook disrupted a Russian military operation targeting Ukraine that created Facebook profiles of fake people purporting to be journalists. The fake journalists tried to contact and influence policymakers and influential people directly but did not appear to try to build a large Facebook audience, the report said. Russia adopted a similar strategy for a modest disinformation operation in the United States as well, although in that operation actual journalists were recruited under false pretenses to represent fabricated news outlets.
The report reveals significant trends, including how the number of foreign disinformation operations compares to domestic ones (slightly more domestic) and whether most disinformation appeared to be politically or financially motivated (the latter, but it’s not always possible to tell who is paying the shadowy PR firm).
The top countries Facebook identified as originators of most disinformation operations both domestic and foreign were Russia, Iran, Myanmar, the United States and Ukraine.
The countries that were most frequently targeted by foreign disinformation operations were the United States, Ukraine, Britain, Libya and Sudan.
As operations grow more sophisticated, it can become harder to distinguish them from authentic political activity, the report noted. That problem was particularly acute in the 2020 U.S. election, which the report described as a “a watershed moment in the recent history of influence operations.”
Russia, Iran and China all tried to influence public debate ahead of the vote, apparently with limited results, the report said. The most elaborate effort involved the Russian Internet Research Agency hiring people in Ghana to impersonate Black Americans discussing politics and issues of race. Facebook also discovered a shadowy network run by people in Mexico who posted about issues of Hispanic pride and the Black Lives Matter movement. The report noted that the FBI later connected this operation to the Russian IRA.
By contrast, domestic disinformation had a much greater impact that foreign. The five U.S.-based operations the company exposed heading into the 2020 election featured domestic political players who were abusing Facebook’s rules.
Four out of the five were on the political right.
One was Rally Forge, a U.S.-based marketing firm that hired a staff of teenagers to sow disinformation and was affiliated with the pro-Trump political action committee Turning Point USA, The Washington Post first reported. The others were groups affiliated with the violent conspiracy theory QAnon, a website dedicated to promoting white identity and criticizing immigration, an “inauthentic” network tied to Trump advisor Roger Stone and the Proud Boys militia group.
In addition, shortly after the election Facebook took down an “inauthentic” network tied to former Trump advisor Steve Bannon. The company did not put this takedown into the report because it did not rise to the level of a full-scale disinformation operation.
One trend the report highlighted was the rise in “perception hacking,” in which the prospect of an influence operation helps cast doubt on the authenticity of public debate.
As the United States headed into the 2018 midterms, Facebook found that Russia’s IRA had created and broadcast a website, usaira.ru, complete with an “election countdown” timer where the agency claimed to have been creating nearly 100 fake Instagram accounts.
“These fake accounts were hardly the hallmark of a sophisticated operation, rather they were an attempt to create the perception of influence,” the report noted.