Facebook announced Wednesday that it had removed three networks of accounts it says are associated with Russian oligarch Yevgeniy Prigozhin, the Kremlin-linked businessman U.S. authorities charged with interfering in the 2016 presidential election.

The networks — which included more than 170 Facebook accounts, pages and groups, as well as Instagram accounts, with nearly 1 million followers overall — targeted eight African nations with messages intended to bolster Russia’s political and commercial priorities. Some of the images featured Russian President Vladimir Putin alongside African leaders.

Facebook linked the coordinated influence operations to Prigozhin, who is known as “Putin’s chef” because of his politically connected catering business. He funded and oversaw the Internet Research Agency, based in St. Petersburg, that inundated Facebook, Twitter and other social media with messages designed to help elect President Trump, discourage African Americans from voting and inflame U.S. political discourse, authorities have said.

The African operations showed some similarities to that effort, including the willingness to promote messages on both sides of a political debate, but it also showed new tactics, such as the involvement of local partners in the Russian effort, said Nathaniel Gleicher, Facebook’s head of cybersecurity policy. He said that gives authorities at the national level more opportunities to detect and potentially shut down operations while also increasing the complexity for the Russians in building and maintaining their disinformation networks.

“This is the first time we’ve seen Russian actors franchising their operations over to Africans … to amplify their messages,” Gleicher said.

He added, “Part of what we’re seeing is these threat actors trying to evolve … but we’re evolving a bit faster, and we’re getting better at detecting them.”

Facebook’s announcement underscored the increasingly global nature of disinformation campaigns and the extent to which Russian operatives have not been chastened by the U.S. response to the 2016 interference campaign.

Rather, the Russians appear to be advancing their tactics and relying on new means of deception, such as using messaging groups, including Facebook-owned WhatsApp, to deliver disinformation. The targets, according to Facebook, were in Madagascar, the Central African Republic, Mozambique, Congo, Ivory Coast, Cameroon, Sudan and Libya.

The announcement also highlighted how disinformation is being carried out by quasi-private groups with shadowy links to nation-state actors. In this case, researchers said, the activity bore links to the Wagner Group, which experts have tied to Prigozhin. The Wagner Group has been characterized as a “semi-state” security organization. The rise of these opaque entities has accompanied Russia’s effort to assert itself on the international stage.

Prigozhin was indicted in 2018 by special counsel Robert S. Mueller III, who reported that the Internet Research Agency had an extensive network of paid human operatives, often called trolls, with ambitions to influence politics in many nations through social media. Efforts to reach Prigozhin through a Washington attorney representing his catering business were not successful.

The closing of the Russian-linked accounts — coming a week after Facebook also closed Instagram accounts targeting Democratic presidential candidate Joe Biden — shows how the company is moving forcefully against foreign sources of disinformation at a time when some critics say it is not doing enough to curb domestic sources. The company has faced particularly withering attacks for a policy, announced last month, allowing U.S. politicians to lie on its platforms without consequence.

Assisting Facebook in its investigation of the Russian disinformation networks were researchers from the Stanford Internet Observatory, headed by former Facebook chief security officer Alex Stamos.

The Stanford researchers said the disinformation network targeting Libyans appeared to be tied to Prigozhin and the Wagner Group. A key piece of evidence was communications about the effort originally collected by the Dossier Center, a project tracking Russian criminal activity that was founded by exiled business executive Mikhail Khodorkovsky.

Stanford researcher Shelby Grossman said the African disinformation operations that Facebook shut down resembled those of the Internet Research Agency in targeting U.S. audiences before and after the 2016 campaign. The messages were highly visual and shrewdly targeted, displaying a facility with issues designed to resonate locally.

But the tactics also showed evolution and a willingness to experiment with new methods for delivering messages and evading detection.

“They’re definitely trying out more things,” Grossman said.

One novel tactic, used in a page supporting Mozambique’s ruling party, was a seemingly benign online contest asking Facebook users to post images of themselves doing good deeds, presumably to increase engagement with the page. “The photo that has the most likes and reposts will be painted on a canvas by a plastic artist, as a gift, and published on our page,” said the post, written in Portuguese.

More familiar Russian disinformation tactics included the spreading of false and misleading reports, in some cases from accounts purporting to represent news organizations.

One Facebook page targeting Sudan, called Radio Africa, ran a post showing a picture of Putin below text reading, “American and British intelligence put together false information about Putin’s inner circle. … A diplomatic military source said that American and British intelligence agencies are preparing to leak false information about people close to the president, Vladimir Putin, and the leadership of the Russian defense ministry.”

The disinformation campaign in Sudan reposted stories from the Russian government’s news outlet Sputnik. The one in Libya in some cases supported aspiring politicians, including the son of late Libyan leader Moammar Gaddafi, who was deposed and killed in 2011.

Kimberly Marten, chair of the political science department at Barnard College and an expert on international security, cautioned against seeing the operations in Africa as pilot projects for interference in Western democracies. The best evidence dates the Wagner Group’s presence on the continent to 2018, she said, meaning that its online activities may be inspired by Russia’s success in interfering in U.S. and European elections, rather than jumping-off points for operations elsewhere.

“When we look at what Prigozhin is doing — and presumably he’s doing it under Putin’s guidance — he’s throwing spaghetti against the wall and seeing what sticks,” Marten said. “They’ve been doing so much in both the U.S. and Europe that they don’t need to start in Africa to have an impact there. It’s more likely they’re taking techniques that have worked and seeing if they work in Africa.”

Marten also said social media may not be as useful as a platform for disinformation in Africa as it is in other parts of the world because of lower literacy rates. What the Wagner Group seems to be doing in Africa, the security scholar said, is cultivating “personal relationships with leaders in a neocolonial model,” propping up their political endeavors with positive stories that may then travel via the rumor mill.

The best response from the United States, she said, would be to continue with multilateral engagement and “not take part in a great power scramble for Africa.”