In October, the Russian government hosted the first Russia-Africa Summit. More than 40 African heads of state arrived in Sochi to “identify new areas and forms of cooperation,” as Vladimir Putin noted in his greeting to participants.
A week later, Facebook announced that it had removed three networks of pages and accounts engaged in a long-term influence operation spanning eight African countries. Facebook, which had proactively identified a majority of the pages, attributed this operation to companies run by Yevgeniy Prigozhin, a man with close ties to Putin. Prigozhin is also the Russian oligarch U.S. authorities accused of bankrolling the Internet Research Agency — which the New York Times referred to as the “notorious Russian troll factory.”
Who was most likely to share fake news in 2016? Seniors.
Our team at the Stanford Internet Observatory worked with Facebook to identify and analyze these materials. As we show in a recent white paper, it is no accident that, as Russia sought to increase its influence in Africa, Prigozhin was running influence operations there. Here’s what we found.
This was a big operation
The operation targeted Libya, Sudan, Madagascar, the Central African Republic, Mozambique, Congo, Ivory Coast and Cameroon. We analyzed the networks targeting the first six of these countries — an investigation involving 73 Facebook pages and seven Instagram accounts. These pages had “likes” from more than 1.7 million accounts, though some of these likes are probably from the same account. The accounts posted at a high rate — with over 8,900 posts in October alone.
There were patterns, and familiar tactics
We saw consistent tactics across these Facebook pages. Some of these tactics were familiar to those who have studied Internet Research Agency activity: pages set up to resemble local news sources, posts duplicated and cross-posted to amplify engagement, and attempts to leverage original, country-specific memes to damage opposition leaders and Russia’s rivals on the continent, such as France.
But other tactics were novel. At times, the Russian disinformation companies employed local citizens as content creators, making it more difficult to trace pages back to their origin. And there were franchise-like centers — the two largest were in Egypt and Madagascar — directing teams of page administrators who produced steady streams of local-language content.
This makes the disinformation operations harder to detect. Although Facebook offers a Page Transparency feature that, in most cases, shows users where a page’s administrators are located, a Libyan user who looked into the pages targeting Libya would have seen a group of administrators located in Egypt, but no such group in Russia.
It’s all about gold, oil — and great power politics
What were these pages supposed to accomplish? Russia tends to conduct government-aligned influence operations in service to some kind of geopolitical aim. We also looked at the larger question of what Russia’s — and Prigozhin’s — aims in Africa are. Russia has recently expanded its presence in Africa in a return to Soviet-era foreign policy priorities, and a search for new economic opportunities to allay the effects of Western sanctions levied after the annexation of Crimea.
Fake news is bad news for democracy.
Prigozhin and his companies — which include mining concerns and a private military company known as the Wagner Group — play a key role in this expansion. As Kimberly Marten has argued in an article on the Wagner Group, using Prigozhin’s companies to pursue foreign policy aims around the world allows the Russian government to benefit from their actions when it is convenient and to disavow them when it is not. No less significantly, in Russia’s informal, connection-based economy Prigozhin’s firms provide a channel for wealth to flow across the line theoretically separating Russia’s state interests abroad from private citizens like Prigozhin.
The Prigozhin-linked Facebook pages we analyzed are connected to a skein of interests, including mining rights, military contracts, fragile alliances and Russia’s foreign policy priorities. Prigozhin’s companies work in all of these areas — our analysis of this Africa influence operation suggests these pages and networks were expressly designed to boost support for their activities in the targeted countries.
This support could have many layers. On July 2, a post on one of the pages targeting Madagascar lauded the KRAOMA mine’s new Russian partners. It referred to cooperation with Ferrum Mining, which Russian investigators have tied to Prigozhin. Another Prigozhin project, a website called Afrique Panorama, published an article calling Ferrum’s work with KRAOMA “perfect” and shared it on its Facebook page.
These activities mesh with our understanding of what happened in the run-up to the 2016 presidential election, when Prigozhin’s Internet Research Agency was a crucial, informal-but-deniable part of a broad Russian effort to damage Hillary Clinton’s candidacy.
Did Russia make Brexit promoter Nigel Farage a ‘YouTube star’?
The stakes are similarly high for both Russia and Prigozhin in Africa, where great wealth and trade relationships hang in the balance. To this end, our findings suggest this influence operation was designed to boost the political figures with which Russia has aligned itself, such as President Faustin-Archange Touadéra in the Central African Republic.
Detecting disinformation has grown difficult
Our research focuses on the implications this influence operation has for understanding and combating disinformation. What happened in Africa suggests that Russian-sponsored operations are evolving — and likely to be harder to detect in the future as these efforts become more intertwined with the communities they are targeting on open platforms like Facebook.
These operations also target more opaque channels like WhatsApp and Telegram. As bad as the effects of disinformation campaigns in the United States have been, they have the potential to be more harmful in the developing world, where data usage costs can make verifying stories more difficult.
At the same time, we observed many social media users responding skeptically to the content on these pages. Users in Mozambique commented on an untrue story denigrating the opposition and posted comments like “fake news!”
In another example, we noted pages designed to beef up support for the son of deceased Libyan leader Moammar Gaddafi. Libyans responded with comments like “it looks like you never lived in Libya under Gaddafi, otherwise you wouldn’t post these imaginary things.”
This suggests that the impact of these operations has limits — and that not all social media users passively consume hyperpartisan content.
Don’t miss anything! Sign up to get TMC’s smart analysis in your inbox, three days a week.
Shelby Grossman is a research scholar at the Stanford Internet Observatory. She focuses on inauthentic information operations in Africa. Follow her on Twitter @shelbygrossman.