More-Troll Kombat

Graphika Report

Tuesday December 15, 2020

More-Troll Kombat

Graphika & The Stanford Internet Observatory

Read Full Report

French and Russian Influence Operations Go Head to Head Targeting Audiences in Africa

On December 15, Facebook announced that it had taken down three separate networks that it had discovered for “coordinated inauthentic behavior” that targeted communities across Africa. One, centered on the Central African Republic (CAR) and Mali, was linked to individuals associated with the French military. The other two, centered respectively on CAR and Libya, were connected to the business and influence operations of Russian oligarch Yevgeniy Prigozhin, founder of the mercenary organization Wagner Group and the Internet Research Agency “troll farm.” The French and Russian operations in the CAR tried to expose each other, and repeatedly clashed in groups, comments, and cartoon wars. 

We have documented the first of the Russian operations in a joint report with Stanford University entitled “Stoking Conflict by Keystroke”; this report focuses on the French and Russian operations that targeted CAR. For the sake of brevity, the operation linked to individuals with ties to the French military will be referred to as the “French operation” in this report, while the Russian operation attributed to individuals associated with past activity by the Internet Research Agency (IRA) and previous operations attributed to entities associated with Russian financier Yevgeniy Prigozhin is referred to as the “Russian operation” in this report. It is worth highlighting that Facebook did not attribute the operation directly to the French Government or the French military, and that this report similarly does not offer evidence of institutional involvement from French governmental and military entities. 

Facebook’s takedown marks a rare exposure of rival operations from two different countries going head to head for influence over a third country. It underscores how geopolitical sparring on the ground in Africa is playing out in parallel across social media - not just Facebook, but also Twitter, YouTube, and long-form news articles written by the operations. Before the takedown, Facebook shared assets with Graphika and the Stanford Internet Observatory for independent analysis.

The clash between the two troll operations in CAR sets this exposure apart. From January 2020 through to the moment of the takedown, the rival influence operations posted in the same groups, commented on each other’s posts, called each other out as “fake news,” conducted basic open-source analysis to expose each other’s fake accounts, friended each other, shared each other’s posts, and even, according to one source, tried to entrap each other with direct messages. This report is a case study in a battle between rival influence operations; for that reason, we have called this report exposing both operations and their overlap “More-troll Kombat.” 

The rivalry in CAR was a significant part of both operations’ activity, but it was by no means the only part. Overall, the Russian operation was focused on Southern Africa and CAR; according to Facebook’s statement, it “relied on local nationals from Central African Republic and South Africa.” This is in line with earlier Prigozhin-related operations similarly exposed by Facebook, ourselves and others that co-opted locals, often unwitting, in Ghana, Nigeria, and the United States. The operation posted heavily about local politics and the forthcoming CAR elections, and praised Russia’s engagement in CAR. It also attacked France and the local United Nations mission. A few Russian assets posted about an alleged “coup attempt” in Equatorial Guinea in July-August 2020. 

The French operation was focused on Mali and CAR, and to a lesser extent on Niger, Burkina Faso, Algeria, Cote d’Ivoire and Chad; according to Facebook’s statement, it was linked to “individuals associated with French military.” In CAR, it posted almost exclusively about Russian interference and Russian trolls. Unlike the Russian operation, it did not post systematically about electoral politics and avoided commenting on the upcoming election and its candidates. In Mali, the French assets mainly posted about the security situation, praising the Malian and French armed forces and attacking the jihadist groups they are combatting

The operations showed significant differences, notably the Russian operation’s reliance on local nationals (wittingly or unwittingly) and the French operation’s avoidance of electoral topics. However, when they clashed in CAR, they resembled one another. Each side trolled the other with insulting videos and memes; each side made false accusations against the other; each side used doctored evidence to support their accusations. Some Russian assets posed as news outlets, while some French ones posed as fact-checkers. Both used stolen profile pictures (and in the case of the French network, AI-generated profile pictures) to create fake personas for their networks.

This underscores the key concern revealed by Facebook’s latest findings. To judge by its timing, content and methods, the French operation was, in part, a direct reaction to the exposure of Prigozhin’s troll operations in Africa in 2019 by Facebook. However, its tactics were very similar. By creating fake accounts and fake “anti-fake-news” pages to combat the trolls, the French operators were perpetuating and implicitly justifying the problematic behavior they were trying to fight. 

This is damaging in (at least) two ways. For the operators, using “good fakes” to expose “bad fakes” is a high-risk strategy likely to backfire when a covert operation is detected, as noted in a ground-breaking 2018 French diplomacy report on information manipulation. More importantly, for the health of broader public discourse, the proliferation of fake accounts and manipulated evidence is only likely to deepen public suspicion of online discussion, increase polarization, and reduce the scope for evidence-based consensus. 

Covert influence operations like those that targeted CAR are a problem for the health and credibility of democratic debate. Setting up more covert influence operations to counter them is not a solution. 

Read Full Report

The Best of Graphika in Your Inbox

Sign up for updates via our email newsletter.