A carte blanche by Péter Krekó, Csaba Molnar and Lóránt Győri of the Political Capital Institute, Budapest.
Troll armies have become key parts of the Kremlin’s disinformation manual. They first surfaced in 2016, when Putin’s confidant Yevgeny Prigozhin’s Internet Research Agency employed thousands at a St. Petersburg “troll farm” to intervene in key elections, including the US presidential race between Donald Trump and Hillary Clinton. Now they are alive – on an entirely different scale – as part of Russia’s invasion of Ukraine, and seemingly less obvious after the geo-blocking of major disinformation outlets, Sputnik and RT, and the spreading their content on the biggest social media platforms in the world, Twitter, Facebook and YouTube.
Research published by the British government in May 2022 found that Russia has expanded its army of robots and trolls since invading Ukraine on February 24, 2022. A new army of trolls, linked to Prigozhin, the founder of the Russian mercenary group Wagner, has entered the theater of war, he warned. In addition to targeting mainstream media outlets or politicians, such as then British Prime Minister Boris Johnson and German Chancellor Olaf Scholz, troll activity focused on manipulating public opinion by injecting misinformation posts in the comment sections of various social media platforms (Facebook, Twitter, TikTok, Telegram). META researchers proved in August that “Cyber Front Z” war trolls are linked to the troll factory run by Prigozhin, and a study, published November 6, found that social media accounts “hibernated ”, once linked to the IRA, were again active in attacking President Biden’s handling of the Ukraine crisis ahead of the midterm elections.
Using a combination of algorithm-based text mining and qualitative analysis, research from our think tank, Political Capital, tracked and analyzed the activation of troll accounts and their dissemination strategies following the invasion of Ukraine into the V4 countries (Hungary, Poland, Czechia and Slovakia), Germany, Italy and Romania. Repeated use of stock photos and specific posting patterns of reposting the same comment verbatim in Facebook threads revealed their patterns of inauthentic behavior. Our team examined repetitive texts of at least 5 words and which were posted at least 200 times on social media channels, and unearthed a series of results. First, in our study of the V4 countries, we revealed that there are notable country-specific differences in activities and narratives across different countries. In Hungary and the Czech Republic, for example, we detected a large number of comparable posts aligned with pro-Kremlin narratives. Of the five stories aired in the two countries, three were about (1) Ukraine committing genocide in the Donbass, (2) the neo-Nazi takeover of Ukraine, or (3) Ukraine not being not a real state. In Poland, however, such tactics would not work, due to widespread resentment in the country towards Russia. As such, the posts attempted to highlight geopolitical insecurity by suggesting that the ruling PiS party had mishandled national security efforts, while NATO cooperation could drag Poland into war. In Germany, troll efforts have focused on amplifying a sense of guilt in German public opinion. The main narrative also sought to reinterpret the war as a conflict between Russia and the West (US and NATO), emphasizing alleged violations by the West of the promises it had made to the Soviet Union and Russia regarding NATO enlargement.
Second, we discovered that a lot of fake stories start life in Moscow. Three messages that are repeated by trolls in Hungary were easily identified as such. These included: “Ukraine does not exist”; “NATO’s new dictatorial world order”; and the “last eight years of genocide in the Donbass”. Of these, the first emerged from an organization linked to pro-Putin Ukrainian oligarch Viktor Medvechuk, listing the source of the information as the separatists’ “news agency”. An AFP investigation revealed that the same account circulated in Greek, German, English and Bulgarian.
The flow of messages from the Kremlin has also increased, in some areas. Our investigation detected Kremlin disinformation in popular media, including RTL, RTL Aktuell, Sat1 and ZDF Heute, which enjoy large audiences in their respective countries. They have also managed to make their way to sympathetic political administrations in some countries. For example, opinion pieces from pro-government mainstream media in Hungary echoed the false claim of “genocide” or “ethnocide” committed against the Russian or Hungarian minority – referring to a diplomatic dispute. long-standing discussion on minority language rights between Kyiv and Budapest. While liberals from the German AFD, Romanian Social Democrats, Robert Fico in Slovakia and the Trikolor party in the Czech Republic have also given oxygen to the Kremlin narratives.
…and their mistakes
Ultimately, however, our efforts to identify troll activity stemmed from their mistakes. For example, in several cases we found that so-called Slovak Facebook users commented on Czech Facebook pages in Hungarian, and Italian profiles commented on Colombian Facebook pages in their own language, etc. Profiles that simultaneously shared overtly pro and anti-Kremlin narratives were even more telling. These errors suggest that the Russian source behind these profiles forgot to switch accounts before moving to another jurisdiction. We also found that fake and stolen profiles were the most common in spreading these stories.
What the Union must do
The extent of Russian troll activity in Europe should be of concern to political leaders and citizens at large. Inauthentic online influence operations are easy to fabricate and inexpensive to execute.
It is therefore imperative that the EU adopts appropriate legislation – beyond its digital services legislation – and develops technical capacities to better recognize inauthentic behavior online. The ultimate silver bullet lies with the social media companies themselves and their appetite for tackling inauthentic networks transparently. European lawmakers should give the highest priority to these platforms quickly complying with demands to combat disinformation and, in addition, introduce their own tools to prevent the spread of troll activity on their platforms.