Putin fans or Kremlin bots?
Putin fans or Kremlin bots?
If, when communicating on a social network or simply scrolling through your feed, you believe that you are observing a cross-section of society and its reaction to current events, then you are deeply mistaken and fall into a specially set trap. Social networks have long ceased to be a space of pure freedom; they are now flooded with an army of trolls, bots and paid influencers who make up the infrastructure of online political astroturfing. This concept originated in the USA in the mid-1980s and goes back to the name of the company AstroTurf, which supplied stadium surfaces that imitate natural grass (grassroots). In the same way, social or political astroturfing is an imitation of grassroots initiatives and campaigns that are designed to falsify “popular opinion” and create a false impression of the prevailing sentiments, simulate a pro-government majority and frighten doubters with the intensity of its mobilization and determination. Just as this happens in elections rigged by authoritarian regimes and in traditional media controlled by them.
As the authors of the published study write, over the last decade in Russia a powerful infrastructure of a special “network autocracy” has been created, implementing “third generation control” strategies aimed not at limiting, but at proactively shaping network content. As the authors show, analyzing the strategies of network astroturfing and the real reaction of users to the war in Ukraine and the “partial mobilization”, social networks today are an arena of confrontation and struggle between the community of real users and the “astroturfing army” created by the regime. Astroturfing strategies in this fight are quite diverse and variable, but online propaganda is not always successful, although it seriously distorts our ideas about “grassroots sentiments.”
(The English version of the first part of this article was originally published by Russia.Post , to which we would like to thank for their cooperation.)
1. “Third generation control”: network astroturfing infrastructure
For the Russian authoritarian regime, the public reaction to the announcement of “partial mobilization” was far from desirable. Within two weeks, some 700,000 people had fled the country to avoid the draft, mass protests had broken out across the country , and approval ratings for the president and key political institutions had declined . To counter this backlash, the Kremlin has mobilized both traditional and online media. State media received new “temniks” in which journalists were instructed to emphasize that “partial mobilization” was very limited and would affect only 1% of reservists - young men with military experience. In addition, the authorities mobilized bloggers promoting the hashtag #don't panic and comparing this 1% mobilized with the 1% of the contents of a woman's cosmetic bag, a serving of French fries or a pack of chocolates in order to downplay the scale of the event.
Once considered a “liberation technology ,” social media is now increasingly being used by authoritarian governments to spread propaganda, control political narratives, and create versions of “networked authoritarianism . ” Social networks have become an integral part of the Kremlin propaganda machine. While television remains the primary propaganda tool, it is complemented by vast networks of bots, trolls and paid influencers that are used to steer public opinion and silence critical voices.
Under a model of “third-generation control” —techniques that are used to proactively shape rather than restrict the online sphere—Putin’s regime is attempting to influence online discussions through the use of trolls and bots. Perhaps the most prominent example is the notorious Kremlin-linked Internet Research Agency (IRA), which conducted several information campaigns in an attempt to sow discord in the United States and influence the results of the 2016 election.
How do trolls work? After analyzing accounts associated with AI, the researchers suggested that “foreign” trolls mainly intervene in discussions around existing disagreements within society, seeking to increase polarization or sow fear. For example, American accounts associated with AIA take on the role of citizens with far-right or far-left views to increase the profile of radically opposing viewpoints, or simply spread disinformation about fictitious disasters to sow fear.
The direct impact of trolls is difficult to assess because they behave like real users. However, research based on leaked lists of accounts from troll factories shows that in Russia their work in imposing pro-government narratives in online discussions is not very successful. They are much more effective at distracting citizens from substantive political discussions and reducing the degree of criticism of the government online.
Compared to human-run troll accounts, there are many more bots on social media. For example, scientists believe that the number of bots that write about politics in the Russian Twitter sphere during important events, such as the annexation of Crimea in 2014, can reach 80%. Unlike trolls, who participate in discussions to influence users' beliefs, bots perform many other functions. For example, they create the appearance of popularity of officials by following the accounts of ministers, governors, etc. (according to research, from 13 to 63% of followers of Russian governors on Instagram may be bots). In addition, bots create information noise during important political events such as rallies, making it difficult for potential protest participants to find relevant information. Finally, bots post news headlines and links and manipulate search engine rankings , increasing the visibility of propaganda sources in search queries. Bots are used by various political forces, not just the regime. For example, in 2015–2017 there were approximately equal numbers of pro-Kremlin, anti-Kremlin, and neutral bots on Twitter. Similarly, the discussion of the Russian-Ukrainian war on Twitter is currently influenced by both pro-Russian and pro-Ukrainian bots .
Overall, over the past ten years, the Kremlin has built a powerful digital infrastructure that includes thousands of regime-controlled and paid bloggers and trolls, as well as automated bots. Following China, Russia has built its own unique model of “network authoritarianism,” based on a complex system of digital political astroturfing that imitates “popular opinion.” After the February invasion of Ukraine, she was given the important task of spreading invasion propaganda. But how much does this infrastructure really help control the beliefs and sentiments of citizens?
2. Successes and failures of the “partial mobilization” propaganda
To better understand the Kremlin's online astroturfing strategies, we collected messages related to the Russian-Ukrainian war and published in Russian media and social networks. To extract content from social networks, we used the Brand Analytics monitoring system, and to extract media content, we used the Scan Interfax monitoring system. We received a large array (47,553 messages from the media and 833,518 messages from social networks), on which we tracked the dynamics of key terms and the topics of messages during July–August 2022. Our analysis shows that the Kremlin was preparing for the mobilization before it was announced and used regime-controlled accounts to portray the invasion as a war with NATO, rally around Putin against an existential threat to the Fatherland, and dehumanize Ukrainians. We have seen how the propaganda machine coordinates across its vast networks to counter criticism of the war from skeptics and opponents of the regime.
Preparing the population for mobilization
In our previous report, we analyzed changes in the dynamics and content of Kremlin military propaganda in the first half of 2022. We showed that since the start of the war, the tsunami of information about the “special military operation” on television died down and stabilized by mid-summer. However, further analysis showed that social networks have their own dynamics: here, pro-war users increasingly spread the thesis that “NWO” is precisely a war and it requires more decisive action. These differences are clearly visible in the changes in the main justifications for the war (“denazification”, “demilitarization”, “protection of the people of Donbass/Russian language”).
Figure 1a. Dynamics of various justifications for war in social media (normalized frequencies - terms used per 100,000 words at three-day intervals)
Figure 1b. Dynamics of various justifications for war in official media (normalized frequencies - terms used per 100,000 words at three-day intervals)
Figure 1 shows the trend of normalized frequencies (terms used per 100,000 words at three-day intervals), showing that there are spikes in social media that do not necessarily coincide with spikes in usage of specific terms in mainstream media. Focusing on social media posts, we found that in July–August there were several coordinated attempts to promote a justification for mobilization narrative that contained three main elements: 1) NATO as the real enemy that Russia is fighting, 2) the need for patriotic unity, and 3 ) dehumanization of Ukraine and its citizens.
Mentions of the NATO threat continue to play an important role in war narratives on social media, even more so than in traditional media. In August, pro-military groups twice launched large-scale efforts to promote video and text messages that used strong language portraying Ukraine as a NATO satellite and Ukrainians as nationalists. The large number of identical messages at the same time suggests that these messages were promoted in a coordinated manner by a network of government-controlled accounts. Since most of them contained promotional rhetoric—for example, the idea that the NWO was Russia's last stand against the West—the purpose of these messages was most likely to lay the groundwork for the announcement of a “partial mobilization” on September 21.
Figure 2. Mentions of NATO in the context of the Russian-Ukrainian war (normalized frequencies - term usage per 100,000 words at three-day intervals)
Patriotic Unity
The call for “patriotic unity” represents another central pillar of the mobilization narrative. For example, on August 3, 5% of all war posts on social media (903 out of 17,196) promoted the video “New Challenges for Russia,” created by the pro-war nationalist group Essence of Time. The video accuses the West of attacking Russia and calls for patriotic unity in response to military failures and internal challenges. The spike around August 22–25 reflects discussions around Ukraine's Independence Day, but also represents another coordinated attempt to shape public opinion: 17% of all posts about the August 25 war (3,312 out of 20,006) promoted the video "Why Russia Can't Survive a Long Conflict" with the West", created by the same group. It said that the answer to military failures and internal challenges must be sacrifice and patriotic unification - otherwise Russia is doomed to failure.Dehumanize the enemy
In addition to presenting the invasion as an existential war to which Russians must respond with patriotic unity, online discussions about the war are replete with dehumanizing language. Dehumanization is a common tool used by warring governments: it incites hatred and undermines compassion for the enemy, which makes mobilization easier. We have combined the terms used by state media to dehumanize Ukrainians (“Ukrofashists”, “Ukrops”, “(neo)Nazis”, “terrorists”, “fanatics”, etc.) into one “dictionary”.Figure 3. Use of dehumanizing language (normalized frequencies - terms used per 100,000 words at three-day intervals)
Figure 3 shows the trend of messages from July to August, which demonstrates the astonishing volume of dehumanization in both traditional media and social networks. Language that incites hatred towards the Ukrainian people is even more noticeable than the key terms used to justify the war. To reinforce the dehumanizing labels, propaganda uses emotionally charged verbs and adverbs: “frenziedly killed”, “population intoxicated by the vile Banderaism”, etc. This narrative presents Ukrainians as a bloodthirsty and fanatical people with brainwashed, which, obviously, should prevent Russians from feeling compassion for their relatives, friends and ordinary citizens in Ukraine.
Despite the fact that there is a significant amount of inhumane statements on social networks, an anomalous spike on August 12 is clearly visible in Figure 3. He points to coordinated attempts to amplify the anti-Ukrainian narrative through state-controlled accounts or bots. For example, on August 12–13, 16% (2,993 out of 19,220) of messages with dehumanizing language were promoting the nationalist video “Who are Bandera’s Afraid.” On August 13–14, 10% of such vocabulary was accounted for by the nationalist video “NATO made Ukraine a six,” created by the pro-war nationalist group “Essence of Time.”
Propaganda failures
However, not everything that the propaganda machine promotes for mobilization works. In July 2022, two “temniks” distributed by the Kremlin to coordinate media coverage of the war fell into the hands of Meduza. Among them were comparisons of “SVO” with the baptism of Rus', justifications of the war as a struggle between Russian Orthodoxy and atheists, etc. We used key words from these “temniks” to understand whether this vocabulary resonated in online discussions.
Figure 4. The language of the Kremlin theses (normalized frequencies - the use of terms per 100,000 words at three-day intervals)
As can be seen in Figure 4, the official media indeed picked up the themes formulated in the Kremlin “temniks” at the end of July. However, online discussions followed the official media agenda to a very limited extent. The language of “temniki” appears on social networks half as often, which indicates that it has not found a response among their users.
In addition, despite the criminalization of public anti-war statements, social network users continue to discuss the war in terms of “Russian invasion”, “occupation of Ukrainian territories”, “annexation of Crimea/Donbass/parts of Ukraine”, “Russian occupation”, “aggression against Ukraine”, etc. etc. This suggests that state propaganda cannot fully cope with anti-war sentiment on the Internet. The frequency of these phrases increases significantly by the end of summer 2022.
Figure 5. Anti-war rhetoric (normalized frequencies—terms used per 100,000 words at three-day intervals)
“Partial mobilization” became a key element of the Kremlin’s military strategy and certainly influenced public sentiment in Russia. It has become a huge stress for Russians. Our analysis shows that even before its announcement, the Kremlin relied on nationalist social media users associated with the regime who advocated a universal call to arms, the main elements of which were the existential threat from NATO, the need for patriotic unity and the dehumanization of the Ukrainian nation.
Analysis of the online propaganda campaign suggests that the Kremlin used social networks to maneuver to compensate for the consequences of two emerging problems - military failures in Ukraine and failures of television propaganda. However, as mobilization brought the war closer to ordinary people, anti-war sentiment has become even more visible on the Internet, and many of the propaganda clichés do not resonate with users. It is difficult to say whether these cracks in the Kremlin’s narratives about the war and the “intrusion” of war into the daily lives of citizens are enough to provoke political turbulence, which would become a serious factor of pressure on Kremlin policy.
3. Differences in astroturfing strategies: VKontakte, Odnoklassniki and Telegram
Studies of propaganda narratives on Russian social networks that have emerged in recent months have tended to focus on one platform: Telegram (TG). Compared to other Russian social networks, TG is more diverse: there are many pro-regime and opposition channels. In addition, attention has been drawn to this platform due to the presence of influential war bloggers there - “military correspondents” and radical nationalists. Because they regularly criticize the conduct of a “special military operation,” experts and journalists often view the authors and audience of these channels as those who could potentially challenge the Kremlin’s policies.
However, TG is only a small part of Russia's vast online social media space. Two other Russian social networks—VKontakte (VK) and Odnoklassniki (OK)—have either similar or much larger audiences. Because political discussions vary widely across social platforms, we'll look at them from a comparative perspective.
Based on the algorithm described above for working with the Brand Analytics monitoring system, we extracted 1,544,918 messages (more than 352,000,000 words) dedicated to the Russian-Ukrainian war and published by users in July–September 2022 on three social networks. 3% of our corpus is messages in TG, 60% in VK, and 37% in OK. This distribution likely reflects differences in data protection policies across platforms. Given the unequal number of reports, the comparison results should be interpreted with caution.
OK is considered the most pro-government platform and has a much more adult audience than VK and TG. According to numerous polling data, age is positively correlated with support for the regime, the invasion of Ukraine, and “partial mobilization.” Our results provide further evidence of the relationship between audience sociodemographic characteristics and political preferences. We show that OK is the most pro-war platform, followed by VC, while TG is the most polarized space, containing both pro-war and anti-war positions.
Given these platform features, one would expect VK and TG to become prime targets for regime-controlled account networks. However, we found that the activity of regime-controlled accounts is more noticeable in OK than in VK and TG.
Ecosystem of Russian social networks
Different platforms differ radically in the way they organize online communication. Russian LiveJournal, and more recently also Facebook and Twitter, are known for their heated political discussions, while in their early days OK and VK were largely apolitical. With a strong focus on user safety, privacy and freedom of expression, TG Messenger has become an important communication and coordination tool during mass protests in Hong Kong, Russia and Belarus.
According to data for 2021 , the networks were ranked according to the reach of the Russian audience as follows: VK - 44%, YouTube - 37%, Instagram - 34%, OK - 30%, Telegram - 21%, TikTok - 16%, Facebook - 10%, “My World” (Mail.ru) - 5% and Twitter - 4%. In addition to social media users, these platforms are filled with semi-automated and automated accounts - bots (automated accounts that distribute pre-prepared messages) and cyborgs (humans who rely on algorithms to facilitate the creation and distribution of content). After Facebook and Instagram were banned in Russia in the spring of 2022 (but remained accessible through VPN services), VK, OK and TG strengthened their dominant positions as the main social networks in Russia. The VK audience increased to 62%, the OK audience - to 42%, and the TG audience - to 55%.
The Odnoklassniki network, sometimes disparagingly called Odnoglazniki, is often considered a space for Putin’s electorate. Its audience is much older than the audience of other platforms. As of 2021 , 7.4% of OK users are under 24 years old, 16.9% of users are from 24 to 34 years old, 25.2% of users are from 34 to 44 years old, and the dominant 49.5% are over 45 years old . The vast majority of public groups in OK are anti-Western and pro-Kremlin communities, which form the basis of the “virtual Russian world” not only in Russia, but also in other countries with a significant Russian-speaking population (for example, in the Baltic countries). Although VK is part of the Kremlin's propaganda machine, its audience is much younger. According to 2021 data , the dominant 31.3% of VK users are under 24 years old, 29% of users are from 24 to 34 years old, 21.8% of users are from 34 to 44 years old, and only 18% are over 45 years old. Finally, the audience of TG, a relative newcomer to the market, is slightly older than the audience of VK. As of 2021 , 29.6% of TG users are under 24 years old, the dominant 30.6% of users are from 24 to 34 years old, 21.3% of users are from 34 to 44 years old, and only 18.5% are over 45 years old . As mentioned, qualitative studies and polls show that age is highly correlated with support for the regime and the invasion of Ukraine. Given these clear socio-demographic differences, we expect to see ideological differences in online discussions about the war in OK, VK and TG.
Ideological orientation of discussions in OK, VK and TG
To understand the balance of pro-war and anti-war sentiment across platforms, we compared the frequency of words and phrases of anti-war rhetoric, the distribution of terms used by the regime to justify the invasion, such as “denazification” and “demilitarization,” and the use of the “vocabulary” of dehumanization described above.
Figure 6. Dehumanization vocabulary across platforms (normalized frequencies—terms used per 100,000 words at three-day intervals)
Figure 6 shows that dehumanizing language is prominently present on all three platforms. We record two significant spikes: immediately after September 21 and September 30, which coincides with the announcement of mobilization and the Kremlin’s decision to officially annex four regions of Ukraine. Markers of dehumanization language are more common in posts on OK. On September 30, there were more than five times more dehumanizing expressions in OK than in VK and TG, which are less pro-Kremlin platforms in terms of socio-demographic profile. However, the sudden appearance and disappearance of a huge amount of dehumanizing vocabulary in OK compared to a stable amount in VK and TG indicate the artificial nature of this phenomenon. These fluctuations indicate that the language of dehumanization in OK is a consequence of astroturfing - it is produced by bots and paid influencers.
To better understand the differences and dynamics of pro-war and anti-war narratives, we examined the use of anti-war vocabulary. Figure 7 shows the aggregate frequencies of keywords and formulas typical of the language of Kremlin opponents criticizing the invasion - for example, “Russian aggression”, “annexation”, “occupation of Ukrainian territories”, “Russian invasion”, “occupation of Donbass”, “occupation” Crimea”, “Russian occupiers”, etc. The graphs reflect fluctuations in the use of anti-war expressions on different platforms from July to September.
Figure 7. Anti-war rhetoric across platforms (normalized frequencies—terms used per 100,000 words at three-day intervals)
As expected, TG is the most anti-war platform of the three. It is followed by VK and OK. Although TG has many pro-Kremlin channels, independent media channels are also widely represented there and generate a significant amount of anti-war content. The least amount of anti-war language is present in OK, the most pro-government network in terms of socio-demographic profile, with an even more significant decrease by the end of September.
The frequency of key terms used by the regime to justify the invasion - “denazification”, “demilitarization”, “protection of the inhabitants of Donbass and the Russian language” - further demonstrates the artificial nature of the pro-war discourse in the OK.
Figure 8a. Justification for the invasion: “denazification” (normalized frequencies - the use of terms per 100,000 words at three-day intervals)
Figure 8b. Invasion rationale: "demilitarization" (normalized frequencies - terms used per 100,000 words at three-day intervals)
The terms “denazification” and “demilitarization” disappeared from television in the spring of 2022, as they did not find a response from the population. Nevertheless, they remained part of the pro-war propaganda that was disseminated by the regime on social networks in the summer of 2022. TG remains a relatively free space, devoid of official rhetoric, and VK users show a moderate tendency to reproduce official narratives about the war. However, the frequency of these terms in TG and VC is consistently low compared to the anomalous peaks in OC. These peaks correspond to a fivefold increase in the frequency of their use in OK posts compared to VK posts. Note that the term “demilitarization,” actively reproduced by regime-controlled accounts in the summer, disappeared from astroturfing posts by the fall of 2022. The very discontinuous pattern of frequency of posts on this topic (compared to other social media platforms) indicates its artificial nature.
Discussion in support and against mobilization on different platforms
The “partial mobilization” announced by Putin on September 21 proved to be a litmus test for understanding the real reaction of the Russian population to the invasion of Ukraine. It is much easier to support an abstract war waged by professional soldiers somewhere far away than a war that directly invades people's lives - and takes away those lives.
How did users of various social networks react to the “partial mobilization”? For military users, this was a long-awaited solution that they welcomed. Anti-war users quickly turned the word "mobilization" into "mobilization" and began a campaign of resistance. The graphs in Figure 9 show the frequency of three terms—“mobilization,” “stop the war,” and “cannon fodder”—used by critics of the September 2022 mobilization.
Figure 9a. Anti-war discourse at the end of September 2022: “grave” (normalized frequencies - terms used per 100,000 words at three-day intervals)
Figure 9b. Anti-war discourse at the end of September 2022: “stop the war” (normalized frequencies - terms used per 100,000 words at three-day intervals)
Figure 9c. Anti-war discourse at the end of September 2022: “cannon fodder” (normalized frequencies - terms used per 100,000 words at three-day intervals)
The figure shows that the reaction was much more negative in TG and VC than in OK: all solid lines corresponding to OK frequencies are at the bottom of the graph. TG and VK users adopted the terms “grave” and “cannon fodder.” At the same time, TG became the main platform for the spread of anti-mobilization sentiments. The frequency of phrases associated with two resistance strategies—“fleeing the country” and “draft dodging”—highlights the differences between social media platforms. Unlike OK, for TG and VK users, evading mobilization and fleeing the country is a desirable response to mobilization.
Figure 10a. Reaction to September mobilization: “fleeing the country” (normalized frequencies - terms used per 100,000 words at three-day intervals)
Figure 10b. Reaction to September mobilization: “draft dodging” (normalized frequencies - terms used per 100,000 words at three-day intervals)
As can be seen from Figure 10, TG users were more actively discussing leaving the country even before the announcement of mobilization. After September 21, this constant topic was replaced by conversations about how to “avoid the draft.” VK users were generally in line with the same trend, although on a smaller scale. The massive OK network has shown very little interest in the topic, at least in the public space. Considering that 51% of Russians have no savings , and 72% do not have a valid foreign passport , it is not surprising that on all three sites the frequency of phrases related to mobilization evasion had significantly more weight than the topic of emigration.
Putin fans or Kremlin bots?
The space of social networks in Russia today is characterized by relative freedom and diversity. After the ban on Western platforms in the spring of 2022, VK, OK and TG increased their audience and became the dominant networks. Because there are significant sociodemographic differences between their audiences, we believe that the differences between how social media users discuss war and mobilization on these platforms provide insight into the Kremlin's online communication strategy.Despite the fact that, as a result of the significant presence of influential pro-war bloggers (“military correspondents”) in TG, it began to be considered the main “hotbed” of pro-war narratives, we show that the main support for the war is concentrated precisely on the Odnoklassniki network. At the same time, our analysis shows that a disproportionate amount of propaganda is aimed precisely at this platform, the most loyal to the Kremlin. These data provide a clue to the Kremlin's digital propaganda strategy: rather than trying to win over opponents of the war or users without clear political preferences, the main target of astroturfing appears to be a predominantly pro-war audience. Consistent with both classic studies of media influence and modern research on the effects of propaganda in authoritarian Russia, these findings demonstrate that the basic strategy of online astroturfing may be similar to that of traditional authoritarian propaganda: its main function is to strengthen the beliefs of those already supporting the regime. rather than winning new supporters
Коментарі
Дописати коментар