TZ riff opening: "Doo, doo, doo, doo... Doo, doo, doo, doo..."
While indirectly funding Western pundits "whose genuinely held beliefs happen to dovetail with the Kremlin’s own narratives" is hardly surprising... Tampering with and inciting both sides of the culture war (good cop, bad cop interplay) -- and choreographing demonstrations and staging homegrown support -- might seem less considered items. (I.e., the latter avenue of conspiracy fodder often entertains respective left and right wing billionaires being responsible, rather than a "Dark Axis" nation or Third Rome candidate.)
Paranoia with respect to SF is arguably unwarranted, though. Would either a flesh and blood team or fully automated trolls be wasting their time on fading internet forums -- the oldest online social venue around, apart from Usenet? In this era (compared to the late 1990s and early 2000s), such dinosaurs are overwhelmingly occupied by seniors. But OTOH, a voter never retires.
(Why Older Citizens Are More Likely to Vote)
- - - - - - - - - - - - - - -
Russian web brigades
https://en.wikipedia.org/wiki/Russian_web_brigades
INTRO: Russian web brigades, also called Russian trolls, Russian bots, RUbots, olgintsy, Kremlinbots, or Kremlin trolls are state-sponsored anonymous Internet political commentators and trolls linked to the Russian government. Participants report that they are organized into teams and groups of commentators that participate in Russian and international political blogs and Internet forums using sockpuppets, social bots, and large-scale orchestrated trolling and disinformation campaigns to promote pro-Putin and pro-Russian propaganda, with an overtone of opposition to diversity, equity and inclusion, and of support for Russian imperialism.
- - - - - - - - - - - - - - -
The Kremlin’s bots, trolls, and influencers
https://quillette.com/2024/11/24/the-kremlins-bots-trolls-and-influencers-russia-disinformation/
EXCERPTS: Russian bots have promoted content from all sides of the political spectrum. Some bots describe Western society as being in the grips of a totalitarian form of feminism, for example, while others accuse Western feminists of failing their commitment to intersectionality. The Kremlin bots weigh in on vaccine debates, taking only the most hardline positions on either side. They also swelled the follower counts of many pro-Palestinian social media influencers in the wake of the 7 October massacre.
The next layer of Russia’s disinformation campaign consists of troll farms. These tend to be run out of office buildings in Ghana and Nigeria, where Russian agents hire local people to create fake social media accounts in which they masquerade as Western citizens. These anonymous commenters spend their working days sharing pro-Russian propaganda and engaging in online debates—often in the most argumentative, irrational, and inflammatory ways possible.
- - - - - - - - - - - - - - -
Russian disinformation tactics on social media
https://disa.org/russian-disinformation-tactics-on-social-media/
EXCERPTS: The advent of artificial intelligence and automation has supercharged Russia’s disinformation capabilities. Chatbots and AI-powered systems are now deployed to generate and disseminate propaganda content at an unprecedented scale. These sophisticated programs can engage in seemingly natural conversations on platforms like Twitter and Telegram, subtly influencing unsuspecting users.
[...] Beyond online manipulation, Russia also orchestrates real-world events to further its disinformation agenda. Staging fake protests, organizing seemingly grassroots demonstrations, and fabricating incidents provides fodder for propaganda outlets. These staged events, often portrayed as genuine expressions of public sentiment, are then amplified through media channels, creating a distorted picture of reality and fueling narratives of instability and discontent.
- - - - - - - - - - - - - - -
I investigated millions of tweets from the Kremlin’s ‘troll factory’ and discovered classic propaganda techniques reimagined for the social media age
https://theconversation.com/i-inves...es-reimagined-for-the-social-media-age-237712
EXCERPT: But the agency’s tactics went beyond social media provocations, and involved the orchestration of real-world events like protests and rallies, as the US Senate Intelligence Committee has reported. These operations also targeted both sides of the political spectrum with the trolls posing as US political activists to manipulate Americans into organising and promoting events – thus creating a false grassroots campaign which heightened existing societal tensions. My topic analysis revealed the trolls’ focus wasn’t random and prioritised hot-button issues of concern to the US voters, such as the economy, security, and immigration.
- - - - - - - - - - - - - - -
Bots on Russian social media: How network propaganda works
https://re-russia.net/en/expertise/0147/
EXCERPT: A more detailed analysis allows us to distinguish two subgroups within the group of pro-governmental bots — ‘good’ and ‘evil'.
‘Evil’ Kremlebots are the most numerous category, they occupy the absolute majority (73%) of all bot comments. In terms of content, they are usually militaristic anti-Westernists, emotionally charged ‘patriots’, and their comments usually have a pronounced negative tone. In contrast, the ‘good’ Kremlinbots are characterised by an almost complete absence of aggression and militarism. They write positive comments about the situation in Russia and also claim that Russia wants peace (which Zelensky is allegedly preventing).
Most likely, most of the ‘good’ Kremlebots are automated (the comments are written for them by a neural network), while there are real people behind the ‘evil’ bots, and their task is to react to ‘wrong’ content and attack the messenger. Thus, the ‘evil’ Kremlebots are the main propaganda agents within the VKontakte network. Their function is not limited to broadcasting the propaganda message, but consists of moderating the discussion, and within this they seem like a group of fierce and voluntary support.
_
While indirectly funding Western pundits "whose genuinely held beliefs happen to dovetail with the Kremlin’s own narratives" is hardly surprising... Tampering with and inciting both sides of the culture war (good cop, bad cop interplay) -- and choreographing demonstrations and staging homegrown support -- might seem less considered items. (I.e., the latter avenue of conspiracy fodder often entertains respective left and right wing billionaires being responsible, rather than a "Dark Axis" nation or Third Rome candidate.)
Paranoia with respect to SF is arguably unwarranted, though. Would either a flesh and blood team or fully automated trolls be wasting their time on fading internet forums -- the oldest online social venue around, apart from Usenet? In this era (compared to the late 1990s and early 2000s), such dinosaurs are overwhelmingly occupied by seniors. But OTOH, a voter never retires.
- - - - - - - - - - - - - - -
Russian web brigades
https://en.wikipedia.org/wiki/Russian_web_brigades
INTRO: Russian web brigades, also called Russian trolls, Russian bots, RUbots, olgintsy, Kremlinbots, or Kremlin trolls are state-sponsored anonymous Internet political commentators and trolls linked to the Russian government. Participants report that they are organized into teams and groups of commentators that participate in Russian and international political blogs and Internet forums using sockpuppets, social bots, and large-scale orchestrated trolling and disinformation campaigns to promote pro-Putin and pro-Russian propaganda, with an overtone of opposition to diversity, equity and inclusion, and of support for Russian imperialism.
- - - - - - - - - - - - - - -
The Kremlin’s bots, trolls, and influencers
https://quillette.com/2024/11/24/the-kremlins-bots-trolls-and-influencers-russia-disinformation/
EXCERPTS: Russian bots have promoted content from all sides of the political spectrum. Some bots describe Western society as being in the grips of a totalitarian form of feminism, for example, while others accuse Western feminists of failing their commitment to intersectionality. The Kremlin bots weigh in on vaccine debates, taking only the most hardline positions on either side. They also swelled the follower counts of many pro-Palestinian social media influencers in the wake of the 7 October massacre.
The next layer of Russia’s disinformation campaign consists of troll farms. These tend to be run out of office buildings in Ghana and Nigeria, where Russian agents hire local people to create fake social media accounts in which they masquerade as Western citizens. These anonymous commenters spend their working days sharing pro-Russian propaganda and engaging in online debates—often in the most argumentative, irrational, and inflammatory ways possible.
- - - - - - - - - - - - - - -
Russian disinformation tactics on social media
https://disa.org/russian-disinformation-tactics-on-social-media/
EXCERPTS: The advent of artificial intelligence and automation has supercharged Russia’s disinformation capabilities. Chatbots and AI-powered systems are now deployed to generate and disseminate propaganda content at an unprecedented scale. These sophisticated programs can engage in seemingly natural conversations on platforms like Twitter and Telegram, subtly influencing unsuspecting users.
[...] Beyond online manipulation, Russia also orchestrates real-world events to further its disinformation agenda. Staging fake protests, organizing seemingly grassroots demonstrations, and fabricating incidents provides fodder for propaganda outlets. These staged events, often portrayed as genuine expressions of public sentiment, are then amplified through media channels, creating a distorted picture of reality and fueling narratives of instability and discontent.
- - - - - - - - - - - - - - -
I investigated millions of tweets from the Kremlin’s ‘troll factory’ and discovered classic propaganda techniques reimagined for the social media age
https://theconversation.com/i-inves...es-reimagined-for-the-social-media-age-237712
EXCERPT: But the agency’s tactics went beyond social media provocations, and involved the orchestration of real-world events like protests and rallies, as the US Senate Intelligence Committee has reported. These operations also targeted both sides of the political spectrum with the trolls posing as US political activists to manipulate Americans into organising and promoting events – thus creating a false grassroots campaign which heightened existing societal tensions. My topic analysis revealed the trolls’ focus wasn’t random and prioritised hot-button issues of concern to the US voters, such as the economy, security, and immigration.
- - - - - - - - - - - - - - -
Bots on Russian social media: How network propaganda works
https://re-russia.net/en/expertise/0147/
EXCERPT: A more detailed analysis allows us to distinguish two subgroups within the group of pro-governmental bots — ‘good’ and ‘evil'.
‘Evil’ Kremlebots are the most numerous category, they occupy the absolute majority (73%) of all bot comments. In terms of content, they are usually militaristic anti-Westernists, emotionally charged ‘patriots’, and their comments usually have a pronounced negative tone. In contrast, the ‘good’ Kremlinbots are characterised by an almost complete absence of aggression and militarism. They write positive comments about the situation in Russia and also claim that Russia wants peace (which Zelensky is allegedly preventing).
Most likely, most of the ‘good’ Kremlebots are automated (the comments are written for them by a neural network), while there are real people behind the ‘evil’ bots, and their task is to react to ‘wrong’ content and attack the messenger. Thus, the ‘evil’ Kremlebots are the main propaganda agents within the VKontakte network. Their function is not limited to broadcasting the propaganda message, but consists of moderating the discussion, and within this they seem like a group of fierce and voluntary support.
_
Last edited: