Russia is actively using fake comments on Telegram to shape favorable public opinion among residents of the temporarily occupied territories of Ukraine.
This conclusion is based on a joint investigation by OpenMinds and the Digital Forensic Research Lab (DFRLab).
Researchers uncovered a coordinated network of more than 3,600 accounts powered by generative artificial intelligence. These accounts systematically post targeted comments in local Telegram channels – praising Russian governance, discrediting Ukraine, and creating the illusion of widespread public support for the occupation.
One example highlighted by the researchers involves an account that was active for just a single day – May 11, 2024. Within that short span, it posted an astonishing 1,391 comments across 65 Telegram channels and chats. Twenty-nine of these messages appeared in local channels used by residents of Ukraine’s temporarily occupied territories. The bot was active from 09:49 a.m. to 11:47 p.m. Moscow time, averaging one comment approximately every 36 seconds.
The content of its messages reflected a broad range of themes aligned with common Russian propaganda narratives. These included praise for Vladimir Putin as the world’s strongest leader, portrayals of NATO as manipulating Ukraine, accusations against President Zelensky of harming his own people, claims that Ukrainians promote Nazi slogans, and admiration for Russia’s growing international partnerships, particularly its expanding ties with Africa.
Roughly 400 of the comments were explicitly pro-Russian, while nearly 1,000 carried anti-Ukrainian or anti-Western messages. Only a small number could be considered neutral.
Unlike rudimentary bots that simply copy and paste identical messages, this account demonstrated a more advanced approach. It adapted its comments to the context of each discussion, responded to previous posts, and adjusted its tone and content to reinforce pro-Russian narratives more convincingly and conversationally.
For example, in one discussion, the account posted: “What a blessing to have such a wise and sensible president!” In another, it blamed Western countries for escalating the conflict, claiming that “everything happening in Ukraine is a conflict orchestrated by the West.” Other comments promoted Russia’s global influence, stating that “logistical barriers between Russia and Africa are being removed, which will significantly boost trade this year.”
The one-day activity of the highlighted account is far from an isolated case. The investigation uncovered a broader network of 3,634 automated Telegram accounts that had been posting pro-Russian comments from January 2024 to April 2025.
Over just five months, this network published more than 316,000 comments in channels linked to occupied territories. More than 3 million additional posts appeared in various Ukrainian and Russian Telegram groups and chats. On average, each bot posted 84 comments per day, with the most active ones generating over 1,000 daily comments.
Beyond the sheer volume of activity, other signs also pointed to the use of generative AI behind these accounts: nonsensical usernames, mismatched or generic profile photos, recycled narratives, and overly formal or awkward phrasing – all of which suggest the involvement of large language models.

Researchers believe that the spread of fake comments promoting specific propaganda narratives on social media is part of Russia’s broader strategy to dominate the information space in the occupied territories. Since the early days of the occupation, Russia has forcibly switched local residents to Russian telecommunications providers, cut off access to Ukrainian media, and launched dozens of Telegram channels disguised as local news outlets.
By 2023, the Telegram ecosystem in the occupied territories had expanded to include over 600 channels. These channels operated in coordination with newly established ministries of information, regional media holdings, and traditional news outlets controlled by Russian intelligence services. Their role extended beyond spreading propaganda – they also aimed to simulate normal life and create the illusion of public support for the occupation.
In this restricted media environment, where Telegram often serves as the primary source of news, comment manipulation assumes added significance. The high volume of seemingly “local” messages expressing gratitude toward Russia or disdain for Ukraine helps to construct what researchers call an “artificial consensus” – a manufactured sense of public opinion that can shape individual beliefs by presenting propaganda as widespread, authentic sentiment.
Some of the bot messages were clearly reactive, coordinated with major events. After the Ukrainian offensive in the Kursk region in the summer of 2024, the bots shifted to praising Russian efforts to provide aid and urged residents not to panic. Following the terrorist attack at Crocus City in Moscow, the bots defended Russian security services and blamed Ukraine. During Finland’s and Sweden’s accession to NATO, the bots downplayed the threat.
In the occupied territories, similar surges were observed around specific campaigns rather than events: in March 2024, the bots flooded channels with existential fears about a Third World War and accusations that Ukraine was sabotaging peace talks. In May 2024, they launched a coordinated wave of praise for Putin, portraying him as the savior of the “new Russian regions.”
The full report, including in-depth findings and methodological details, is available on the Atlantic Council’s website.
Підтримати нас можна через:
Приват: 5169 3351 0164 7408 PayPal - [email protected] Стати нашим патроном за лінком ⬇
Subscribe to our newsletter
or on ours Telegram
Thank you!!
You are subscribed to our newsletter