By Dr. Rolf Fredheim, Belen Carrasco Rodriguez and John Gallagher, for NATO Stratcom Centre of Excellence
Executive Summary
In the period May—July 2019 bots accounted for 55% of all Russian-language messages on Twitter. This big increase in automated activity was largely driven by news-bots contributing to information effects around stories published by the Kremlin’s propaganda outlet, Sputnik. On VK, the bot presence also increased, and currently accounts for one quarter of all users. 17% of English language messaging was done by bots.
Three military exercises were of particular interest for Russian-language bots on Twitter and VK: Spring Storm, Baltic Operations (BALTOPS), and Dragon-19. The level of Twitter activity during the month of July was less than half that observed for the period May–June.
Having studied robotic activity for almost three years, we see a clear pattern: whenever a military exercise takes place, coverage by hostile pro-Kremlin media is systematically amplified by inauthentic accounts. In this issue of Robotrolling we take a closer look at how manipulation has changed during the period 2017–2019 in response to measures implemented by Twitter. Since 2017 bot activity has changed. Spam bots have given way to news bots—accounts promoting fringe or fake news outlets—and mention-trolls, which systematically direct messaging in support of pro-Kremlin voices and in opposition to its critics.
We present an innovative case study measuring the impact political social media manipulation has on online conversations. Analysis of Russian Internet Research Agency posts to the platform Reddit shows that manipulation caused a short-term increase in the number of identity attacks by other users, as well as a longer-term increase in the toxicity of conversations.
Download publication file (5.17 MB) |