By Jakub Kalensky, for DisinfoPortal
Too often, discussions about countering disinformation revolve solely around social media platforms. Especially for politicians or other individuals in power, placing the focus and blame on the social media platforms is politically convenient; targeting tech companies and demanding they devise solutions allow governments or intergovernmental organizations to shoulder less of the blame for the problem of disinformation.
However, the disinformation ecosystem expands far beyond the social media environment, and is far more sophisticated.
A study published this summer by the INFO OPS Poland Foundation is an important reminder about the massive scale of the disinformation machine. The authors analyze in particular the online messaging that aims to exacerbate relations between Poland and Ukraine, one of the leitmotivs of the pro-Kremlin disinformation in the country.
The breadth and scope of the disinformation ecosystem in a single country as described by the Polish authors accentuates the severity of the disinformation threat.
“Processes of manipulating information, which are supposed to negatively influence the mutual perception of Polish and Ukrainian societies, have the character of multidimensional activities. Manipulation activities are carried out in the media, Polish-speaking Russian media, social networks, the blogosphere and other dedicated information areas in a virtual information environment,” say the authors. They highlight the continuity of the information operations, as well as their frequent accompaniment by manipulation activities in physical dimensions. Such was the case in the arsonist attack committed in the Ukrainian city of Uzhorod by Polish nationalists who were hired for the job by Manuel Ochsenreiter, the assistant of a German MP about whom Russian sources boast, “We are going to have an MP in the Bundestag who is ours, and whom we fully control.”
But even in the purely informational dimension, the scale of the problem is much bigger than the reaction of our societies indicates. In the chapter “Russian Advanced Model for the Distribution of Manipulative Messages,” the authors describe the multifaceted ecosystem used to spread disinformation in Poland.
In this model, the starting point is the Polish version of the Kremlin’s disinformation organization Sputnik. “It has a limited range as a website, but this is misleading. Messages originating from Sputnik are changed and replicated multiple times by other means. Sputnik is the beginning of the process of entering of a message into the infosphere.”
Next in line is the amplification of Sputnik’s messages via a network of websites dedicated to spreading pro-Kremlin disinformation (one of the websites given as an example is AlexJones.pl). From these websites, the disinformation spreads further to the blogosphere.
It is only after that phase that social media enters the scene for further multiplication. And after that, the propaganda messages spread further to discussion forums and other places on the web designed for the exchange of opinions.
With every phase, the disinformation messages become increasingly tailored to a specific audience, with the ideal result being that the disinformers are able to fine-tune the messages and channels for a specific person, e.g. a particular decisionmaker.
Once this informational basis is laid, influential opinion leaders start spreading the desired message via their social media accounts, making use of all the previously spread informational layers. Their messages can be further amplified by online tools and mechanisms, such as botnets. This final step, according to the authors, is the moment when the carefully organized disinformation operations penetrate the “natural information sphere.” With the desired information now present in the “natural environment,” it is possible to try and influence the targeted recipients with yet new force.
However, this is not the end of the information operations: “One of the tools used by the Russians is checking the response to information stimuli – in this case, information, disinformation or manipulation.” Evaluating the reaction of the information environment allows the information aggressors to improve their audience-tailoring for the next operation.
And as has been seen previously, the disinformers boast a very good track record in “mutating” their information viruses by adjusting the messages and the channels for a particular audience. If the environment is such that sources which are too openly pro-Russian are frequently discredited, the information aggressors maneuver to blur the source of the disinformation. A more newfound phenomenon is an increasing “regionalization” of disinformation messages, as demonstrated in a recent study about pro-Kremlin disinformation in Belarus by Andrei Yeliseyeu from the East Center.
Focusing on merely one of the many channels used by disinformation peddlers, even one as important as social media, is missing the bigger point, and distorts the multifaceted nature of the problem. The United States and Europe will need a much more coherent approach if they are to successfully tackle the full scale of the problem of hostile disinformation campaigns.
By Jakub Kalensky, for DisinfoPortal