By EUvsDisinfo
Alicia Wanless is the director of the Partnership for Countering Influence Operations at the Carnegie Endowment for International Peace, which aims to foster evidence-based policymaking for the governance of the information environment. In this interview she shares her views on what is information ecology and how to approach the challenge of disinformation with EUvsDisinfo.
See also the video of the interview.
When we talk about intentional manipulation in the information environment, we usually talk about disinformation. Do you think this term is adequate to describe what we are up against or would you reframe the discussion if you could?
Disinformation is one problem among many in the information environment. It’s also one that is extremely challenging to do anything about.
The information environment is the space where people process information to make sense of the world. To do this, humans develop tools from alphabets to artificial intelligence to transform information into artefacts that can be shared from the spoken word to videos, and whatever comes along in the future. The information environment consists of people, the means for processing information, the outputs created with those means, as well as the interrelationships between these three things, which are all shaped by conditions. The information environment is integral to the foundations of democracy, as this is the place where people make decisions. The very legitimacy of democracy is rooted in the ability of people to make free and informed decisions.
The problem with doing anything in the information environment – be it trying to counter disinformation or Foreign Information Manipulation and Interference (FIMI) is that when it comes down to it, the most fundamental processing of information happens in our heads. We might be able to see the tools and the outputs of processed information, but at some point, how that processed information is all understood comes back to cognition. The very act of intervening in the information environment, even if in the pursuit of ensuring that people can make free and informed decisions, has the potential to take this agency away from them – or, and this is equally as challenging – it can be perceived to be doing so. I think the challenges associated with tackling disinformation highlight this problem.
The information environment is integral to the foundations of democracy, as this is the place where people make decisions. The very legitimacy of democracy is rooted in the ability of people to make free and informed decisions.
Disinformation is misleading or false information that is spread intentionally. The intentionality in sharing it distinguishes disinformation from misinformation, which might be academically satisfying but serves little practical purpose when both belief and deliberation occur in a person’s head. From the outside, we might never know if a person purposely shared information they knew to be untrue, unless they admit to it.
Setting aside intent, there are fundamental issues with the concept of truth that make it a nearly impossible criteria around which to try and manage the information environment. While we might like to think that science can determine truth, the reality is a little more murky.
Truth might be formed through social norms that develop and are accepted by a group over time. Sometimes a truth is built on a preference.
Could you elaborate a bit more on that please?
The truth is pineapple is delicious on pizza. For those that love it, that is the truth. For those that see it as an aberration, including many Italian friends who regularly send me jokes about my poor taste, the truth is quite different. That’s the funny thing about our relationship with “truth”; our need to have it validated by others compels us to share it. In addition to being information animals, humans are social animals, and we seem to have a deep-seated need for the beliefs we’ve formed from information to be confirmed by others in our society.
Truth is formed by social norms and opinions, but also by processing information, and this act will be influenced by all that came before. For example, for some people, the speed at which COVID-19 vaccines were developed and approved was alarming, leading to hesitancy in taking it. For those people, that was based on a comparison for how long it takes under normal circumstances. Both things can be true – that it usually takes years to get a drug approved, but that doing so quicker doesn’t mean it isn’t possible to do so safely.
However, the conclusion a person comes to about a new vaccine will be based on their prior knowledge, experience, and trust in the system overseeing the drug approval process. The truth I might develop is that they are safe, but someone else might not come to the same conclusion. For those who trust science it might seem like it’s possible to have a single ground truth on a topic, but for those who are unfamiliar with science, that argument falls short. And that’s really the crux of the problem. Truth is, for better or worse, personal, and often veers into what we might better call beliefs and opinions.
In addition to being information animals, humans are social animals, and we seem to have a deep-seated need for the beliefs we’ve formed from information to be confirmed by others in our society.
Truth also rests on an expressed idea, which tends to make it a topic, a specific subject on which a belief has been formed and accepted by the holder of that truth.
On issues of social import, particularly if two truths are perceived to be incompatible with each other, this can lead to information competition, whereby two communities attempt to have their worldview adopted by the greater whole – inherently making the process political. Moreover, if the size of these communities is significant in a society, it is inevitable that politicians will pick up the cause.
Therefore, framing public policy around a concept like disinformation, with all the inherent challenges in determining a person’s truth and intentions, is more like pouring gasoline on a fire, rather than a path to finding a democratic solution to whatever problem the disinformation might be aggravating in society, particularly in those that are more heterogeneous and pluralist. No matter how well-intentioned the efforts to address disinformation might be, the act of doing so will make those people holding the belief feel targeted, especially if they are validated by politicians.
By the time multiple truths about a single topic have taken hold in a society, the very act of trying to bring one side over to the other will be viewed with suspicion, threatening to further erode trust in institutions across belief lines. This means that by the time disinformation becomes a problem, it’s likely already too late to do much about it. While disinformation is a problem, if we ever hope to get ahead of it, moving away from a threat-focused approach is the only answer. At some point we must stop focusing on eliminating the things we don’t like, to envisioning what sort of information ecosystem is most conducive for fostering democracy.
Please introduce the concept of information ecology. Should we treat information more akin to a physical space with natural laws?
Information ecology is an approach to studying the information environment, taking into account the humans, tools, outputs, and the relationships between them. It takes from physical ecology for methods to study information ecosystems. It connects the various disciplines that research aspects of the information environment to help present an understanding of the whole complex system. There is a lot we can learn from how the physical world is studied.
For example: in the early 1800s, the Prussian naturalist Alexander von Humboldt provided us with the foundation for how we understand climate and ecosystems today. Before his research, European naturalists viewed the environment as something that existed for mankind to exploit without consequences. A space where we can waltz in and take control – a little like how we think the information environment is just waiting to be shaped to our will when needed. Von Humbolt recognised that there were patterns to ecosystems. He started measuring things like plant growth patterns and temperatures and systematizing that knowledge to compare across geographies to build up knowledge about the physical world. This led to the science of ecology, or the study of the environment and the relation of living things in it.
For decades and globally, climate scientists have been measuring temperatures, water levels, air quality and more. Taken together, the picture is quite bleak. 97% of actively publishing climate scientists agree that humans are causing climate change. Climate scientists don’t just measure air quality to understand how pollution might be changing the environment, they study a host of factors, and innovations like carbon dating helped them dig deep back in time.
If we want to understand the impact of something like disinformation or what to do about it, we have to take a similar systemic approach to studying the information environment as has been done on the physical environment. We need to develop an approach that connects research from different disciplines, while measuring things that exist over time to help chart changes in different factors – we need information ecology. We now know that things like tracking flora and fauna and measuring temperature and precipitation help us understand ecosystems. If we could do the same in the information environment, this would help us determine what the state of an information ecosystem is. Which in turn is important to know before something like a pandemic happens or the introduction of a new technology to better understand the impact of these developments. It would give us baselines to compare.
We know that you have a long-standing experience in stakeholder management. What would be your recommendations for multi-stakeholder engagement on the topic of disinformation and information manipulation?
Start with a shared strategic vision – what is it that you are trying to protect in even countering information manipulation or disinformation? We focus so much on the threats; I think we often fail to articulate a vision of what it is we hope to achieve.
In general, we lack a systemic approach in nearly every way we are tackling issues related to the information environment. Not just in research, but also in terms of coordination of various stakeholders. This is bad because it prevents strategy, which requires seeing the system over a longer-term. A good strategy is about making the best choices from an array of options within an operating environment. This also means working across stakeholders and mandates. We must move beyond a siloes approach.
There are disconnects between stakeholder types across industry, civil society, or governments, and disconnects within single stakeholders. Industry is comprised of several competing companies, but even internally, it is a collection of individual teams who enforce against different threats – be it misinformation, fake accounts, misrepresentation or coordinated inauthentic behaviour – to name a few. These teams know each other, but that doesn’t necessarily mean they work closely together.
Civil society contains a melange of researchers in NGOs and academia. They come from different disciplines, using different methods and terms, and other separate initiatives working on data access, data stores, tools, and systematizing research; not to mention a host of operational efforts from fact checking to tracking disinformation. Government is broken out across departments and agencies with some teams handling foreign disinformation, others violent extremism, entirely different teams on cyber security, and others still working in strategic communications and media development to name some. Depending on the set up there might also be a regulator who can push for action by other stakeholders depending on existing laws.
Across stakeholder types, much less single stakeholders, who knows what the total sum of actors and options are for intervention in the information environment? It’s really hard to be strategic if no one knows what the array of options are to choose from – and it’s even harder if there is no strategic aim. Given the importance of the information environment to its very legitimacy, democracies must articulate a strategic vision for what they hope to achieve. It isn’t reasonable to hope for the eradication of a threat entirely, nor is focusing only on a negative to reach a positive outcome.
This means we have to take a step back and see the system, including the stakeholders and their roles within it. We can’t be strategic until we see the operating environment – in this case the information environment. This will take coordination across stakeholder types, ideally through an independent mechanism that enables rather than co-opts civil society, but facilitates collaboration with more powerful stakeholders like government and industry. That might include drawing on existing frameworks like those in emergency management to created shared plans for how a democratic society might choose to respond in a crisis in the context of the information environment.
Has our current information consumption led to a decline in democracy? And why?
That’s hard to answer in the absence of a systemic approach to studying the information environment. Just even piecing together data trends on the different ways people consume information in a single information ecosystem, like Canada or Finland, is extremely challenging. I think we have a tendency to mistake correlation with causation and look for easy answers for why something like democracy is deteriorating. It’s likely more complex than that, with multiple factors leading to a decline.
Internationally, democracies around are backsliding into authoritarianism. It’s easy to blame technology for that. It certainly enables tactics from an authoritarian playbook, like disinformation and information manipulation. The problem is that controlling an information ecosystem to stop something like disinformation is also part of the authoritarian playbook. So, a slide into authoritarianism doesn’t just happen if governments or politicians pollute the information environment, but also in how they attempt to control their information ecosystem.
If we want to get ahead of any of the issues impacting democracy, we need to understand the system in which democracy exists – the information environment.
Many of the most proposed interventions, like banning bad actors and disinformation from social media platforms, can resemble authoritarian approaches. So is the democratic decline caused by technology changing our way of consuming information, or is it politicians deciding that information pollution is the path to an office, or are it attempts to control an information ecosystem – for better and for worse? It’s likely all of the above, along with a host of other factors, like changing economic conditions or the fallout from the global pandemic. But if we want to get ahead of any of the issues impacting democracy, we need to understand the system in which democracy exists – the information environment. I’m hopeful that the DSA will help in part to get us there, both in terms of transparency reporting and increased data access for researchers, but there is a lot more beyond that which needs to be brought together too.
What are the biggest challenges ahead in the world of foreign information manipulation and interference (FIMI)? And whose responsibility is it to solve them?
The biggest challenge is trust. The elephant in the room when it comes to re-establishing trust in a heavily polluted information environment is the people. We could have the best policy in the world based on robust evidence but if the people do not trust it, the effort is all for naught. But often, the discussions around what to do about issues in the information environment is framed more in how to protect people from threats, rather than how to engage them in fostering an information ecosystem that supports democracy.
Citizens along with all of the stakeholders I mentioned before have a role in the information environment, and all must be engaged in a democracy, in fostering whatever a healthy information ecosystem might be. Maybe that engagement with citizens starts with a wider whole of society exercise that helps identify a strategic vision of what we want to see in the information environment.
By EUvsDisinfo