By Joan Donovan and Claire Wardle, for Items
Misinformation can be deadly. The Covid-19 pandemic has demonstrated that fact clearly and tragically. While the big tech companies have taken some steps to rein in misinformation and make credible organizations more visible, they lag far behind the pundits, scammers, and extremists who abuse trust on a global scale. It’s time for companies and researchers to take risks, make bold experiments, and rigorously test ways to slow the speed and spread of deadly deceptions.
In June 2020, The World Health Organization (WHO) began a webinar series on the infodemic, where medical researchers, social scientists, journalists, and health professionals discussed how misinformation impacts their ability to deliver timely, relevant, and local information. There is a broad need for a cross-sector alliance to damp the spread of health misinformation, but in this critical time, expertise is overwhelmed by the flood of misinformation, rumors, and conspiracies.“The struggle over who is responsible for the distribution of misinformation continues to thwart public health professionals’ attempts to remedy misinformation and triage the infodemic.”
During the public webinar, Viroj Tangcharoensathien, a representative from the Ministry of Public Health in Thailand, expressed an urgent need for better coordination between key stakeholders and private platforms to remove medical misinformation. Many of the participants in the webinar understood the dangers presented by health misinformation, but collectively are unable to implement any course of action without tech companies’ cooperation. The struggle over who is responsible for the distribution of misinformation continues to thwart public health professionals’ attempts to remedy misinformation and triage the infodemic.
We are in the midst of a perfect storm of misinformation, where the wrong information or lack of it can be deadly. Many governments and health authorities are doing everything in their control, but real-time crisis communications in an age of digital platforms has become a lethal challenge. During the webinar, the WHO reiterated that it is critical for researchers to understand the power of hashtags and social media, yet we should not overburden public health officials to come up with creative ways to maneuver around misinformation.
Over the past four years, we have conducted path-breaking misinformation research with the support of expert teams and funding of philanthropists. We hollered as platforms evaded accountability, refused to turn over data as promised, and struggled to protect their reputations under intense public scrutiny. Our teams have tracked all types of media manipulation from false news, faulty products, scams targeting marginalized communities, conspiracies, hoaxes, bots, and imposters as they swirled across social media platforms.
We know what “normal misinformation” looks like, but the infodemic is different because of the synchronous global scale; it’s bigger, badder, and potentially deadlier. As scholars, we must organize research for action.“During the course of the pandemic, tech companies have implemented changes to direct attention to authoritative content.”
During the course of the pandemic, tech companies have implemented changes to direct attention to authoritative content. Google has added a Covid-19 portal as the top of any results for related searches. Twitter has significantly strengthened its policies around taking down any “content that directly goes against guidance from authoritative sources of global and local public health information.” Facebook has launched a Coronavirus Information Center so people can receive locally relevant information. WhatsApp has partnered with the WHO and governments around the world to help people search for relevant information on the messaging platform in multiple languages. Apple has partnered with the Centers for Disease Control and Prevention (CDC) to build an app to help people assess their symptoms. Yet, it’s still not enough.
There will never be enough good information to counterweight bad information. In the same way as hospitals face unprecedented pressures in terms of numbers of beds, masks, and ventilators, well-meaning information portals and chatbots aren’t enough to adequately tackle the scale and potential harm that will come from unmoderated misinformation during this pandemic. For example, people do not go online every day to post “the earth is round” but there are people who do go online every day to say, “the earth is flat and here’s 10 reasons why.”
Misinformation can spread like a virus, using networks and contact to travel from person to person through clicks, likes, and shares. Its vectors of attack are often online influencers and political pundits with vast hypersocial networks. Health professionals cannot be expected to communicate their research and advisories over social media at the same volume because they often have small and limited networks. As researchers, the challenge ahead of us requires an answer to this question: What must change so internet users get authoritative, timely, relevant, and local information first and foremost?
The design of social media platforms needs an overhaul so that not just popular information is available, but curated knowledge reaches information seekers first. Every day, people find information online, either because they search for it or they stumble across it in their feeds. What they find is critical and yet the public still has relatively little understanding or control over how information appears in their feed or why it is selected. Some of the most engaging disinformation efforts tap into people’s deepest fears about their own safety and that of their loved ones. That’s in part why the Covid-19 pandemic features the latest swarm of bad and misinformed actors pushing dangerous narratives.
We are deeply alarmed that journalists, governments, and health authorities don’t have the capability or capacity to provide accurate, timely, engaging content targeted at the right people, at the right time, in the right places. Overstretched and underresourced newsrooms or fact-checkers can’t fill these gaps either. Without significant investment and rapid organization and management of the infodemic, we are going to be in serious trouble as this crisis unfolds, not just in the United States, but in places like India, Brazil, and Nigeria where misinformation also propagates at alarming rates aided by political networks.“After three months deep into networks monitoring coronavirus information flows, we’ve seen internet users be bombarded with an array of hoaxes, scams, fake cures, and malware.”
This is not hyperbole. After three months deep into networks monitoring coronavirus information flows, we’ve seen internet users be bombarded with an array of hoaxes, scams, fake cures, and malware. This is the tip of the iceberg, with growing misinformed claims having concrete harms. The potential of hydroxychloroquine to treat the virus led to so much hoarding and panic buying that people suffering from lupus and other autoimmune disease struggled to fill their prescriptions. There is an uptick in anti-Islamic sentiment on social media. At the same time, we’re seeing an increase in violence toward Asian people globally. Scams, whether phishing attacks over email or the hawking of false cures to prevent infections, are growing, as the FTC warns. Like the virus itself, these tactics and techniques are being seen in every country, in every language.
In short, opportunists and malicious actors are taking advantage of the current situation, and technology companies are too far behind to protect users from abuse and deception on this global scale. Beyond health misinformation, there must be an overhaul to the design of platforms with a concerted effort to rapid responses led by independent researchers. The coronavirus pandemic exacerbates and exposes problems already apparent in the design of social media platforms and the internet; problems that misinformation researchers have been studying for a long while.
We need a communication response that matches the threat, which includes optimizing social media and search capabilities for trustworthy results bolstered by expertise. Current research efforts must be integrated into platforms and expanded to handle the exponential rise in misinformation. For example, First Draft is working around the clock to tamp down hoaxes and train journalists remotely, while The Shorenstein Center is conducting research on media manipulation and publishing a fast-tracked edition of the Misinformation Review on Covid-19. Our research indicates that rapid response to misinformation on social media and by news organizations can alleviate the strain put on other professional sectors, such as journalists and public health advocates, to handle the overflow of inquiries from people seeking clarification on Covid-19. Coordinating across the fields of journalism, critical internet studies, and public health is a crucial step toward these ends.
Stuck in our homes, social media has become more important than ever. Unfortunately, even after the virus subsides, misinformation is here to stay. We know there is no easy fix for misinformation because the problem is as much social as it is technical. But now is the time to start experimenting, taking risks and rigorously testing what might actually prevent or slow down the spread of harmful misinformation.“Medical misinformation…can have deadly consequences as publics seek out guidance on what they can do to lower the risk of infection.”
Now is the time to organize research for action. When we say, “Misinformation is Everybody’s Problem Now,” we mean that misinformation crosses academic disciplines, professional sectors, and tears at the fabric of society in ways that will impact generations to come. Medical misinformation, in particular, can have deadly consequences as publics seek out guidance on what they can do to lower the risk of infection. This is a “whole of society” problem, where creative solutions are necessary across research and professional sectors, so that different forms of expertise can inform new strategies for fighting the infodemic. For example, this should include tech companies not just futzing with algorithms, but also working closely with librarians to ensure high-quality knowledge curation is factored into their products.
As researchers, this pandemic has taught us that when information seekers search online for medical guidance, society cannot shoulder the burden wrought by rampant medical misinformation, scams, and hoaxes. In assessing the true cost of misinformation, it doesn’t just cost more financially to undo the damage caused by misinformation, it also costs us socially and politically, which taxes every one of us in different ways.
By Joan Donovan and Claire Wardle, for Items
Joan Donovan – PhD, research director of the Shorenstein Center on Media, Politics and Public Policy at the Harvard Kennedy School, where she researches media manipulation and disinformation.
Claire Wardle – PhD, US director at the nonprofit First Draft, where she researches, leads projects, and trains groups on understanding and tackling misinformation.