Orangefiery logo on transparent background. An orange halo sits on top of the word "orange" written in a typewriter-esque font and "fiery" written in a script-like font.
Consulting and communications to help organizations navigate the path to growth

Sign up to stay informed

1-415-384-8677

Top

Combating Disinformation and Misinformation in the COVID-19 Era

Combating Disinformation and Misinformation in the COVID-19 Era

Misinformation, Disinformation & Ineffective Communications Strategies Have Made It Harder to Fight COVID-19

Over the last decade, the world has been plagued with disinformation and misinformation, battling what’s come to be known as “fake news.” In a world where endless information is instantly available, it’s become impossible for social media platforms to keep up with and combat the harmful disinformation and misinformation that exists online. The rise of disinformation and misinformation is largely tied to an increased use of social media over traditional media sites to access and share news.

Disinformation and misinformation become even more dangerous when health and safety is involved, and when health and safety issues become politicized, as they have during the COVID-19 pandemic. According to a study published in Misinformation Review, people who primarily consume news from social media are more likely to believe falsehoods about coronavirus. This has contributed to a difficulty discerning what content is legitimate opinion content vs. disinformation or misinformation.

Both misinformation and disinformation have been disseminated on social media platforms, but there are key differences between the two. Disinformation is generally considered to be deliberately misleading or biased information, such as the Russian bots that may have been promulgating messages about the pandemic on Twitter, as mentioned in our November blog post. Misinformation, on the other hand, is the spread of false information regardless of intent. The two are often interrelated, such as when a piece of disinformation is unknowingly amplified by social media users (an act of misinformation).

In March, Kang-Xing Jin, Facebook’s head of health, wrote an op-ed discussing this topic, saying “Vaccine conversations are nuanced, so content can’t always be clearly divided into helpful and harmful. It’s hard to draw the line on posts that contain people’s personal experiences with vaccines.”

Unfortunately, social media has provided a dangerous platform for disinformation from the anti-vaccine movement. Lack of direct communication from the government, in combination with the limited capabilities of media companies to combat vaccine misinformation due to fears of impeding on individuals’ right to free speech, has created the perfect storm for anti-vaccination activists and, as a result, an immense threat to public health.

Dr. Peter Hotez, a professor of pediatrics and molecular virology at Baylor College of Medicine and author of “Preventing the Next Pandemic: Vaccine Diplomacy in a Time of Anti-Science”, shared some of his thoughts on vaccine misinformation as a result of the pandemic in a conversation with The New Yorker: “While the White House and Operation Warp Speed did a good job in terms of the scientific rigor and integrity of the clinical trials, they never had a communication plan,” he says. “And that left a big vacuum. So I think those things really combined to work against us in a big way.”

As communicators, we know the impact that a good campaign can have on the mindset of key publics—and we also know the potential harm that can occur in its absence, especially regarding issues of public health.

COVID-19 Disinformation, Misinformation & Impact

The World Health Organization (WHO) was able to foresee misinformation as a problem in the early days of the pandemic and partnered with over 50 digital companies and social media platforms in February 2020 to help circulate vital health and safety information and remove misleading posts to minimize the spread of false information. Additionally, WHO has incorporated social listening into its public health messaging development, reviewing 1.6 million pieces of information weekly on various platforms to categorize information, track trending topics and develop/adapt prompt health messages.

According to WHO, initial misinformation about the COVID-19 pandemic at its start stemmed from distrust in public health experts, as people debated whether it was a serious disease and if mask-wearing would protect them.

Despite WHO’s early pushes for media companies and social media platforms to crack down on COVID-19 misinformation, there has not been the same momentum for cracking down on vaccine-related misinformation – and disinformation – until recently.

Anti-Vaccine Misinformation & Impact

There are many varying levels to, and reasons behind, the anti-vaccine movement and vaccine hesitancy. Beyond traditional anti-vaccine rhetoric, such as the emphatically disproven claim that vaccines have a correlation with autismor other conspiracy theories, individuals have concerns that are specific to the COVID-19 vaccine rollout.

According to interviews from a nationally representative sample of over 1,000 people conducted by Kaiser Family Foundation in February, the largest public concern (36% of respondents) that people had about COVID-19 vaccines are side-effects, followed by the newness of the vaccines, concerns about effectiveness, safety, access and availability. This aligns with more recent poll data that has been made available in light of the rapid distribution of the vaccine over the last several months. According to a recent CNN poll, seven in ten Americans have either gotten the vaccine or plan to do so—but a quarter of Americans say they will not try to get a shot.

According to a report by the Centre for Countering Digital Hate (CCDH), social media accounts held by anti-vaxxers have grown in followers by over 10 million people since 2019 — and this movement only continues to grow online (it now stands at 59 million followers). Additionally, researchers found that just 12 individuals, known as the “Disinformation Dozen,” produce a majority of the vaccine disinformation on Facebook, Instagram and Twitter. This growth in followership and disinformation from a few sources is cause for concern.

Unfortunately, the COVID-19 pandemic has heightened public fears and given anti-vaccination activists an even greater platform to spread disinformation. Conversations questioning vaccine safety are also being driven by mistrust in the intentions and political/economic motives of the institutions and prominent figures involved in the vaccine rollout, according to research conducted by First Draft on dominant vaccine narratives, disinformation, misinformation and data deficits.

Additionally, there is nuance and complexity to the information ecosystems surrounding vaccine information online. While there is a high demand for information on vaccines, there is a low supply of credible sources that are then exploited by bad actors—resulting in misinformation and disinformation rising to the forefront.

Significance of Varying Responses to COVID-19 vs. Vaccine Misinformation

In September 2020, Facebook CEO Mark Zuckerberg told Axios that Facebook was not planning to take strong action to condemn anti-vaccination misinformation or disinformation in the ways that it has for the coronavirus pandemic, so as not to infringe on individuals’ right to self-expression. Facebook did not announce a change of stance on the importance of battling vaccine disinformation until February 2021—following announcements from other social media giants including Twitter, TikTok and YouTube at the end of 2020.

In thinking about why there is a distinction between how media companies chose to address and combat disinformation surrounding the COVID-19 pandemic compared with the vaccine rollout, there are a few key considerations. Even with anti-vaccination advertisement bans from companies such as Facebook, or recent further crackdowns to remove groups or individuals spreading disinformation surrounding the vaccine, there are still ethical constraints to limiting speech on the platforms.

While both COVID-19 information and vaccine information are integral to public health, there is a key difference in the rhetoric surrounding both issues. COVID-19 protocols, such as mask-wearing or social distancing guidelines, while they have in some ways been politicized, appear to still be viewed as essential safety guidance to protect those around you. In contrast, information on vaccines prior to the pandemic has portrayed vaccination as an individual, autonomous choice, rather than a public health issue. Because of the key differences in this rhetoric, it can be more difficult for media platforms (as well as members of the general public) to differentiate what is misinformation derived from an individuals’ personal opinion versus disinformation that was introduced with malicious intent.

Combatting Disinformation

The fact is, there are three important and interrelated problems in the misinformation and disinformation landscape, and three potential things we can do about them.

First, there is science literacy and the scientific method itself, which is poorly understood. In the past 18 months, we have witnessed science and clinical research in action, and it hasn’t always been pretty. First masks weren’t needed. Then they were. The SARS-CoV-2 virus spread primarily through respiratory droplets. Then through aerosols. It also spread through surface contact. Then it didn’t.

Scientific information is constantly evolving. It makes communicating about things challenging, because people want definitive answers to questions about what to do to prevent the spread of infectious diseases. Regardless, both consumers and communicators can improve trust by taking a number of steps, like being skeptical and well-educated readers, as we outlined in 2019.

Directly connected to this is the need for credible sources, including our government, to adapt to the information-at-our-fingertips environment and public expectations. The CDC has recently faced criticism from experts and public health officials who worry that the organization is issuing guidance too slowly. Because this is perceived by some as indecision and an indication of a lack of accurate information, it has led to public frustration and skepticism regarding their recommendations. There is a lot of room for improvement, and these adjustments are essential to maintaining public trust in essential regulatory bodies.

Second, the social media platforms that have connected hundreds of millions of users at scale are prone, by virtue of their algorithms, to be exploited by disinformation actors. Whether these are Russian bots, anti-vaccination activists or agents of chaos, there are a set of skills to be mastered to packaging information so as to be widely disseminated. Provocative headlines, A/B testing and boosted engagement from affiliated but fake user accounts are tools used by many to maximize the reach of their posts. Disinformation actors are no different.

Related to this are the so-called “filter bubbles” in which our social media accounts exist. At a broader level, these algorithms deliver news based on what has driven a user’s engagement in the past. The result can be a self-reinforcing but isolated version of reality, which limits our exposure to varying points of view and sources.

The third problem is, well… us. Most people are not well educated on discerning the difference between disinformation and legitimate information. Which makes sense, considering the fact that a few disinformation actors seem to be the source of most widely spread disinformation news. The widespread practice of sharing misinformation – meaning the case in which false information is shared without malicious intent – is what turns disinformation into such a powerful weapon.

In order to truly crack down on misinformation and disinformation, private digital media companies must algorithmically prioritize credible sources and actively remove the groups, sources and individuals that are spreading falsehoods—beyond performative public statements condemning their actions. Without meaningful change at the platform level, the reward for disinformation actors to exploit these platforms is simply too great.

We also need to be better educated social media consumers of information of all stripes (including misinformation and disinformation). For tips on how you can stop the spread of disinformation, check out this 10-point checklist from the Institute for Public Relations, part of its 2020 IPR Disinformation in Society Report.

Disinformation and misinformation are playing a critical role in the pandemic and the vaccination effort that seeks to end it. By talking openly about these issues and taking tangible steps to safeguard facts, we can prevent the spread of both disinformation and misinformation.

 

Share
No Comments

Post a Comment