Rumor Control: a Framework for Countering Vaccine Misinformation

Authors: Matt Masterson, Alex Zaheer, Chase Small, Jack Cable, Jennifer John (Stanford Internet Observatory)

Introduction

There has been a lot of good news recently about the COVID-19 vaccine rollout in the United States. Millions of Americans are getting vaccinated each day, and recent research findings have demonstrated the long term-effectiveness of the vaccines. However, as the United States begins to approach what appears to be a plateau in vaccination rates, we will likely see a transition in the type and pervasiveness of mis- and disinformation about the safety, availability and effectiveness of the COVID-19 vaccines. Those who want the vaccine have largely gotten it, and many of the adults who remain are either hesitant or have difficulty getting access to the vaccine. Trials for children have also begun, with the FDA set to authorize the first vaccine for adolescents imminently. As issues of access are addressed, lingering vaccine hesitancy has the potential to delay the long-awaited end to the pandemic, so understanding persistent vaccine hesitancy is key. Many of the narratives underpinning vaccine hesitancy are predictable – they are themes and messaging borrowed from prior efforts to promote hesitancy in other vaccines, including routine childhood immunizations. The key question: who should address them, and how?

What is Rumor Control?

In the context of countering misinformation, a Rumor Control page is a centralized website offered by a trusted voice sharing facts and information in order to anticipate and respond to emerging narratives. This approach to debunking misinformation draws on literature suggesting that debunking messages coming from rumor control centers can help prevent rumor spread. Psychologists have concluded that messengers that are perceived as having high trustworthiness and expertise are most effective at debunking falsehoods, meaning a debunking approach that aggregates facts from trusted subject matter experts could be ideal. Additionally, in the vaccine context, a study suggests that trustworthy sources of information highlighting expert consensus around known vaccine-related facts and findings can increase general support for vaccines. A Rumor Control site should never be viewed as the only source of truth, but instead, as a distribution channel to drive visitors to additional information about a complicated subject. Operators must recognize that Rumor Control sites should not address all false or misleading narratives: instead, they should address those that are either anticipated to gain widespread traction or have already received such attention on social or traditional media. 

In order to respond to shifting narratives promoting vaccine hesitancy, the operators of a Rumor Control site should identify clear processes for both formulating responses to rumors and consulting subject matter experts. When leveraged effectively, these sites serve as a force multiplier for trusted information sources, enabling other trusted voices to leverage an established Rumor Control site towards tailoring communications for their own communities. Federal, state and local health officials and healthcare providers have begun a version of this by providing reassurance and correcting misconceptions as vaccine distribution continues, but the effort remains disjointed compared to the highly-networked and coordinated anti-vaccine community. For inspiration on how to better connect the “defenders” here, we look at another recent event that involved a flood of false and misleading information: the 2020 election.

Rumor Control and the 2020 Election

The Rumor Control established during the U.S. 2020 election by the Cybersecurity and Infrastructure Security Agency (CISA) serves as a case study for the potential impact and usage of such a site. In the lead-up to this election, all levels of government and the major social media platforms collaborated to anticipate and debunk election-related narratives. For example, as states made changes to the election process due to COVID-19 restrictions, election officials used both social and traditional media to reach out to voters directly regarding when and how to return mail ballots, or explaining why election results would take longer to report due to these mail-in ballots.

As local officials launched their individual Rumor Controls across the country, CISA, the lead federal agency for election security, recognized a need to address larger delegitimizing themes at a national level. As narratives or questions emerged, state and local officials used Rumor Control pages as hubs to share facts and official messaging to voters. This need was highlighted when Iranian actors launched a disinformation campaign that threatened voters with consequences based on their voting preferences. Thus, the CISA Rumor Control page was born, to anticipate and debunk emerging rumors at a national level before they went viral. CISA’s Rumor Control ultimately rose to national attention for championing the role of state and local election officials in securing the election, and today, is regarded by election officials as one of the most effective efforts toward countering mis- and disinformation in the 2020 election.

While election-related and vaccine-related misinformation differ in content, efforts to counter one may inform efforts to counter the other, particularly related to synchronization of public messaging. Similar to election communications, public health communication relies on a network of actors at many levels of locality and across various communities. This makes message coordination among all public health communicators elusive, a daunting fact given the pervasiveness and spread of vaccine mis- and disinformation seen to date. In the 2020 election, CISA’s Rumor Control page emboldened state and local election officials to create their own information hubs, often using CISA’s example as a template. Similarly, in the vaccine space, a Rumor Control operator could work to identify pervasive narratives at a local or national level, then collect facts accordingly from subject matter experts to debunk these narratives and mobilize this process into an operational Rumor Control. 

Operating Rumor Control: a Framework

Rumor Control Workflow

Rumor Control efforts should follow pre-established workflows to ensure that only relevant myths are addressed, proper subject matter experts are consulted in drafting the myth-debunking, and all relevant communicators amplify the fact after publication to reach its intended audience. While an initial version of these procedures should be determined before the Rumor Control is created, as public perception of this page is observed and analyzed, communicators should iterate and make adjustments.

Figure 1: Proposed Rumor Control Workflow

Figure 1: Proposed Rumor Control Workflow

Figure 1 shows a potential Rumor Control workflow, which is outlined in more detail below:

  1. Myth Detected: The Rumor Control operator receives reports from community partners such as doctors, vaccine administrators, community-specific liaisons, or through internal detection and monitoring to determine which narratives are gaining traction in key communities.

  2. Response Threshold Triggered: The Rumor Control team continually monitors the spread of the myth online, regularly evaluating whether or not the thresholds for response have been met.

  3. Fact Drafted and Published: Once the threshold has been met, the Rumor Control operator consults with subject matter experts to determine ground truth and the best way to debunk the observed myth, and publishes the result. 

  4. Fact Amplified: The Rumor Control operator engages with community partners to amplify the accurate information through whichever communication channels best reach the target communities. Given that falsehoods may spread further and faster than facts on social media, this step is critical to ensuring factual information receives as much reach as possible in the appropriate online communities.

  5. Reaction Measured: After the Rumor Control posting is disseminated widely, the operator checks in with community partners and other stakeholders to determine the continued pervasiveness of the debunked myth. While discerning the causal impact of the Rumor Control posting is not easy, this process will yield valuable feedback on the current Rumor Control workflow and identify opportunities for improvement, such as whether thresholds for response should be lowered to future rumors on a certain topic given past spread.

Strong partnerships with community-specific subject matter experts and liaisons are critical to this workflow. Partners can include state and local government offices, civil society members, NGOs, and individual organizers. Not only should these individuals help in sourcing and assessing the impact of pervasive narratives, but they will also be the core amplifiers of Rumor Control postings to each target audiences. Building these partner relationships takes time, and should be started as early as possible in the Rumor Control development process.

Effectively Presenting Facts 

An effective Rumor Control page addresses rumors directly and uses language that is accessible to the general public. Each rumor explanation should follow the same format: a factual statement, followed by a single sentence summarizing the rumor, and finally a deeper factual debunking of the rumor. 

  • Start with the facts. Research shows that overall, debunking misinformation does decrease belief in the targeted falsehoods. The most effective debunking strategy is presenting factual information about the topic in question. Draft Rumor Control posts with this in mind, sourcing your facts from trustworthy subject matter experts such as local doctors, hospital associations or health offices

  • Write in plain, accessible language. Especially in a social media environment, factual information used to debunk falsehoods are ideally accessible to the average layperson. Statements should be succinct and visually appealing. Posts to social media promoting Rumor Control should include images and diagrams if those are available.

  • Link to other trusted sources, not the rumor itself. Avoid linking to instances of the original myth. Instead, link to authoritative sources on the subject that are likely to be widely recognized as independent and trustworthy.

Example Rumor Control Format

Reality: COVID-19 vaccines do not change or interact with your DNA in any way.

Myth: COVID-19 vaccines are designed to change your DNA

Get the facts: There are currently two types of COVID-19 vaccines that have been authorized and recommended for use in the United States: messenger RNA (mRNA) vaccines and a viral vector vaccine. All COVID-19 vaccines work with the body’s natural defenses to safely develop immunity to disease. The vaccines never enter the nucleus of the cell, which is where our DNA is kept. This means the genetic material in the vaccines cannot affect or interact with our DNA in any way. For more information, see the CDC’s explanation of mRNA vaccines.

Adapted from https://www.cdc.gov/coronavirus/2019-ncov/vaccines/facts.html

Recommended Thresholds for Posting to Rumor Control

Operators of a Rumor Control site should only address topics that the average reader will have already heard about, or is likely to encounter. First Draft, a non-profit organization that works with journalists, academics, technologists on how to provide accurate information in critical moments, highlights five criteria to consider when determining when to report on misinformation. Below, we have placed these thresholds in the context of a Rumor Control website:

Engagement - Rumor Control websites should address misinformation that has received a high level of attention across multiple posts. Operators can search keywords of a rumor on Google, Facebook, or Twitter to assess if it appears in multiple posts with a high cumulative number of engagements. For example, it would not be appropriate to address a false claim made by an individual online that has only been liked a handful of times, as doing so would likely expose people to the rumor who otherwise would never have encountered it. Data & Society have provided guidelines on the danger of amplification when communicating about harmful content.   

Audience - While online anti-vaccine communities and accounts post vaccine misinformation incessantly, Rumor control sites should only address a rumor found in these groups if it has spread to the general public. Operators should look for instances where an online group that rarely talks about vaccines discusses an anti-vaccine narrative. For instance, when a narrative moves beyond anti-vaccine groups, to political forums or parenting groups, this is indicative of more general spread. 

Multiple Social Media Platforms - Rumor Control websites should consider whether misinformation has spread across multiple social media platforms. To identify additional content, key terms and URLs associated with the initial post will be searched for across different platforms. Content that has spread across multiple major platforms and met other criteria above has likely gained enough traction to warrant a Rumor Control post. 

Influencer/Verified Accounts - Rumor Control websites should address misinformation that has been amplified by verified accounts and other influencers who may have a large and active following that is likely to amplify their message. If the influencer does not generally discuss vaccines, the narrative is likely to pose a greater threat.

Large Media Outlets - Rumor Control websites should address misinformation that has been amplified by large media outlets like cable news or online newspapers. Even if the media outlets are debunking and countering the misinformation, it will be important to include in the Rumor Control site. 

When in doubt, Rumor control sites should seek the support of mis- and disinformation researchers to assess the pervasiveness of rumors. Research institutions such as those that make up the Virality Project, FirstDraft, and Project VCTR can be helpful for determining if a rumor meets the above criteria.

As vaccination rates in adults plateau and child vaccinations begin, misleading information about vaccines will likely intensify, and therefore so should efforts to counteract it. The Rumor Control model offers an opportunity to use networked influencers — a tactic employed heavily by disinformers — across sectors to amplify facts over rumors. If it is successful in the vaccine space, this could be an important tool in demonstrating that a truly “whole-of-society” response to harmful mis- and disinformation is possible.

Previous
Previous

The Case for a Mis- and Disinformation Center of Excellence

Next
Next

Vaccine Rollout and Mis/Disinformation: Expectations and Action Plan for Health Communicators