Skip to main content

Fighting the infodemic: the 4 i Framework for Advancing Communication and Trust



The proliferation of false and misleading health claims poses a major threat to public health. This ongoing “infodemic” has prompted numerous organizations to develop tools and approaches to manage the spread of falsehoods and communicate more effectively in an environment of mistrust and misleading information. However, these tools and approaches have not been systematically characterized, limiting their utility. This analysis provides a characterization of the current ecosystem of infodemic management strategies, allowing public health practitioners, communicators, researchers, and policy makers to gain an understanding of the tools at their disposal.


A multi-pronged search strategy was used to identify tools and approaches for combatting health-related misinformation and disinformation. The search strategy included a scoping review of academic literature; a review of gray literature from organizations involved in public health communications and misinformation/disinformation management; and a review of policies and infodemic management approaches from all U.S. state health departments and select local health departments. A team of annotators labelled the main feature(s) of each tool or approach using an iteratively developed list of tags.


We identified over 350 infodemic management tools and approaches. We introduce the 4 i Framework for Advancing Communication and Trust (4 i FACT), a modified social-ecological model, to characterize different levels of infodemic intervention: informational, individual, interpersonal, and institutional. Information-level strategies included those designed to amplify factual information, fill information voids, debunk false information, track circulating information, and verify, detect, or rate the credibility of information. Individual-level strategies included those designed to enhance information literacy and prebunking/inoculation tools. Strategies at the interpersonal/community level included resources for public health communicators and community engagement approaches. Institutional and structural approaches included resources for journalists and fact checkers, tools for managing academic/scientific literature, resources for infodemic researchers/research, resources for infodemic managers, social media regulation, and policy/legislation.


The 4 i FACT provides a useful way to characterize the current ecosystem of infodemic management strategies. Recognizing the complex and multifaceted nature of the ongoing infodemic, efforts should be taken to utilize and integrate strategies across all four levels of the modified social-ecological model.

Peer Review reports


In today’s interconnected, digitalized world, it has become increasingly apparent that information about a public health event—particularly false or misleading information—can lead to negative health outcomes. The ongoing COVID-19 pandemic has reinforced this fact, prompting the World Health Organization (WHO) to declare a simultaneous “infodemic” or “overabundance of false or misleading information on COVID-19, which poses a grave threat to response efforts and public health” [1]. In the midst of this infodemic, researchers have uncovered associations between exposure to or belief in COVID-19-related misinformation (false or misleading information that is spread unwittingly by those who do not know it is false) and psychological distress, non-adherence to recommended mitigation measures, reduced intent to get vaccinated, and violence against healthcare workers [2,3,4,5]. Disinformation, or false information that is spread deliberately by those seeking to cause harm, is also a growing concern, as there is evidence that state actors may be using disinformation to fuel pernicious debates about public health issues in the United States (US), particularly vaccination [6].

The rise of social media and digital technologies has undoubtedly contributed to the infodemic. Indeed, researchers have found that on Twitter, false information travels faster and more widely than true information [7]. The mechanisms that underly the viral spread of false information on social media are complex and contested, but scholars have highlighted the role of platform algorithms and echo chambers (online environments in which individuals only see content that aligns with their pre-existing beliefs), both of which may facilitate selective exposure to (potentially false) information [8,9,10].

Once exposed, cognitive and psychological processes dictate whether an individual will believe false information or reject it. Unfortunately, human cognitive processing is subject to inherent biases that can make individuals vulnerable to misinformation/disinformation [11]. There is some evidence that individuals may be more prone to these biases when presented with information in a so-called “filter-bubble” (an algorithmically curated information environment) or echo chamber [12], further underscoring the role of social media in the propagation of false information. Mistrust—of governments, individuals in positions of authority, or institutions—has also been implicated in belief in/the spread of misinformation and disinformation [13, 14], as individuals with high levels of mistrust are likely to reject official information and seek out alternative explanations (which may take the form of conspiracy theories) [15].

While a large and growing body of research is dedicated to understanding both the mechanisms and impact of misinformation and disinformation, fewer efforts have sought to characterize the full spectrum of misinformation/disinformation management strategies [16]. Two recent analyses, for example, characterized only a subset of existing strategies, focusing mainly on psychological and cognitive interventions [17, 18]. Further, these analyses focused on false information more generally, without a specific focus on strategies for managing health-related misinformation/disinformation. Over the past several years, numerous organizations have developed tools and approaches to manage the spread of falsehoods and communicate more effectively in an environment of misleading health claims. The WHO, for example, ran a crowdsourced technical consultation in 2020 on infodemic management strategies, leading to the development of an infodemic management framework [19]. Other groups have developed more local and community-based approaches, including training trusted community messengers to disseminate accurate information about COVID-19 [20]. Additionally, researchers have crafted innovative interventions designed to refute or confer resistance to health-related misinformation/disinformation [21,22,23]. Taken together, these tools and approaches can serve as a resource for public health practitioners and those working in health communications, research, or policy, who will be faced with health-related misinformation and disinformation for years to come. However, because such approaches have not been systematically characterized, practitioners and policy makers are unlikely to be able to take full advantage of them when crafting their own infodemic management strategies.

The aim of this analysis was to characterize the current ecosystem of infodemic management strategies, allowing public health practitioners, communicators, researchers, and policy makers to gain an in-depth understanding of the tools and approaches at their disposal. Specifically, we sought to accomplish two goals: first, in an exploratory review, we identify existing tools and approaches for infodemic management, and second, through a qualitative content analysis of these tools and approaches, we develop a conceptual framework to characterize points of infodemic intervention. This work was conducted as part of a large multi-stage research project exploring effective public health communication strategies to utilize in an environment of false or misleading information and mistrust.


Search strategy

The research team utilized a multi-pronged search strategy to identify tools and approaches for combatting health-related misinformation and disinformation. First, a scoping review of academic literature indexed in PubMed, Scopus, and Web of Science was conducted using two sets of keywords: one relating to misinformation and disinformation and another relating to management of or solutions to misinformation and disinformation. The search strategy for each database can be seen in Table 1.

Table 1 Search strategy for scoping literature review

To expedite the scoping review, final search results from all three databases were filtered to exclude non-review articles, yielding a total of 413 reviews. These reviews were uploaded to Covidence, a literature review software program. Duplicates were identified and removed, yielding 313 unique reviews. AES then conducted title and abstract screening followed by full text review. Reviews were included in the final corpus if they were accessible online, available in English, and contained discussion of interventions or strategies for managing/combatting health-related misinformation and disinformation. Reviews were excluded if they were not written in English, were not accessible online, or did not contain discussion of interventions or strategies for managing/combatting health-related misinformation and disinformation. A total of 43 reviews were included in the final corpus. These reviews were re-read in full by AES, who extracted individual tools, strategies, or approaches for managing/combatting health-related misinformation/disinformation and added them to an Excel file.

Next, all members of the research team (AES, AMJ, NH, and SLP) conducted independent searches of grey literature, including publications, reports, and products accessible through web searches. This search was built around a deductive list of organizations involved in misinformation and disinformation management and health communications, including international and intergovernmental organizations, US-based federal agencies, non-governmental organizations, technology and media companies, non-profits, think tanks, and research centers. To supplement the above searches, the research team scanned all U.S. state health department websites for misinformation and disinformation management practices, policies, and tools. In addition to state health departments, the websites of the following large local health departments were also searched for misinformation and disinformation management tools and strategies: the New York City Department of Health and Mental Hygiene, the San Diego County Health Department, Public Health – Seattle and King County, the Baltimore City Health Department, and the Philadelphia Department of Public Health. Team members also engaged in organic searches to identify additional sources. Tools and approaches were added to the Excel file as they were discovered, with care taken not to add duplicates.

Similar search terms were used to search the gray literature and health department websites as those used in the scoping literature review, though many websites did not have advanced search functions. As such, individual keywords or phrases were often used to search for relevant tools and approaches (e.g., “infodemic management” or “misinformation”). All searches were conducted between October 2022 and January 2023.

Tools and approaches were included in the Excel file if they were focused on addressing misinformation or disinformation related to a health topic (broadly defined). Tools and approaches were not limited by date of development and included those that emerged prior to COVID-19 as well as those that were in development at the time the searches were conducted. Further, tools were not limited by geography or language, but as our research team is based in the US and speaks English, these tools are more prominent in the data.

Qualitative data analysis

The main feature(s) of each tool or approach were labelled in Excel using an iteratively developed list of tags. The initial list of tags was informed by the scoping literature review and developed by AES. This list was refined by the research team through group discussions as new tools and approaches were identified. Tools/approaches were coded with relevant tag(s) by the same researcher who entered the tool/approach into the working Excel file. Each entry could be coded with up to 3 tags. The research team held weekly meetings to discuss any coding questions and to revise the tag list as necessary.


We identified over 350 tools and approaches for managing health-related misinformation and disinformation. Many of the tools did not distinguish between misinformation and disinformation and were designed to combat false information in general (disinformation turns into misinformation once it is believed and propagated by those who believe it, so it is not always necessary or even possible to distinguish between the two [24]).

To characterize the infodemic management strategies identified in the search, we present the 4 i Framework for Advancing Communication and Trust (4 i FACT). The 4 i FACT, which is based on Bronfenbrenner’s ecological systems theory and the widely used social-ecological model (SEM) [25, 26], consists of four levels (information, individual, interpersonal/community, and institutional/structural), each of which contains a subset of the tags used to label individual strategies. A description of the tags in each level is shown in Fig. 1.

Fig. 1
figure 1

The 4 i Framework for Advancing Communication and Trust (4 i FACT) with types of tools and approaches

Each level of the 4 i FACT is described below, along with a description of the tags contained in each level and examples of the tools and approaches associated with each tag.


Tags in the information level were used to label tools or approaches that targeted information itself, including accurate information, false information, or lack of information (i.e., information voids).

Amplifying factual information

We identified 108 tools and approaches designed to disseminate or amplify accurate information or otherwise direct individuals to credible sources of information. These approaches often made use of social media to ensure accurate information reached as many people as possible. During the COVID-19 pandemic, for example, the Baltimore City Health Department launched a series of social media campaigns to ensure Baltimore residents had accurate and up-to-date information about the COVID-19 vaccines. The posts for these campaigns, which were written humorously using vernacular and graphics popular on social media, were designed to “go viral” [27]. Other organizations designed their approaches around social media to combat false information spread on these platforms. For example, “Dear Pandemic” is an ongoing effort to provide social media users with easy-to-understand, factual, and practical information about COVID-19 on Facebook and Instagram [28].

Filling information voids

We identified 50 tools/approaches designed to fill information voids. Some of these tools were chatbots that were programmed to answer common questions. VIRA, for instance, is a chatbot developed by the Johns Hopkins International Vaccine Access Center that uses artificial intelligence (AI) to answer common questions about the COVID-19 vaccines [29]. Other approaches relied on human interaction rather than AI. Several state health departments, for instance, including those in Minnesota [30], Georgia [31], and Illinois [32] ran telephone hotlines during the COVID-19 pandemic to answer residents’ questions. Search engine optimization was also used to fill information voids. The WHO and Google, for example, partnered during the COVID-19 pandemic to create an organized search results panel for anyone searching for information about COVID-19 online [33]. The search results panel directs Google users to credible sources of information like the WHO or CDC, thereby ensuring factual responses to search queries.

Debunking false information

We identified 100 tools/approaches designed to fact check or debunk circulating false information. Many of the tools with this tag were traditional fact-checking websites that provided lists of false claims and accompanying refutations or alternative explanations. Some of these websites were dedicated to either specific topics or specific sources of misinformation or disinformation. The #CoronaVirusFacts Alliance, for example, is a website containing a categorized database of fact-checked rumors about COVID-19 [34]. The EUvsDisinfo Database is a collection of debunked disinformation from pro-Kremlin sources. The database contains debunked claims on a variety of topics, including COVID-19, bioweapons, and other geopolitical issues [35].

Information tracking

We identified 44 tools/approaches designed to track circulating information, including false or misleading information. Many of these were social listening tools, which track conversations on social media and often rely on AI and machine learning (ML). The Early AI-Supported Response with Social Listening (EARS) Platform, for example, is a platform developed by the WHO that uses AI to search for COVID-19-related conversations and posts from major social media platforms, allowing users to gain an understanding of how individuals are talking about COVID-19 online [36]. Some of the tools and approaches with this tag facilitated reporting of misinformation. During the 2014–2016 Ebola outbreak, for example, a group of intergovernmental and academic organizations created DeySay, a rumor-tracking messaging system that allowed community members to report Ebola-related rumors via text message. The rumors reported through this system were used to inform relevant debunking materials, allowing public health communicators to refute misinformation in real time [37].

Verification, credibility, and detection

We identified 32 tools designed to detect false information or evaluate content or source credibility. These tools can be split into two broad categories: those designed to verify or rate sources of information, and those designed to confirm the authenticity of information by detecting manipulation or bot-like activity. An example of a tool that falls into the first category is Media Bias/Fact Check, which is a website that rates the bias and credibility of media sources and directs users to news pieces from the "least biased" sources [38]. An example of a tool in the second category is Botometer, which is an online tool that helps users determine whether specific Twitter accounts are likely to be bots [39]. Most of the verification, credibility, and detection tools were automated and relied on AI/ML.


Tags in the individual level were used to identify tools and approaches designed to increase individual-level resiliency to misinformation and disinformation.

Enhancing information literacy

We identified 58 tools/approaches designed to encourage or teach individuals to think critically about the information they consumed, thereby reducing their susceptibility to false or misleading claims. Some of these approaches were focused on a single type or form of information, such as scientific or health-related information. The San Diego County Health Department, for example, developed an online resource (a webpage with links to other sites) informing users how to find credible scientific information about COVID-19 as well as how to critically evaluate scientific information about the disease [40]. Other tools and approaches were focused on digital or media literacy. For example, the non-profit New America is currently developing Cyber Citizenship, which is a collection of media and digital literacy resources for educators who are interested in helping their students build resilience to misinformation and disinformation online [41]. Other tools and approaches with this tag were designed to enhance information literacy more broadly. Sarah Blakeslee at California State University, Chico, for example, developed the CRAAP Test, which is a tool that helps individuals evaluate the credibility of a source of information based on its Currency, Relevance, Authority, Accuracy and Purpose (CRAAP) [42].


Prebunking, also referred to as inoculation, is a strategy in which individuals are pre-emptively exposed to anticipated false information or common tactics used in misinformation and disinformation campaigns, making them (theoretically) less susceptible to misinformation and disinformation when they come across it [43, 44]. We identified 11 prebunking tools/approaches in this search, many of which were in gamified formats. For example, Go Viral! is an online game developed by the University of Cambridge, UK Cabinet Office, and the WHO. Players of the game learn how to create viral false content using common manipulation tactics. In doing so, they develop “psychological resistance” against future misinformation and disinformation campaigns [45].


Tags in the interpersonal/community level were used to label tools and approaches that were focused on communication and relationship or trust building at the interpersonal or community level.

Resources for public health communicators

These resources (of which we identified 62) were designed to enhance the credibility and efficacy of public health communication—particularly in the midst of mistrust and misinformation—and included messaging guidance, sharable materials, and toolkits. Many targeted traditional public health communicators, such as health department employees, physicians, or community health workers. The Public Health Communications Collaborative, for example, compiled a collection of toolkits, talking points, messaging, and graphics to help public health leaders communicate credibly and persuasively about COVID-19, along with other health topics [46]. Other resources targeted non-traditional public health communicators, including parents, teachers, and faith leaders. The Public Health Association of British Columbia, for example, partnered with CANVax to develop The COVID-19 Misinformation Toolkit for Kids (and Parents!) at Home, which is a guide for parents outlining how to discuss COVID-19 vaccines with their children [47]. In addition, in 2021, the Office of the U.S. Surgeon General released A Community Toolkit for Addressing Health Misinformation, which provides guidance to teachers, school administrators, healthcare professionals, community members, and faith leaders on understanding, identifying, discussing, and ultimately combatting health-related misinformation [48].

Community engagement

We identified 25 community engagement approaches. These approaches typically involved efforts to identify and train trusted messengers who could communicate accurate health information or encourage protective health-related behavior among members of their communities. For example, the nonprofit Vaccinate Your Family recently developed SQUAD™, which is a program that provides training and mentorship to individuals who want to become vaccine advocates in their communities [49]. Many of the approaches in this level were aimed at overcoming communication barriers (like lack of trust) among hard-to-reach, marginalized, or vulnerable populations. Live Chair Health, for example, is an organization that trains U.S. barbers in health education in order to close the life expectancy gap and overcome medical mistrust among Black men. Recently, they have been training barbers to discuss COVID-19-related issues with their clients, including vaccination [50]. Some of the community engagement efforts we identified were designed to (re)build trust in the healthcare system at large. The International Vaccine Access Center, for example, partnered with local community leaders and other organizations in Baltimore to counter vaccine myths and encourage members of the African American community to get vaccinated against COVID-19. The more overarching goal of the program, however, was to “build trust in both vaccination and the broader health system” [51].


Tags in the institutional/structural level were often applied to tools or approaches that were designed to shift the burden of infodemic management from the “demand” side (i.e., focusing on information consumers) to the “supply” side (i.e., focusing on information purveyors).

Policy or legislation

We identified 33 regulatory or legislative approaches. These approaches can be divided into three categories. The first category consisted of efforts to regulate online content or otherwise hold individuals or companies criminally responsible for sharing false information online. In the US, for example, some have proposed changes to Sect. 230 of the Communications Decency Act, which protects online platforms from legal action based on the content shared by third parties. Proposed changes are intended to amend Sect. 230 by making online platforms liable for using their algorithms to promote the spread of health-related misinformation during a public health emergency [52].

The second category of regulatory and legislative approaches consisted of policies designed to enhance digital or media literacy. The government of Singapore, for example, recently released its Digital Readiness Blueprint, which is a national plan for increasing access to and use of digital technology as well as enhancing digital literacy among citizens. One of the aims outlined in the plan is to “strengthen focus on information and media literacy to build resilience in an era of online falsehoods” [53]. In the US, proposed legislation has included efforts to implement a national strategy for information/media literacy education and the development of a commission to oversee information/media literacy in schools [54].

The final category of policy/legislative approaches consisted of policies related to medical boards or licensure in the US. The Tennessee State Medical Board, for example, instituted a policy in 2021 that allows removal of medical licenses from physicians spreading misinformation about the COVID-19 vaccines [55].

Social media regulation

We identified 13 approaches or policies designed to regulate information on social media platforms. These efforts can be divided into two broad categories: soft content moderation and hard content moderation. Soft content moderation generally consisted of efforts to reduce the visibility or amplification of posts containing false information or efforts to alert individuals that certain posts may contain false content. Hard content moderation involved removal of posts or suspension of accounts propagating false information. Meta is an example of a social media company that has employed both soft and hard content moderation. During the COVID-19 pandemic, for example, Meta introduced a series of policies to combat misinformation and disinformation about the virus on Facebook, including using its algorithm to limit the spread of false information and removing posts or accounts responsible for repeatedly sharing misinformation [56].

The remaining institutional/structural-level tools consisted of capacity building tools for those working in health communications, public health, or infodemic research/management.

Managing academic/scientific literature

We identified 3 tools/approaches designed to help academics or public health professionals keep track of emerging or retracted scientific literature. The COVID Contents (CC) Initiative, for example, was an effort undertaken by the Istituto Superiore di Sanità (ISS) in Italy. During the height of the COVID-19 pandemic, ISS established a working group to sift through peer-reviewed papers and pre-prints on COVID-19. The working group compiled their findings into an open-access weekly report called Covid Contents, the aim of which was to provide health professionals with up-to-date and synthesized information about COVID-19 as it emerged [57].

Resources and standards for journalists/fact checkers

We identified 13 resources or standards for journalists/fact-checkers. Some of these were designed to ensure journalists had access to accurate information and adequate resources when reporting on public health emergencies. In 2020, for example, the International Center for Journalists, together with the International Journalists’ Network launched the Global Health Crisis Reporting Forum, now called the ICFJ Pamela Howard Forum on Global Crisis Reporting, which provided journalists with information about COVID-19 along with other resources to improve their coverage of the pandemic [58]. Other tools and approaches with this tag aimed to improve or bolster the fact checking industry. The International Fact Checking Network (IFCN) Code of Principles, for example, is an effort to promote fact checking in journalism and establish professional standards and codes of conduct for fact checkers across the globe [59].

Resources for infodemic researchers/research

We identified 9 tools/approaches designed to facilitate research on misinformation/disinformation and infodemiology. The Mercury Project, for example, is an effort to fund research that will help “combat the growing global threat posed by low Covid-19 vaccination rates and public health mis- and disinformation” [60].

Resources for infodemic managers

We identified 38 high-level resources for those managing misinformation and disinformation as public health, community, or industry leaders. Many of the tools and approaches with this tag consisted of frameworks, toolkits, or high-level guides outlining how to combat health-related misinformation/disinformation or infodemics more broadly. The U.S. Cybersecurity and Infrastructure Security Agency (CISA), for example, developed the COVID-19 Disinformation Toolkit, which provides information and guidance to state, local, tribal, and territorial officials on misinformation and disinformation related to COVID-19 [61].


While not comprehensive, the tools and approaches identified by the research team provide valuable insight into the current ecosystem of infodemic management strategies, which can be characterized using a modified social-ecological model with four levels. The tools/approaches in each level target important components and determinants of health-related misinformation and disinformation, including information itself, individual resiliency, communication and interpersonal/inter-community relationships and trust, and institutional and structural factors. However, each type of approach has accompanying strengths and weaknesses.

In terms of the information-level approaches, there are some practical considerations that are important to acknowledge. Findings from cognitive and psychological research suggest that human information processing is dictated largely by biases and heuristics [62], particularly in conditions of uncertainty [63]. If, for example, information provided to individuals is contrary to pre-established beliefs, such information (factual or not) may simply be dismissed in favor of alternative explanations, a phenomenon referred to as confirmation bias [64]. This bias not only makes individuals vulnerable to false information (particularly if such information conforms with their pre-existing beliefs), but also likely limits the impact of many of the information-level approaches in the database, including amplifying factual information, filling information voids, verification/credibility/detection, and debunking false information. Indeed, there is evidence that debunking false information is extremely challenging when such information aligns with individuals’ pre-existing beliefs [65]. The scale of false information also presents a practical challenge, as new rumors and claims constantly emerge. While the incorporation of artificial intelligence tools can support information-level interventions at scale, they introduce an additional set of complications and challenges associated with accuracy, interpretation, and the need for trained or experienced personnel [66].

In contrast to the information-level approaches, individual-level approaches are designed to encourage individuals to think more critically about information they come across, thereby helping them overcome some of the cognitive biases and heuristics that make them susceptible to false information in the first place. There is some evidence that such approaches can be effective. Prebunking interventions, for example, have been shown to reduce the likelihood that individuals will be persuaded by false information or share it with others [43,44,45, 67]. In addition, there is evidence that information literacy interventions can change the way individuals think about and evaluate the information they consume [68, 69]. However, in order for such interventions to have real-world impact, individuals must agree to be inoculated and/or undergo information literacy training. This could prove challenging, especially considering the ongoing politicization of public health. Enhancing individual-level resiliency will also need to be a continual process as increasingly savvy actors and misinformation campaigns continue to adapt and evolve. Finally, it should be noted that enhancing science literacy will not necessarily make individuals more trusting of information provided to them by scientists. On the contrary, improving individuals’ knowledge of the scientific process (and of the inherent uncertainties involved in scientific research) may cause them to be more skeptical of scientific information in general [70].

The communication and community engagement approaches identified in this search touch on one of the most important components of and contributors to misinformation and disinformation: lack of trust. By leveraging trusted community messengers and (re)building trust in the healthcare system, these approaches offer promising ways to overcome barriers to communication and reduce the spread and impact of false information. However, identifying messengers and establishing trust with certain communities—particularly those that have experienced marginalization or oppression—will require ongoing investment and resources. Indeed, scholars argue that community engagement should be thought of as a component of disaster preparedness in addition to response [71]. Moreover, community engagement and communication approaches will need to be tailored to the specific information needs of different communities. Information tracking tools may help identify such needs, as well as what kind of false information is circulating at a given time.

The institutional and structural-level approaches—particularly those relating to social media regulation and policy or legislation—are important given that they allow for a more supply-side approach to combatting misinformation/disinformation. Such approaches may be valuable because, as discussed above, cognitive biases make it difficult to prevent individuals from believing false information or to correct it once it has been seen. However, there may be unintended consequences associated with efforts to regulate the supply of false information. For example, there is evidence that flagging false content, a form of social media regulation, may make individuals more likely to believe that content that is not flagged is true [72]. This phenomenon, referred to as the implied truth effect, could be problematic if unflagged content is actually false. In addition, social media regulation could potentially increase conspiratorial beliefs or claims among those whose social media activity is limited by such regulation (i.e., shutting down accounts may prove to individuals that they are being lied to or that there is a conspiracy against them) [73, 74]. The very architecture of social media may also undermine efforts to contain misinformation, as financial incentives to keep users engaged continue to prioritize sensational content over more staid, but factual, claims. Structural approaches that require policy change may also be difficult to enact. In the US, political tensions have colored debates over social media policies. It should also be noted that legislative efforts to contain “misinformation” have been used to legally arrest and detain journalists and others around the world. In some instances, the arbiter of “truth” may be a government or administration that is hostile to claims that undermine its legitimacy.

Notwithstanding the challenges and limitations described above, each level of the 4 i FACT contains valuable approaches for managing and mitigating the effects of health-related misinformation and disinformation. Similar levels (information, population, system) have been identified in previous work on the evaluation of emergency risk communication [75], which suggests that social-ecological models offer a useful way to characterize points of intervention or evaluation of information-related processes during public health emergencies. Such models are likely useful because they reflect complex realities. Indeed, information (including false information) does not exist in a vacuum, but in a complex system of individuals, communities, and institutions. The 4 i FACT reflects this reality, offering possible points of intervention at each level of the system. However, given their associated limitations, interventions that target only one level of the system are unlikely to be effective on their own. As such, the most effective strategy for combatting health-related misinformation and disinformation will likely be one that is multi-faceted and stretches across multiple (or ideally all) levels of the 4 i FACT.

This research offers a characterization of infodemic management strategies that public health practitioners, communicators, and policy makers can use to guide current and future approaches. However, this study is subject to some limitations. The dataset developed for this study is not an exhaustive list of all past or existing misinformation and disinformation management strategies. Tools and approaches that were not discussed in academic or gray literature or that were not featured on U.S. state or selected large local health department websites were likely missed. Moreover, given that searches were conducted in English and only U.S. health department websites were searched, the database is unlikely to be representative of strategies at the international level. Finally, while efforts were taken to ensure tools and approaches were described using the most up-to-date and accurate information available, it is possible that some were misinterpreted or mischaracterized. Recognizing that any effort at a comprehensive list would be outdated as soon as it was compiled, the tagging system developed by the research team focuses on broader approaches that continue to resonate, even as the details of specific tools continue to evolve.


The current ecosystem of infodemic management strategies can be characterized using a modified social-ecological model, the 4 i FACT, with four interconnected, nested levels: information, individual, interpersonal, and institutional. Public health practitioners, communicators, and policy makers can use this model, and the approaches contained within it, to inform current and future efforts to combat health-related misinformation and disinformation, which continue to pose a threat to public health. Given the complexity of the information environment and the fact that approaches in each level have associated strengths and limitations, efforts should be taken to utilize and integrate strategies across all four levels of the 4 i FACT. No single intervention can adequately address all levels of the infodemic, and any comprehensive approach to infodemic management must consider action across all levels.

Availability of data and materials

The datasets generated and analyzed during the current study are not yet publicly available as the research team is still exploring ways to display the data in a searchable format. In the meantime, data are available from the corresponding author on reasonable request.



World Health Organization


United States

4 i FACT:

4 I Framework for Advancing Communication and Trust


Social-ecological model


Centers for Disease Control and Prevention


National Academies of Sciences, Engineering, and Medicine


Artificial intelligence


Machine learning


  1. Coronavirus disease 2019 (COVID-19) Situation Report – 100. World Health Organization; 2020. cited 2023 Apr 9. Report No.: 100. Available from:

  2. Rocha YM, de Moura GA, Desidério GA, de Oliveira CH, Lourenço FD, de Figueiredo Nicolete LD. The impact of fake news on social media and its influence on health during the COVID-19 pandemic: a systematic review. J Public Health (Berl). 2021 Cited 2023 Mar 23;

  3. Loomba S, de Figueiredo A, Piatek SJ, de Graaf K, Larson HJ. Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA. Nat Hum Behav. 2021;5(3):337–48.

    Article  PubMed  Google Scholar 

  4. Bhatti OA, Rauf H, Aziz N, Martins RS, Khan JA. Violence against Healthcare Workers during the COVID-19 Pandemic: A Review of Incidents from a Lower-Middle-Income Country. Ann Glob Health. 2021;87(1):41.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Roozenbeek J, Schneider CR, Dryhurst S, Kerr J, Freeman ALJ, Recchia G, et al. Susceptibility to misinformation about COVID-19 around the world. R Soc Open Sci. 2020;7(10):201199.

    Article  CAS  Google Scholar 

  6. Broniatowski DA, Jamison AM, Qi S, AlKulaib L, Chen T, Benton A, et al. Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate. Am J Public Health. 2018;108(10):1378–84.

    Article  PubMed  Google Scholar 

  7. Vosoughi S, Roy D, Aral S. The spread of true and false news online. Science. 2018;359(6380):1146–51.

    Article  CAS  PubMed  Google Scholar 

  8. Cinelli M, De Francisci MG, Galeazzi A, Quattrociocchi W, Starnini M. The echo chamber effect on social media. Proc Natl Acad Sci. 2021;118(9):e2023301118.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  9. Törnberg P. Echo chambers and viral misinformation: Modeling fake news as complex contagion. PLoS One. 2018;13(9):e0203958.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Pariser E. The filter bubble: how the new personalized web is changing what we read and how we think. London: Penguin; 2011. p. 179.

  11. Pantazi M, Hale S, Klein O. Social and Cognitive Aspects of the Vulnerability to Political Misinformation. Polit Psychol. 2021;42(S1):267–304.

    Article  Google Scholar 

  12. Rhodes SC. Filter Bubbles, Echo Chambers, and Fake News: How Social Media Conditions Individuals to Be Less Critical of Political Misinformation. Polit Commun. 2022;39(1):1–22.

    Article  Google Scholar 

  13. De Freitas L, Basdeo D, Wang HI. Public trust, information sources and vaccine willingness related to the COVID-19 pandemic in Trinidad and Tobago: an online cross-sectional survey. Lancet Regional Health - Americas. 2021;1(3):100051.

    Article  Google Scholar 

  14. Melki J, Tamim H, Hadid D, Makki M, Amine JE, Hitti E. Mitigating infodemics: the relationship between news exposure and trust and belief in COVID-19 fake news and social media spreading. PLoS One. 2021;16(6):e0252830.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  15. Pierre JM. Mistrust and misinformation: a two-component, socio-epistemic model of belief in conspiracy theories. J Soc Polit Psychol. 2020;8(2):617–41.

    Article  Google Scholar 

  16. Sundelson AE, Huhn N, Jamison AM, Pasquino SL, Kirk Sell T. Infodemic Management Approaches Leading up to, During, and Following the COVID-19 Pandemic. Johns Hopkins Center for Health Security; 2023 [Cited 2023 Apr 11]. Available from:

  17. Roozenbeek J, Culloty E, Suiter J. Countering misinformation. Eur Psychol. 2023;28(3):189–205.

    Article  Google Scholar 

  18. Kozyreva A, Lewandowsky S, Hertwig R. Citizens versus the internet: confronting digital challenges with cognitive tools. Psychol Sci Public Interest. 2020;21(3):103–56.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Tangcharoensathien V, Calleja N, Nguyen T, Purnat T, D’Agostino M, Garcia-Saiso S, et al. Framework for Managing the COVID-19 Infodemic: Methods and Results of an Online, Crowdsourced WHO Technical Consultation. J Med Internet Res. 2020;22(6):e19659.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Korin MR, Araya F, Idris MY, Brown H, Claudio L. Community-based organizations as effective partners in the battle against misinformation. Front Public Health. 2022;10:853736.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Featherstone JD, Zhang J. Feeling angry: the effects of vaccine misinformation and refutational messages on negative emotions and vaccination attitude. J Health Commun. 2020;25(9):692–702.

    Article  PubMed  Google Scholar 

  22. Ophir Y, Romer D, Jamieson PE, Jamieson KH. Counteracting Misleading Protobacco YouTube Videos: The Effects of Text-Based and Narrative Correction Interventions and the Role of Identification. Int J Commun. 2020;14:16.

    Google Scholar 

  23. Piltch-Loeb R, Su M, Hughes B, Testa M, Goldberg B, Braddock K, et al. Testing the efficacy of attitudinal inoculation videos to enhance COVID-19 vaccine acceptance: quasi-experimental intervention trial. JMIR Public Health Surveill. 2022;8(6):e34615.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Appelman N, Dreyer S, Bidare PM, Potthast KC. Truth, intention and harm: Conceptual challenges for disinformation-targeted governance. Internet Policy Review. 2022. Cited 2023 Aug 1; Available from:

  25. Bronfenbrenner U. Ecological systems theory. In: Making human beings human: Bioecological perspectives on human development. Thousand Oaks: Sage Publications Ltd; 2005. p. 106–73.

    Google Scholar 

  26. Golden SD, McLeroy KR, Green LW, Earp JAL, Lieberman LD. Upending the Social Ecological Model to Guide Health Promotion Efforts Toward Policy and Environmental Change. Health Educ Behav. 2015;42(1_suppl):8S-14S.

    Article  PubMed  Google Scholar 

  27. Instagram. 2021 Cited 2023 Apr 9. B’More City Health on Instagram: “You know where there are microchips? In your phone. You know where there aren’t microchips? In your vaccine. Get….” Available from:

  28. Ritter AZ, Aronowitz S, Leininger L, Jones M, Dowd JB, Albrecht S, et al. Dear pandemic: nurses as key partners in fighting the COVID-19 infodemic. Public Health Nurs. 2021;38(4):603–9.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Weeks R, Sangha P, Cooper L, Sedoc J, White S, Gretz S, et al. Usability and Credibility of a COVID-19 Vaccine Chatbot for Young Adults and Health Workers in the United States: Formative Mixed Methods Study. JMIR Hum Factors. 2023;30(10): e40533.

    Article  Google Scholar 

  30. Minnesota COVID-19 Response. Cited 2023 Apr 9. Contact Us. Available from:

  31. Georgia Department of Public Health. Cited 2023 Apr 9. COVID-19 Hotline. Available from:

  32. Illinois Department of Public Health. Cited 2023 Apr 9. COVID-19 Public Communication. Available from:

  33. World Health Organization. Cited 2023 Apr 9. Reaching digital populations everywhere with trusted information. Available from:

  34. Poynter. Cited 2023 Apr 9. CoronaVirusFacts Alliance. Available from:

  35. EUvsDisinfo. Cited 2023 Jul 31. Disinfo Database. Available from:

  36. McGowan BS. World Health Organization’s Early AI-supported Response with Social Listening Platform (WHO EARS). J Med Libr Assoc. 2022;110(2):273–5.

    PubMed  PubMed Central  Google Scholar 

  37. Spadacini BM. Tracking Rumors to Contain Disease: The Case of DeySay in Liberia’s Ebola Outbreak. USAID Impact Blog. 2016. Cited 2023 Apr 9. Available from:

  38. Media Bias/Fact Check. Cited 2022 Oct 10. Media Bias/Fact Check News. Available from:

  39. Botometer by OSoMe. Cited 2022 Oct 10. Available from:

  40. COVID-19 in San Diego. Cited 2023 Apr 9. Finding Credible Sources of Information. Available from:

  41. OER Commons. Cited 2023 Apr 9. Cyber Citizenship. Available from:

  42. Blakeslee S. The CRAAP Test. LOEX Quarterly. 2004 Oct 1;31(3). Available from:

  43. van der Linden S, Leiserowitz A, Rosenthal S, Maibach E. Inoculating the Public against Misinformation about Climate Change. Global Chall. 2017;1(2):1600008.

    Article  Google Scholar 

  44. Roozenbeek J, van der Linden S, Nygren T. Prebunking interventions based on the psychological theory of “inoculation” can reduce susceptibility to misinformation across cultures. Harvard Kennedy School Misinformation Review. 2020;1(2).

  45. Basol M, Roozenbeek J, Berriche M, Uenal F, McClanahan WP, van der Linden S. Towards psychological herd immunity: cross-cultural evidence for two prebunking interventions against COVID-19 misinformation. Big Data Soc. 2021;8(1):20539517211013868.

    Article  Google Scholar 

  46. Public Health Communications Collaborative. Cited 2023 Apr 9. Messaging guidance and Answers to Tough Questions about public health. Available from:

  47. CANVax. Cited 2023 Apr 9. The COVID-19 Misinformation Toolkit for Kids (and Parents!) at Home. Available from:

  48. A Community Toolkit for Addressing Health Misinformation. Office of the U.S. Surgeon General; 2021. Cited 2023 Apr 9. Available from:

  49. Vaccinate Your Family. Cited 2023 Apr 9. Our Programs. Available from:

  50. Live Chair Health. Cited 2023 Apr 9. About. Available from:

  51. Centers for Disease Control and Prevention. 2021. Cited 2023 Aug 3. Baltimore Faith Community Promoting COVID-19 Vaccine Confidence. Available from:

  52. Sen. Klobuchar A [D M. Text - S.2448 - 117th Congress (2021–2022): Health Misinformation Act of 2021. 2021 Cited 2023 Apr 9. Available from:

  53. Digital Readiness Blueprint. Singapore: Ministry of Communications and Information; Cited 2023 Apr 9. Available from:

  54. Rep. Beyer DS. Text - H.R.6971 - 117th Congress (2021–2022): Educating Against Misinformation and Disinformation Act. 2022 Cited 2023 Apr 9. Available from:

  55. Wadhwani A. Medical board meets to review COVID misinformation policy. Tennessee Lookout. 2021. Cited 2023 Apr 9; Available from:

  56. Facebook Help Centre. Cited 2023 Apr 9. COVID-19 policy updates and protections. Available from:

  57. Bertinato L, Brambilla G, Castro PD, Rosi A, Nisini R, Barbaro A, et al. How can we manage the COVID-19 infodemics? A case study targeted to health workers in Italy: Covid 19 Contents. Annali dell’Istituto Superiore di Sanità. 2021;57(2):121–7.

    CAS  PubMed  Google Scholar 

  58. ICFJ International Center for Journalists. Cited 2023 Apr 9. Covering COVID-19: Resources for Journalists. Available from:

  59. IFCN Code of Principles. Cited 2023 Apr 9. Available from:

  60. Press Release: 2022 Mercury Project Grantee Teams. Social Science Research Council (SSRC). 2022. Cited 2023 Mar 1. Available from:

  61. COVID-19 Disinformation Toolkit. Cybersecurity and Infrastructure Security Agency; 2020. Cited 2023 Apr 9. Available from:

  62. Korteling JE, Brouwer AM, Toet A. A neural network framework for cognitive bias. Front Psychol. 2018;3(9):1561.

    Article  Google Scholar 

  63. Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. 1974;185(4157):1124–31.

    Article  CAS  PubMed  Google Scholar 

  64. Nickerson RS. Confirmation bias: a ubiquitous phenomenon in many guises. Rev Gen Psychol. 1998;2(2):175–220.

    Article  Google Scholar 

  65. Lewandowsky S, Ecker UKH, Seifert CM, Schwarz N, Cook J. Misinformation and its correction: continued Influence and successful debiasing. Psychol Sci Public Interest. 2012;13(3):106–31.

    Article  PubMed  Google Scholar 

  66. García-Marín D, Elías C, Soengas-Pérez X. Big Data and Disinformation: Algorithm Mapping for Fact Checking and Artificial Intelligence. In: Vázquez-Herrero J, Silva-Rodríguez A, Negreira-Rey MC, Toural-Bran C, López-García X, editors. Total Journalism: Models, Techniques and Challenges. Cham: Springer International Publishing; 2022. Cited 2023 Aug 7. 123–35. (Studies in Big Data).

    Chapter  Google Scholar 

  67. Iles IA, Gillman AS, Platter HN, Ferrer RA, Klein WMP. Investigating the potential of inoculation messages and self-affirmation in reducing the effects of health misinformation. Sci Commun. 2021;43(6):768–804.

    Article  Google Scholar 

  68. Guess AM, Lerner M, Lyons B, Montgomery JM, Nyhan B, Reifler J, et al. A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proc Natl Acad Sci. 2020;117(27):15536–45.

    Article  PubMed  PubMed Central  Google Scholar 

  69. McGrew S, Smith M, Breakstone J, Ortega T, Wineburg S. Improving university students’ web savvy: an intervention study. Br J Educ Psychol. 2019;89(3):485–500.

    Article  Google Scholar 

  70. Howell EL, Brossard D. (Mis)informed about what? What it means to be a science-literate citizen in a digital world. Proc Natl Acad Sci U S A. 2021;118(15):e1912436117.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  71. Schoch-Spana M, Franco C, Nuzzo JB, Usenza C. Community Engagement: Leadership Tool for Catastrophic Health Events. Biosecur Bioterror. 2007;5(1):8–25.

    Article  PubMed  Google Scholar 

  72. Pennycook G, Bear A, Collins E, Rand DG. The Implied Truth Effect: Attaching Warnings to a Subset of Fake News Headlines Increases Perceived Accuracy of Headlines Without Warnings. Rochester; 2019. Cited 2022 Dec 3. Available from:

  73. Barrett JS, Yang SY, Muralidharan K, Javes V, Oladuja K, Castelli MS, et al. Considerations for addressing anti-vaccination campaigns: How did we get here and what can we do about it? Clin Transl Sci. 2022;15(6):1380–6.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  74. Helm RK, Nasu H. Regulatory Responses to “Fake News” and freedom of expression: normative and empirical evaluation. Hum Rights Law Rev. 2021;21(2):302–28.

    Article  Google Scholar 

  75. Savoia E, Lin L, Gamhewage GM. A conceptual framework for the evaluation of emergency risk communications. Am J Public Health. 2017;107(S2):S208–14.

    Article  PubMed Central  Google Scholar 

Download references


Not applicable.


Funding for this research was provided by U.S. Centers for Disease Control and Prevention (CDC), contract number 75D30122C14281, and the National Academies of Sciences, Engineering, and Medicine (NASEM).

Author information

Authors and Affiliations



TKS conceptualized the study and oversaw study procedures. All authors contributed to data collection. AES drafted the manuscript. All authors read, edited, and approved the final manuscript.

Corresponding author

Correspondence to Anne E. Sundelson.

Ethics declarations

Ethics approval and consent to participate

The research is based on publicly available documents; therefore, ethics approval and consent were not sought.

Consent for publication

Not applicable.

Competing interests

None to declare.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sundelson, A.E., Jamison, A.M., Huhn, N. et al. Fighting the infodemic: the 4 i Framework for Advancing Communication and Trust. BMC Public Health 23, 1662 (2023).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


  • Misinformation
  • Disinformation
  • Infodemic
  • Fact check
  • Social media
  • Social-ecological model