Skip to main content

Human factors methods in the design of digital decision support systems for population health: a scoping review

Abstract

Background

While Human Factors (HF) methods have been applied to the design of decision support systems (DSS) to aid clinical decision-making, the role of HF to improve decision-support for population health outcomes is less understood. We sought to comprehensively understand how HF methods have been used in designing digital population health DSS.

Materials and methods

We searched English documents published in health sciences and engineering databases (Medline, Embase, PsychINFO, Scopus, Comendex, Inspec, IEEE Xplore) between January 1990 and September 2023 describing the development, validation or application of HF principles to decision support tools in population health.

Results

We identified 21,581 unique records and included 153 studies for data extraction and synthesis. We included research articles that had a target end-user in population health and that used HF. HF methods were applied throughout the design lifecycle. Users were engaged early in the design lifecycle in the needs assessment and requirements gathering phase and design and prototyping phase with qualitative methods such as interviews. In later stages in the lifecycle, during user testing and evaluation, and post deployment evaluation, quantitative methods were more frequently used. However, only three studies used an experimental framework or conducted A/B testing.

Conclusions

While HF have been applied in a variety of contexts in the design of data-driven DSSs for population health, few have used Human Factors to its full potential. We offer recommendations for how HF can be leveraged throughout the design lifecycle. Most crucially, system designers should engage with users early on and throughout the design process. Our findings can support stakeholders to further empower public health systems.

Peer Review reports

Background

Interactive decision aid systems, such as dashboards, are vital digital interfaces that support decision-makers across diverse sectors like healthcare, energy, and finance [1]. In these dynamic and often unpredictable settings, professionals must make swift and accurate decisions under pressure, where the cost of an error can be substantial. Given that human cognitive and perceptual constraints can lead to decision-making errors, these systems aim to minimize errors and enhance user decision-making.

Human Factors (HF) Engineering, along with its subdiscipline of Human Computer Interaction (HCI), represents a field poised at the intersection of human behavior and system design [2]. It is predicated on the principle of tailoring systems to match user capabilities and characteristics, thereby minimizing the mismatch between humans and the tools they use. This alignment aims to reduce cognitive and physical strain, facilitating improved performance and satisfaction. The methodologies encompass understanding user-system interactions, crafting solutions responsive to user needs, and evaluating these solutions against criteria such as decision-making accuracy, task efficiency, mental workload, and user satisfaction [2].

The emergence of HCI as a distinct field in the late 20th century represents an evolution of the HF tradition, with a specific focus on the interfaces between humans and computers [3]. While the field of HF broadly addresses the design of systems with human users, HCI hones in on the complexities of human interactions with computer systems. HCI researchers examine how individuals interact with computers, striving to make these interactions more intuitive, efficient, and pleasant. This includes studying user behavior, developing new interaction techniques, designing user interfaces, and evaluating user experiences. The relationship between HCI and HF is synergistic; while HF provides the overarching principles of user-centered design and system optimization, HCI applies these principles specifically to the design and evaluation of software systems. Throughout the manuscript we refer to HF in a broad sense, thereby encompassing HCI.

The system design process for software systems begins with a needs assessment and design requirements phase, where user, task, environment, and stakeholder analyses are conducted to define functional, non-functional, user, and regulatory requirements. This is followed by design and prototyping, involving conceptual and detailed design, as well as creating low-fidelity and high-fidelity prototypes to visualize and test concepts. Next, testing and evaluation occur through formative and summative evaluations, including usability testing and user acceptance testing to ensure the system meets requirements. Deployment involves implementation, integration, training, and launching the system. Post-deployment evaluation includes monitoring, maintenance, gathering user feedback, and implementing updates and patches based on feedback and issues, as well as planning for new releases and the system’s end-of-life [4]. Frequently, system designers employ agile methods in the software design process, which emphasizes iterative development, frequent collaboration with stakeholders, and adaptability to change throughout the project lifecycle [5].

In the context of Decision Support Systems (DSS), the contribution of HF is significant. These systems often involve complex user interfaces that must present information in a clear and actionable manner. Human-centered methodologies have advanced DSS for healthcare, aiding clinicians in making better diagnostic and therapeutic decisions and promoting patient safety [6, 7]. Yet, challenges remain in the adoption of DSS in clinical environments due to issues rooted in usability and integration into existing workflows domains where HF provides essential insights [8,9,10,11]. Absent a strong emphasis on human factors principles such as user interface design and interaction paradigms, users may not adopt these systems.

The intersection of HF and HCI is particularly potent in public health, where decisions affect large populations. Public health officials undertake complex tasks that require synthesizing vast arrays of data, and here, the role of HF is to ensure that DSS are not only functionally aligned with these tasks but are also accessible and engaging for the users. As such, DSS designed for public health need to accommodate broader determinants of health, from socioeconomic factors to healthcare services. This scoping review explores the applications of human factors in the design of evidence-based DSS in population health.

Methods

Our scoping review was based on the methodological framework described by Arksey and O’Malley [12], with refinements by Levac and colleagues [13]. We also followed the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P and PRISMA-S, respectively) reporting guidelines to facilitate understanding and transparency [14, 15]. Our detailed study protocol was published in BMJ Open in March 2022 [16]; we briefly describe these methods below.

Search strategy

Our search included peer-reviewed literature databases, manual searches, and grey literature. First, we searched 7 interdisciplinary indexed databases: Ovid MEDLINE, EMBASE, Scopus, PsycINFO, Compendex, IEEE Xplore, and Inspec. Our team included a librarian specialising in health science, and we further consulted with an engineering & computer science librarian to ensure both disciplines were captured. Details on our search strategy for each database can be found in the published protocol [16] and Supplemental Material. The MEDLINE search strategy was validated against a key set of 8 articles [17,18,19,20,21,22,23,24], pre-determined by the authors and was peer reviewed using PRESS [25] by another librarian, not associated with this study to ensure accuracy and comprehensiveness. We then manually searched the reference lists of included articles and relevant reviews. The search was completed in September 2023.

Our grey literature search started with a pilot review of several public health dashboards for infectious disease surveillance, modeling and forecasting, where we identified that the information presented on these websites were insufficient for the HF aspect of this review. Thus, our grey literature search included full-text conference proceedings papers, identified through Compendex (Engineering Village), IEEE Xplore, and Inspec (Engineering Village).

Eligibility criteria

We sought to describe the HF applications to the field of population health, thus we excluded clinical applications, such as those discussing patient safety, monitoring of an individual’s health, or clinical DSS. Since HF applications in healthcare began to emerge in the 1990’s [26, 27], our search started in 1990 to capture the potential evolution of HF applications in the public health domain. As detailed in our study protocol [16], we included studies published in English since 1990 that described the development, validation, or application guided by HF principles in the field of population health. Exclusion criteria included articles whose end-user was not public health, articles not related to HF, articles that did not describe a digital evidenced-based DSS, as well as conference abstracts, reviews (including commentaries and discussion pieces), and articles not written in English.

Screening process

The search results were integrated into Covidence [28], a systematic review management software, and duplicates were removed. Two reviewers independently screened the title and abstract of all articles according to the inclusion and exclusion criteria. Disagreements were resolved through team discussion and included a third independent reviewer as necessary. Using a similar process, selected articles then underwent full text screening by two independent reviewers, resulting in the final studies for inclusion [16].

Data abstraction and synthesis

As outlined in the published protocol [16], a data abstraction form was developed and pilot-tested by two researchers working independently of each other. The abstracted data were synthesized according to three themes: study characteristics, population health characteristics, and human factors characteristics (Table 1). A reviewer used the form to extract data from each article; a second reviewer verified the extraction.

Table 1 Data abstraction themes and items

We computed descriptive statistics for all extracted items, calculating the total number and percent of all studies in a particular category. We also conducted a narrative synthesis of the included studies and the application of HF in population health.

Results

Our search yielded 21,581 unique studies, of which 153 studies met our inclusion criteria [19, 21,22,23,24, 29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162,163,164,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181]. Figure 1 provides a modified PRISMA flow diagram of our screening workflow. Raw data from the extraction process for the 153 included studies can be found in the Supplementary Materials.

Fig. 1
figure 1

PRIMSA flow chart for screening workflow

Study characteristics

Academic Discipline of authors and Year of Publication

The academic disciplines of the authors were diverse, with the majority being from Public Health (56%). Other disciplines included Multidisciplinary teams (23%), which consisted of researchers from both Public Health and Computer Science/Human-Computer Interaction/Informatics fields. Authors from solely Computer Science/Human-Computer Interaction/Human Factors or Informatics (CS/HCI/HF/Informatics) made up 20%, and those from Geographic Information Science/Geographic Science (GIS/Geographic Science) comprised 1%. The distribution of publications over the years showed that 3% were published between 2000 and 2004, 11% between 2005 and 2009, 24% between 2010 and 2014, 29% between 2015 and 2019, and 32% between 2020 and 2023 (Table 2).

Publication type

The types of publications varied, with peer-reviewed journal articles being the most common (76%). Conference proceedings accounted for 18% of the publications, while other types of publications made up 6% (see Table 2).

Publication venue type

Publications were most frequently found in Public Health (65%) venues. Additional publication venues included Informatics (13%), Computer Science/Engineering (12%), Geospatial (5%), Human Factors/Human-Computer Interaction (HF/HCI) (1%), and other disciplines (10%); see Table 2).

Study location

Most studies were conducted in North America (50%). Other study locations included Europe (16%), Africa (11%), Asia (10%), South America (4%), Oceania (4%), and global or multiple locations (4%; see Table 2).

Table 2 Study characteristics

Population health characteristics

Population health topic area

The studies covered a range of population health topic areas. The most frequently addressed topic was infectious disease, representing 35% of the studies. Public health data and indicators were covered in 14% of studies, while maternal, newborn, and child health were the focus of 10%. Non-communicable diseases were addressed in 10% of the studies, and vaccines and drugs were the topic of 6%. Other areas included injury (4%), mental health (3%), nutrition (3%), and substance abuse (2%). Various other topics were covered in 13% of studies (Table 3).

DSS type

Most of the studies utilized health surveillance tools, accounting for 69% of studies. Program evaluation tools and predictive modeling tools were each used in 8% of the studies. Other types of tools were employed in 14% of the studies (see Table 3).

Population health end-user

The end-users of the population health tools and interventions were predominantly multidisciplinary teams, representing 35% of the studies. Program planners were the end-users in 27% of studies, while public health professionals (not otherwise specified) accounted for 12%. Policy makers were the end-users in 8% of the studies, community health workers in 4%, and academia in 3%. Other end-users were identified in 12% of studies (see Table 3).

Population health setting

The settings refer to where the tools are intended to be used. Multiple levels of public health were the most common setting, reported in 25% of the studies. Local public health units were the intended setting in 17% of the studies, and regional public health in 16%. Public health (not otherwise specified) was the setting in 16% of the studies, while federal public health accounted for 12%. Hospitals were the intended setting in 4% of the studies, community health centres in 3%, and other settings in 8% of studies (see Table 3).

Table 3 Population health study characteristics

Human factors characteristics

Researchers primarily engaged with users during the testing and evaluation phase, followed by the post-deployment evaluation phase, the needs assessment and requirements gathering phase, and the design and prototyping phase (Table 4). The majority of studies (n = 96) involved users at only one point in the design lifecycle; 36 studies engaged users in two phases, 17 studies in three phases, and only 4 studies involved users in all four design lifecycle phases. Detailed results for how users were engaged within each phase are presented in the subsequent sections.

User needs assessment and requirements gathering

During the needs assessment and requirements gathering phase, various methods were employed to engage users and gather necessary information. Interviews were the most frequently used method, cited in 26 studies, with an average sample size of 15 participants, although 19% of these studies did not specify the sample size. Meetings, workshops, and discussions were used in 21 studies, with an average of 13 participants, but a significant number of these studies (67%) did not report sample sizes. Focus groups were conducted in 11 studies, averaging 21 participants, with 36% not specifying sample sizes. Questionnaires were used in 6 studies, with a mean sample size of 27 and all studies reporting their sample sizes. Observations and the Delphi method were each employed in 5 studies. Observations averaged at 10 participants with 20% not reporting sample sizes, while the Delphi method had a notably higher average of 84 participants, with 60% not specifying sample sizes. Less frequently used methods included usability assessments of baseline tools and task analysis (see Table 4).

Design and prototyping

In the design and prototyping phase, several methods were utilized to engage users and gather feedback. The most frequently used method was design-based workshops, reported in 16 studies, with an average sample size of 25 participants. Expert and stakeholder reviews were conducted in 9 studies, averaging at 3 participants. Heuristic evaluations were used in 4 studies, with an average of 4 participants. Focus groups and questionnaires were each employed in 3 studies, with focus groups averaging at 7 participants and questionnaires at 13 participants. Interviews were conducted in 2 studies with the average sample size of 18. Informal feedback was gathered in 2 studies, with an average sample size of 5 participants. The Delphi method was used in 1 study, but no information on sample size was provided. Overall, many of the qualitative methods for engaging users in the design and prototyping processes neglected to indicate their sample size (see Table 4).

User testing and evaluation

In the user testing and system evaluation phase, various methods and measures were employed to assess system performance and user experience. User testing, was the most frequently used method, appearing in 49 studies with an average sample size of 16 participants. Of the 49 studies that conducted user testing, 1 study [69] used an experimental framework, 11 collected quantitative data [21, 47, 48, 65, 69, 117, 132, 138, 146, 162, 177] including: task completion time (8 studies; [21, 47, 48, 65, 117, 138, 146, 177] ), task success/accuracy (6 studies; [21, 48, 65, 69, 132, 162]), efficiency (1 study; [69]), and the number of clicks (1 study; [146] ). Questionnaires were utilized in 43 studies, with an average sample size of 22 participants, while interviews were conducted in 21 studies, averaging at 14 participants. Informal feedback was gathered in 17 studies, with an average sample size of 8 participants, and focus groups were used in 12 studies, with an average of 13 participants. Log data was analyzed in 3 studies, and experiment [69] 1 study, with an average sample size of 33 participants. The Delphi method was used in 2 studies, with an average sample size of 15 participants. Notably, many studies using these methods neglected to specify their sample sizes, particularly for qualitative methods such as informal feedback sessions, like the qualitative methods applied in the design and prototyping phase (see Table 4).

Post-deployment evaluation

In the post-deployment assessment and evaluation phase, various methods were employed to gather feedback and assess system performance after it was deployed for use by end-users. Questionnaires were the most frequently used method, reported in 33 studies with an average sample size of 71 participants. Interviews were conducted in 28 studies, averaging at 44 participants, and focus groups were used in 9 studies with an average sample size of 22 participants. User testing was employed in 7 studies, with an average sample size of 11. Quantitative metrics were used in 4 of the 7 studies that conducted user testing and included task success/accuracy (3 studies; [23, 149, 152]), the number of clicks (1 studies; [146]), and task completion time (1 study; [146]). Additional methods included log data analysis (5 studies), informal feedback (4 studies), and observations (3 studies) with an average sample size of 15 participants. App issue reporting and experiment and A/B testing were each conducted in 2 studies, with the latter having an average sample size of 105 participants. Heuristic evaluations were used in 1 study, with an average sample size of 4 participants. Notably, again many studies, particularly those relying on qualitative methods such as informal feedback, did not specify their sample sizes (see Table 4).

Table 4 HF study characteristics. Note that orange bars reflect quantitative methods while green bars represent qualitative methods

Discussion

Have HF methods been used to their full potential?

Over the past 20 years, HF methods have been increasingly applied throughout the design lifecycle of DSS for public health contexts. A variety of qualitative and quantitative methods were used, with qualitative methods used more frequently during the needs assessment and design and prototyping phases, while quantitative methods more frequently used in the two evaluation phases: user testing and evaluation and post-deployment evaluation. Indeed, qualitative methods, such as interviews and observations, provide deep, contextual insights into user needs and behaviors, ensuring a user-centered design process. They allow for flexibility and iteration, uncovering unmet needs and fostering empathy, which leads to more inclusive and effective solutions. These methods help designers create systems that truly resonate with and benefit users, which is why they are advantageous in the early phases of the design lifecycle. On the other hand, quantitative methods provide objective, measurable data that allow for statistical analysis, and benchmarking, ensuring rigorous evaluation during user testing and post-deployment phases. These methods offer precision, reproducibility, and the ability to identify trends, enabling data-driven decisions and continuous improvement of system performance.

However, our findings indicate that researchers have not been using quantitative human factors methods to their full potential in the two evaluation phases. Importantly, most user testing and evaluation approaches did not collect direct measures of performance with the system. Additionally, only 3 studies [69, 121, 127] employed A/B testing or experimental methods to compare new or current tools with alternatives in public health contexts. Furthermore, no study evaluated whether these DSS help public health professionals make better decisions. As such, despite following some best practices in engaging users in the system design process, there is little evidence for the efficacy of these tools in supporting users in decision making tasks. Furthermore, a large proportion of studies did not report their sample size, particularly for qualitative methods. Those that reported sample sizes for qualitative studies generally followed best practices (e.g. 6–8 participants per focus group) [182]. Most studies reported sample sizes for quantitative methods, which followed best practices using larger sample sizes than qualitative methods (e.g., 20 + per questionnaire).

Human factors vs. agile Software Development

In the field of HF, researchers have thoroughly and rigorously assessed system design in the context of safety-critical systems such as those encountered in the aviation, surface transportation, military, and nuclear domains. However, as demonstrated in this study, this approach is lacking in the design of DSS in public health. This may in part be attributed to several constraints such as time and resources. For instance, funding opportunities are more limited for public health DSS than in other domains such as military DSS. In turn, this limits the number of public health staff available to develop and systematically evaluate these systems. Against these constraints, agile approaches to development afford user engagement and feedback throughout the design lifecycle, however, they may fall short in providing robust evidence for the efficacy of DSS. Indeed, most studies identified in this review were from researchers in the public health domain. Multidisciplinary teams may open-up additional funding opportunities in addition to fostering synergy between public health domain expertise and engineering technical skills.

When should we conduct HF experiments?

In systems design, it is best practice to engage with users throughout the design lifecycle. Encouragingly, we have seen an increase in the engagement of users in the design of DSS for public health. While it may not always be feasible to conduct A/B testing or experimentation, especially under time and funding constraints, some circumstances may warrant a more thorough approach. For example, more rigorous testing may be beneficial in the context of DSS intended to support high-stakes decision-making processes. Additionally, introducing novel technologies, such as artificial intelligence (AI) and machine learning (ML), in public health necessitates thorough testing to validate their efficacy. AI and ML models can potentially enhance the speed and accuracy of epidemiological insights, enabling quicker decision-making during time-critical events like the COVID-19 pandemic [183]. However, HF challenges such as the “black box” that characterises many AI/ML tools can hinder the ability of epidemiologists to explain results and decision-makers to take confident action.

Strengths and limitations

Our scoping review has numerous strengths. Since it was designed to capture studies in both engineering and public health over the last twenty years, it has considerable breadth and comprehensiveness. Importantly, the long review period allowed us to track changes in this area over time. Our search strategy was reviewed by two librarians in both the public health sciences and engineering domains, which also improves the rigour of our search and address challenges with different nomenclature with this interdisciplinary research. We were also able to ensure that each record was reviewed by both a team member from HF engineering and one from public health, with the ability to discuss potential conflicts with a third member of the study team. This approach reduces the likelihood of false positives or negatives in terms of the studies deemed to meet inclusion criteria. Finally, our study protocol was previously peer-reviewed and published [16] and we did not deviate from our study protocol.

Our review also has some important limitations. We were only able to include studies published in English and thus we may be under-capturing studies from the Global South. Our review also does not include a full appraisal of methodological quality or risk of bias as such checklists do not exist for the study of HF in health. As DSS continue to be developed for use in clinical medicine and population health, the development of a checklist to guide rigorous Human Factors evaluations may represent a fruitful area of future work, especially by groups such as the EQUATOR network [184]. While it was difficult to summarize all potentially relevant details of our included studies due to space restrictions, we aimed to cover the most salient details for stakeholders in this space. We also present our full table of the 153 studies that met our inclusion criteria in the Supplementary Materials.

Conclusion

While we identified many studies that applied HF methods to design decision support tools for population health, few leveraged HF methods to their full potential. We offered several recommendations for how HF methods can be leveraged at different points within the design lifecycle. The key is to engage with users early on and throughout the design process rather than simply bringing in end-users for usability testing. In terms of testing, there is a need to consider additional metrics beyond usability and tool utility. This includes measuring task performance, mental workload, situation awareness, and, ultimately, the quality of decisions made. Furthermore, there is a greater need for more rigorous evaluations, to generate the level of evidence needed to determine if and how DSS improve public health decision-making. Overall, HF methods have great potential for enhancing the impact of dashboards and data-based decision support tools and efforts are needed to adopt best practices in design and evaluation.

Data availability

All data generated or analyzed during this study are included in this published article and its supplementary information.

Abbreviations

AI:

Artificial Intelligence

CS:

Computer Science

DSS:

Decision Support Systems

HCI:

Human Computer Interaction

HF:

Human Factors

ML:

Machine Learning

References

  1. Bonczek RH, Holsapple CW, Whinston AB. Foundations of decision support systems. Academic; 2014.

  2. Lee JD, Wickens CD, Liu Y, Boyle LN. Designing for people: an introduction to human factors engineering. CreateSpace; 2017.

  3. Harrison S, Tatar D, Sengers P. The three paradigms of HCI. Alt Chi Session at the SIGCHI Conference on human factors in computing systems San Jose, California, USA. 2007. pp. 1–18.

  4. Meister D. Human factors testing and evaluation. Elsevier; 2014.

  5. Salah D, Paige RF, Cairns P. A systematic literature review for agile development processes and user centred design integration. Proceedings of the 18th international conference on evaluation and assessment in software engineering. 2014. pp. 1–10.

  6. Salwei ME, Carayon P, Hoonakker PLT, Hundt AS, Wiegmann D, Pulia M, et al. Workflow integration analysis of a human factors-based clinical decision support in the emergency department. Appl Ergon. 2021;97:103498.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Carayon P, Hoonakker P, Hundt AS, Salwei M, Wiegmann D, Brown RL, et al. Application of human factors to improve usability of clinical decision support for diagnostic decision-making: a scenario-based simulation study. BMJ Qual Saf. 2020;29:329–40.

    Article  PubMed  Google Scholar 

  8. Karsh B-T. Clinical practice improvement and redesign: how change in workflow can be supported by clinical decision support. Volume 200943. Rockville, MD: Agency for Healthcare Research and Quality; 2009.

    Google Scholar 

  9. Bates DW, Kuperman GJ, Wang S, Gandhi T, Kittler A, Volk L, et al. Ten commandments for effective clinical decision support: making the practice of evidence-based medicine a reality. J Am Med Inf Assoc. 2003;10:523–30.

    Article  Google Scholar 

  10. Sittig DF, Belmont E, Singh H. Improving the safety of health information technology requires shared responsibility: It is time we all step up. Healthcare [Internet]. 2018;6:7–12. http://www.journals.elsevier.com/healthcare-the-journal-of-delivery-science-and-innovation

  11. Kilsdonk E, Peute LW, Jaspers MWM. Factors influencing implementation success of guideline-based clinical decision support systems: a systematic review and gaps analysis. Int J Med Inf. 2017;98:56–64.

    Article  CAS  Google Scholar 

  12. Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8:19–32.

    Article  Google Scholar 

  13. Levac D, Colquhoun H, O’Brien KK. Scoping studies: advancing the methodology. Implement Sci. 2010;5:1–9.

    Article  Google Scholar 

  14. Moher D, Shamseer L, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst Rev. 2015;4:1–9.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Rethlefsen ML, Kirtley S, Waffenschmidt S, Ayala AP, Moher D, Page MJ, et al. PRISMA-S: an extension to the PRISMA statement for reporting literature searches in systematic reviews. Syst Rev. 2021;10:1–19.

    Article  Google Scholar 

  16. Vasquez HM, Pianarosa E, Sirbu R, Diemert LM, Cunningham HV, Donmez B et al. Human factors applications in the design of decision support systems for population health: a scoping review. BMJ Open [Internet]. 2022;12. https://bmjopen.bmj.com/content/12/4/e054330

  17. Revere D, Dixon BE, Hills R, Williams JL, Grannis SJ. Leveraging health information exchange to improve population health reporting processes: lessons in using a collaborative-participatory design process. EGEMS (Wash DC). 2014;2:1082.

    PubMed  Google Scholar 

  18. Pike I, Smith J, Al-Hajj S, Fuselli P, Macpherson A. The Canadian atlas of child and youth injury: mobilizing injury surveillance data to launch a national knowledge translation tool. Int J Environ Res Public Health. 2017;14:982.

    Article  PubMed  PubMed Central  Google Scholar 

  19. de Lima TFM, Lana RM, Carneiro TGS, Codeço CT, Machado GS, Ferreira LS et al. Dengueme: a tool for the modeling and simulation of dengue spatiotemporal dynamics. Int J Environ Res Public Health. 2016;13.

  20. Ola O, Sedig K. Beyond simple charts: design of visualizations for big health data. Online J Public Health Inf. 2016;8.

  21. Yuan M, Powell G, Lavigne M, Okhmatovskaia A, Buckeridge DL. Initial Usability Evaluation of a Knowledge-Based Population Health Information System: The Population Health Record (PopHR). AMIA Annu Symp Proc [Internet]. 2017;2017:1878–84. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85058748371&partnerID=40&md5=b859b042f33ad16f747528f271aacac0

  22. Al-Hajj S, Pike I, Riecke BE, Fisher B. Visual analytics for public health: Supporting knowledge construction and decision-making. Proceedings of the Annual Hawaii International Conference on System Sciences. 2013. pp. 2416–23.

  23. Scotch M, Parmanto B, Monaco V. Usability Evaluation of the Spatial OLAP Visualization and Analysis Tool (SOVAT). J Usability Stud [Internet]. 2007;2:76–95. http://ovidsp.ovid.com/ovidweb.cgi?T=JS&PAGE=reference&D=pmnm2&NEWS=N&AN=26613012

  24. Harris JK, Hinyard L, Beatty K, Hawkins JB, Brownstein JS, Nsoesie EO, et al. Evaluating the implementation of a twitter-based foodborne illness reporting tool in the city of St. Louis department of health. Int J Environ Res Public Health. 2018;15:833.

    Article  PubMed  PubMed Central  Google Scholar 

  25. McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS peer review of electronic search strategies: 2015 guideline statement. J Clin Epidemiol. 2016;75:40–6.

    Article  PubMed  Google Scholar 

  26. Leape LL. Human factors meets health care: the ultimate challenge. Ergon Des. 2004;12:6–12.

    Google Scholar 

  27. Cafazzo JA, St-Cyr O. From discovery to design: the evolution of human factors in healthcare. Healthc Q. 2012;15:24–9.

    Article  PubMed  Google Scholar 

  28. Veritas Health Innovation. Covidence systematic review software. Veritas Health Innovation.

  29. Accorsi P, Lalande N, Fabrègue M, Braud A, Poncelet P, Sallaberry A et al. HydroQual: Visual analysis of river water quality. 2014 IEEE Conference on Visual Analytics Science and Technology (VAST). IEEE; 2014. pp. 123–32.

  30. Adini B, Verbeek L, Trapp S, Schilling S, Sasse J, Pientka K et al. Continued vigilance - development of an online evaluation tool for assessing preparedness of medical facilities for biological events. Front Public Health [Internet]. 2014;2:35. http://ovidsp.ovid.com/ovidweb.cgi?T=JS&PAGE=reference&D=pmnm3&NEWS=N&AN=24783192

  31. Al-Hajj S, Fisher B, Smith J, Pike I. Collaborative visual analytics: a health analytics approach to injury prevention. Int J Environ Res Public Health. 2017;14.

  32. Ali H, Waruru A, Zielinski-Gutierrez E, Kim AA, Swaminathan M, De Cock KM, et al. Evaluation of an HIV-Related Mortuary Surveillance System - Nairobi, Kenya, two sites, 2015. MMWR Surveill Summ. 2018;67:1–12.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Anderson B, Coulter S, Orlowsky R, Ruzich B, Smedley R, Purvis M et al. Designing user experiences for policymakers in serious games in the domain of global food security. University of Virginia, Charlottesville, VA, United States BT – 2017 Systems and Information Engineering Design Symposium (SIEDS), 28 April 2017: IEEE; 2017. pp. 89–94.

  34. Andersson SR, Hassanen S, Momanyi AM, Onyango DK, Lutukai MN, Chandani YK, et al. Using human-centered design to Adapt Supply chains and Digital Solutions for Community Health Volunteers in nomadic communities of Northern Kenya. Glob Health Sci Pract. 2021;9:S151–67.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Anema A, Druyts E, Hollmeyer HG, Hardiman MC, Wilson K. Descriptive review and evaluation of the functioning of the International Health regulations (IHR) annex 2. Global Health. 2012;8:1.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Azofeifa A, Yeung LF, Duke CW, Gilboa SM, Correa A. Evaluation of an active surveillance system for stillbirths in metropolitan Atlanta. J Registry Manag. 2012;39:13–36.

    PubMed  Google Scholar 

  37. Bhowmick T, Robinson AC, Gruver A, MacEachren AM, Lengerich EJ. Distributed usability evaluation of the Pennsylvania Cancer Atlas. Int J Health Geogr. 2008;7:36.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Bollaerts K, De Smedt T, Donegan K, Titievsky L, Bauchau Kaatje. ORCID: http://orcid.org/0000-0001-7704-0527 VAO-B. Benefit-Risk monitoring of vaccines using an interactive dashboard: a methodological proposal from the ADVANCE Project. Drug Saf. 2018;41:775–86.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Boonchieng W, Tuanrat W, Aungwattana S, Tamdee D, Budda D. Development of a Community-based Geographic Health Information System via Mobile Phone in Saraphi District. J Computers (Taiwan). 2019;30:84–92.

    Google Scholar 

  40. Borges HL, Malucelli A, Paraiso EC, Moro CC. A physiotherapy EHR specification based on a user-centered approach in the context of public health. AMIA. Annual Symposium proceedings / AMIA Symposium AMIA Symposium. 2007;61–5.

  41. Brownson RC, Kemner AL, Brennan LK. Applying a mixed-methods evaluation to healthy kids, Healthy communities. J Public Health Manag Pract. 2015;21:S16–26.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Butchart A, Peden M, Matzopoulos R, Phillips R, Burrows S, Bhagwandin N, et al. The South African National Non-natural Mortality Surveillance System–rationale, pilot results and evaluation. S Afr Med J. 2001;91:408–17.

    CAS  PubMed  Google Scholar 

  43. Carr ECJ, Babione JN, Marshall D. Translating research into practice through user-centered design: an application for osteoarthritis healthcare planning. Int J Med Inf. 2017;104:31–7.

    Article  Google Scholar 

  44. Cesario M, Jervis M, Luz S, Masoodian M, Rogers B. Time-based geographical mapping of communicable diseases BT – 2012 16th International Conference on Information Visualisation, IV. 2012, July 11, 2012 - July 13, 2012. Graduate Programme on Health Promotion, University of Franca, BrazilDepartment of Computer Science, University of Waikato, New ZealandSchool of Computer Science and Statistics, Trinity College Dublin, Ireland: Institute of Electrical and Electronics Engineers Inc.; 2012. pp. 118–23.

  45. Chirambo GB, Muula AS, Thompson M, Hardy VE, Heavin C, Connor YO et al. End-user perspectives of two mHealth decision support tools: Electronic Community Case Management in Northern Malawi. Int J Med Inf. 2021;145.

  46. Cinnamon J, Rinner C, Cusimano MD, Marshall S, Bakele T, Hernandez T, et al. Evaluating web-based static, animated and interactive maps for injury prevention. Geospat Health. 2009;4:3–16.

    Article  PubMed  Google Scholar 

  47. Cleland B, Wallace J, Bond R, Muuraiskangas S, Pajula J, Epelde G et al. July. Usability Evaluation of a Co-created Big Data Analytics Platform for Health Policy-Making. Ulster University, School of Computing, United Kingdom BT - Human Interface and the Management of Information. HIMI 2019, held as part of the 21st HCI International Conference, HCII 2019, 26–31 2019: Springer International Publishing; 2019. pp. 194–207. https://doi.org/10.1007/978-3-030-22660-2_13

  48. Concannon D, Herbst K, Manley E. Developing a data dashboard framework for population health surveillance: Widening access to clinical trial findings. JMIR Form Res [Internet]. 2019;3. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85096868744&doi=10.2196%2F11342&partnerID=40&md5=0596d9738c51cdb49ebf2e5f2a2d010f.

  49. Cox R, Sanchez J, Revie CW. Multi-criteria Decision Analysis Tools for prioritising emerging or re-emerging infectious diseases Associated with Climate Change in Canada. PLoS ONE. 2013;8:e68338.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  50. Cretikos M, Telfer B, McAnulty J. Evaluation of the system of surveillance for enteric disease outbreaks, New South Wales, Australia, 2000 to 2005. N S W Public Health Bull. 2008;19:8–14.

    Article  PubMed  Google Scholar 

  51. Cruden G, Frerichs L, Powell BJ, Lanier P, Brown CH, Lich Gracelyn. ORCID: http://orcid.org/0000-0002-1737-5867 KHAI-O http://orcid.org/Cruden. Developing a multi-criteria decision analysis tool to support the adoption of evidence-based child maltreatment prevention programs. Aarons Baumann, Belton, Bonabeau, Buffett, Cruden, Glasgow, Marsh, Muhlbacher, Palinkas, Sheldrick, Stoltzfus, Thokala, Tversky A, editor. Prev Sci. 2020;No-Specified.

  52. Cunningham PM, Cunningham M, van Greunen D, Veldsman A, Kanjo C, Kweyu E et al. Oct. Implications of baseline study findings from rural and deep rural clinics in Ethiopia, Kenya, Malawi and South Africa for the co-design of mHealth4Afrika. Stockholm University, Dept. of Computer and Systems Sciences, 13 Docklands Innovation Park, 128 East Wall Road, Ireland BT – 2016 IEEE Global Humanitarian Technology Conference (GHTC), 13–16 2016: IEEE; 2016. pp. 666–74.

  53. Dalle Carbonare S, Cerra C, Bellazzi R. Development and representation of health indicators with thematic maps. Stud Health Technol Inform [Internet]. 2012;180:220–4. http://ovidsp.ovid.com/ovidweb.cgi?T=JS&PAGE=reference&D=emed13&NEWS=N&AN=366370560

  54. Fico G, Hernanzez L, Cancela J, Arredondo MT, Dagliati A, Sacchi L, et al. What do healthcare professionals need to turn risk models for type 2 diabetes into usable computerized clinical decision support systems? Lessons learned from the MOSAIC project. BMC Med Inf Decis Mak. 2019;19:163.

    Article  Google Scholar 

  55. Finch CF, Goode N, Salmon PM, Shaw Caroline F. ORCID: http://orcid.org/0000-0003-1711-1930 LAO-F. End-user experiences with two incident and injury reporting systems designed for led outdoor activities - challenges for implementation of future data systems. Inj Epidemiol. 2019;6:39.

    Article  PubMed  PubMed Central  Google Scholar 

  56. Fisher RP, Myers BA. Free and simple GIS as appropriate for health mapping in a low resource setting: a case study in eastern Indonesia. Int J Health Geogr. 2011;10:15.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Foldy SL, Barthell E, Silva J, Biedrzycki P, Howe D, Erme M, et al. SARS Surveillance project–internet-enabled multiregion surveillance for rapidly emerging disease. MMWR Morb Mortal Wkly Rep. 2004;53:215–20.

    Google Scholar 

  58. Foldy SL, Biedrzycki PA, Baker BK, Swain GR, Howe DS, Gieryn D, et al. The public health dashboard: a surveillance model for bioterrorism preparedness. J Public Health Manag Pract. 2004;10:234–40.

    Article  PubMed  Google Scholar 

  59. Gagnon M-P, Lampron A, Buyl R, Implementation. and adoption of an electronic information system for vaccine inventory management BT – 49th Annual Hawaii International Conference on System Sciences, HICSS 2016, January 5, 2016 - January 8, 2016. Universite Laval, CanadaCHU de Quebec Research Center, CanadaVrije Universiteit, Brussel, Belgium: IEEE Computer Society; 2016. pp. 3172–8.

  60. Gerrits RG, Klazinga NS, van den Berg MJ, Kringos Reinie G. ORCID: http://orcid.org/0000-0001-8030-2882 DSAO-G. figure interpretation Assessment Tool-Health (FIAT-health) 2.0: from a scoring instrument to a critical appraisal tool. BMC Med Res Methodol. 2019;19:160.

    Article  PubMed  PubMed Central  Google Scholar 

  61. Gesteland PH, Livnat Y, Galli N, Samore MH, Gundlapalli AV. The EpiCanvas infectious disease weather map: an interactive visual exploration of temporal and spatial correlations. J Am Med Inform Assoc. 2012;19:954–9.

    Article  PubMed  PubMed Central  Google Scholar 

  62. Gourevitch MN, Athens JK, Levine SE, Kleiman N, Thorpe LE. City-Level Measures of Health, Health determinants, and equity to Foster Population Health Improvement: the City Health Dashboard. Am J Public Health. 2019;109:585–92.

    Article  PubMed  PubMed Central  Google Scholar 

  63. Grossberndt S, Bartonova A, Van Den Hazel P. Application of social media in the environment and health professional community. Environ Health. 2012;11:S16.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Guthrie JL, Marchand-Austin A, Lam K, Whelan M, Lee B, Alexander DC, et al. Technology and tuberculosis control: the OUT-TB web experience. J Am Med Inform Assoc. 2017;24:e136–42.

    Article  PubMed  Google Scholar 

  65. Hawver JE, Rocheleau B, Wyllie TT, Waller KN, Bailey R, Smith MC. Mental health resources and the criminal justice system: Assessment and plan for integration in Charlottesville, Virginia - Phase III expansion BT – 2009 IEEE Systems and Information Engineering Design Symposium, SIEDS ’09, April 24, 2009 - April 24, 2. University of Virginia, Charlottesville, VA 22904, United StatesDepartment of Systems and Information Engineering, University of Virginia, Charlottesville, VA 22904, United States: IEEE Computer Society; 2009. pp. 197–202. https://doi.org/10.1109/SIEDS.2009.5166183

  66. Ha YP, Tesfalul MA, Littman-Quinn R, Antwi C, Green RS, Mapila TO et al. Becerra Chapman Creswell Daemen Denkinger Fox Ha Karlesky Kayiwa Khan Labrique Lewis Puryear Timimi B Vella editor 2016 Evaluation of a mobile health approach to Tuberculosis contact tracing in Botswana. J Health Commun 21 1115–21.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Heidebrecht CL, Wells GA, Tugwell PS, Engel ME. Tuberculosis surveillance in Cape Town, South Africa: an evaluation. Int J Tuberculosis Lung Disease. 2011;15:912–8.

    Article  CAS  Google Scholar 

  68. Hundley VA, Avan BI, Ahmed H, Graham WJ, Group BKW. Clean birth kits to improve birth practices: development and testing of a country level decision support tool. BMC Pregnancy Childbirth. 2012;12:158.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Hu PJ, Zeng D, Chen H, Larson C, Chang W, Tseng C et al. System for infectious disease information sharing and analysis: design and evaluation. IEEE Trans Inf Technol Biomed [Internet]. 2007;11:483–92. http://ovidsp.ovid.com/ovidweb.cgi?T=JS&PAGE=reference&D=emed10&NEWS=N&AN=47306402

  70. Hussain-Alkhateeb L, Olliaro P, Benitez D, Kroeger A, Sewe MO, Rocklov J, et al. Early warning and response system (EWARS) for dengue outbreaks: recent advancements towards widespread applications in critical settings. PLoS ONE. 2018;13:e0196811.

    Article  PubMed  PubMed Central  Google Scholar 

  71. Ilesanmi OS, Fawole O, Nguku P, Oladimeji A, Nwenyi O. Evaluation of Ebola virus disease surveillance system in Tonkolili District, Sierra Leone. Pan Afr Med J. 2019;32:2.

    Article  PubMed  PubMed Central  Google Scholar 

  72. Jaroensutasinee M, Jaroensutasinee K, Jinpon P. Integrated information visualization to support decision-making in order to strengthen communities: design and usability evaluation. Inf Health Soc Care. 2017;42:335–48.

    Article  Google Scholar 

  73. Joshi A, de Araujo Novaes M, Machiavelli J, Iyengar S, Vogler R, Johnson C, et al. A human centered GeoVisualization framework to facilitate visual exploration of telehealth data: a case study. Technol Health Care. 2012;20:457–71.

    PubMed  Google Scholar 

  74. Joyce K. To me it’s just another tool to help understand the evidence: public health decision-makers’ perceptions of the value of geographical information systems (GIS). Health Place. 2009;15:801–10.

    Article  PubMed  Google Scholar 

  75. Kadam R, White W, Banks N, Katz Z, Kelly-Cirino C, Dittrich S. Target product profile for a mobile app to read rapid diagnostic tests to strengthen infectious disease surveillance. PLoS ONE. 2020;15:e0228311.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  76. Karavite DJ, Miller MW, Ramos MJ, Rettig SL, Ross RK, Xiao R, et al. User testing an information foraging Tool for Ambulatory Surgical site infection surveillance. Appl Clin Inf. 2018;9:791–802.

    Article  Google Scholar 

  77. Kealey CM, Brunetti GM, Valaitis RK, Akhtar-Danesh N, Thomas H. A severe Acute Respiratory Syndrome extranet: supporting local communication and information dissemination. BMC Med Inf Decis Mak. 2005;5:17.

    Article  Google Scholar 

  78. Keeling JW, Turner AM, Allen EE, Rowe SA, Merrill JA, Liddy ED et al. Development and evaluation of a prototype search engine to meet public health information needs. AMIA. Annual Symposium proceedings / AMIA Symposium AMIA Symposium. 2011;2011:693–700.

  79. Kelly GC. A spatial decision support system for guiding focal indoor residual spraying interventions in a malaria elimination zone. Geospat Health. 2011;6:21–31.

    Article  PubMed  Google Scholar 

  80. Laberge M, Shachak A. Developing a tool to assess the quality of socio-demographic data in community health centres. Appl Clin Inf. 2013;4:1–11.

    Article  CAS  Google Scholar 

  81. Liaw S-T, Ansari S, Zhou R, Gao J. A digital health profile & maturity assessment toolkit: cocreation and testing in the Pacific Islands. J Am Med Inf Assoc. 2021;28:494–503.

    Article  Google Scholar 

  82. Livnat Y, Rhyne T-M, Samore MH. Epinome: a visual-analytics workbench for epidemiology data. IEEE Comput Graph Appl. 2012;32:89–95.

    Article  PubMed  Google Scholar 

  83. Loschen W, Coberly J, Sniegoski C, Holtry R, Sikes M, Happel Lewis S. Event communication in a regional disease surveillance system. AMIA. Annual Symposium proceedings / AMIA Symposium AMIA Symposium. 2007;483–7.

  84. Loschen W, Seagraves R, Holtry R, Hung L, Lombardo J, Lewis S. INFOSHARE - an Information Sharing Tool for Public Health during the 2009 presidential inauguration and H1N1 outbreak. Online J Public Health Inf. 2010;2.

  85. Maclachlan JC, Jerrett M, Abernathy T, Sears M, Bunch MJ. Mapping health on the internet: a new tool for environmental justice and public health research. Health Place. 2007;13:72–86.

    Article  PubMed  Google Scholar 

  86. Mansoor H, Gerych W, Alajaji A, Buquicchio L, Chandrasekaran K, Agu E, PLEADES: Population level observation of smartphone sensed symptoms for in-the-wild data using clustering BT – 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics, Theory, Applications VISIGRAPP et al. 2021, February 8, 2021. Department of Data Science, Worcester Polytechnic Institute, Worcester; MA, United StatesDepartment of Computer Science, Worcester Polytechnic Institute, Worcester; MA, United States: SciTePress; 2021. pp. 64–75.

  87. Margevicius KJ, Generous N, Abeyta E, Castro L, Daughton A, Del Valle SY, et al. The biosurveillance analytics resource directory (BARD): facilitating the use of epidemiological models for infectious disease surveillance. PLoS ONE. 2016;11:e0146600.

    Article  PubMed  PubMed Central  Google Scholar 

  88. Bögl M, Aigner W, Filzmoser P, Lammarsch T, Miksch S, Rind A. Visual analytics for Model Selection in Time Series Analysis. IEEE Trans Vis Comput Graph. 2013;19:2237–46.

    Article  PubMed  Google Scholar 

  89. Merkord CL, Liu Y, Mihretie A, Gebrehiwot T, Awoke W, Bayabil E et al. Integrating malaria surveillance with climate data for outbreak detection and forecasting: the EPIDEMIA system. Malar J. 2017;16.

  90. Millery M, Ramos W, Lien C, Kukafka R, Aguirre AN. Design of a Community-Engaged Health Informatics Platform with an Architecture of Participation. AMIA Annu Symp Proc. 2015;2015:905–14.

  91. Mukhtar Q, Mehta P, Brody ER, Camponeschi J, Friedrichs M, Kemple AM, et al. Development of the diabetes indicators and data sources Internet Tool (DIDIT). Prev Chronic Dis. 2006;3:A20.

    PubMed  Google Scholar 

  92. Nagykaldi Z, Mold JW, Bradley KK, Bos JE. Bridging the gap between public and private healthcare: influenza-like illness surveillance in a practice-based research network. J Public Health Manag Pract. 2006;12:356–64.

    Article  PubMed  Google Scholar 

  93. Ngo TD, Canavati SE, Dinh HS, Ngo TD, Tran DT, Martin NJ et al. Addressing operational challenges of combatting malaria in a remote forest area of Vietnam using spatial decision support system approaches. Geospat Health. 2019;14.

  94. Nguyen LH, LeFevre AE, Jennings L, Agarwal S, Labrique AB, Mehl G, et al. Perceptions of data processes in mobile-based versus paper-based health information systems for maternal, newborn and child health: a qualitative study in Andhra Pradesh, India. BMJ Innov. 2015;1:167–73.

    Article  Google Scholar 

  95. Olingson C, Hallberg N, Timpka T, Lindqvist K. Requirements engineering for inter-organizational health information systems with functions for spatial analyses: modeling a WHO safe community applying use case maps. Methods Inf Med. 2002;41:299–304.

    Google Scholar 

  96. Chen M, Trefethen A, Bañares-Alcántara R, Jirotka M, Coecke B, Ertl T, et al. From data analysis and visualization to causality discovery. Comput (Long Beach Calif). 2011;44:84–7.

    CAS  Google Scholar 

  97. Park S, Gil-Garcia JR. Understanding transparency and accountability in open government ecosystems: The case of health data visualizations in a state government BT – 18th Annual International Conference on Digital Government Research, DG.O. 2017, June 7, 2017 - June 9, 2017. University at Albany, State University of New York, 187 Wolf Road, Suite 301, Albany; NY; 12205, United States: Association for Computing Machinery; 2017. pp. 39–47.

  98. Patel R, Ahn E, Baldacchino T, Mullavey T, Kim J, Liu N et al. A Mobile App and Dashboard for Early Detection of Infectious Disease Outbreaks: Development Study. JMIR Public Health Surveill [Internet]. 2021;7:e14837. http://ovidsp.ovid.com/ovidweb.cgi?T=JS&PAGE=reference&D=emexb&NEWS=N&AN=634537554

  99. Pelat C, Bonmarin I, Ruello M, Fouillet A, Caserio-Schonemann C, Levy-Bruhl D et al. Improving regional influenza surveillance through a combination of automated outbreak detection methods: the 2015/16 season in France. Eurosurveillance. 2017;22.

  100. Rachmani E, Lin M-C, Hsu CY, Jumanto J, Iqbal U, Shidik GF et al. The implementation of an integrated e-leprosy framework in a leprosy control program at primary health care centers in Indonesia. Int J Med Inf. 2020;140.

  101. Rajamani S, Bieringer A, Sowunmi S, Muscoplat M. Stakeholder Use and Feedback on Vaccination History and Clinical Decision Support for Immunizations Offered by Public Health. AMIA Annu Symp Proc. 2017;2017:1450–7.

  102. Rajvanshi H, Telasey V, Soni C, Jain D, Surve M, Gangamwar V, et al. A comprehensive mobile application tool for disease surveillance, workforce management and supply chain management for Malaria Elimination Demonstration Project. Malar J. 2021;20:91.

    Article  PubMed  PubMed Central  Google Scholar 

  103. Reeder B, Hills RA, Turner AM, Demiris G. Participatory design of an integrated information system design to support public health nurses and nurse managers. Public Health Nurs. 2014;31:183–92.

    Article  PubMed  Google Scholar 

  104. Reeder B, Turner AM. Scenario-based design: a method for connecting information system design with public health operations and emergency management. J Biomed Inf. 2011;44:978–88.

    Article  Google Scholar 

  105. Rezaei-hachesu P, Samad-Soltani T, Yaghoubi S, GhaziSaeedi M, Mirnia K, Masoumi-Asl H, et al. The design and evaluation of an antimicrobial resistance surveillance system for neonatal intensive care units in Iran. Int J Med Inf. 2018;115:24–34.

    Article  Google Scholar 

  106. Roberton T, Litvin K, Self A, Stegmuller AR. All things to all people: trade-offs in pursuit of an ideal modeling tool for maternal and child health. BMC Public Health. 2017;17:785.

    Article  PubMed  PubMed Central  Google Scholar 

  107. Robinson AC, MacEachren AM, Roth RE. Designing a web-based learning portal for geographic visualization and analysis in public health. Health Inf J. 2011;17:191–208.

    Article  Google Scholar 

  108. Sahar L, Faler G, Hristov E, Hughes S, Lee L, Westnedge C, et al. Development of the Inventory Management and Tracking System (IMATS) to Track the Availability of Public Health Department Medical Countermeasures during Public Health Emergencies. Online J Public Health Inf. 2015;7:e212.

    Google Scholar 

  109. Semwanga AR, Nakubulwa S, Adam T. Applying a system dynamics modelling approach to explore policy options for improving neonatal health in Uganda. Health Res Policy Syst. 2016;14:35.

    Article  PubMed  PubMed Central  Google Scholar 

  110. Sopan A, Noh ASI, Karol S, Rosenfeld P, Lee G, Shneiderman B. Community Health Map: a geospatial and multivariate data visualization tool for public health datasets. Gov Inf Q. 2012;29:223–34.

    Article  Google Scholar 

  111. Sorge J, Klassen B, Higgins R, Tooley L, Ablona A, Jollimore J, et al. Democratizing Access to Community-based survey findings through dynamic data visualizations. Arch Sex Behav. 2021;50:119–28.

    Article  PubMed  Google Scholar 

  112. Stegmuller AR, Self A, Litvin K, Roberton T. How is the lives Saved Tool (LiST) used in the global health community? Results of a mixed-methods LiST user study. BMC Public Health. 2017;17:773.

    Article  PubMed  PubMed Central  Google Scholar 

  113. Struik LL, Abramowicz A, Riley B, Oliffe JL, Bottorff JL, Stockton Laura L. ORCID: http://orcid.org/0000-0001-7175-7308, Bottorff, Joan L.; ORCID: http://orcid.org/0000-0001-9724-5351 LDAI-O http://orcid.org/Struik. Evaluating a tool to support the integration of gender in programs to promote men’s health. Affleck Bottorff, Bunn, Damschroder, Dworkin, Gahagan, Galdas, Gelb, Heidari, Heilman, Kiselica, Langley, Lefkowich, Mackenzie, McIntosh, Ogrodniczuk, Oliffe, Oliffe, Oliffe, Paretz, Pirkis, Robertson, Robertson, Robertson, Robertson, Rycroft-Malone, San B, editor. Am J Mens Health. 2019;13.

  114. Studnicki J, Fisher JW, Eichelberger C, Bridger C, Angelon-Gaetz K, Nelson D. NC CATCH: Advancing Public Health Analytics. Online J Public Health Inf. 2010;2.

  115. Sutcliffe A, De Bruijn O, Thew S, Buchan I, Jarvis P, McNaught J, et al. Developing visualization-based decision support tools for epidemiology. Inf Vis. 2014;13:3–17.

    Article  Google Scholar 

  116. Svoronos T, Jillson IA, Nsabimana MM. TRACnet’s absorption into the Rwandan HIV/AIDS response. Int J Healthc Technol Manage. 2008;9:430–45.

    Article  Google Scholar 

  117. Swoboda CM, Griesenbrock T, Gureddygari HR, Aldrich A, Fareed N, Jonnalagadda P. Visualizing Opportunity Index Data Using a Dashboard Application: A Tool to Communicate Infant Mortality-Based Area Deprivation Index Information. Appl Clin Inform [Internet]. 2020;11:515–27. http://ovidsp.ovid.com/ovidweb.cgi?T=JS&PAGE=reference&D=emexa&NEWS=N&AN=632546907

  118. Thew SL, Sutcliffe A, de Bruijn O, McNaught J, Procter R, Jarvis P, et al. Supporting creativity and appreciation of uncertainty in exploring geo-coded public health data. Methods Inf Med. 2011;50:158–65.

    Article  CAS  PubMed  Google Scholar 

  119. Tilahun B, Kauppinen T, Kesler C, Fritz F. Design and development of a linked open data-based health information representation and visualization system: potentials and preliminary evaluation. JMIR Med Inf. 2014;2:e31.

    Article  Google Scholar 

  120. Tobgay T, Samdrup P, Jamtsho T, Mannion K, Thriemer K, Ortega L, et al. Performance and user acceptance of the Bhutan febrile and malaria information system: report from a pilot study. Malar J. 2016;15:52.

    Article  PubMed  PubMed Central  Google Scholar 

  121. Tom-Aba D, Toikkanen SE, Glockner S, Denecke K, Silenou BC, Krause G et al. User Evaluation Indicates High Quality of the Surveillance Outbreak Response Management and Analysis System (SORMAS) After Field Deployment in Nigeria in 2015 and 2018. Stud Health Technol Inform [Internet]. 2018;253:233–7. http://ovidsp.ovid.com/ovidweb.cgi?T=JS&PAGE=reference&D=emed19&NEWS=N&AN=624845037

  122. Travers D, Crouch J, Haas SW, Mostafa J, Waller AE, Schwartz TA et al. Implementation of Emergency Medical Text Classifier for syndromic surveillance. AMIA. Annual Symposium proceedings / AMIA Symposium AMIA Symposium. 2013;2013:1365–74.

  123. Turner AM, Reeder B, Wallace JC. A resource management tool for public health continuity of operations during disasters. Disaster Med Public Health Prep. 2013;7:146–52.

    Article  PubMed  PubMed Central  Google Scholar 

  124. Velicko I, Riera-Montes M. The Chlamydia surveillance system in Sweden delivers relevant and accurate data: results from the system evaluation, 1997–2008. Eurosurveillance. 2011;16.

  125. Wang E-H, Zhou L, Watzlaf V, Abernathy PA, Web-Based Social Network Analysis System for Guiding Behavioral Interventions Delivery in Medically Underserved Communities BT – 2017 International Conference on Computational Science and, Intelligence C. CSCI 2017, December 14, 2017 - Dec. FPFHC, FOCUS Pittsburgh, Pittsburgh; PA, United StatesDepartment of HIM, University of Pittsburgh, Pittsburgh; PA, United States: Institute of Electrical and Electronics Engineers Inc.; 2017. pp. 840–5.

  126. Wang KH, Marenco L, Madera JE, Aminawung JA, Wang EA, Cheung K-H. Using a community-engaged health informatics approach to develop a web analytics research platform for sharing data with community stakeholders. AMIA Annu Symp Proc. 2017;2017:1715–23.

  127. Wongsapai M, Suebnukarn S, Rajchagool S, Kijsanayotin B. Health-oriented electronic oral health record for health surveillance. Stud Health Technol Inform [Internet]. 2013;192:763–7. http://ovidsp.ovid.com/ovidweb.cgi?T=JS&PAGE=reference&D=emed14&NEWS=N&AN=603583684

  128. Wu E, Davis A, Villani J, Fareed N, Huerta TR, Harris DR, et al. Community dashboards to support data-informed decision-making in the HEALing communities study. Drug Alcohol Depend. 2020;217:108331.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  129. Cole BL, Yancey AK, McCarthy WJ. A graphical, computer-based decision-support tool to help decision makers evaluate policy options relating to physical activity. Am J Prev Med. 2010;39:273–9.

    Article  PubMed  Google Scholar 

  130. Yang J-A, Block J, Jankowska MM, Baer RJ, Chambers CD, Jelliffe-Pawlowski LL, et al. An Online Geographic Data Visualization Tool to relate Preterm births to Environmental factors. Prev Chronic Dis. 2019;16:E102.

    PubMed  PubMed Central  Google Scholar 

  131. Kenealy T, et al. A whole of system approach to compare options for CVD interventions in Counties Manukau. Aust N Z J Public Health. 2012;36:263–8.

    Article  PubMed  Google Scholar 

  132. Geyer NR, Kessler FC, Lengerich EJ, United States. LionVu 2.0 usability assessment for Pennsylvania,. ISPRS Int J Geoinf [Internet]. 2020;9. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85094968202&doi=10.3390%2Fijgi9110619&partnerID=40&md5=bbbc5fe3300e82f8555c82bfcd150bec

  133. Jinpon P, Jaroensutasinee M, Jaroensutasinee K. Integrated information visualization to support decision making for health promotion in Chonburi, Thailand. Walailak J Sci Technol. 2019;16:551–60.

    Article  Google Scholar 

  134. Karlsson D, Ekberg J, Spreco A, Eriksson H, Timpka T. Visualization of infectious disease outbreaks in routine practice. Stud Health Technol Inf. 2013. pp. 697–701.

  135. McGladrey M, Noar S, Crosby R, Young A, Webb E. Creating project CREATE: lessons learned and best practices for developing web-based resources for public health practitioners. Am J Health Educ. 2012;43:341–8.

    Article  Google Scholar 

  136. Mittelstädt S, Hao MC, Dayal U, Hsu M-C, Terdiman J, Keim DA. Advanced visual analytics interfaces for adverse drug event detection. Proceedings of the Workshop on Advanced Visual Interfaces AVI. 2014. pp. 237–44.

  137. Osborn AW, Peters LR. Vaccination Data when the outbreak happens: a qualitative evaluation of Oregon’s Rapid Response Tool. Disaster Med Public Health Prep. 2019;13:682–5.

    Article  PubMed  Google Scholar 

  138. Parks AL, Walker B, Pettey W, Benuzillo J, Gesteland P, Grant J et al. Interactive agent based modeling of public health decision-making. AMIA. Annual Symposium proceedings / AMIA Symposium AMIA Symposium [Internet]. 2009;2009:504–8. https://www.scopus.com/inward/record.uri?eid=2-s2.0-79953795548&partnerID=40&md5=253273e41c383b4ef358fa638b39a708

  139. Pontin D, Thomas M, Jones G, O’Kane J, Wilson L, Dale F, et al. Developing a family resilience assessment tool for health visiting/public health nursing practice using virtual commissioning, high-fidelity simulation and focus groups. J Child Health Care. 2020;24:195–206.

    Article  PubMed  Google Scholar 

  140. Schooley B, Feldman S, Tipper B. A Unified Framework for Human Centered Design of a Substance Use, Abuse, and Recovery Support System. Advances in Intelligent Systems and Computing. Health Information Technology Program, College of Engineering and Computing, University of South Carolina, 550 Assembly Street, Columbia, SC 29208, United States; 2020. pp. 175–82.

  141. Sinclair S, Hagen NA, Chambers C, Manns B, Simon A, Browman GP. Accounting for reasonableness: exploring the personal internal framework affecting decisions about cancer drug funding. Health Policy. 2008;86:381–90.

    Article  PubMed  Google Scholar 

  142. Thew S, Sutcliffe A, Procter R, de Bruijn O, McNaught J, Venters CC, et al. Requirements engineering for E-science: experiences in epidemiology. IEEE Softw. 2009;26:80–7.

    Article  Google Scholar 

  143. Timpka T, Morin M, Jenvald J, Eriksson H, Gursky E. Towards a simulation environment for modeling of local influenza outbreaks. AMIA. Annual Symposium proceedings / AMIA Symposium AMIA Symposium. 2005;729–33.

  144. Zakkar M, Sedig K. Interactive visualization of public health indicators to support policymaking: an exploratory study. Online J Public Health Inf. 2017;9.

  145. Aburto NJennings, Rogers L, De-Regil LM, aria, Kuruchittham V, Rob G, Arif R et al. An evaluation of a global vitamin and mineral nutrition surveillance system. Arch Latinoam Nutr [Internet]. 2013;63:105–13. http://ovidsp.ovid.com/ovidweb.cgi?T=JS&PAGE=reference&D=emed14&NEWS=N&AN=604450009

  146. Rajamani S, Chakoian H, Bieringer A, Lintelmann A, Sanders J, Ostadkar R, et al. Development and implementation of an interoperability tool across state public health agency’s disease surveillance and immunization information systems. JAMIA Open. 2023;6:ooad055.

    Article  PubMed  PubMed Central  Google Scholar 

  147. Akre S, Liu PY, Friedman JR, Bui AAT, International. COVID-19 mortality forecast visualization: Covidcompare.io. JAMIA Open [Internet]. 2021;4. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85140846026&doi=10.1093%2fjamiaopen%2fooab113&partnerID=40&md5=9fc1e6ff64aafb515889c20b67f06e44

  148. Alminana A, Bayeh A, Girma D, Kanagat N, Oot L, Prosser W et al. Early lessons from Ethiopia in establishing a data triangulation process to analyze immunization program and Supply data for decision making. Glob Health Sci Pract. 2022;10.

  149. Alpers R, Kuhne L, Truong H-P, Zeeb H, Westphal M, Jackle S. Evaluation of the EsteR Toolkit for COVID-19 decision support: sensitivity analysis and usability study. JMIR Form Res. 2023;7:e44549.

    Article  PubMed  PubMed Central  Google Scholar 

  150. Altura KAP, Madjalis HEC, Sungahid MDG, Serrano EA, Rodriguez RL. Development of a Web-Portal Health Information System for Barangay. Fujisawa, Japan: Institute of Electrical and Electronics Engineers Inc.; 2023. pp. 544–50.

    Google Scholar 

  151. Ansari B, Martin EG. Integrating human-centered design in public health data dashboards: lessons from the development of a data dashboard of sexually transmitted infections in New York State. J Am Med Inf Assoc. 2023.

  152. Backonja U, Park S, Kurre A, Yudelman H, Heindel S, Schultz M et al. Supporting rural public health practice to address local-level social determinants of health across Northwest states: Development of an interactive visualization dashboard. J Biomed Inform [Internet]. 2022;129. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85127144845&doi=10.1016%2fj.jbi.2022.104051&partnerID=40&md5=07f1b9bec7b466f6d0ff6bbe52f77c93

  153. Burgess H, Gutierrez-Mock L, Moghadassi Y-XH, Lesh M, Krueger N et al. E,. Implementing a digital system for contact tracing and case investigation during COVID-19 pandemic in San Francisco: A qualitative study. JAMIA Open [Internet]. 2021;4:ooab093-. https://academic.oup.com/jamiaopen

  154. Delcher C, Horne N, McDonnell C, Bae J, Surratt H. Overdose Detection Mapping Application Program expansion evaluation—A qualitative study. Criminol Public Policy [Internet]. 2023;22:491–516. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85159118390&doi=10.1111%2f1745-9133.12628&partnerID=40&md5=7c72655dfe4bc641d9dbf531a8794f8d

  155. Doyle M, Ainsworth P, Boul S, Lee D. Evaluation of a system for Real-Time Surveillance of Suicide in England. Crisis. 2023;44:341–8.

    Article  PubMed  Google Scholar 

  156. Agbemafle EE, Kubio C, Bandoh D, Odikro MA, Azagba CK, Issahaku RG, et al. Evaluation of the malaria surveillance system - Adaklu District, Volta Region, Ghana, 2019. Public Health Pract (Oxf). 2023;6:100414.

    Article  CAS  PubMed  Google Scholar 

  157. Filos D, Lekka I, Kilintzis V, Stefanopoulos L, Karavidopoulou Y, Maramis C, et al. Exploring associations between Children’s obesogenic behaviors and the local Environment using Big Data: development and evaluation of the obesity Prevention Dashboard. JMIR Mhealth Uhealth. 2021;9:e26290.

    Article  PubMed  PubMed Central  Google Scholar 

  158. Guimarães EADA, Morato YC, Carvalho DBF, Oliveira VCD, Pivatti VMS, Cavalcante RB et al. Evaluation of the Usability of the Immunization Information System in Brazil: A Mixed-Method Study. Telemedicine and e-Health [Internet]. 2021;27:551–60. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85105752793&doi=10.1089%2ftmj.2020.0077&partnerID=40&md5=83f9a036c1d51f62785e1a03f461baf5

  159. Hintermeier M, Gold AW, Erdmann S, Perplies C, Bozorgmehr K, Biddle L. From Research into Practice: Converting Epidemiological Data into Relevant Information for Planning of Regional Health Services for Refugees in Germany. Int J Environ Res Public Health [Internet]. 2022;19:8049. https://www.mdpi.com/1660-4601/19/13/8049/pdf?version=1656585138

  160. Hollis S, Stolow J, Rosenthal M, Morreale SE, Moses L. Go.Data as a digital tool for case investigation and contact tracing in the context of COVID-19: a mixed-methods study. BMC Public Health. 2023;23:1717.

    Article  PubMed  PubMed Central  Google Scholar 

  161. Said SIM, Aminuddin R, Abidin NAZ, Nasir SDNM, Ibrahim AZM. Visualizing COVID-19 Vaccination Rate and Vaccination Centre in Malaysia using DBSCAN Clustering model. 2022 IEEE International Power and Renewable Energy Conference (IPRECON). 2022. pp. 1–6.

  162. Ising A, Waller A, Frerichs L. Evaluation of an Emergency Department Visit Data Mental Health Dashboard. J Public Health Manag Pract. 2023.

  163. Jonnalagadda P, Swoboda C, Singh P, Gureddygari H, Scarborough S, Dunn I, et al. Developing dashboards to address children’s Health disparities in Ohio. Appl Clin Inf. 2022;13:100–12.

    Article  Google Scholar 

  164. Lardi EA, Khader SAKSAAAMAASA. The Rotavirus Surveillance System in Yemen: evaluation study. JMIR Public Health Surveill. 2021;7:e27625.

    Article  PubMed  PubMed Central  Google Scholar 

  165. Lechner C, Rumpler M, Dorley MC, Li Y, Ingram A, Fryman H. Developing an Online Dashboard to Visualize Performance Data—Tennessee Newborn Screening Experience. Int J Neonatal Screen [Internet]. 2022;8. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85138468949&doi=10.3390%2fijns8030049&partnerID=40&md5=e3824922762dd77b6b5004fe604b73c7

  166. Li Y, Albarrak AS. An informatics-driven intelligent system to improve healthcare accessibility for vulnerable populations. J Biomed Inform [Internet]. 2022;134. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85137174603&doi=10.1016%2fj.jbi.2022.104196&partnerID=40&md5=d6ea5cf40f3b559a4ca1233bb856ae72

  167. Mansoor H, Gerych W, Alajaji A, Buquicchio L, Chandrasekaran K, Agu E, et al. PLEADES: Population level observation of smartphone sensed symptoms for in-the-wild data using clustering. Virtual, Online: SciTe; 2021. pp. 64–75.

    Google Scholar 

  168. Meidani Z, Moravveji A, Gohari S, Ghaffarian H, Zare S, Vaseghi F, et al. Development and Testing requirements for an Integrated Maternal and Child Health Information System in Iran: A Design thinking Case Study. Methods Inf Med. 2022;61:e64–72.

    Article  PubMed  PubMed Central  Google Scholar 

  169. O’Flaherty M, Lloyd-Williams F, Capewell S, Boland A, Maden M, Collins B et al. Modelling tool to support decision-making in the NHS Health Check programme: Workshops, systematic review and co-production with users. Health Technol Assess (Rockv) [Internet]. 2021;25:1–233. https://www.journalslibrary.nihr.ac.uk/hta/hta25350/#/abstract

  170. O’Leary MC, Mayorga KHL, Hicklin ME, Davis K, Brenner MM et al. AT,. Engaging stakeholders in the use of an interactive simulation tool to support decision-making about the implementation of colorectal cancer screening interventions. Cancer Causes and Control [Internet]. 2023; https://www.springer.com/journal/10552

  171. Praharaj S, Solis P, Wentz EA. Deploying geospatial visualization dashboards to combat the socioeconomic impacts of COVID-19. Environ Plan B Urban Anal City Sci [Internet]. 2023;50:1262–79. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85148503498&doi=10.1177%2f23998083221142863&partnerID=40&md5=f1c96f0c706aa744fb4a99f1150a1787

  172. Rivers Z, Roth JA, Wright W, Rim SH, Richardson LC, Thomas CC, et al. Translating an Economic Analysis into a Tool for Public Health Resource Allocation in Cancer Survivorship. MDM Policy Pract. 2023;8:23814683231153376.

    Google Scholar 

  173. Swift B, Imohe A, Perez CH, Mwirigi L. An in-depth review of the UNICEF NutriDash platform, lessons learnt and future perspectives: a mixed-methods study. BMJ Open [Internet]. 2023;13:e062684-. http://bmjopen.bmj.com/content/early/by/section

  174. Tchoualeu DD, Elmousaad HE, Osadebe LU, Adegoke OJ, Nnadi C, Haladu SA, et al. Use of a district health information system 2 routine immunization dashboard for immunization program monitoring and decision making, Kano State, Nigeria. Pan Afr Med J. 2021;40:2.

    PubMed  PubMed Central  Google Scholar 

  175. Tegegne HA, Bogaardt C, Collineau L, Cazeau G, Lailler R, Reinhardt J et al. OH-EpiCap: A semi-quantitative tool for the evaluation of One Health epidemiological surveillance capacities and capabilities. medRxiv [Internet]. 2023; https://www.medrxiv.org/

  176. Tennant R, Tetui M, Grindrod K, Burns CM. Multi-disciplinary Design and implementation of a Mass Vaccination Clinic Mobile Application to support decision-making. IEEE J Transl Eng Health Med. 2023;11:60–9.

    Article  PubMed  Google Scholar 

  177. Vázquez Noguera JL, Ho Shin H, Sauer Ayala C, Grillo S, Pérez-Estigarribia P, Torales R et al. Epymodel: A User-Friendly Web Application for Visualising COVID-19 Projections for Paraguay Including Under-Reporting and Vaccination. Communications in Computer and Information Science [Internet]. 2023. pp. 58–72. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85169037861&doi=10.1007%2f978-3-031-36357-3_5&partnerID=40&md5=5a6a2c39c2f97a9230239d0b54f42116

  178. Wells J, Grant R, Chang J, Kayyali R. Evaluating the usability and acceptability of a geographical information system (GIS) prototype to visualise socio-economic and public health data. BMC Public Health. 2021;21:2151.

    Article  PubMed  PubMed Central  Google Scholar 

  179. Zheng S, Edwards JR, Dudeck MA, Patel PR, Wattenmaker L, Mirza M, et al. Building an Interactive Geospatial Visualization Application for National Health Care-Associated Infection Surveillance: Development Study. JMIR Public Health Surveill. 2021;7:e23528.

    Article  PubMed  PubMed Central  Google Scholar 

  180. Yang C, Zhang Z, Fan Z, Jiang R, Chen Q, Song X, et al. EpiMob: interactive visual analytics of Citywide Human mobility restrictions for Epidemic Control. IEEE Trans Vis Comput Graph. 2023;29:3586–601.

    Article  PubMed  Google Scholar 

  181. Shimpi N, Glurich I, Hegde H, Steinmetz A, Kuester R, Crespin M et al. DentaSeal: a school-based dental sealant efficiency assessment tool to support statewide monitoring and reporting: a field report. Technol Health Care. 2023.

  182. Rabiee F. Focus-group interview and data analysis. Proceedings of the nutrition society. 2004;63:655–60.

  183. Lu S, Christie GA, Nguyen TT, Freeman JD, Hsu EB. Applications of artificial intelligence and machine learning in disasters and public health emergencies. Disaster Med Public Health Prep. 2022;16:1674–81.

    Article  PubMed  Google Scholar 

  184. Equator Network. Enhancing the quality and transparency of health research. 2016. https://www.equator-network.org/

Download references

Acknowledgements

Not applicable.

Funding

This work was funded by the XSeed 2020–2021 Interdivisional Research Funding Program and the Data Sciences Institute at the University of Toronto. BD is supported by a Canada Research Chair in Human Factors and Transportation. LCR is supported by a Canada Research Chair in Population Health Analytics and the Stephen Family Chair in Community Health from Trillium Health Partners. The funders played no role in study design, data collection, analysis and interpretation of data, or the writing of this manuscript.

Author information

Authors and Affiliations

Authors

Contributions

LCR, BD were responsible for study design. HC, HV, LMD developed the search strategy. HV, RS, EP, LMD conducted the search, screening, and data extraction. HV, EP, LMD, VH, BD, LCR analyzed and interpreted the data. HV and VH drafted the original manuscript. All authors contributed to and approved of the manuscript.

Corresponding author

Correspondence to Laura C. Rosella.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Supplementary Material 2

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Vasquez, H.M., Pianarosa, E., Sirbu, R. et al. Human factors methods in the design of digital decision support systems for population health: a scoping review. BMC Public Health 24, 2458 (2024). https://doi.org/10.1186/s12889-024-19968-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12889-024-19968-8

Keywords