Skip to main content

Availability of published evidence on coverage, cost components, and funding support for digitalisation of infectious disease surveillance in Africa, 2003–2022: a systematic review

Abstract

Background

The implementation of digital disease surveillance systems at national levels in Africa have been challenged by many factors. These include user applicability, utility of IT features but also stable financial support. Funding closely intertwines with implementations in terms of geographical reach, disease focus, and sustainability. However, the practice of evidence sharing on geographical and disease coverage, costs, and funding sources for improving the implementation of these systems on the continent is unclear.

Objectives

To analyse the key characteristics and availability of evidence for implementing digital infectious disease surveillance systems in Africa namely their disease focus, geographical reach, cost reporting, and external funding support.

Methods

We conducted a systematic review of peer-reviewed and grey literature for the period 2003 to 2022 (PROSPERO registration number: CRD42022300849). We searched five databases (PubMed, MEDLINE over Ovid, EMBASE, Web of Science, and Google Scholar) and websites of WHO, Africa CDC, and public health institutes of African countries. We mapped the distribution of projects by country; identified reported implementation cost components; categorised the availability of data on cost components; and identified supporting funding institutions outside Africa.

Results

A total of 29 reports from 2,033 search results were eligible for analysis. We identified 27 projects implemented in 13 countries, across 32 sites. Of these, 24 (75%) were pilot projects with a median duration of 16 months, (IQR: 5–40). Of the 27 projects, 5 (19%) were implemented for HIV/AIDs and tuberculosis, 4 (15%) for malaria, 4 (15%) for all notifiable diseases, and 4 (15%) for One Health. We identified 17 cost components across the 29 reports. Of these, 11 (38%) reported quantified costs for start-up capital, 10 (34%) for health personnel compensation, 9 (31%) for training and capacity building, 8 (28%) for software maintenance, and 7(24%) for surveillance data transmission. Of 65 counts of external funding sources, 35 (54%) were governmental agencies, 15 (23%) foundations, and 7 (11%) UN agencies.

Conclusions

The evidence on costing data for the digitalisation of surveillance and outbreak response in the published literature is sparse in quantity, limited in detail, and without a standardised reporting format. Most initial direct project costs are substantially donor dependent, short lived, and thus unsustainable.

Peer Review reports

Background

The adoption of digital systems is increasingly recognised as essential for enhancing infectious disease surveillance and outbreak response [1]. The COVID-19 pandemic has shown the importance of digital systems for enhanced surveillance and outbreak response management at scale [2], and in real-time [3,4,5]. Beforehand, the occurrence of the West Africa Ebola outbreak, and the COVID-19 pandemic have accelerated the design and deployment of many digital tools to support response efforts [6,7,8,9,10].

Since 1998, countries of the World Health Organization Regional Office for Africa (WHO – AFRO) have adopted the Integrated Disease Surveillance and Response (IDSR) strategy [11]. It is a comprehensive, evidence-based strategy for strengthening national public health surveillance and response. It does so by integrating and harmonising the flow of surveillance data from community through to the national levels for early detection and response to public health threats [11]. With the occurrence of major public health emergencies in Africa, the limitations of the paper-based IDSR for early detection and coordination of emergency response became obvious [10, 12,13,14,15,16,17]. In 2013, with support of partners, the WHO initiated electronic surveillance termed “eSurveillance” to enhance the performance of the IDSR [18]. This enhancement constituted the use of electronic systems to facilitate rapid collection, early reporting, and analysis of both human and animal health data [11, 18]. In May 2023, the Africa CDC digital transformation strategy (2023–2030) was launched [19]. Among other commitments, the strategy undertakes to support the improvement of health systems capabilities of member countries to quickly detect, investigate and respond to health threats [19]. This strategy is aligned to Africa’s new public health order adopted by the African Union Commission in 2021 [20]. These efforts notwithstanding, there is currently no consolidated regional eSurveillance and outbreak response management system [21]. In the meantime, individual countries have adopted various digital systems to move forward with their respective IDSR.

The implementation of these digital systems at scale in African countries has been challenged by many factors – key among which are limited geographical reach and disease focus, and reliable financing [22, 23]. The levels of digitalisation and available funding in African public health systems are particularly low [24, 25]. Funding closely intertwines with implementations in terms of geographical reach, disease focus, and sustainability. The open sharing of evidence among African countries on their respective digitalisation experiences in respect of these factors could contribute to mitigating implementation failures. However, the practice of evidence sharing on geographical and disease coverage, costs, and funding sources for improving the implementation of these systems on the continent is unclear. Thus, even where there is stakeholder interest in appraising the long-term cost implications before undertaking a project, the required evidence or data may not be publicly available in the literature. This hinders realistic costing for successful piloting and scale-ups [26, 27]. The lack of a comprehensive appraisal and forecasting of the cost implications beyond the up-front costs contribute to implementation failures [28,29,30].

So, to assess the availability of published evidence for informing better planning and funding strategies for the digitalisation of surveillance and outbreak response in Africa, our review systematically addressed three thematic questions. First, what is the extent of geographical and disease coverage, and how long have the implemented surveillance systems been in operation? Second, how do published literature and reporting patterns illuminate project cost components, and what are the implications for strategic planning, piloting, and forecasting sustainable implementations at scale? Third, what are the sources of external funding support for African countries’ efforts to digitalise surveillance and outbreak response systems? Ultimately, the answers to these questions provide a gauge of the prevailing practices on documentation and transparency regarding implementation costs and funding sources.

Methods

Study design

We considered digital tools for public health surveillance and/or outbreak response to include smart phone or tablet-based approaches which are either SMS-, app- or web-based. We defined the limits of the review by geographical setting – Africa; public health conditions of interest – infectious diseases; purpose of project – surveillance and/or outbreak management, and the period of review – 2003 to 2022. We specified the components of our review question in terms of the “PICo” framework namely, the Population, Interest, and Context [31]. We developed a data extraction spreadsheet for recording relevant data from eligible records. We described and summarised the data in keeping with the review outcomes namely, geographical reach, disease focus, duration of implementation, cost components, and external funding support.

We registered the protocol for this systematic review in PROSPERO (CRD42022300849) (https://www.crd.york.ac.uk/PROSPERO/). We reported the review in keeping with the updated Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines 2020 [32].

Literature search and study selection

Search strategy

We searched five electronic bibliographic databases and performed a manual search of cited references, websites of WHO, Africa CDC, and national institutes of public health of African countries. The electronic databases we searched were PubMed, MEDLINE over Ovid, EMBASE, Web of Science, and Google Scholar. We searched all fields for the period January 1, 2003 to December 31, 2022. We developed a stepwise search strategy using key words in our review starting in PubMed:

Search #1: cost OR cost components

Search #2: digital health tools

Search #3: infectious disease surveillance OR outbreak response

Search #4: Africa OR sub-Saharan Africa OR Low middle income country OR settings

Complete search (#5): #1 AND #2 AND #3 AND #4

Thus, we combined the four searches to obtain the complete set of search terms and results. With these keywords, PubMed generated additional related terms via its medical subject heading (MeSH) feature. Using Boolean operators and truncations, we adopted the search terms from PubMed for the other databases depending on the limits to number of search terms and unique search features [Appendix 1].

Inclusion and exclusion criteria

The inclusion criteria hinged on three aspects: the project interests (digital applications involving infectious disease surveillance or outbreak response); location of projects (Africa); and type of publication (peer-reviewed articles, conference proceedings, and grey literature [33] on project reports on institutional websites, protocols). We excluded records on digital projects for non-communicable diseases, public health supply chain managements, health administration, and electronic medical records systems for routine patient care. By publication type, we excluded commentaries, opinion letters, and editorials.

Study selection process

We imported all search results from electronic databases into EndNote X9 referencing system [34]. We performed duplicate detection and deletion. Next, we exported the remaining records onto the free web version of Rayyan for title and abstract screening [35]. The first author (BBK) and co-author (MH) performed a blinded title and abstract screening. We resolved 14 conflicts of screening decisions by consensus based on discussion. We obtained full texts of articles that passed the title and abstract screening. Three authors - BBK, MH, BCS independently performed eligibility assessment of full texts and agreed on included records.

Quality assessment of reports

Based on the study designs of included records, we adapted appropriate items from the Critical Appraisal Skill’s Programme (CASP) check lists for economic evaluations [36], and the Appraisal tool for Cross-Sectional Studies (AXIS) [37] to obtain a 20 - item hybrid quality assessment tool (Appendix 2). The tool assessed the reports against six broad quality criteria namely, the clarity of research aims, appropriateness of methods, validity of results, discussion of implications of results, relevance of results for comparable settings, and compliance with ethics. It uses a four-level non-summative scoring system viz. “Yes”, “Can’t Tell”, “No”, and “Not Applicable” to assess each quality question.

Data extraction process

We iteratively developed a data extraction spreadsheet in MS-Excel to capture a total of 45 variables in six broad themes. These themes were: title and source of report; purpose of project; features of implemented digital tools; sites of implementation, disease focus; duration of implementation, cost components as captured by reports; and names of funding institutions. Three authors - BBK, MH, and CR performed data extraction independently and subsequently merged results in one spreadsheet.

Data management and analysis

In this review, we referred to results of our literature search as records. A record may contain two or more case studies. We referred to each case study as a report. Where a record reported separate case studies in different countries, we counted each country as different implementation site. Hence, the number of implementation sites was the total number of countries from which at least one project implementation was reported.

Geographic reach referred to government administrative divisions of a country such regions, districts, and sub-districts. The duration of implementation referred to the period for which a digital tool was deployed. We identified cost components as all implementation expenditure items or activity-based categories as they were captured in the reports. We listed all the cost components captured in included reports and organised them into activity and time-driven expenditure categories. We assessed the availability of data on cost components at four levels. For each report, a cost component was coded as “Unreported” if it was not reported; “Descriptive” if it was reported without any cost numbers; “Estimated” if it was reported with a cost estimate; “Quantified”, if it was reported and a full cost of expenditure provided. Where cost components were stated as free, such as the use of an open-source software, the cost component for software development was considered zero US dollars (USD), and hence categorised as “Quantified”.

External funding support referred to financial or technical support from countries or institutions other than the governments of beneficiary countries. For projects that received funding from multiple sources, we counted all funders for the tally. Where funds were in the form of a grant, we considered the main source of funding for the analysis.

We performed descriptive statistical analysis. The unit of analysis was a report. We tabulated key characteristics of the included reports, and the sources of external funding support. We visualised the data on geographical distribution of reporting countries, diseases for which the projects were implemented, durations of pilot projects, and the pattern of reporting on cost components across the four levels aforementioned.

Results

Literature search and evaluation for study inclusion

All the searches yielded a total of 2,033 records, including 669 duplicates. After title and abstract screening of the remaining 1,364 unique records, we excluded 1,323 records based on the aforementioned criteria. Of the remaining 41 records, another 15 were excluded based on the full text assessment, leaving 25 records for inclusion. Owing to multiple case reporting in two of the included records, we obtained a total of 29 reports for analysis (Fig. 1).

Fig. 1
figure 1

PRISMA flow diagram of record screening and selection process

Study characteristics

The 29 reports described the implementation of a total of 27 projects at 32 implementation sites (Table 1). About 67% (18/27) of the projects used open-source tools. Four of the projects did not have specific names for their tools except general descriptive labels such as mHealth or web-based systems. The median number of reports published per year was 1 (range 0–4).

Table 1 General characteristics of reports on the implementation of digital tools for surveillance or outbreak response in Africa, 2003–2022

Quality of reports

Aside from the two grey reports, the rest of the reports (27) that were published in peer-reviewed journals presented their research objectives, methods, and results systematically. Of the 29 reports, 13 (45%) did not discuss the limitations of their studies, 10 (34%) did not include statements on competing or conflicts of interests, and 17 (59%) did not report on ethical approval (Appendix 2).

Geographical reach of implementations

The 32 implementing sites were located across 13 countries with 24 (75%) from the eastern and southern Africa bloc. (Fig. 2). Three records reported on implementations across multiple countries (Table 1). One record reported malaria surveillance projects in Kenya and Uganda. A second reported three different but related case studies on the surveillance of tick and tick-borne disease in Benin, Kenya, and South Africa. A third reported a surveillance project on human and animal diseases in Burundi, Tanzania, and Zambia.

On the country level, the geographical reach of implementation at the 32 sites included 4 (13%) communities or cities, 17 (53%) district or sub-districts, 2 (6%) regions or provinces; and 9 (28%) with national coverage.

Fig. 2
figure 2

Distribution of number of projects reports by countries implementing digital tools for surveillance or outbreak response in Africa, 2003–2022

Purpose and disease focus areas of implementations

Of the 27 projects, 21 (78%) were implemented mainly for surveillance, 5 (17%) for both surveillance and outbreak response, and 1 (3%) (OpenMRS-Ebola) for outbreak response. The implementations covered diseases of humans, animals, and One Health conditions (Fig. 3).

Fig. 3
figure 3

Diseases for which projects for digital surveillance or outbreak response were implemented in Africa, 2003–2022

Duration of implementations

At the 32 implementation sites, 24 (75%) were pilot projects, 5 (16%) were ongoing, and 3 (9%) were one-time interventions. The durations of the pilot projects were reported for 23 of 24 implementation sites. The median duration of the pilot projects was 16 months, (IQR: 5–40) (Fig. 4). All the pilot projects reported some level of successful outcomes for which the implementers recommended scale ups whilst highlighting challenges to overcome.

Fig. 4
figure 4

Duration of pilot implementations of digital projects for surveillance or outbreak response in Africa, 2003–2022

Reporting on cost components

The reporting of cost components for implementations varied in detail and approach based on the main objectives of each report. We identified a total of 17 cost components across all the reports (Fig. 5). On the more frequently quantified cost components across the 29 reports, 11 (38%) reported for capital costs (start-up infrastructure and pieces of equipment), 10 (34%) for health personnel compensation, 9 (31%) for training and capacity building, 8 (28%) for software maintenance, 7 (24%) for data transmission, and 7 (24%) for local travel. One report quantified the cost of planning. No report quantified the costs of international travel, remote technical support, and remote project management support of co-implementers in partner institutions outside Africa. The reporting of the costs of implementation varied in the range of activities undertaken. Some of the granular cost items included rent for office space, general office supplies, utility bills, cleaning services, and the monetary compensation of local health personnel in the forms of salaries, allowances, and per diems for meetings, trainings, and field supervisions. A set of the granular cost items pooled across the included reports under the cost components is summarised in Appendix 3.

Of the 29 reports, 10 (34%) provided information on the total direct costs of projects in monetary terms (USD) with varying combination of aggregate and cost breakdowns [29, 38, 40, 43, 46, 51, 57, 61]. The median total direct cost was 57,189.00 (the average of 50,036.88 for Rapid SMS system for malaria surveillance in two districts in 2009 for 11 months [40], and 64,342.00 for the pilot of eIDSR in one district for 14 weeks 2017 [57]). Overall, the total direct cost for these projects ranged from 2,353.27 USD (for an SMS and internet-based system implemented over 27 months from 2006 to 2008 in one district [29], to 472,327.00 USD (for the pilot of mHealth system over 41 months from 2014 to 2017 for a national capital city [61].

Fig. 5
figure 5

Reporting patterns for cost components in the implementation digital projects for surveillance or outbreak response in Africa, 2003–2022

External project funding support

A total of 39 main external funding institutions supported the implementation of 26 of the 27 projects. Of 65 counts of external funding sources, 35 (54%) were governmental agencies, 15 (23%) foundations, 7 (11%) UN agencies, 3 (5%) non-governmental organisations (NGOs), 3 (5%) industry, and 1 (2%) each for scientific societies, and the World Bank. (Table 2). The lead funders among governmental agencies based on the number of projects they supported were the US National Institute of Health (NIH), US Centers for Disease Control and Prevention (US-CDC), and US Agency for International Development (USAID). The lead funders among the foundations were the Bill and Melinda Gates Foundation, the Rockefeller Foundation, and the Wellcome Trust (Table 2). The WHO, UNICEF, FAO, and the World Bank were the global organisations which supported the implementation of digital disease surveillance or outbreak response systems. The three funding sources from industry were Google Inc., MSD (Merck, Sharp & Dohme) Animal Health, and Novartis Pharma AG. The funders mostly collaborated with local and external academic or research institutions, and the responsible government agencies or ministries of beneficiary countries.

Table 2 Types and sources of funding support for implementing digital tools for surveillance or outbreak response in Africa, 2003–2022

Discussion

Most of the implementations were on pilot basis,and hence limited in geographical reach. They covered single diseases or disease groups from the sets of notifiable diseases. The reports contained limited description and quantification of project cost components. The main sources of external funding support were intergovernmental co-operations and foundations. All the reports were from the sub-Saharan Africa region. This pattern of distribution of project reporting sites on the continent may be partly reflective of the relatively higher investment in the sub-Saharan Africa region in the last two decades. It could mirror a response to the increasing frequency of emerging and remerging infectious diseases of pandemic potential [62, 63]. The higher number of reports from the eastern and southern Africa bloc also reflects a relative higher number of projects arguably attributed to earlier mobile and internet penetration relative to other sub-Saharan Africa regions [64, 65].

The projects targeted both specific diseases such as malaria and rabies, as well as disease groups such as HIV/AIDs and tuberculosis, all notifiable diseases, and One Health events. Programmatic interventions for some of the traditional priority diseases such malaria, HIV/AIDs, and tuberculosis have long existed in Africa. However, the inclusion of digital surveillance for case detection, investigation, and follow ups have become add-ons for accelerating the achievement of disease prevention and control targets [66,67,68]. A case in point of coupling digital interventions to traditional disease control programmes is the WHO Global Taskforce on “Digital health in the TB response” [66]. We observed that about 30% of the reported implementations were either for all notifiable diseases or One Health conditions. This may be informed by the increasing recognition of the high burden of One Health conditions in the WHO-AFRO. It is reported by Talisuna et al. in their comprehensive review and mapping of the spatial and temporal distribution of infectious disease epidemics, disasters and other potential public health emergencies in the region [69].

Most of the implementations were limited in geographical reach and duration because they were pilot projects. This finding is consistent with the widespread phenomenon where many technical projects are not scaled even after successful piloting, especially in low-and middle-income countries (LMICs) [70,71,72,73]. In an effort to regulate and promote implementation of eHealth initiatives at scale in Uganda, the director general of health services issued a moratorium in 2012 directing the immediate halting of all projects [74]. The moratorium cited among other reasons, the need for actors to demonstrate compliance with national requirements, present convincing mechanisms of sustainability, and provide clarity on system ownership. However, Gimbel et al. suggested that the heavy reliance of LMICs on external donors, governments, and private sector for funding partly explains the increasing trend of pilot projects that do not get scaled up [68].

Most of the reports identified by the present review did not include cost reporting as an outcome. Similar to the majority of reports on implementation outcomes, the literature reporting implementations of digital projects are limited on cost information compared to other outcomes such as acceptability, appropriateness, and feasibility [75,76,77]. Even though a scoping review by Silenou et al. reported the use of digital applications from 28 African countries as of 2021 [6], we obtained eligible reports with information on cost considerations from only 13 countries. About 28% of our included reports provided some form of quantitative cost information. On face value, this finding is less grim given that in their systematic review including 235 implementation studies, Eisman et al. reported that only about 10% of these provided cost information [78].

The reports included in our review provide varying details of cost information on capital investments, data transmission, training, and software development. Even though the cost information was mostly descriptive, some reported cost analyses of implementation expenditures in various categorisations and detail [29, 38, 40, 43, 46, 51, 57, 61]. All the 10 reports that provided total cost of implementations were pilots, limited in geographical scale (districts, sub-districts, and communities), and disease-specific except one [57]. In addition, these projects varied widely in time horizon of initiation, duration, and country of implementation. This heterogeneity and the limited reporting of unit cost data have not allowed for meta-analysis. Still, these total costs may be useful for providing historical and situational context. This is especially true for actors who may consider similar projects in future beyond piloting to sustainable implementation at scale. In our experience as co-implementers of SORMAS in Ghana and Nigeria, we note that the total costs of implementation are underestimated. The reason being that many hidden and indirect costs are not accounted for. Further, even the total direct costs are difficult to track and compute because of the multiplicity of both internal and external contributing sources at various administrative levels of the health system. In the few cases where the quantification of cost information is reported, even fewer provide cost breakdowns in the main text, or as supplementary material [38, 40, 46, 57, 61]. This trend of aggregate reporting limits the possibilities for comprehensive economic evaluations of implementations. Shield et al. reported on this in their paper on factors limiting cost-effectiveness analysis [79], and Fukuda and Imanaka reported the same challenge in their assessment of the transparency of cost estimates in economic evaluations of patient safety programmes [80]. Thus, the incomplete reporting and quantification of cost components limit the availability of the needed raw material for sophisticated but useful health economics research, as well as evidence synthesis. Ultimately, this phenomenon deprives global health actors of guidance on context-relevant evidenced-based implementation strategies.

Aside the aforementioned difficulty in determining indirect costs, some fairly direct and tangible costs are simply not reported. This does not only compound the challenge of cost underestimation and limitations for evidence generation and synthesis;  it also limits efforts at estimating returns on investments and evaluating business models for implementations. For example, nearly all the implementations in our review benefited from participation of external co-implementers, but the costs of international travel, remote technical assistance and project management support were mostly unreported. Given that most of the implementations were pilots, these cost components could constitute a significant start-up cost – a well-recognised early barrier to implementation [77]. This pattern of reporting is consistent across most implementation studies. Aggregate reports on broad categories of cost components such as capital investments, personnel, and transport are more common compared to granular activity-based expenditures [76, 81, 82]. This trend of limited reporting of implementation costs may suggest a poor culture of systematic documentation, or open communication. The bottom line is that this challenge does not allow for comprehensive evaluation of return on investments. In turn, it hinders justification for more investments from funders [78, 83]. The limited reporting on cost may be attributed to organisational practices on financial confidentiality. However, Cidav et al. suggest that the lack of clearly defined and standardised costing methods for planning, execution, and evaluation of implementations could partly explain this phenomenon [76]. Hence, they propose the application of what they describe as “a pragmatic method for costing implementation strategies using time-driven activity-based costing” in conjunction with Proctor et al.’s framework on specifying and reporting implementation strategies [76, 84].

We identified a wide base of external funding support for the implementations. However, about 75% of the projects were funded through grants. The grant durations were shorter than four years, half of them lasting 16 months or shorter. The main sources of external funding support were governmental agencies and foundations. We observed that the implementations were fragmented in purpose within countries and among funders. The open sharing of cost information promotes transparency, donor confidence, and a better contextualisation of implementation demands [85,86,87]. On the contrary, the lack of coordination coupled with limited open reporting on initiatives contributes to duplications and detracts from incremental progress in overall health system strengthening in Africa [88, 89]. In their systematic review on the politics of disease control in Africa, Chattu et al. reiterated how fragmented funding of disease-specific interventions inadvertently undermine health system strengthening in developing countries [89, 90]. That said, deployments of digital outbreak response systems such as the OpenMRS-Ebola in Sierra Leone as a one-time intervention [49] remain critical strategies for enhancing emergency response to major outbreaks. In this regard, funding mechanisms and reporting practices of such interventions depend more on the evolution of the emergency, and less on long-term implementation strategies. Still, the latter would be expected for the digitalisation of national surveillance systems through for example DHIS2 or SORMAS. For the foreseeable future we expect funding mechanisms and reporting requirements to vary depending on the funder and the purpose of the system. However, to the extent that external funding support is what it is – support, the onus of achieving sustainable implementation is on the beneficiary countries. For example, some of the reports we analysed captured recurrent expenditures such as rent for office space, utility bills, and extra compensation for already employed local health personnel [38, 42, 46, 61]. This suggests that in some cases, there is little or no collaboration between external implementing actors and the relevant state agencies. Aside missing the opportunity of saving on some bills, the lack of close collaborations also detracts from the prospects of skill and technology transfer to local personnel.

Even though we do not find literature that focused on the availability of published evidence for digitalisation of surveillance and outbreak response, the findings on limited transparency in cost reporting may resonate to various extents in other continents. For example, in their article on the dawn of digital public health in Europe [91], the representatives of the European Public Health Association’s digital health section underscored how the exigencies of the COVID-19 pandemic response have shifted stakeholder perceptions of digital surveillance tools from one of “opportunities” to “necessities”. Thus, before then, we may infer with caution, that the national and regional actors were less likely instituted deliberate funding and governance procedures beyond the routine institutional requirements to address any limitations in documentation and public reporting of expenditures. It may still be the case. Similar systematic reviews of the current practices on this subject in other continents could offer more insights and allow for comparisons of prevailing practices on evidence sharing so to exchange lessons.

In sum, our review substantiates the fragmentation of digitalisation efforts in Africa in the last 20 years; highlights the prevailing culture on open reporting of implementation cost data; and underscores the urgency for broad multi-level stakeholder engagements and resource commitments for operating comprehensive digital systems at scale in the ultimate interest of global health. To address the challenge of fragmentation, funding institutions should consider conducting joint reviews and approaches which may align with their vision and mandate so to minimise the risks of duplication. Such a review could reveal possibilities for synergy, encourage collaborations, benefit from joint governance and transparent reporting, and consolidate the gains of earlier projects. Ultimately, it could increase the scale and sustainability of implementations. For improving the documentation and transparent cost reporting of future projects, we recommend that as part of routine project reporting, a granular cost reporting template be included as a mandatory appendix for reporting implementation cost breakdowns to funding agencies. Where applicable, further funding release should be contingent on a positive evaluation of cost reporting and transparency. Further, our findings hold some implications for improving practices on the funding and implementation of digital systems for public health surveillance in Africa and comparable settings. First, our review demonstrate that actors cannot rely on openly published grey and peer reviewed literature for evidence on cost of digitalisation of surveillance in Africa. Second, regarding the high failure rate of projects, our findings suggest that tying external funding support to commitments of national actors for system ownership could deliver projects with improved sustainability potential. Third, by highlighting the limited reporting of evidence on implementation costs, our findings seek to underscore lost opportunities for improving cost planning and forecasting. . Our findings also beg the question of what underlying factors could explain the limited sharing of experiences on implementation costs. This question could be tabled for panel discussions at workshops and conferences on implementation research in digital health and related topics.

For future research, we recommend an extensive review of unpublished institutional financial reports, complemented with multidisciplinary expert inputs using the Delphi approach [92]. This would allow stakeholders to obtain a comprehensive set of cost components, as well as historical and prevailing cost estimates. These could form a basis for developing a living cost estimating matrix to guide financial planning, forecasting, and cost reporting for the implementation of digital systems for infectious disease surveillance or outbreak response in different settings. A multi-country qualitative study would also be useful in unravelling contextual and systemic factors that limit access to cost data for implementation research in general. The reporting on data ownership, confidentiality, and security of the systems we reviewed was not within the scope of our study. Nevertheless, we are convinced transparent reporting on project implementation that includes these important ethical dimensions would increase clarity of responsibilities for system governance and promote sustainability.

Limitations

The risk of publication bias is the main limitation of our review. The reason is that project implementation information that is openly available is hardly complete compared to the unpublished reports. To minimise this bias, we communicated with some authors and institutions to request additional information in keeping with our review questions. Also, where the reporting of a cost component was not directly spelt out in reports but could be reasonably inferred, there was the risk of misclassification of reporting. To minimise this, we relied on discussion among at least three authors to reach a consensus.

Conclusions

The evidence on costing data for the digitalisation of surveillance and outbreak response in the published literature is sparse in quantity, limited in detail, and without a standardised reporting format. This detracts from incremental learning from past funding pitfalls that would otherwise improve funding strategies for future projects. Most initial direct project costs are substantially donor dependent, short lived, and thus unsustainable. National public health institutions in Africa and donor partners should consider promoting standardisation and open reporting of implementation cost data to inform better design and planning of future digitalisation projects. In keeping with their mandate under the international health regulations, African governments should commit financially to the long-term sustainability of digital surveillance systems. Supporting donor partners should also endeavour to engage beyond piloting.

Data availability

All data generated or analysed during this study are included in this published article and its supplementary information files.

Abbreviations

DHIS2:

District Health Information System ? version 2

HIMAL EDS:

Highland Malaria Project Early Detection System (for epidemic malaria)

mSOS:

Mobile SMS-based disease outbreak alert system

RCS:

Rabies Case Surveillance

AMED:

Japan Agency for Medical Research and Development

BELSPO:

Belgian Science Policy Office

BMBF:

Federal Ministry of Education and Research (Bundesministerium f?r Bildung und Forschung)

eIDSR:

Electronic Integrated Disease Surveillance and Response

EU:

European Union

FCDO:

Foreign, Commonwealth and Development Office

GIZ:

GARC (Global Alliance for Rabies Control) Data Logger

IDRC:

German Corporation for International Cooperation (Gesellschaft f?r Internationale Zusammenarbeit)

IPM:

International Development Research Centre of Canada

IPM:

Pasteur Institute of Madagascar (Institut Pasteur de Madagascar)

JICA:

The Japan International Cooperation Agency

KABS app:

Kenya Animal Biosurveillance System application

mHAT app:

Mobile Health and Treat application

MMV:

Medicines for Malaria Ventures

MSD Animal Health:

Merck, Sharp and Dohme Corp Animal Health

MSF:

Doctors Without Borders (Medecins Sans Frontieres)

MSP:

Ministry of Health Prevention (Ministere de la Sante et de la Prevention)

NIH:

National Institutes of Health

NLM:

National Library of Medicine

OpenMRS:

Open Data Kit

Ebola app:

Open Medical Records System - Ebola application

Rapid SMS:

Rapid Short Messaging Service

REB:

Rabies Epidemiological Bulletin

RVT:

Rabies Vaccine Tracker

SORMAS:

Surveillance, Outbreak Response Management and Analysis Systems

STG:

SurveyToGo

UBS Optimus Foundation:

Union Bank of Switzerland Optimus Foundation

UNICEF:

United Nations Children’s Fund (originally: United Nations International Children’s Emergency Fund)

US CDC:

United States Centers for Disease Control and Prevention

US HHS:

United States Health and Human Services

US PEPFAR:

United States President’s Emergency Plan for AIDS Relief

US PMI:

United States President’s Malaria Initiative

USAID:

United States Agency for International Development

WAP:

World Animal Protection (formerly The World Society for the Protection of Animals - WSPA)

Web GIS:

Web-based Geographic Information System

WHO:

World Health Organisation

WAP:

World Animal Protection (formerly The World Society for the Protection of Animals - WSPA)

References

  1. Kelly C, Kamil-Thomas Z. Digital Health Technologies: Digital Innovations in Public Health. In: Leal Filho W, Wall T, Azul AM, Brandli L, Özuyar PG, editors. Good Health and Well-Being [Internet]. Cham: Springer International Publishing; 2020 [cited 2024 May 7]. pp. 119–30. (Encyclopedia of the UN Sustainable Development Goals). http://link.springer.com/https://doi.org/10.1007/978-3-319-95681-7_70.

  2. Strengholt P. Data Management at Scale. O’Reilly Media, Inc.; 2023.

  3. Budd J, Miller BS, Manning EM, Lampos V, Zhuang M, Edelstein M, et al. Digital technologies in the public-health response to COVID-19. Nat Med. 2020;26(8):1183–92.

    Article  CAS  PubMed  Google Scholar 

  4. Kostkova P, Saigí-Rubió F, Eguia H, Borbolla D, Verschuuren M, Hamilton C, et al. Data and Digital Solutions to support surveillance strategies in the context of the COVID-19 pandemic. Front Digit Health. 2021;3:707902.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Ting DSW, Carin L, Dzau V, Wong TY. Digital technology and COVID-19. Nat Med. 2020;26(4):459–61.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  6. Silenou BC, Nyirenda JLZ, Zaghloul A, Lange B, Doerrbecker J, Schenkel K, et al. Availability and Suitability of Digital Health Tools in Africa for Pandemic Control: Scoping Review and cluster analysis. JMIR Public Health Surveill. 2021;7(12):e30106.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Alao A, Brink R. COVID-19 Digital Technology Response in Sub-Saharan African Countries. In: Building Resilient Healthcare Systems With ICTs [Internet]. IGI Global; 2022 [cited 2024 May 7]. pp. 74–105. https://www.igi-global.com/chapter/covid-19-digital-technology-response-in-sub-saharan-african-countries/298399.

  8. Cilliers L. A digital health ecosystem for Africa during the COVID-19 pandemic. In: Digital Innovation for Healthcare in Covid-19 Pandemic [Internet]. Elsevier; 2022 [cited 2024 May 7]. pp. 39–51. https://www.sciencedirect.com/science/article/pii/B978012821318600013X.

  9. Danquah LO, Hasham N, MacFarlane M, Conteh FE, Momoh F, Tedesco AA, et al. Use of a mobile application for Ebola contact tracing and monitoring in northern Sierra Leone: a proof-of-concept study. BMC Infect Dis. 2019;19(1):810.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Sacks JA, Zehe E, Redick C, Bah A, Cowger K, Camara M, et al. Introduction of Mobile Health Tools to support Ebola Surveillance and contact tracing in Guinea. Glob Health Sci Pract. 2015;3(4):646–59.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Fall IS, Rajatonirina S, Yahaya AA, Zabulon Y, Nsubuga P, Nanyunja M, et al. Integrated Disease Surveillance and Response (IDSR) strategy: current status, challenges and perspectives for the future in Africa. BMJ Glob Health. 2019;4(4):e001427.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Kebede S, Duales S, Yokouide A, Alemu W. Trends of major disease outbreaks in the African region, 2003–2007. East Afr J Public Health. 2010;7(1):20–9.

    PubMed  Google Scholar 

  13. Kamorudeen RT, Adedokun KA, Olarinmoye AO. Ebola outbreak in West Africa, 2014–2016: epidemic timeline, differential diagnoses, determining factors, and lessons for future response. J Infect Public Health. 2020;13(7):956–62.

    Article  PubMed  Google Scholar 

  14. Coltart CEM, Lindsey B, Ghinai I, Johnson AM, Heymann DL. The Ebola outbreak, 2013–2016: old lessons for new epidemics. Philos Trans R Soc Lond B Biol Sci. 2017;372(1721):20160297.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Aborode AT, Hasan MM, Jain S, Okereke M, Adedeji OJ, Karra-Aly A, et al. Impact of poor disease surveillance system on COVID-19 response in Africa: Time to rethink and rebuilt. Clin Epidemiol Glob Health. 2021;12:100841.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  16. Mengel MA, Delrieu I, Heyerdahl L, Gessner BD. Cholera outbreaks in Africa. Curr Top Microbiol Immunol. 2014;379:117–44.

    PubMed  Google Scholar 

  17. Franklin K, Kwambana-Adams B, Lessa FC, Soeters HM, Cooper L, Coldiron ME, et al. Pneumococcal meningitis outbreaks in Africa, 2000–2018: systematic literature review and Meningitis Surveillance database analyses. J Infect Dis. 2021;224(12 Suppl 2):S174–83.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  18. World Health Organisation - Africa Region. eSurveillance implementation in the context of Integrated Disease Surveillance and Response in the WHO African Region. WHO Africa/; 2015.

  19. Africa CDC. Digital Transformation Strategy to revolutionize and strengthen Public Health systems across the continent [Internet]. Africa CDC. [cited 2024 May 7]. https://africacdc.org/news-item/africa-cdc-digital-transformation-strategy-to-revolutionize-and-strengthen-public-health-systems-across-the-continent/.

  20. Call To Action. Africa’s New Public Health Order [Internet]. Africa CDC. [cited 2024 May 7]. https://africacdc.org/news-item/call-to-action-africas-new-public-health-order/.

  21. Boenecke J, Brinkel J, Belau M, Himmel M, Ströbele J. Harnessing the potential of digital data for infectious disease surveillance in sub-saharan Africa. Eur J Pub Health. 2022;32(Supplement3):ckac131569.

    Article  Google Scholar 

  22. Wolfe CM, Hamblion EL, Dzotsi EK, Mboussou F, Eckerle I, Flahault A, et al. Systematic review of Integrated Disease Surveillance and Response (IDSR) implementation in the African region. PLoS ONE. 2021;16(2):e0245457.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  23. Odekunle FF, Odekunle RO, Shankar S. Why sub-saharan Africa lags in electronic health record adoption and possible strategies to increase its adoption in this region. Int J Health Sci (Qassim). 2017;11(4):59–64.

    PubMed  Google Scholar 

  24. Oleribe OO, Momoh J, Uzochukwu BS, Mbofana F, Adebiyi A, Barbera T, et al. Identifying Key challenges facing Healthcare systems in Africa and potential solutions. Int J Gen Med. 2019;12:395–403.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Adigun L, Dusenbury C, Schoub BD. Public health in Africa–the role of national public health institutes. S Afr Med J. 2007;97(11):1036–9.

    PubMed  Google Scholar 

  26. Larsen-Cooper E, Bancroft E, Rajagopal S, O’Toole M, Levin A. Scale matters: a cost-outcome analysis of an m-Health intervention in Malawi. Telemed J E Health. 2016;22(4):317–24.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Mason C, Lazenby S, Stuhldreher R, Kimball M, Bartlein R. Lessons Learned from Implementing Digital Health Tools to address COVID-19 in LMICs. Front Public Health. 2022;10:859941.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Beck EJ, Harling G, Gerbase S, DeLay P. The cost of treatment and care for people living with HIV infection: implications of published studies, 1999–2008. Curr Opin HIV AIDS. 2010;5(3):215–24.

    Article  PubMed  Google Scholar 

  29. Chang LW, Kagaayi J, Nakigozi G, Serwada D, Quinn TC, Gray RH, et al. Cost analyses of peer health worker and mHealth support interventions for improving AIDS care in Rakai, Uganda. AIDS Care. 2013;25(5):652–6.

    Article  PubMed  Google Scholar 

  30. Iribarren SJ, Cato K, Falzon L, Stone PW. What is the economic evidence for mHealth? A systematic review of economic evaluations of mHealth solutions. PLoS ONE. 2017;12(2):e0170581.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Curtin University. Systematic reviews: formulating the Research question - PICo framework. Curtin University Library; 2022.

  32. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Paez A. Grey literature: an important resource in systematic reviews. J Evid Based Med. 2017.

  34. The EndNote Team. In: EndNote X, editor. EndNote. Philadelphia, PA: Clarivate; 2013.

    Google Scholar 

  35. Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan-a web and mobile app for systematic reviews. Syst Rev. 2016;5(1):210.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Critical Appraisal Skill Programme. CASP Econonomic Evaluation Checklists [online]. 2018.

  37. Downes MJ, Brennan ML, Williams HC, Dean RS. Development of a critical appraisal tool to assess the quality of cross-sectional studies (AXIS). BMJ Open. 2016;6(12):e011458.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Mueller DH, Abeku TA, Okia M, Rapuoda B, Cox J. Costs of early detection systems for epidemic malaria in highland areas of Kenya and Uganda. Malar J. 2009;8:17.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Randrianasolo L, Raoelina Y, Ratsitorahina M, Ravolomanana L, Andriamandimby S, Heraud JM, et al. Sentinel surveillance system for early outbreak detection in Madagascar. BMC Public Health. 2010;10:31.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Asiimwe C, Gelvin D, Lee E, Ben Amor Y, Quinto E, Katureebe C, et al. Use of an innovative, affordable, and open-source short message service-based tool to monitor malaria in remote areas of Uganda. Am J Trop Med Hyg. 2011;85(1):26–33.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Madder M, Walker JG, Van Rooyen J, Knobel D, Vandamme E, Berkvens D, et al. e-Surveillance in animal health: use and evaluation of mobile tools. Parasitology. 2012;139(14):1831–42.

    Article  CAS  PubMed  Google Scholar 

  42. Rajatonirina S, Heraud JM, Randrianasolo L, Orelle A, Razanajatovo NH, Raoelina YN, et al. Short message service sentinel surveillance of influenza-like illness in Madagascar, 2008–2012. Bull World Health Organ. 2012;90(5):385–9.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Rajput ZA, Mbugua S, Amadi D, Chepngeno V, Saleem JJ, Anokwa Y, et al. Evaluation of an android-based mHealth system for population surveillance in developing countries. J Am Med Inf Assoc. 2012;19(4):655–9.

    Article  Google Scholar 

  44. Githinji S, Kigen S, Memusi D, Nyandigisi A, Wamari A, Muturi A, et al. Using mobile phone text messaging for malaria surveillance in rural Kenya. Malar J. 2014;13:107.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Mwabukusi M, Karimuribo ED, Rweyemamu MM, Beda E. Mobile technologies for disease surveillance in humans and animals. Onderstepoort J Vet Res. 2014;81(2):E1–5.

    Article  PubMed  Google Scholar 

  46. Mtema Z, Changalucha J, Cleaveland S, Elias M, Ferguson HM, Halliday JEB, et al. Mobile Phones as Surveillance Tools: Implementing and evaluating a large-scale intersectoral surveillance system for rabies in Tanzania. PLoS Med. 2016;13(4):e1002002.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Karimuribo ED, Mutagahywa E, Sindato C, Mboera L, Mwabukusi M, Kariuki Njenga M, et al. A smartphone app (AfyaData) for innovative one health disease surveillance from community to National Levels in Africa: intervention in Disease Surveillance. JMIR Public Health Surveill. 2017;3(4):e94.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Kipanyula MJ, Sanga CA, Geofrey AM, Fue KG. On piloting web-based rabies surveillance system for humans and animals: web-based rabies Surveillance System. Maximizing Healthcare Delivery and Management through Technology Integration. IGI Global; 2016. pp. 305–23.

  49. Oza S, Jazayeri D, Teich JM, Ball E, Nankubuge PA, Rwebembera J, et al. Development and Deployment of the OpenMRS-Ebola Electronic Health Record System for an Ebola Treatment Center in Sierra Leone. J Med Internet Res. 2017;19(8):e294.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Toda M, Njeru I, Zurovac D, Kareko D, O-Tipo S, Mwau M, et al. Understanding mSOS: a qualitative study examining the implementation of a text-messaging outbreak alert system in rural Kenya. PLoS ONE. 2017;12(6):e0179408.

    Article  PubMed  PubMed Central  Google Scholar 

  51. El-Khatib Z, Shah M, Zallappa SN, Nabeth P, Guerra J, Manengu CT, et al. SMS-based smartphone application for disease surveillance has doubled completeness and timeliness in a limited-resource setting - evaluation of a 15-week pilot program in Central African Republic (CAR). Confl Health. 2018;12:42.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Maraba N, Hoffmann CJ, Chihota VN, Chang LW, Ismail N, Candy S, et al. Using mHealth to improve Tuberculosis case identification and treatment initiation in South Africa: results from a pilot study. PLoS ONE. 2018;13(7):e0199687.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Mohammed A, Franke K, Boakye Okyere P, Brinkel J, Bonačić Marinovic A, Kreuels B, et al. Feasibility of Electronic Health Information and Surveillance System (eHISS) for disease symptom monitoring: a case of rural Ghana. PLoS ONE. 2018;13(5):e0197756.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Coetzer A, Scott TP, Noor K, Gwenhure LF, Nel LH. A Novel Integrated and Labile eHealth System for Monitoring Dog rabies vaccination campaigns. Vaccines (Basel). 2019;7(3):108.

    Article  PubMed  Google Scholar 

  55. Singh Y, Jackson D, Bhardwaj S, Titus N, Goga A. National surveillance using mobile systems for health monitoring: complexity, functionality and feasibility. BMC Infect Dis. 2019;19(Suppl 1):786.

    Article  PubMed  PubMed Central  Google Scholar 

  56. Martin DW, Sloan ML, Gleason BL, de Wit L, Vandi MA, Kargbo DK, et al. Implementing Nationwide Facility-based Electronic Disease Surveillance in Sierra Leone: lessons learned. Health Secur. 2020;18(S1):S72–80.

    Article  PubMed  Google Scholar 

  57. Sloan ML, Gleason BL, Squire JS, Koroma FF, Sogbeh SA, Park MJ. Cost Analysis of Health Facility Electronic Integrated Disease Surveillance and Response in one District in Sierra Leone. Health Secur. 2020;18(S1):S64–71.

    Article  PubMed  Google Scholar 

  58. Moore C, Scherr T, Matoba J, Sing’anga C, Lubinda M, Thuma P, et al. mHAT app for automated malaria rapid test result analysis and aggregation: a pilot study. Malar J. 2021;20(1):237.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Njenga MK, Kemunto N, Kahariri S, Holmstrom L, Oyas H, Biggers K, et al. High real-time reporting of domestic and wild animal diseases following rollout of mobile phone reporting system in Kenya. PLoS ONE. 2021;16(9):e0244119.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  60. Grainger C. A Software for Disease Surveillance and Outbreak Response-insights from Implementing SORMAS in Nigeria and Ghana. Germany: Federal Ministry for Economic Cooperation and Development (BMZ); 2020.

    Google Scholar 

  61. Turimumahoro P, Tucker A, Gupta AJ, Tampi RP, Babirye D, Ochom E, et al. A cost analysis of implementing mobile health facilitated Tuberculosis contact investigation in a low-income setting. PLoS ONE. 2022;17(4):e0265033.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  62. Betjeman TJ, Soghoian SE, Foran MP. mHealth in Sub-saharan Africa. Int J Telemed Appl. 2013;2013:482324.

    PubMed  PubMed Central  Google Scholar 

  63. Makuta I, O’Hare B. Quality of governance, public spending on health and health status in Sub Saharan Africa: a panel data regression analysis. BMC Public Health. 2015;15:932.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Lee S, Cho YM, Kim SY. Mapping mHealth (mobile health) and mobile penetrations in sub-saharan Africa for strategic regional collaboration in mHealth scale-up: an application of exploratory spatial data analysis. Global Health. 2017;13(1):63.

    Article  PubMed  PubMed Central  Google Scholar 

  65. Déglise C, Suggs LS, Odermatt P. SMS for disease control in developing countries: a systematic review of mobile health applications. J Telemed Telecare. 2012;18(5):273–81.

    Article  PubMed  Google Scholar 

  66. World Health Organization (WHO. Digital health in the TB response: scaling up the TB response through information and communication technologies. Geneva, Switzerland: World Health Organization who pdf Accessed September. 2014;25:2019.

  67. Burton T. Is the digital divide hampering the malaria response in Africa? United Nations Development Programme (UNDP), Global Fund Partnership/Health Implementation Support 2019.

  68. Gimbel S, Kawakyu N, Dau H, Unger JA. A Missing Link: HIV-/AIDS-Related mHealth interventions for Health workers in low- and Middle-Income Countries. Curr HIV/AIDS Rep. 2018;15(6):414–22.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Talisuna AO, Okiro EA, Yahaya AA, Stephen M, Bonkoungou B, Musa EO, et al. Spatial and temporal distribution of infectious disease epidemics, disasters and other potential public health emergencies in the World Health Organisation Africa region, 2016–2018. Global Health. 2020;16(1):9.

    Article  PubMed  PubMed Central  Google Scholar 

  70. Tomlinson M, Rotheram-Borus MJ, Swartz L, Tsai AC. Scaling up mHealth: where is the evidence? PLoS Med. 2013;10(2):e1001382.

    Article  PubMed  PubMed Central  Google Scholar 

  71. Huang F, Blaschke S, Lucas H. Beyond pilotitis: taking digital health interventions to the national level in China and Uganda. Global Health. 2017;13(1):49.

    Article  PubMed  PubMed Central  Google Scholar 

  72. Greve M, Brendel AB, van Osten N, Kolbe LM. Overcoming the barriers of mobile health that hamper sustainability in low-resource environments. J Public Health. 2022;30(1):49–62.

    Article  Google Scholar 

  73. Kuipers P, Humphreys JS, Wakerman J, Wells R, Jones J, Entwistle P. Collaborative review of pilot projects to inform policy: a methodological remedy for pilotitis? Aust New Z Health Policy. 2008;5:17.

    Article  Google Scholar 

  74. McCann D. A Ugandan mHealth moratorium is a good thing. ICTworks. 2012.

  75. Nwaozuru U, Obiezu-Umeh C, Shato T, Uzoaru F, Mason S, Carter V, et al. Mobile health interventions for HIV/STI prevention among youth in low- and middle-income countries (LMICs): a systematic review of studies reporting implementation outcomes. Implement Sci Commun. 2021;2(1):126.

    Article  PubMed  PubMed Central  Google Scholar 

  76. Cidav Z, Mandell D, Pyne J, Beidas R, Curran G, Marcus S. A pragmatic method for costing implementation strategies using time-driven activity-based costing. Implement Sci. 2020;15(1):28.

    Article  PubMed  PubMed Central  Google Scholar 

  77. Willmeroth T, Wesselborg B, Kuske S. Implementation outcomes and indicators as a New Challenge in Health Services Research: a systematic scoping review. Inquiry. 2019;56:46958019861257.

    PubMed  Google Scholar 

  78. Eisman AB, Kilbourne AM, Dopp AR, Saldana L, Eisenberg D. Economic evaluation in implementation science: making the business case for implementation strategies. Psychiatry Res. 2020;283:112433.

    Article  PubMed  Google Scholar 

  79. Shields GE, Wilberforce M, Clarkson P, Farragher T, Verma A, Davies LM. Factors Limiting Subgroup Analysis in cost-effectiveness analysis and a call for transparency. PharmacoEconomics. 2022;40(2):149–56.

    Article  PubMed  Google Scholar 

  80. Fukuda H, Imanaka Y. Assessment of transparency of cost estimates in economic evaluations of patient safety programmes. J Eval Clin Pract. 2009;15(3):451–9.

    Article  PubMed  Google Scholar 

  81. O’Beirne M, Reid R, Zwicker K, Sterling P, Sokol E, Flemons W, et al. The costs of developing, implementing, and operating a safety learning system in community practice. J Patient Saf. 2013;9(4):211–8.

    Article  PubMed  Google Scholar 

  82. Filene JH, Brodowski ML, Bell J. Using cost analysis to examine variability in replications of an efficacious child neglect prevention program. J Public Child Welf. 2014;8(4):375–96.

    Article  Google Scholar 

  83. Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, et al. Enhancing the impact of implementation strategies in Healthcare: A Research Agenda. Front Public Health. 2019;7:3.

    Article  PubMed  PubMed Central  Google Scholar 

  84. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.

    Article  PubMed  PubMed Central  Google Scholar 

  85. Gaitonde R, Oxman AD, Okebukola PO, Rada G. Interventions to reduce corruption in the health sector. Cochrane Database Syst Rev. 2016;2016(8):CD008856.

    PubMed  PubMed Central  Google Scholar 

  86. The Global Fund Office of the Inspector General G. New OIG Advisory Report on grant implementation in Western and Central Africa identifies room for improvement. The Global Fund; 2019.

  87. Wierzynska A, Steingrüber S, Oroxom R, Bauhoff S. Recalibrating the anti-corruption, transparency, and accountability formula to advance public health. Glob Health Action. 2020;13(sup1):1701327.

    Article  PubMed  PubMed Central  Google Scholar 

  88. Peters BG. The challenge of policy coordination. Policy Des Pract. 2018;1(1):1–11.

    Google Scholar 

  89. Chattu VK, Knight WA, Adisesh A, Yaya S, Reddy KS, Di Ruggiero E, et al. Politics of disease control in Africa and the critical role of global health diplomacy: a systematic review. Health Promot Perspect. 2021;11(1):20–31.

    Article  PubMed  PubMed Central  Google Scholar 

  90. Béhague DP, Storeng KT. Collapsing the vertical-horizontal divide: an ethnographic study of evidence-based policymaking in maternal health. Am J Public Health. 2008;98(4):644–9.

    Article  PubMed  PubMed Central  Google Scholar 

  91. Wong BLH, Maaß L, Vodden A, van Kessel R, Sorbello S, Buttigieg S, et al. The dawn of digital public health in Europe: implications for public health policy and practice. Lancet Reg Health Eur. 2022;14:100316.

    Article  PubMed  PubMed Central  Google Scholar 

  92. McPherson S, Reese C, Wendler MC. Methodology update: Delphi Studies. Nurs Res. 2018;67(5):404–10.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

We are grateful to Dr. Vanessa Melhorn, Dr. Marieke Ahlborn, and Elisabeth Bastkowski for their administrative support.

Funding

The work is part of the doctoral research of the first author which is funded by intramural funds of the Helmholtz Centre for Infection Research (HZI), the German Federal Ministry of Education and Research, the Helmholtz research association, the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101003480, and the Regional Programme Support to Pandemic Prevention in the Economic Community of West African States Region implemented by the Deutsche Gesellschaft für Internationale Zusammenarbeit GmbH - GIZ (project number 14.2510.7-005.00) on behalf of the German Federal Ministry for Economic Cooperation and Development and the European Union, and through financial support of the Helmholtz Association of German Research Centres [grant number SO-094].

Open Access funding enabled and organized by Projekt DEAL.

Author information

Authors and Affiliations

Authors

Contributions

Conceptualisation of study: BBK, BL, GK.Design of study: BBK, KW, BL, GK, AMH, EK, CJKT, MH, CR.Writing of review protocol: BBK, MH, BL, AMH.Literature search: BBK, MH.Literature screening and selection: BBK, MH, BCS.Data extraction: BBK, MH, CR, JA.Data management and analysis: BBK, MH, BCS, CJKT.Writing first draft of manuscript: BBK.Review of draft manuscript: MH, AMH, EK, KW, BCS, CJKT, CR, JA, BL, GK.Review of final version of manuscript: All authors.

Corresponding author

Correspondence to Basil Benduri Kaburi.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Supplementary Material 2

Supplementary Material 3

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kaburi, B.B., Harries, M., Hauri, A.M. et al. Availability of published evidence on coverage, cost components, and funding support for digitalisation of infectious disease surveillance in Africa, 2003–2022: a systematic review. BMC Public Health 24, 1731 (2024). https://doi.org/10.1186/s12889-024-19205-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12889-024-19205-2

Keywords