Skip to main content

Development of a simple and effective online training for health workers: results from a pilot in Nigeria

Abstract

Background

Health workers (HWs) in Africa face challenges accessing and learning from existing online training opportunities. To address these challenges, we developed a modular, self-paced, mobile-ready and work-relevant online course covering foundational infection prevention and control (IPC) concepts. Here, we evaluate the first pilot of this course, conducted with HWs in Nigeria.

Methods

We used a learner-centered design and prototyping process to create a new approach to delivering online training for HWs. The resulting course comprised 10 self-paced modules optimized for use on mobile devices. Modules presented IPC vignettes in which learning was driven by short assessment questions with feedback. Learners were recruited by distributing a link to the training through Nigeria-based email lists, WhatsApp groups and similar networks of HWs, managers and allied professionals. The course was open to learners for 8 weeks. We tracked question responses and time on task with platform analytics and assessed learning gains with pre- and post-testing. Significance was evaluated with the Wilcoxon signed-rank test, and effect size was calculated using Cohen’s d.

Results

Three hundred seventy-two learners, with roles across the health system, enrolled in the training; 59% completed all 10 modules and earned a certificate. Baseline knowledge of foundational IPC concepts was low, as measured by pre-test scores (29%). Post-test scores were significantly higher at 54% (effect size 1.22, 95% confidence interval 1.00-1.44). Learning gains were significant both among learners with low pre-test scores and among those who scored higher on the pre-test. We used the Net Promoter Score (NPS), a common user experience metric, to evaluate the training. The NPS was + 62, which is slightly higher than published scores of other self-paced online learning experiences.

Conclusions

High completion rates, significant learning gains and positive feedback indicate that self-paced, mobile-ready training that emphasizes short, low-stakes assessment questions can be an effective, scalable way to train HWs who choose to enroll. Low pre-test scores suggest that there are gaps in IPC knowledge among this learner population.

Peer Review reports

Background

Infection prevention and control

There is increasing recognition of the importance of infection prevention and control (IPC) as a key component of epidemic preparedness and response and as a critical tool for keeping health workers (HWs) safe [1,2,3]. HWs have higher rates of infection than the general population, particularly during epidemics, and weak IPC programs as well as gaps in emergency preparedness contribute to this risk [1]. The COVID-19 pandemic has further highlighted the association between IPC and HW safety [4, 5]. In the early months of the pandemic, HWs were three times more likely than members of the general population to have COVID-19 [6]. While effective strategies for improving IPC programs and practices are an active area of research, an emerging consensus indicates that multiple synergizing approaches, including training, are required to bring about improvements in IPC [7,8,9]. Notably, HWs who received IPC training were at lower risk of developing COVID-19 [6]. Effective and continuous training is particularly important to keep HWs updated on rapidly evolving IPC guidance in the context of COVID-19 [10]. Ongoing training will also be needed to address new epidemics of other infectious agents.

The World Health Organization (WHO) defines HWs broadly to encompass health service providers as well as managers and support personnel [11]. A wide array of different HWs (including doctors, nurses, administrators, laboratory staff and community health workers) play a role in IPC implementation and require some foundational IPC knowledge to do their jobs safely and effectively [7]. In addition, officials and HWs at all levels of the health system (national, subnational, district and local) play a role in IPC. However, IPC training is far from universal. A recent assessment of 88 countries’ national IPC programs found that just over half provide in-service IPC training, while fewer than half provide pre-service IPC training [12]. In a recent survey comparing HWs’ access to personnel trained in IPC across the six different WHO regions, fewer than 40% of respondents in the African region reported that they had access to trained personnel “always” or “often,” the lowest of any WHO region [13]. Although such surveys indicate that many HWs do not currently receive IPC training, less is known about the IPC knowledge of individual HWs across different countries and in different positions in the health system. Integrating assessments into training may be helpful to better define gaps in IPC knowledge and skills and develop targeted interventions to address those gaps.

Online learning: limitations and solutions

Many organizations replaced in-person training with online learning in response to restrictions on gathering and travel imposed during the COVID-19 pandemic. This shift required an expansion of online learning opportunities for HWs, which currently include massive open online courses (MOOCs) [14], webinars [15] and synchronous virtual communities of practice [16]. However, accessing, engaging with and learning in online environments can be challenging even under optimal circumstances. Though MOOCs and other online courses are lauded for their potential to deliver learning at scale, their completion rates are low [17, 18]. Moreover, high rates of learner multitasking online can impede focused learning [19]. While these findings hint at the impediments to achieving depth of impact and scale through online learning [17], a growing body of research points to best practices in online course design that improve engagement and learning. Simple and organized online learning experiences lead to less frustration and sustain learners’ sense of self-efficacy [20], while focusing on only essential content improves knowledge retention [21]. Assessment that is designed to provide feedback and promote learning rather than to evaluate learners (formative assessment) is a powerful way to activate learning offline and online [22, 23]. The addition of short explanations after assessment questions improves learners’ perceptions of the learning experience [24]. Course design factors also influence learner engagement and retention. For example, shorter online courses and those with deadlines have higher completion rates [25, 26].

Online learning for HWs in Africa

In both online and face-to-face learning, content- and learner-specific factors influence course completion and learning outcomes. When learners believe content has career utility or relevance to their daily professional lives, they are more likely to persist [27] and HWs report higher interest in online learning that is relevant to their work [28]. The efficacy of different online learning approaches is context-dependent: working adults who prioritize flexibility prefer self-paced learning, but this modality might disadvantage learners who are less motivated or need more structure. Moreover, online learning is not equally suited to all of the knowledge, skills and abilities that HWs need to develop [29]. It can be challenging to teach and assess complex clinical skills in a virtual setting [28]. Online learning may be a useful tool to address existing gaps in IPC training, but more work needs to be done to delineate what IPC knowledge, skills and behaviors can be effectively taught online.

Residents of low- and middle-income countries face some additional barriers to learning online. In Africa, most internet users connect using mobile broadband (rather than fixed connections), and mobile network coverage is still rapidly growing. As of 2020, an estimated 77% of the African population is within range of a 3G wireless signal [30], though it is likely that access to internet and mobile devices is higher among HWs [13]. Factors other than the reach and quality of internet connections (including reliability of electrical service, wireless data and device costs and digital literacy skills gaps) represent additional barriers to access and use of internet resources [30, 31]. This environment favors the adoption of simple, mobile-compatible and user-friendly web applications that do not consume much data.

Although the objective of most training is to develop learners’ knowledge and skills, more accessible metrics, such as participation, completion and learner satisfaction are more frequently evaluated [32]. Only a fraction of studies evaluating online medical education in low- and middle-income countries report learning gains [33]. There are comparatively few studies that assess fully online training of in-service HWs in Africa, perhaps in part because the continent is under-represented in the medical education research literature [34]. In the case of one app-based training for nurses and midwives in Rwanda, learners received a half-day in-person introduction to the app and were coached on its use during frequent site visits [35]. In a training on Ebola Virus Disease in Nigeria, HWs accessed the training via tablets placed in health facilities [36]. There were significant short-term learning gains in both studies, though long-term knowledge gains were not assessed. It may be logistically complex and costly to scale up interventions that involve providing learners with devices and in-person support, challenges that are heightened by COVID-19-related travel restrictions and safety measures. These cases highlight the need for more research on the efficacy and limitations of fully online training for learners in Africa.

In this work, we hypothesized that online learning for HWs in Africa could be improved by designing learning experiences that play to the strengths of online learning and proactively address the barriers faced by this population of learners.

Methods

Training development process

We used a learner-centered design thinking process [37] to develop this training. Our process involved understanding learners’ needs and the challenges they face, defining solutions that would address those needs (Table 1), and developing and iteratively testing solutions. Through our experiences delivering a large-scale HW training program during the COVID-19 pandemic [1], conversations with colleagues and health workers in Africa and our own experiences with online learning, we developed features to address the learning needs of HWs in Africa, with a focus on workers in primary health care facilities. We took a broad view of needs to encompass the myriad challenges that learners might face in accessing online learning, remaining engaged and retaining knowledge.

Table 1 Features that address the learning needs of HWs in Africa

We developed a prototype that met these design parameters and shared this module and written descriptions of the overall training structure with team members, colleagues and HWs in Nigeria. We then adapted the prototype based on their feedback. Notably, we changed the assessments to give learners more control over their progress and adjusted the target word count of the modules based on the time it took these users to complete them. We also changed the way we released content to learners. Our initial design implemented spaced repetition [38], with modules released on a schedule. Based on comments received in the prototyping process, we removed this constraint to allow learners to access any module at any time.

The final prototype (Fig. 1) comprised 10 modules with each module designed to require 10 to 20 min of work. All modules and supporting materials were in English. The first and last modules were the pre-test and post-test, respectively; these tests had no time limit and contained the same questions. The intervening modules each presented a brief vignette, occurring in a primary health care facility, in which learners answered questions and received feedback as a means of learning about IPC concepts in an interactive and narrative fashion. We did not find an existing learning management system suitable for this prototype, so we adapted other platforms to fit our purposes. We used a web-based survey tool (SurveyMonkey) to deliver the modules and an enterprise email service (Mailchimp) to communicate with learners, with automation software (Zapier) passing learner data to Mailchimp.

Fig. 1
figure 1

Course structure and screenshots of the prototype. Learners could access the modules in any order, though linear progression was encouraged. Screenshots are shown in mobile view, but all modules were also compatible with computers and tablets

Content and assessment development

We followed a “backward course design” process [39] for content development. We defined the target audience as HWs in Nigeria, with a focus on those working in primary health care facilities. Notably, these facilities are not typically staffed by physicians [40]. Team members from Resolve to Save Lives, AFENET-Nigeria and the National Primary Health Care Development Agency (NPHCDA) of Nigeria developed and reviewed the course. We created a sequence of modules to ensure consistent coverage of all objectives and content, and then wrote the modules based on that sequence. The pre-/post-test included questions drawn from, or similar to, some questions covered in other modules, and learners could review correct answers after completing both the pre-test and the post-test. We adhered to WHO guidance for technical content, but also ensured that the content was consistent with NPHCDA guidance. Research indicates that simple content adaptations may promote local relevance of learning materials [41]. To increase relevance to the target audience we gave all characters in the vignettes names that are common in Nigeria. Multimedia (images and video) used in the course were open-access or developed in-house. Three medical doctors on the team (based in the United States, Rwanda and Nigeria) reviewed all content for accuracy and alignment with NPHCDA standards. Collectively, these three technical reviewers have decades of experience working in IPC.

Evaluation of learner perceptions

The Net Promoter Score (NPS) question – “How likely is it that you would recommend this training to a friend or colleague?” – has been used to evaluate user experience across a wide variety of industries, including education [42] and health care [43]. We asked this as an optional question at the end of each module with an 11-point response scale, ranging from not at all likely (0) to extremely likely (10). Responses were binned into promoters (9 and 10), neutral (7 and 8) and detractors (6 and below). The NPS was calculated as the percentage of promoters minus the percentage of detractors [43].

Learner recruitment and communications

The training launched on April 6, 2021 and was available through June 1, 2021, a period of 8 weeks. We shared the link to the training through several Nigeria-based networks of HWs that we were part of, including WhatsApp groups and email lists. We also disseminated the training link via partner organizations’ social media platforms (Twitter and Facebook) and through an IPC bulletin that was distributed to various networks of HWs in Nigeria. Although the training was designed specifically for HWs and allied professionals, no restrictions were placed on participation (any person with the link could participate). Learners were allowed to participate anonymously, or they could consent to provide their name and email address in each module to work towards a certificate of participation. The requirement to complete all 10 modules was made clear to all participants on the course landing page and in other course materials. Participants received a confirmation email upon completing each module, and we emailed certificates to learners who completed all 10 modules (those who did not complete all modules did not receive certificates). In order to complete each module, learners had to answer all of the questions in the module. In modules 2-9, after multiple choice questions, learners received feedback as to whether or not they selected the correct response and an explanation of the correct response; they had the option to repeat questions they answered incorrectly. We did not require learners to repeat questions, and there was no required pass rate for the modules. We sent weekly email reminders to learners who had not completed the training, using Mailchimp to automate these reminders and customize them based upon course progression. Learner support was provided via email, with responses to support questions typically provided within 24 h.

Data collection and analysis

We collected all data in SurveyMonkey and module completion was automatically recorded in Mailchimp. Invalid emails and duplicate enrollments were cleaned in Mailchimp and not included in enrollment and certification statistics. We analyzed all other data in Microsoft Excel and R version 3.4.4. If a learner completed the same module more than once, we only considered the first response to any given module. To enroll and work towards a certificate, learners had to complete a module and submit a valid email address with that module. Some respondents did not consent to share an email, but they could complete the modules anonymously for informational purposes. As a result, these responses were not identifiable and we excluded them from the analysis. To minimize survey fatigue, we kept the number of demographic questions to a minimum, focusing only on country and job role. Learners indicated whether or not they lived in Nigeria and selected their job role from a list; they could select more than one role and write in a role under “Other.” We coded these “Other” responses based upon keywords and key phrases. We evaluated learning gains among those who completed the pre-test and post-test in sequence (six learners who completed the post-test before the pre-test were excluded from the learning gains analyses). Each question on the pre-/post-test received equal weight, with a maximum score of one and a minimum score of zero. Partial credit was awarded for multiple answer (checkboxes) questions. We conducted qualitative analysis of a pre-/post-test question designed to elicit multiple strategies for IPC improvement by coding responses according to the WHO multimodal strategies framework [7]. Two individuals independently coded the responses, scoring each strategy as “mentioned” or “not mentioned.” The inter-rater reliability (Cohen’s kappa) was 0.83. In cases of discrepancies between the two coders, the given strategy was scored as “not mentioned.” To summarize support requests, we manually reviewed all emails we received from learners enrolled in the course from the course start date to 8 weeks after the course close date.

Pre-test scores and learning gains were not normally distributed (Shapiro-Wilk test), so we used the nonparametric t-test equivalents (the Wilcoxon rank-sum test and Wilcoxon signed-rank test) to compute inferential statistics for pre-/post-test scores. We used Cohen’s d to calculate effect sizes and chi-square tests to evaluate differences in the number of strategies named in open response questions. This project received a “non-human subjects research” determination from an IRB in the United States and an “exempt” determination from an IRB in Nigeria.

Results

Learner roles

All participants who responded to demographic questions reported living in Nigeria. Learners could select more than one role and an open response “Other” option was also available. Learners held a wide array of roles in the health system, including doctors, nurses, administrators, Ministry of Health officials and non-governmental organization (NGO) employees (Table 2). Common open responses in the “Other” category included terms such as “community health worker,” “medical laboratory scientist,” “environmental health officer” and “routine immunization officer.” Collectively, these results show that the training reached an array of professionals from across the health system in positions that require different levels of formal education and knowledge of IPC.

Table 2 Learner roles. Learners could self-identify with more than one role in a multiple-response question that included an open response “Other” field. “Other” responses (indicated with an asterisk*) were coded and grouped according to key words. We also manually coded and grouped “Other” responses that could not reasonably be attributed to health-sector roles (e.g., “administrative assistant,” “student” and “self-employed”)

Course usage

Three hundred seventy-two learners enrolled in the training, and of these, 220 (59%) completed all 10 modules and earned a certificate. Most attrition occurred early in the course (Table 3). The mean number of modules completed per learner was 6.8, and the mean delay between completion of the pre-test and post-test was 5.8 days. In this training, learners could skip modules, so some completed many modules, but did not earn a certificate. The median time to complete each module ranged from 8.5 min to 18.6 min, and the median time on task to complete all 10 modules was 138.6 min, or just under 2.5 h. Fourteen learners (4% of those enrolled) made a total of 18 support requests. The most common support requests (eight total) were related to certificates (e.g., asking when certificates would be sent, requesting that a certificate be re-sent or requesting a certificate in a different file format).

Table 3 Module topics and completion rates. The main topical focus of each module is given, along with the percentage of enrolled learners who completed that module. Certain themes, such as standard precautions and administrative controls, were integrated into many different modules. Learners could skip any module so there was no module that all enrolled learners completed

IPC knowledge and learning gains

The first and final modules of the course included a pre-test and post-test, respectively. The average pre-test score was 29% among the learners who completed the pre-test before the post-test (N = 222). Within the intervening modules (2-9), when learners answered formative assessment questions incorrectly, they received feedback on their answers and had the opportunity to try questions again. Learners provided a correct initial response to multiple-choice formative assessment questions 46% of the time. When learners answered incorrectly on their first attempt, they chose to repeat the question 78% of the time.

Learners spent less time on the post-test (median of 9.9 min) than on the pre-test (median of 18.6 min). However, average post-test scores (Fig. 2) improved to 54%, representing a learning gain of 86% (P < 0.001) with an effect size of 1.22 (95% confidence interval: 1.00–1.44). This indicates that scores improved by more than one standard deviation from the pre-test to the post-test, a large effect size for an educational intervention [44]. We analyzed the post-test scores of those who scored below the median or at the median and above on the pre-test. Both groups demonstrated substantial and significant improvements. Moreover, there was no significant difference (P = 0.51) between the pre-test scores of learners who completed the post-test (mean pre-test score: 29%) and those who did not complete the post-test (mean pre-test score: 27%).

Fig. 2
figure 2

Pre-/post-test performance. Learners who completed both the pre- and post-test were grouped by pre-test score. Learners who scored below the median and at or above the median both had significant learning gains

We also analyzed pre-test scores, post-test scores and learning gains by learner role, focusing on the most commonly reported roles (those with > 5% of enrolled learners). All of the groups analyzed had significant learning gains (Table 4).

Table 4 Pre- and post-test scores for the most common learner roles. Post-test scores were significantly different from pre-test scores in all of the groups (**P < 0.001; *P < 0.01)

There was one open-response question in the pre-/post-test, in which learners were asked to name different facility-level approaches that could be taken to improve IPC. The responses were coded manually as described in Methods. Only 14% of learners mentioned two or more strategies on the pre-test; this increased to 25% on the post-test (P < 0.005).

Learner feedback

We asked the NPS question (“How likely is it that you would recommend this training to a friend or colleague?”) at the end of modules 2-9 (all modules excluding the pre-/post-test). The mean rating was 8.95 out of 10, corresponding to an NPS of + 62, with a response rate of 46%.

Discussion

We developed a new approach to training HWs in Nigeria that directly addresses the challenges they face in accessing online learning in general and IPC training in particular. The course reached hundreds of HWs who held diverse roles across the health system, with a high completion rate, positive feedback and significant learning gains. This demonstrates an effective and safe approach to train frontline HWs at scale in the context of COVID-19-related travel restrictions and safety measures.

Low baseline knowledge of IPC

Low average scores on the pre-test and on learners’ first attempts of formative assessment questions suggest that this population had low baseline knowledge of IPC. In light of the recommendation that all HWs receive IPC training [7, 8], this indicates that more work needs to be done to build HW knowledge of foundational IPC concepts. The field would benefit from standardized IPC curricula and assessments that could be adapted to different country contexts and different levels of the health system [3]. Just as international standards endorsed by normative bodies contributed to quality improvement in medical education [45], a certification mechanism (for individual HWs, health facilities or both) would help policymakers identify gaps in IPC and develop targeted solutions. Moreover, the field would benefit from standardized monitoring and evaluation tools that would complement improved training programs.

Completion and learning outcomes

The course had a high completion rate relative to industry standards. We are not aware of many reports of completion rates in optional, free, time-bound, fully online courses focused on HWs in Africa. One course, offered to HWs worldwide, had a reported completion rate of 36% [46]. There are many more reports on MOOC completion rates, which range from < 5 to 13% [17, 18], though rates are higher for some courses and platforms [14]. Some learners skipped modules or completed modules more than once; it would be useful to evaluate learners’ reasons for this in the future.

There was a substantial and statistically significant improvement in scores from the pre-test to the post-test, suggesting that learners’ knowledge of IPC increased by completing the course. Some learners completed the course quickly, but the learning gains we observed cannot reasonably be attributed to short-term recall of the questions themselves, as the average time elapsed between the pre-test and the post-test was about 6 days. Moreover, scores improved regardless of learners’ prior knowledge of IPC, and there was no significant difference in pre-test scores between those who did and did not complete the course. This is encouraging, given that low prior subject knowledge is associated with poor outcomes in other online courses [47]. Learners holding many different roles (e.g., doctors, nurses, administrators, community health workers) experienced learning gains in the course. Larger sample sizes and additional demographic information (including data on gender and education level) would facilitate a deeper understanding of differences in IPC knowledge among different groups of HWs.

Learners’ perceptions of the course

The NPS of this course (+ 62) is slightly higher than NPS ratings of other self-paced free online learning experiences, such as Duolingo (+ 52) [48] and Coursera (+ 56) [42]. This indicates that learners were satisfied with the learning experience and were likely to recommend it to others (an NPS > + 50 is considered “excellent” [49]). The field of online learning lacks a consensus tool for rapid and standardized assessment of learner experience [42]; NPS may be a useful metric for comparing learner satisfaction across different online learning experiences and it would be even more helpful if widely adopted to allow for comparison against accepted benchmarks.

Limitations

This training and its evaluation have some limitations. We did not address every challenge experienced by HWs in Africa who are trying to access online learning. This training required learners to have an internet connection and an email account and it was only offered in English. It remains to be seen whether this approach can effectively scale to train HWs in other countries, working in other levels of health care delivery and in other languages. We do not have a means of independently validating self-reported learner roles, and we do not have good a way to estimate how many learners saw the course. To improve the robustness and independence of learning gains evaluation, future courses should include pre-/post-tests that do not use questions that learners see in formative assessment. Moreover, we did not conduct long-term follow-up to assess retention of IPC knowledge, which is as important an outcome as short-term learning. We also did not assess behavior change, a downstream metric that is more challenging to capture, but also more relevant to important public health outcomes [50]. Though post-test scores improved, they remained low, which supports the concept that multiple reinforcing approaches are needed to sustain improvements in IPC knowledge and practice.

Conclusions

This work demonstrates how organizations can take specific steps to understand their learners’ goals and barriers and develop solutions that build upon evidence from the science of learning. The barriers we identified to online learning among HWs in Nigeria are likely to be applicable in other contexts, and the solution we developed may be effective in different sectors and for other populations of learners. Moreover, this work suggests that immersion in formative assessment is an effective and enjoyable way for HWs to learn. Future work should evaluate long-term knowledge retention, the potential of the training to be scaled up to reach more HWs and the feasibility of implementation in other languages. The lessons learned in this pilot have informed a second phase of implementation of this course in multiple countries and in two languages. Our evaluation also points to foundational IPC knowledge gaps, suggesting a need for additional assessments and training for HWs in different roles within the health system.

Availability of data and materials

De-identified data are available from the corresponding author upon request.

Abbreviations

HWs:

Health workers

IPC:

Infection prevention and control

MOOCs:

Massive open online courses

NGO:

Nongovernmental organization

NPHCDA:

National Primary Health Care Development Agency

NPS:

Net Promoter Score

WHO:

World Health Organization

References

  1. Patel LN, Kozikott S, Ilboudo R, Kamateeka M, Lamorde M, Subah M, et al. Safer primary healthcare facilities are needed to protect healthcare workers and maintain essential services: lessons learned from a multicountry COVID-19 emergency response initiative. BMJ Glob Health. 2021;6:e005833.

    Article  PubMed  Google Scholar 

  2. World Health Organization. Resolution WHA74.14: Protecting, safeguarding and investing in the health and care workforce. [Internet] Geneva: World Health Organization; 2021. Available from: https://apps.who.int/gb/ebwha/pdf_files/WHA74/A74_R14-en.pdf.

  3. Allegranzi B, Kilpatrick C, Storr J, Kelley E, Park BJ, Donaldson L. Global infection prevention and control priorities 2018–22: a call for action. Lancet Glob Health. 2017;5:e1178–80.

    Article  Google Scholar 

  4. Shaw A, Flott K, Fontana G, Durkin M, Darzi A. No patient safety without health worker safety. Lancet. 2020;396:1541–3.

    Article  Google Scholar 

  5. Resolve to Save Lives. Protecting health care workers: a need for urgent action. [Internet]. New York: Resolve to Save Lives; 2021. Available from: https://preventepidemics.org/wp-content/uploads/2021/01/RTSL_Protecting-Health-Care-Workers.pdf.

  6. World Health Organization. COVID-19 Weekly Epidemiological Update. [Internet] Geneva: World Health Organization; 2021 Jan 31. Available from: https://www.who.int/docs/default-source/coronaviruse/situation-reports/20210202_weekly_epi_update_25.pdf.

  7. Storr J, Twyman A, Zingg W, Damani N, Kilpatrick C, Reilly J, et al. Core components for effective infection prevention and control programmes: new WHO evidence-based recommendations. Antimicrob Resist Infect Control. 2017;6:6.

    Article  Google Scholar 

  8. World Health Organization. Minimum requirements for infection prevention and control programmes. [Internet]. Geneva: World Health Organization; 2019. Available from: https://www.who.int/publications/i/item/9789241516945.

  9. Barrera-Cancedda AE, Riman KA, Shinnick JE, Buttenheim AM. Implementation strategies for infection prevention and control promotion for nurses in sub-Saharan Africa: a systematic review. Implement Sci. 2019;14:111.

    Article  Google Scholar 

  10. Abbas S, Sultan F. Infection control practices and challenges in Pakistan during the COVID-19 pandemic: a multicentre cross-sectional study. J Infect Prev. 2021;22(5):205–11.

    Article  Google Scholar 

  11. World Health Organization. WHO Charter - Health worker safety: a priority for patient safety. [Internet] Geneva: World Health Organization; 2020. Available from: https://apps.who.int/iris/handle/10665/339287.

  12. Tartari E, Tomczyk S, Pires D, Zayed B, Coutinho Rehse AP, Kariyo P, et al. Implementation of the infection prevention and control core components at the national level: a global situational analysis. J Hosp Infect. 2021;108:94–103.

    Article  CAS  Google Scholar 

  13. Desai AN, Ramatowski JW, Lassmann B, Holmes A, Mehtar S, Bearman G. Global infection prevention gaps, needs, and utilization of educational resources: a cross-sectional assessment by the International Society for Infectious Diseases. Int J Infect Dis. 2019;82:54–60.

    Article  Google Scholar 

  14. Utunen H, Kerkhove MDV, Tokar A, O’Connell G, Gamhewage GM, Fall IS. One year of pandemic learning response: benefits of massive online delivery of the World Health Organization’s technical guidance. JMIR Public Health Surveill. 2021;7:e28945.

    Article  Google Scholar 

  15. Wilson K, Dennison C, Struminger B, Armistad A, Osuka H, Montoya E, et al. Building a virtual global knowledge network during COVID-19: the infection prevention and control global webinar series. Clin Infect Dis. 2021. https://doi.org/10.1093/cid/ciab320.

  16. Struminger B, Arora S, Zalud-Cerrato S, Lowrance D, Ellerbrock T. Building virtual communities of practice for health. Lancet. 2017;390:632–4.

    Article  Google Scholar 

  17. Reich J, Ruipérez-Valiente JA. The MOOC pivot. Science. 2019;363:130–1.

    Article  CAS  Google Scholar 

  18. Jordan K. Massive open online course completion rates revisited: Assessment, length and attrition. IRRODL [Internet]. 2015; 16(3). Available from: http://www.irrodl.org/index.php/irrodl/article/view/2112.

  19. Lepp A, Barkley JE, Karpinski AC, Singh S. College students’ multitasking behavior in online versus face-to-face courses. SAGE Open. 2019;9:2158244018824505.

    Article  Google Scholar 

  20. Simunich B, Robins DB, Kelly V. The impact of Findability on student motivation, self-efficacy, and perceptions of online course quality. Am J Dist Educ. 2015;29:174–85.

    Article  Google Scholar 

  21. Brame CJ. Effective educational videos: principles and guidelines for maximizing student learning from video content. CBE Life Sci Educ. 2016;15:es6.

    Article  Google Scholar 

  22. Szpunar KK, Khan NY, Schacter DL. Interpolated memory tests reduce mind wandering and improve learning of online lectures. PNAS. 2013;110:6313–7.

    Article  CAS  Google Scholar 

  23. Roediger HL, Karpicke JD. The power of testing memory: basic research and implications for educational practice. Perspect Psychol Sci. 2006;1:181–210.

    Article  Google Scholar 

  24. Thomas MP, Türkay S, Parker M. Explanations and interactives improve subjective experiences in online courseware. IRRODL [Internet]. 2017; 18(7). Available from: http://www.irrodl.org/index.php/irrodl/article/view/3076.

  25. Rodriguez BCP, Armellini A, Nieto MCR. Learner engagement, retention and success: why size matters in massive open online courses (MOOCs). Open Learn. 2020;35:46–62.

    Article  Google Scholar 

  26. Ihantola P, Fronza I, Mikkonen T, Noponen M, Hellas A. Deadlines and MOOCs: how do students behave in MOOCs with and without deadlines. In: 2020 IEEE Frontiers in Education Conference (FIE); 2020. p. 1–9.

    Google Scholar 

  27. Andersen L, Ward TJ. Expectancy-value models for the STEM persistence plans of ninth-grade, high-ability students: a comparison between black, Hispanic, and white students. Sci Educ. 2014;98:216–42.

    Article  Google Scholar 

  28. Bin Mubayrik HF. Exploring adult learners’ viewpoints and motivation regarding distance learning in medical education. Adv Med Educ Pract. 2020;11:139–46.

    Article  Google Scholar 

  29. Regmi K, Jones L. A systematic review of the factors – enablers and barriers – affecting e-learning in health sciences education. BMC Med Educ. 2020;20:91.

    Article  Google Scholar 

  30. Digital trends in Africa 2021. Information and communication technology trends and developments in the Africa region, 2017-2020. Geneva: International Telecommunication Union; 2021.

    Google Scholar 

  31. Abdulmajeed K, Joyner DA, McManus C. Challenges of online learning in Nigeria. In: Proceedings of the Seventh ACM Conference on Learning @ Scale. New York: Association for Computing Machinery; 2020. p. 417–20.

    Chapter  Google Scholar 

  32. Otto D, Bollmann A, Becker S, Sander K. It’s the learning, stupid! Discussing the role of learning outcomes in MOOCs. Open Learn. 2018;33:203–20.

    Article  Google Scholar 

  33. Barteit S, Guzek D, Jahn A, Bärnighausen T, Jorge MM, Neuhann F. Evaluation of e-learning for medical education in low- and middle-income countries: a systematic review. Comput Educ. 2020;145:103726.

    Article  Google Scholar 

  34. Thomas MP. The geographic and topical landscape of medical education research. BMC Med Educ. 2019;19:189.

    Article  Google Scholar 

  35. Nishimwe A, Ibisomi L, Nyssen M, Conco DN. The effect of an mLearning application on nurses’ and midwives’ knowledge and skills for the management of postpartum hemorrhage and neonatal resuscitation: pre–post intervention study. Hum Resour Health. 2021;19:14.

    Article  Google Scholar 

  36. Otu A, Ebenso B, Okuzu O, Osifo-Dawodu E. Using a mHealth tutorial application to change knowledge and attitude of frontline health workers to Ebola virus disease in Nigeria: a before-and-after study. Hum Resour Health. 2016;14:5.

    Article  Google Scholar 

  37. McLaughlin JE, Wolcott MD, Hubbard D, Umstead K, Rider TR. A qualitative review of the design thinking framework in health professions education. BMC Med Educ. 2019;19:98.

    Article  Google Scholar 

  38. Kerfoot BP, Fu Y, Baker H, Connelly D, Ritchey ML, Genega EM. Online spaced education generates transfer and improves long-term retention of diagnostic skills: a randomized controlled trial. J Am Coll Surg. 2010;211:331–337.e1.

    Article  Google Scholar 

  39. Wiggins GP, Wiggins G, Mctighe J. Understanding by Design ASCD; 2005.

    Google Scholar 

  40. World Health Organization. Primary health care systems (PRIMASYS): case study from Nigeria. [Internet] Geneva: World Health Organization; 2017. Available from: https://www.who.int/alliance-hpsr/projects/alliancehpsr_nigeriaprimasys.pdf?ua=1.

  41. Adam M, Chase RP, McMahon SA, Kuhnert K-L, Johnston J, Ward V, et al. Design preferences for global scale: a mixed-methods study of “glocalization” of an animated, video-based health communication intervention. BMC Public Health. 2021;21:1223.

    Article  Google Scholar 

  42. Palmer K, Devers C. An evaluation of MOOC success: net promoter scores. In: Association for the Advancement of Computing in Education (AACE); 2018. p. 1648–53.

    Google Scholar 

  43. Koladycz R, Fernandez G, Gray K, Marriott H. The net promoter score (NPS) for insight into client experiences in sexual and reproductive health clinics. Glob Health Sci Pract. 2018;6:413–24.

    Article  Google Scholar 

  44. Maher JM, Markey JC, Ebert-May D. The other half of the story: effect size analysis in quantitative research. CBE Life Sci Educ. 2013;12:345–51.

    Article  Google Scholar 

  45. Blouin D, Tekian A. Accreditation of medical education programs: moving from student outcomes to continuous quality improvement measures. Acad Med. 2018;93:377–83.

    Article  Google Scholar 

  46. Vaysse C, Chantalat E, Beyne-Rauzy O, Morineau L, Despas F, Bachaud J-M, et al. The impact of a small private online course as a new approach to teaching oncology: development and evaluation. JMIR Med Educ. 2018;4:e9185.

    Article  Google Scholar 

  47. Kennedy G, Coffrin C, de Barba P, Corrin L. Predicting success: how learners’ prior knowledge, skills and activities predict MOOC performance. In: Proceedings of the Fifth International Conference on Learning Analytics And Knowledge. New York: Association for Computing Machinery; 2015. p. 136–40.

    Chapter  Google Scholar 

  48. Vesselinov R, Grego J. Duolingo Effectiveness Study; 2012.

    Google Scholar 

  49. Jolles MW, de Vries M, Hollander MH, van Dillen J. Prevalence, characteristics, and satisfaction of women with a birth plan in the Netherlands. Birth. 2019;46:686–92.

    Article  Google Scholar 

  50. Stevenson R, Moore DE Jr. Ascent to the summit of the CME pyramid. JAMA. 2018;319:543–4.

    Article  Google Scholar 

Download references

Acknowledgments

We would like to thank all of the HWs who participated in the training and provided their feedback on the prototype.

Funding

The Resolve to Save Lives Health Care Worker Training Initiative was funded by Bloomberg Philanthropies and the Stavros Niarchos Foundation.

Author information

Authors and Affiliations

Authors

Contributions

MPT developed the training, conducted analysis and wrote the manuscript. SK co-developed the training, reviewed and revised the course materials and conducted analysis. MK, RAA, EA, BGB, KB, OM, LP, and AEB developed and edited training materials and contributed to the training design. JA and CL supervised the project and this evaluation, revised the manuscript, and contributed to the training design. All authors reviewed the results and this manuscript. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Marshall P. Thomas.

Ethics declarations

Ethics approval and consent to participate

The training was reviewed in the US by the BRANY SBER IRB (#IRB00010793) and in Nigeria by the National Health Research Ethics Committee of Nigeria (#NHREC/01/01/2007-24/03/2021). It received a non-human subjects research determination from BRANY and an exempt determination from NHREC. This project was performed in accordance with international norms, including the Declaration of Helsinki. All participants consented to participate in the training before sharing any personally identifiable information.

Consent for publication

Not applicable.

Competing interests

MPT is lead consultant at Enact Academy, which designs and develops online trainings. All other authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Thomas, M.P., Kozikott, S., Kamateeka, M. et al. Development of a simple and effective online training for health workers: results from a pilot in Nigeria. BMC Public Health 22, 551 (2022). https://doi.org/10.1186/s12889-022-12943-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12889-022-12943-1

Keywords