Skip to main content

Evaluation of a public COVID-19 dashboard in the Western Cape, South Africa: a tool for communication, trust, and transparency

Abstract

Background

Public health dashboards have been used in the past to communicate and guide local responses to outbreaks, epidemics, and a host of various health conditions. During the first year of the COVID-19 pandemic, dashboards proliferated but the availability and quality differed across the world. This study aimed to evaluate the quality, access, and end-user experience of one such dashboard in the Western Cape province, South Africa.

Methods

We analysed retrospective aggregate data on viewership over time for the first year since launch of the dashboard (30 April 2020 – 29 April 2021) and conducted a cross-sectional survey targeting adult users of the dashboard at one year post the initial launch. The self-administered, anonymous questionnaire with a total of 13 questions was made available via an online digital survey tool for a 2-week period (6 May 2021 – 21 May 2021).

Results

After significant communication by senior provincial political leaders, adequate media coverage and two waves of COVID-19 the Western Cape public COVID-19 dashboard attracted a total of 2,248,456 views during its first year. The majority of these views came from Africa/South Africa with higher median daily views during COVID-19 wave periods. A total of 794 participants responded to the survey questionnaire. Reported devices used to access the dashboard differed statistically between occupational status groups with students tending toward using mobile devices whilst employed and retired participants tending toward using desktop computers/laptops. Frequency of use increases with increasing age with 65.1% of those > 70 years old viewing it daily. Overall, 76.4% of respondents reported that the dashboard influenced their personal planning and behaviour. High Likert score ratings were given for clarity, ease of use and overall end-user experience, with no differences seen across the various age groups surveyed.

Conclusion

The study demonstrated that both the availability of data and an understanding of end-user need is critical when developing and delivering public health tools that may ultimately garner public trust and influence individual behaviour.

Peer Review reports

Background

Public health dashboards have been used in the past to communicate, manage and guide local responses to outbreaks, epidemics, and a variety of health conditions [1]. The concept of a ‘visualization dashboard’ is ubiquitous, though, and is found across almost every industry [2]. It is formally defined as “a predominantly visual information display that people use to rapidly monitor current conditions that require a timely response to fulfill a specific role”. This definition, therefore, spans the full spectrum by which information is shared from single page static reports to the more recent evolution of dynamic, interactive, and engaging digital dashboards.

Over the last decade, public health officials have increasingly employed dashboards to guide their response to outbreaks and epidemics [1]. Several examples of this utility include the Somalia Polio outbreak of 2013–2014, the Ebola epidemic of Liberia, Sierra Leone, and Guinea in 2014–2016, and surveillance of the HIV epidemic in New York [3,4,5]. These examples illustrated the value of leveraging intelligence to rapidly respond to worsening adverse conditions, particularly in the context of an epidemic, through constant and reliable local monitoring [1].

More recently, the ongoing COVID-19 pandemic provided fertile soil within which dashboards proliferated. Judson et al. (2022) reported on the variability in COVID-19 reporting systems in Africa and noted that most African countries (98.1%) had some form of official national COVID-19 reporting system in place. Data was communicated in the form of situation reports, press releases, and official dashboards through websites and social media [6].

Beyond access and availability, several features emerged that demonstrated if data available, particularly via dashboards, were actionable. These features included providing data closer to the end-user’s locale, breaking down the population into subgroups, linking time trends to policy decisions, managing the type and volume of data being shown, knowing the audience and their information needs and using visual cues and storytelling as appropriate [7]. Both the availability and quality of public dashboards, therefore, plays an important role in achieving impact on individual decision-making and behaviour change, particularly as it pertains to an African setting.

In the Western Cape, South Africa, the province’s Premier officially launched a COVID-19 dashboard on 30 April 2020 in an online digital conference [8]. The launch highlighted that the dashboard would be a means to ensure transparency and accountability throughout the pandemic [9]. Furthermore, the Premier stated that the dashboard will provide data at a subdistrict level with the belief that this would ensure citizens can protect themselves and their families based on the most recent and correct COVID-19 information at hand [10]. The web-based desktop dashboard was designed, developed, and launched seven weeks after the first provincial case was identified and at a time when the province had recorded a total of only 2,371 COVID-19 cases. The aim of this study was to evaluate the quality, access and end-user experience of the public COVID-19 dashboard that emerged in the Western Cape province, South Africa [11].

Methods

Study design and population

The study was conducted by health officials from the provincial government of the Western Cape. The data used in this evaluation study came from two sources. Firstly, we obtained retrospective aggregate data on viewership over time for the first year since launch of the COVID-19 dashboard (30 April 2020 – 29 April 2021) from background analytics running on the dashboard. This included number of daily views and country of origin of viewer which was further aggregated and mapped to respective continental regions. We compared viewership over time by comparing the frequency of views by provincial COVID-19 wave period including the initial two weeks of launch (30 April – 14 May 2020), wave 1 (15 May – 31 August 2020), trough 1 (1 September – 15 October 2020), wave 2 (16 October 2020 – 28 February 2021), and trough 2 (1 March – 29 April 2021). Secondly, we conducted a cross-sectional survey where the target audience was adult users of the public Western Cape COVID-19 dashboard at one year post the initial launch. This primary data was collected via an online digital survey tool accessible on the dashboard through an opt-in manner of delivery and was available for a two-week period between 6 – 21 May 2021. Recruitment was performed through purposive sampling with an invitation to participate as an on-screen pop-up when users engaged with the dashboard. The minimal sample size was calculated using the Kish Leslie formula [12] and was found to be 246 participants for the cross-sectional study based on an assumed proportion accessing the dashboard once a week or more at ~ 20% of all regular viewers. This combined approach of web traffic analysis and an online survey was based on Albert and Tullis’ (2013) work in measuring the user experience [13]. Both approaches provide quantitative insight into participant behaviour and attitudes in a relatively short space of time and is ideal for participants who are geographically dispersed as was the case for the present study’s target sample.

Survey tool

The self-administered, anonymous questionnaire (see Supplementary File 1) included a total of 13 questions. Demographic variables included age, gender, occupational status, occupation, and location. Access and frequency of use were operationalized through categorical variables in terms of devices used to access the COVID-19 dashboard and how often it was accessed. Quality and impact of the dashboard were operationalized by several variables including: binary variable for impact on personal planning and behaviour, categorical variables for personal purpose and dashboard element most frequently used, five-point Likert scale questions on ease of use, and clarity and understanding, and a ten-point Likert scale for overall rating of end-user experience. Differing Likert scales were used with odd number scoring, allowing for a neutral midpoint score, and even number scoring, ensuring a choice be made in terms of a positive or negative end-user experience [13]. Importantly, the question on dashboard element most frequently used was supplemented with images of the relevant dashboard element to ensure understanding of what was referred to. All questions were mandatory to ensure data completeness.

Data processing and statistical analysis

Data were analysed using RStudio with data visualizations supplemented by DATAtab, Datawrapper and Power BI. We conducted descriptive and bivariate analyses to describe the population sample. We performed descriptive analyses for frequency of viewers per continental region as well as survey participant characteristics. We applied one-way ANOVA and Kruskal Wallis tests to the normal and non-normal distributed variable categories of COVID-19 wave period for viewership over time respectively and Mann–Whitney U test was applied to the non-normal distributed variables of viewership by weekday vs weekend variability. We used the chi-square test or Fisher’s exact test to assess the associations between reported device most often used and occupational status, reported frequency of use and age category, and reported impact on personal planning and dashboard element most often used. We calculated one-way ANOVA for the mean differences between age groups and Likert scale responses. Importantly, age was aggregated to four categories for the purposes of bivariate analysis. The STROBE (Strengthening the reporting of observational studies in epidemiology) standard was used to ensure that all relevant reporting was reflected [14].

Ethical considerations

The study was approved by the University of Cape Town Human Research Ethics Committee (HREC REF 260/2021). Informed consent was obtained from all respondents by signed digital agreement prior to opting into performing the self-administered survey questionnaire.

Results

Viewership metrics

During the first year (30 April 2020 – 29 April 2021) after the launch the provincial COVID-19 public dashboard had attracted a total of 2,248,456 views. The breakdown by continental region (Fig. 1) shows that 95.5% of these views have come from Africa, followed by Europe and Northern America with a total of 54,853 and 17,041 total views respectively. These were total normal views and could not discern unique individuated viewership. The pattern of daily viewership differed across the various wave periods (Fig. 2). The first 2 weeks post launch saw the highest median number of daily views at 13,903 (IQR: 13,007 – 17,820). Interest waned over the Wave 1 period with a median daily viewership of 8,950 (IQR: 6,819 – 10,596) until a plateau in the trough (1) period. Both trough (1) and (2) periods saw similar median daily views of 3,469 and 3140 respectively (p = 0.516). Median daily views in Wave 2 were 4,779 (IQR: 3,520—6,711) with a maximum at 10,984 coinciding with the peak of COVID-19 cases in Jan 2021. The distribution of viewership by day of week over the first year showed higher interest during weekdays with the lowest median daily views on Saturdays and Sundays. The weekday group had higher values for number of views (Median = 5,142) than the weekend group (Median = 4,104; p < 0.001).

Fig. 1
figure 1

COVID-19 public dashboard views by continental region

Fig. 2
figure 2

Number of views per day by COVID-19 wave periods and date. 1The Shapiro–Wilk test confirmed a non-normal distribution by noting a p-value 0.014 for Initial 2 weeks of launch and < 0.001 for Wave 2. A Kruskal–Wallis test for these non-normally distributed periods showed that there is a significant difference between these categorical variable COVID-19 Wave Periods in relation to the variable Number of Views p =  < 0.001. 2The Shapiro–Wilk test confirmed a normal distribution with p-values of 0.129 for Wave 1, 0.105 for Trough (1), and 0.073 for Trough (2). A one-factor analysis of variance (ANOVA) for these normally distributed periods has shown that there is a significant difference between these categorical variable COVID-19 Wave Periods and the variable Number of views per day F = 164.87, p =  < 0.001

Sociodemographic characteristics

A total of 794 individuals responded to the survey invitation during the two-week period of recruitment at one year post the initial launch. Table 1 shows the sociodemographic characteristics of the participants (n = 794). Nearly two thirds (63.3%) of respondents were between 40–69 years old with similar proportions seen across both male and female groups [60–69 years old (22.4%), 40–49 years old (21.3%), and 50–59 years old (19.6%)]. Most participants (63.0%) were employed followed by retirees (28.3%). Far fewer participants reported being unemployed or students (5.0% and 3.7% respectively). Most respondents reported their country of origin as South Africa (96.1%) with smaller numbers from the other countries [United States (0.9%), Germany (0.4%) and the United Kingdom (0.3%)]. Reported devices most often used to access the dashboard included a desktop or laptop (65.5%) followed by mobile phones (26.7%) and tablets (7.8%).

Table 1 Survey participant characteristics (n = 794)

Bivariate analysis of survey results

Reported devices used by occupational status

When comparing the association between devices used to access the dashboard and the occupational status of the participants (Fig. 3), we see that students had the highest tendency to use mobile devices compared to other occupational categories (55.2%). Conversely, 68.1% of all retirees, the highest proportion across all occupational groups, used desktop computers/laptops instead. Unemployed participants were somewhat similar to the student group in terms of access via mobile device usage (41.5%) although they still accessed the dashboard the most through a desktop computer/laptop (53.7%).

Fig. 3
figure 3

Proportion devices used by occupational status* (n = 794). *p-value from Fisher’s exact test 0.001

Reported frequency of use by age category

We found that participants in the older age categories viewed the dashboard more frequently (Fig. 4). Of those aged > 70 years old, 65.1% reported viewing the dashboard daily, with a further 17.9% viewing it a few times a week. Frequency of viewing decreased with age with daily viewership found to be 61.7% in 50 to 69 year olds, 43.7% in 30 to 49 year olds and 29.5% in < 29 year olds (p < 0.001).

Fig. 4
figure 4

Proportion of viewers by age group and frequency of viewing* (n = 794). *p-value from Fisher’s exact test < 0.001

Reported impact on personal planning and behaviour by element most often used

Overall, 76.4% of all respondents reported that the dashboard influenced their personal planning and behaviour (Fig. 5). Respondents who used the additional analyses graphs most often were more likely to report that the dashboard impacted on their planning and behaviour (89.7%) compared to those who used the GIS map and graph (67.0%; p = 0.006).

Fig. 5
figure 5

Reported impact on planning and behaviour by element most often used* (n = 794). *p-value from Chi.2 test 0.006

Clarity and understanding, ease of use and overall rating scores by age group

Median Likert-scale ratings of end-user experience were 4/5 for clarity and understanding, 4/5 for ease of use and 8/10 for overall rating with no difference by age category (Table 2).

Table 2 Clarity and understanding, ease of use and overall rating scores by age group (n = 794)

Discussion

The Western Cape public COVID-19 dashboard was well-used in the first year of the pandemic with 2,248,456 views, predominantly by South Africans, with positive end-user experiences.

Firstly, it is important to acknowledge that the delivery of the dashboard was well-communicated to the public by senior political figures at the outset and only 7 weeks after the first case was identified in the Western Cape, South Africa. There was significant media coverage with a resultant highest median daily viewership in the first two weeks of launch. Secondly, most dashboard viewers were from Africa/South Africa. This was reflected in both the aggregate viewership over the first year and in the profile of respondents from the cross-sectional survey. This is reassuring as it reaffirms the statement from Ivanković et al. [7] that a feature of a highly actionable dashboard is one that provides data close to the end-user’s locale. Interestingly, the viewership from North America and Europe are also similarly represented in proportion in both viewership data and sample population from the respondent feedback.

Furthermore, the pattern of viewership over the wave 1, 2 and trough periods provide insight into the differentiated need based on public concern at the time. The increase in daily viewership over the wave 2 period, in particular, provided an indication that the dashboard was referred to as a trustworthy source of information with increased utility during a COVID-19 wave as opposed to between it. The pattern of access also showed an interesting weekday preponderance with lower access over weekends. This may relate to access to the relevant tools needed to view the dashboard as most respondents reported using a computer/laptop to view it or potentially to differences in weekday vs weekend behaviour patterns as it relates the utility of analytic type tools.

The association between devices used most often and occupational status revealed an interesting phenomenon and reflected the characteristics within the underlying population groups. Students accessed the dashboard via their mobile devices far more than any other occupational group. This may be an indication of both comfort in using a mobile device for the purposes of health information and epidemic surveillance as well as access to a mobile device more so than a computer/laptop [15]. A similar picture of increased mobile device access was seen for unemployed participants, but this may be an indication of device access more than comfort in use. Retirees and employed participants displayed a differing affinity to access with the majority opting for a desktop computer/laptop.

The association between frequency of viewing and age groups provided insight into the interest level from the various population groups and potentially toward their motivation for access. There was a consistent pattern of increasing frequency of viewership per increasing age band with the highest frequency of access by those aged > 70 years old. This is, in fact, contrary to research that reported a digital divide between younger groups having higher access and frequency of use to COVID-19 digital tools compared to elderly populations [16]. During the early months of the COVID-19 pandemic there was growing evidence supporting the hypothesis of increased disease severity with increased age [17]. This was confirmed and communicated with local evidence supporting the hypothesis [18]. The increased frequency of viewership in the older age bands in this study may, therefore, be mimicking either the underlying population concern from those most at risk of severe disease or the relative available time in this occupational group to access digital tools.

The fact that just over three quarters of respondents reported that the dashboard had an impact on their personal planning and behaviour speaks to the utility of tools like this amid a pandemic. When overlaying the association between those that reported impact and the element they most often used, it provided an indication to where the most value potentially lies. The results of this study showed that the additional analyses page conferred the highest association with a positive impact on personal planning and behaviour. When looking at what this area contains on the dashboard it becomes clear that it fulfils many of the criteria for a highly actionable dashboard as described by Ivanković et al. [7] including managing the type and volume of data being shown as well as providing sufficient visual cues along with graphs to better explain what is being seen. The engagement necessary to reach this part of the dashboard also seems to indicate that public users are comfortable navigating interactivity in design.

In terms of quality metrics, the dashboard was given relatively high scores in the areas of clarity and understanding, ease of use and overall rating of end-user experience. The study attempted to see if this experience of quality differed by age group but found no such difference with both elderly and young users reporting a similar positive experience.

This evaluation provides insight into the necessity for development of public health data tools within an environment of a health priority as well as the end-user characteristics and needs as it pertains to access, understanding and quality of dashboard tools to communicate the relevant health data. This understanding has implications for public health officials and is pertinent as we transition out from a COVID-19 era and look to apply some of the lessons learnt during COVID-19 towards other communicable and non-communicable diseases.

This study has several limitations. Firstly, although we did not specifically enquire about the end-users’ socio-economic status, we did ask their employment status and the proportions did not seem to align with the demographic characteristics of the Western Cape. Considering that the adult population in the province is ~ 4.8 million with an unemployment rate of 25.8%, the viewership was potentially skewed to higher income, employed cohorts, and low relative to the full adult population [19]. This speaks to the role of dashboards and their relative importance in a context where many people may not have the capacity to access it or plan their personal behaviour in relation to it. This does not negate the importance and value of the dashboard, but it is important to acknowledge the context which is markedly different to a high-income country where one may expect most of the population to access and use tools such as this. Secondly, due to the nature of online recruitment the sampling approach could not avoid the potential bias of subjective selection or nonresponse bias. In other words, there may have been a difference in those that responded to those that did not. Secondly, the timing of the cross-sectional survey was done at one year post the initial launch. This may confer selection bias both in terms of only targeting those that are still making use of the dashboard one year later and that find value in it as well as those that tend to use it more frequently than others. Thirdly, there may have been a combined selection and information bias since the survey was done during an trough period and the type of recall and type of participant may have been different should it have been asked during a wave. Last, the dashboard was not designed for easy mobile device use and may have biased the responses in terms of type of device most often used. A change in the design towards a more mobile-friendly interface may have shifted the responses towards a different modality of access.

Conclusion

The need for research into the delivery of intuitive, easy-to-understand and timely health data for public use is rapidly growing. This is an important consideration as we emerge toward a post-COVID-19 era and need to leverage the lessons for an African context that is often sparse in terms of data availability. The present study findings, though, demonstrate that it is both availability of data and an understanding of end-user need that is important to develop and deliver quality and appropriate health data tools that may ultimately garner public trust, ensure accountability, and influence individual behaviour.

Availability of data and materials

The datasets used and analysed during the current study are available from the corresponding author on reasonable request.

Abbreviations

COVID-19:

Coronavirus Disease

GIS:

Geographic Information System

HIV:

Human Immunodeficiency Virus

IQR:

Interquartile Range

References

  1. Lechner B, Fruhling A. Towards public health dashboard design guidelines. In International Conference on HCI in Business 2014 Jun 22. Springer, Cham. pp. 49–59. Available at: https://doi.org/10.1007/978-3-319-07293-7_5. Accessed 12 Dec 2021

  2. Sarikaya A, Correll M, Bartram L, Tory M, Fisher D. What do we talk about when we talk about dashboards? IEEE Trans Vis Comput Graph. 2018;25(1):682–92. Available at: https://ieeexplore.ieee.org/abstract/document/8443395?casa_token=L7ceYzg2bpMAAAAA:km0QHqy7xoRDdpTT2ILswSwQdOhTm0TejL9Wjd9fAvAhOVcviB2DGfd8aXHJUj5-n3ULgeZpCQ. Accessed 12 Dec 2021.

    Article  Google Scholar 

  3. Kamadjeu R, Gathenji C. Designing and implementing an electronic dashboard for disease outbreaks response-Case study of the 2013–2014 Somalia Polio outbreak response dashboard. Pan Afr Med J. 2017;27(Suppl 3):22. Available at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5745940/. Accessed 12 Dec 2021.

    Google Scholar 

  4. Brooks JC. Incident management systems and building emergency management capacity during the 2014–2016 Ebola epidemic—Liberia, Sierra Leone, and Guinea. MMWR Suppl. 2016;65. Available at: https://www.cdc.gov/mmwr/volumes/65/su/su6503a5.htm. Accessed 12 Dec 2021

  5. Braunstein SL, Coeytaux K, Sabharwal CJ, Xia Q, Robbins RS, Obeng B, Daskalakis DC. New York City HIV care continuum dashboards: Using surveillance data to improve HIV care Among people living With HIV in New York City. JMIR Public Health Surveill. 2019;5(2):e13086. Available at: https://publichealth.jmir.org/2019/2/e13086/?utm_source=TrendMD&utm_medium=cpc&utm_campaign=JMIR_TrendMD_1. Accessed 12 Dec 2021.

    Article  Google Scholar 

  6. Judson SD, Torimiro J, Pigott DM, Maima A, Mostafa A, Samy A, Rabinowitz P, Njabo K. COVID-19 data reporting systems in Africa reveal insights for future pandemics. Epidemiol Infect. 2022;150:e119.

    Article  CAS  Google Scholar 

  7. Ivanković D, Barbazza E, Bos V, Fernandes ÓB, Gilmore KJ, Jansen T, Kara P, Larrain N, Lu S, Meza-Torres B, Mulyanto J. Features constituting actionable COVID-19 dashboards: descriptive assessment and expert appraisal of 158 public web-based COVID-19 dashboards. J Med Internet Res. 2021;23(2):e25682. Available at: https://www.jmir.org/2021/2/e25682. Accessed 14 Dec 2021.

    Article  Google Scholar 

  8. Chambers, D. New dashboard tracks Western Cape’s rocketing Covid-19 infections. Sowetan Live. 2020. Available at: https://www.sowetanlive.co.za/news/south-africa/2020-04-30-new-dashboard-tracks-western-capes-rocketing-covid-19-infections/. Accessed 02 Feb 2022

  9. Willams, M. Winde says infections will rise as Western Cape launches Covid-19 ‘stats dashboard’. News24. 2020. Available at: https://www.news24.com/news24/southafrica/news/winde-says-infections-will-rise-as-western-cape-launches-covid-19-stats-dashboard-20200430. Accessed 02 Feb 2022

  10. Qukula, Q. Where in the province has Covid-19 spread? WC govt launches dashboard [Internet]. Cape Talk. 2020 Apr. Available at: https://www.capetalk.co.za/articles/382334/where-in-the-province-has-covid-19-spread-wc-govt-launches-dashboard. Accessed 02 Feb 2022

  11. Western Cape Government. Covid-19 Cases Dashboard. South Africa. 2022 [updated 2022 September 15]. Available at: https://coronavirus.westerncape.gov.za/vaccine/covid-19-cases-dashboard. Accessed 05 Oct 2022

  12. Charan J, Biswas T. How to calculate sample size for different study designs in medical research? Indian J Psychol Med. 2013;35(2):121–6. Available at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3775042/. Accessed 12 Nov 2021.

    Article  Google Scholar 

  13. Albert B, Tullis T. Measuring the user experience: collecting, analyzing, and presenting usability metrics. Newnes; 2013. Available at: https://www.sciencedirect.com/science/article/pii/B9780124157811000066. Accessed 15 Dec 2021

  14. Vandenbroucke JP, Elm EV, Altman DG, Gøtzsche PC, Mulrow CD, Pocock SJ, Poole C, Schlesselman JJ, Egger M, Strobe Initiative. Strengthening the Reporting of Observational Studies in Epidemiology (STROBE): explanation and elaboration. Ann Intern Med. 2007;147(8):W-163. Available at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2020496/. Accessed 06 Oct 2022.

    Article  Google Scholar 

  15. Cilliers L, Viljoen KL, Chinyamurindi WT. A study on students’ acceptance of mobile phone use to seek health information in South Africa. Health Inf Manag J. 2018;47(2):59–69. Available at:  https://doi.org/10.1177/1833358317706185?casa_token=vl_KC4fXkkAAAAAA:uq2zNkg0Jy55J9ZwYDw6LNUbiTgyHZZtWDwuRU9mvLu_9Gie5e1DmdFnRjYbiIlEcOeMkmmJwKknNQ . Accessed 08 Oct 2022.

    Article  Google Scholar 

  16. Martins Van Jaarsveld G. The effects of COVID-19 among the elderly population: a case for closing the digital divide. Front Psychiatry. 2020:1211. Available at: https://doi.org/10.3389/fpsyt.2020.577427/full?fbclid=IwAR1AsSQQr7RhGiisqVIZ9jdbMJueuTbf7BC_RSazIuStRSJpBbi6k7P6lH8. Accessed 10 Oct 2022

  17. Romero Starke K, Petereit-Haack G, Schubert M, Kämpf D, Schliebner A, Hegewald J, Seidler A. The age-related risk of severe outcomes due to COVID-19 infection: a rapid review, meta-analysis, and meta-regression. Int J Environ Res Public Health. 2020;17(16):5974. Available at: https://www.mdpi.com/1660-4601/17/16/5974. Accessed 29 Mar 2022.

    Article  Google Scholar 

  18. Boulle A, Davies M-A, Hussey H, et al. Risk factors for COVID-19 death in a population cohort study from the Western Cape Province, South Africa. Clin Infect Dis. 2020. Available at: https://academic.oup.com/cid/article/73/7/e2005/5899044?login=true. Accessed 24 Mar 2022

  19. Stats SA. Quarterly Labour Force Survey Quarter 2: 2021. 2021. Available at: http://www.statssa.gov.za/publications/P0211/P02112ndQuarter2021.pdf. Accessed 22 Apr 2022

Download references

Acknowledgements

The authors would like to acknowledge the following organizations without whom this dashboard and its accompanied research would not have been realized:

The Western Cape Department of Health, The Western Cape Provincial Health Data Centre (Health Intelligence Directorate), The Western Cape Health Information Management Directorate, The Western Cape Centre for e-Innovation (Department of the Premier), The University of Cape Town School of Public Health and Family Medicine, and The Bill & Melinda Gates Foundation.

This work was supported, in whole or in part, by the Bill & Melinda Gates Foundation [INV-1164272, INV-004657]. Under the grant conditions of the Foundation, a Creative Commons Attribution 4.0 Generic License has already been assigned to the Author Accepted Manuscript version that might arise from this submission.

Funding

AB received support from the Bill and Melinda Gates Foundation (INV- 1164272, INV-004657).

Author information

Authors and Affiliations

Authors

Contributions

MI conceptualized the study, analysed, and interpreted the cross-sectional participant data and was a major contributor in writing the manuscript. EM and HH assisted in tool development, data collection, manuscript review and interpretation of findings. MP, TJ, MM, MS, TM, JLB, PD and AH assisted in tool development, data generation, pilot data collection as well as manuscript review. IL and MH assisted with developmental revisions of the tool and manuscript review. AB and MAD supported with academic supervision throughout the process, study conceptualization, data availability and manuscript revisions and review. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Muzzammil Ismail.

Ethics declarations

Ethics approval and consent to participate

The study was approved by the University of Cape Town Human Research Ethics Committee (HREC REF 260/2021). All methods were carried out in accordance with ethical guidelines and regulations as determined by the UCT HREC. Informed consent was obtained from all respondents by signed digital agreement prior to opting into performing the self-administered survey questionnaire.

Consent for publication

Not applicable. No individual or identifiable person data is included.

Competing interests

The authors declare that there are no known competing interests as it pertains to the research submitted.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Supplementary Table 1.

Questionnaire with possible response options. Supplementary Table 2. Number of views by continental region. Supplementary Table 3. Number of views per day by COVID-19 period (30 April 2020 – 29 April 2021). Supplementary Figure 1. Number of views per day by COVID-19 wave period. Supplementary Figure 2. Median number of views by day of week. Supplementary Figure 3. Number of views per day by weekday vs weekend*. Supplementary Table 4. Reported devices used by occupational status* (n=794). Supplementary Table 5. Reported frequency of use by age category* (n=794). Supplementary Table 6. Reported impact on personal planning and behaviour by element most often used* (n=794). Supplementary Table 7. Reported purpose of dashboard for participants* (n=1279).

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ismail, M., Morden, E., Hussey, H. et al. Evaluation of a public COVID-19 dashboard in the Western Cape, South Africa: a tool for communication, trust, and transparency. BMC Public Health 22, 2453 (2022). https://doi.org/10.1186/s12889-022-14657-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12889-022-14657-w

Keywords