Skip to main content

Table 2 Steps of the evaluation process provided by the identified evaluation approaches; along with absence or presence of the different practical element retrieved from the analysis

From: Surveillance systems evaluation: a systematic review of the existing approaches

References Organisation Steps Practical evaluation elements
Presence Absence
[5] Structured roadmap Context of the surveillance system - List of evaluation attributes (13) - No case study presentation
Evaluation questions - Lack of visual representation of the results
Process for data collection and management - Lack of information about evaluator(s)
Findings - Definitions of evaluation attributes - Lack of methods and tools for the assessment (only general questions)
Evaluation report - Lack of attributes’ selection matrix
Following up
[18] Structured roadmap - Worksheets (checklist) - - Methods and tools for the assessment: questionnaire and worksheets - No case study presentation
- Lack of information about evaluator(s)
- Visual representation of the results: bar and radar charts - Lack of evaluation attributes
- Lack of definitions of evaluation attributes
- Lack of attributes’ selection matrix
[19] Structured roadmap - Application guide Resources assessment   - No case study presentation
Indicators - Lack of information about evaluator(s)
Data sources assessment
Data management assessment - Methods and tools for the assessment: scoring guide - Lack of evaluation attributes
Data quality assessment - Visual representation of the results (graphs) - Lack of definitions of evaluation attributes
Information dissemination and use - Lack of attributes’ selection matrix
[20] Structured roadmap Plan to evaluation -List of evaluation attributes (10) - No case study presentation
- Lack of visual representation of the results
Prepare to evaluate - Lack of information about evaluator(s)
Conduct the evaluation Definitions of evaluation attributes - Lack of methods and tools for the assessment (only general questions)
Dissemination and use of the results - Lack of attributes’ selection matrix
[21] Structured roadmap Preparation for the evaluation - Type/knowledge of evaluator(s): Ministry of Health (national, provincial or district levels) - No case study presentation
Documentation and evaluation of the surveillance system - List of evaluation attributes (8) - Lack of visual representation of the results
Evaluation of the capacity of the surveillance system - Definitions of evaluation attributes - Lack of methods and tools for the assessment (general questions)
Outcome of the evaluation - Lack of attributes’ selection matrix
[22] General roadmap Initial evaluation - List of evaluation attributes (16) - No case study presentation
- Lack of visual representation of the results
Intermediate evaluation -Definitions of evaluation attributes - Lack of information about evaluator(s)
Final evaluation - Lack of methods and tools for the assessment
- Lack of attributes’ selection matrix
[23] General roadmap Usefulness of the activities and outputs - Type/knowledge of evaluator(s): three to four evaluators (5 years of expertise in surveillance on communicable diseases for the team leader, plus a laboratory expert and an expert in epidemiology) - No case study presentation
- Lack of visual representation of the results
- Lack of definitions of evaluation attributes
Technical performance - Lack of methods and tools for the assessment
Fulfilment of contract objectives - List of evaluation attributes (7) - Lack of attributes’ selection matrix
[24] General roadmap System description - List of evaluation attributes (9) - No case study presentation
- Lack of visual representation of the results
Outbreak detection - Lack of information about evaluator(s)
System experience - Definitions of evaluation attributes - Lack of methods and tools for the assessment (general questions)
Conclusions and recommendations - Lack of attributes’ selection matrix
[25] Structured roadmap - Questionnaire Usefulness of the operation - Type/knowledge of evaluator(s): experts in international surveillance on communicable diseases - No case study presentation
Quality of the outputs - Lack of visual representation of the results
Development of the national surveillance system - Lack of definitions of evaluation attributes
Technical performance - List of evaluation attributes (6) - Lack of methods and tools for the assessment (general questions)
Structure and management - Lack of attributes’ selection matrix
[26] General roadmap Engage the stakeholders - List of evaluation attributes (10) - No case study presentation
Describe the surveillance system - Lack of visual representation of the results
Evaluation design - Lack of information about evaluator(s)
Performance of the surveillance system - Definitions of evaluation attributes - Lack of methods and tools for the assessment (general questions)
Conclusions and recommendations - Lack of attributes’ selection matrix
Findings and lessons learned
[27] Structured roadmap - Questionnaire - Scoring guide - Worksheets Design the evaluation - Case study presentation (c.f. Table 1) - Lack of definitions of evaluation attributes
- Visual representation of the results through diagram representations (pie charts, histogram, radar chart)
Implement the evaluation - Type/knowledge of evaluator(s): requires little knowledge and experience related to surveillance
Finalisation - List of evaluation attributes (10) and performance indicators - Lack of attributes’ selection matrix
- Methods and tools for the assessment: questionnaire, scoring guide and worksheets
[28] Structured roadmap - Application guide Scope of evaluation - Case study application (c.f. Table 1) (Table 1) - Lack of methods and tools for the assessment (only references provided)
Surveillance system characteristics - Visual representation of the results through colour-coding (green, orange, red)
Design the evaluation - Type/knowledge of evaluator(s): “Anyone familiar with epidemiological concepts and with a reasonable knowledge of the disease under surveillance”
Conduct the evaluation
Report - List of evaluation attributes (22)
- Definitions of evaluation attributes
- Attributes’ selection matrix
[29,32] Structured roadmap - Questionnaire - Scoring guide Description of the surveillance system Case study presentation (c.f. Table 1) - Lack of visual representation of the results
Identification of the priority objectives - Lack of information about evaluator(s)
-Lack of evaluation attributes
Building of dashboard and indicators Provides performance indicators - Lack of definitions of evaluation attributes
Implementation and follow-up - Lack of methods and tools for the assessment
Updates and audit - Lack of attributes’ selection matrix
[30] General roadmap Priority setting - Provides performance indicators - No case study presentation
- Lack of visual representation of the results
  - Lack of information about evaluator(s)
Scientific basis and relevance   - Lack of evaluation attributes
Analytic soundness and feasibility   - Lack of definitions of evaluation attributes
Interpretation and utility   - Lack of methods and tools for the assessment
- Lack of attributes’ selection matrix
[31] General roadmap Text analysis - Case study presentation (c.f. Table 1) - Lack of visual representation of the results
- Lack of information about evaluator(s)
Program conceptual model
- Lack of evaluation attributes
Comparison Validation - Lack of definitions of evaluation attributes
- Lack of methods and tools for the assessment
- Lack of attributes’ selection matrix