ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA)
|
|
- Mae Strickland
- 6 years ago
- Views:
Transcription
1 ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA) GUIDELINES FOR IMPLEMENTATION DRAFT SEPTEMBER 28, 2007 The Global Fund to Fight Aids, Tuberculosis and Malaria, Office of the Global AIDS Coordinator, U.S. President s Initiative for AIDS Relief, USAID, WHO, UNAIDS, MEASURE Evaluation, others?
2 Acknowledgments This tool was developed with input of a number of people from various organizations. Those most directly involved in development of the tool included Karen Hardee from MEASURE Evaluation/JSI, Ronald Tran Ba Huy of The Global Fund to Fight AIDS, Tuberculosis and Malaria, Cyril Pervilhac and Yves Souteyrand from the World Health Organization and David Boone from MEASURE Evaluation/JSI. This tool benefited from feedback on the DQA for Auditing from a number of participants in workshops and meetings held in Nigeria, Ghana, Senegal, South Africa, Vietnam, Switzerland and the U.S. For more information, please contact: Karen Hardee or David Boone MEASURE Evaluation/JSI KHardee@jsi.com DBoone@jsi.com Ronald Tran Ba Huy The Global Fund Ronald.Tranbahuy@Theglobalfund.org Cyril Pervilhac WHO pervilhacc@who.int
3 Contents Page 1. Background and Objectives 1 2. The Link Between the M&E System and Data Quality 1 3. Methodology 5 4. Implementation 6 5. Ethical Considerations 11
4 1- BACKGROUND AND OBJECTIVES Programs are increasingly expected to provide high quality data to national governments and donors to measure success in reaching ambitious goals related to diseases such as Acquired Immunodeficiency Syndrome (AIDS), Tuberculosis (TB) and Malaria. In the spirit of the M&E component of the Three Ones 1, the Stop TB Strategy and Roll Back Malaria, and a number of multilateral and bilateral organizations have collaborated to develop tools to help Programs and projects improve the data management and reporting aspects of their M&E systems and to improve overall data quality. 2 The Routine Data Quality Assessment (RDQA) Tool, which is a greatly simplified version of the DQA for Auditing, is presented here. The RDQA provides the methodology to assess the ability of Programs/projects to collect and report quality data and to verify a sample of reported data and to develop and action plan based on the findings. The RDQA can be used flexibly for a number of purposes, including initial assessment of M&E capacity and routine or periodic monitoring. For example, the DQA can be used by Programs and/or partners as well as individual project/ M&E units for monitoring and evaluation, by donors who want to spot-check data quality problems or even individual program sites that may want to verify/check that their reporting is accurate and timely. In addition, the RDQA can help programs/projects prepare for external data quality audits. 3 Objective of the RDQA Verify rapidly that appropriate data management systems are in place; Verify rapidly the quality of reported data for key indicators at selected sites; and Develop an action plan for strengthening the M&E system and improving data quality 2- LINK BETWEEN THE M&E SYSTEM AND DATA QUALITY The RDQA has been developed based on a multidimensional concept of data flows through a Program/project M&E system that operates at three (or more) levels and the seven dimensions of data quality that can pose challenges at each level of the system. Further, the RDQA identifies seven functional areas that need to be addressed to strengthen the data management and reporting system and improve the quality of data the system produces. 1 The Three Ones principles were established in 2004 at a meeting co-hosted by UNAIDS, the United Kingdom and the United States at which key donors reaffirmed their commitment to strengthening national AIDS responses led by the affected countries themselves. The Three Ones include: one agreed HIV/AIDS Action Framework that provides the basis for coordinating the work of all partners; one National AIDS Coordinating Authority, with a broad-based multisectoral mandate; and one agreed country-level Monitoring and Evaluation System ( Nov. 27, 2006). 2 The two main tools include the Monitoring and Evaluation Systems Strengthening Tool (MESST) and the Data Quality Assessment Tool (with versions for auditing and for routine assessment). 3 The U.S. Government has undertaken a number of program audits of PEPFAR programs that have included data quality and the Global Fund is undertaking data quality audits on a subset of its grants each year. 1
5 A- Levels of the M&E System Data collected, aggregated and reported to measure indicators flows through an M&E system that begins with the recording of an encounter between a client and program staff member, a commodity distributed, or a person trained. Data are collected on source documents (e.g. patient records, client intake sheets, registers, training registers, commodity distribution logs, etc.) The data from source documents is aggregated and sent to a higher level (e.g. a district, a partner or prime recipient or a sub-partner or a sub-recipient) for further aggregation before being sent to the next level, culminating in aggregation at the highest level of a program (e.g. the M&E Unit of a National Program, the Principle Recipient of a Global Fund grant, or the SI Unit of a USG program). The data from countries is frequently sent to international offices for global aggregation to show progress in meeting goals related to health initiatives. Figure 1 illustrates this data flow in data management and reporting system that includes service sites, districts and national M&E Unit. Each country and program/project may have a different data flow. Challenges to data quality can occur at each of these levels. Figure 1. Data Flow Through an Illustrative M&E System NATIONAL Monthly Report Region 1 65 Region 2 75 Region TOTAL 390 District 1 Monthly Report SDP 1 45 SDP 2 20 TOTAL 65 District 2 Monthly Report SDP 4 75 TOTAL 75 District 3 Monthly Report SDP 4 50 SDP TOTAL 250 Service Delivery Point 1 Service Delivery Point 2 Service Delivery Point 3 Service Delivery Point 4 Service Delivery Point 5 Monthly Report Monthly Report Monthly Report Monthly Report Monthly Report ARV Nb. 45 ARV Nb. 20 ARV Nb. 75 ARV Nb. 50 ARV Nb. 200 Source Doc1 Source Doc 1 Source Doc 1 Source Doc 1 Source Doc 1 B- Data Quality Dimensions The RDQA is grounded in the components of data quality, namely, that Programs/projects need accurate and reliable data that are complete, timely, precise, and maintained under conditions of confidentiality, when appropriate (see Table 1). Furthermore the data must have integrity and be maintained with appropriate levels of confidentiality to be considered credible. 2
6 Dimensions of data quality Main dimensions of data quality Accuracy Reliability Sub dimensions of data quality Precision Completeness Timeliness Integrity Table 1. Data Quality Dimensions Operational Definition Also known as validity. Accurate data are considered correct: the data measure what they are intended to measure. Accurate data minimize error (e.g., recording or interviewer bias, transcription error, sampling error) to a point of being negligible. The data generated by a program s information system are based on protocols and procedures that do not change according to who is using them and when or how often they are used. The data are reliable because they are measured and collected consistently. This means that the data have sufficient detail. For example, an indicator requires the number of individuals who received HIV counseling & testing and received their test results by sex of the individual. An information system lacks precision if it is not designed to record the sex of the individual who received counseling and testing. Completeness means that an information system from which the results are derived is appropriately inclusive: it represents the complete list of eligible persons or units and not just a fraction of the list. Data are timely when they are up-to-date (current), and when the information is available on time. Timeliness is affected by: (1) the rate at which the program s information system is updated; (2) the rate of change of actual program activities; and (3) when the information is actually used or required. Data have integrity when the system used to generate them are protected from deliberate bias or manipulation for political or personal reasons. Confidentiality Confidentiality means that clients are assured that their data will be maintained according to national and/or international standards for data. This means that personal data are not disclosed inappropriately, and that data in hard copy and electronic form are treated with appropriate levels of security (e.g. kept in locked cabinets and in password protected files. C- Functional Areas to Strengthen Data Management and Reporting and Data Quality To address data quality challenges throughout the data management and reporting system, it is important to focus on the key functional areas that programs/projects can address. Table 2 shows these functional areas and the questions that need to be answered in determining the strength of the M&E system. 3
7 I II III IV V I Table 2. M&E Functional Area and Key Questions to Address Data Quality Functional Areas M&E Capabilities, Roles and Responsibilities 1 Training Indicator Definitions Data Reporting Requirements Data Collection and Reporting Forms and Tools Data Management Processes and Data Quality Controls Links with National Reporting System Questions Are key M&E and data-management staff identified with clearly assigned responsibilities? Have the majority of key M&E and datamanagement staff received the required training? Are there operational indicator definitions meeting relevant standards that are systematically followed by all service points? Has the Program/project clearly documented (in writing) what is reported to who, and how and when reporting is required? Are there standard data-collection and reporting forms that are systematically used? Are data recorded with sufficient precision/detail to measure relevant indicators? Are data maintained in accordance with international or national confidentiality guidelines? Are source documents kept and made available in accordance with a written policy? Does clear documentation of collection, aggregation and manipulation steps exist? Are data quality challenges identified and are mechanisms in place for addressing them? Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports? Are there clearly defined and followed procedures to periodically verify source data? Does the data collection and reporting system of the Program/project link to the National Reporting System? Dimension of Data Quality Accuracy, Reliability Accuracy, Reliability Accuracy, Reliability Accuracy, Reliability, Timeliness, Completeness Accuracy, Reliability Accuracy, Precision Confidentiality Ability to assess Accuracy, Precision, Reliability, Timeliness, and Integrity, and Confidentiality Accuracy, Reliability Accuracy, Reliability Accuracy, Reliability Ability to assess Accuracy, Precision, Reliability, Timeliness, and Integrity, and Confidentiality To avoid parallel systems and undue multiple reporting burden 4
8 Table 2. M&E Functional Area and Key Questions to Address Data Quality on staff in order to increase data quality. Answers to these 13 questions can help highlight data quality challenges and the related aspects of the data management and reporting system. For example, if data accuracy is an issue, the RDQA can help assess if reporting entities are using the same indicator definitions, if they are collecting the same data, on the same forms, using the same instructions. The RDQA can help assess if roles and responsibilities are clear (e.g. all staff know what data they are collecting and reporting, when, to who and how) and if staff have received relevant training. It can highlight if indicator definitions are standard, etc. These are referred to in the RDQA as the functional areas of the M&E system related to collecting, aggregating and reporting data to measure indicators. Figure 2 shows the conceptual links between data quality, the levels of the M&E system and the functional areas that are embedded in the RDQA. 5
9 3- METHODOLOGY B The RDQA Checklist The RDQA has a checklist (currently in Word, eventually also in Excel) with sections for service sites, intermediate sites and a central M&E Unit. The checklist for each of these levels includes questions to assess the data management and reporting system and instructions for recounting data and comparing with data that have been previously reported. A copy of the Word version of the RDQA checklist is found in Annex Assessment of Data Management and Reporting Systems The purpose of assessing the data management and reporting system is to identify potential challenges to data quality created by the design and implementation of data management and reporting systems. The system assessment portion of the checklist is arranged into the seven functional areas with 13 key summary questions that are critical to evaluating whether the Program/project(s) data management system is well designed and implemented to produce quality data (Table 2 above). 2- Verifying Data The purpose of this part of the DQA is to assess on a limited scale if service delivery and intermediate aggregation sites are collecting and reporting data to measure the indicator(s) accurately, completely and on time and to cross check the reported results with other data sources. 6
10 4- IMPLEMENTATION The DQA for self-assessment and improvement can be implemented in four steps: Step 1 PREPARATION Determine the purpose, levels of the system to be included, the indicator(s) and data sources and the time period of the RDQA. Determine the facilities/sites to be visited and notify the facilities/sites. A. Purpose The RDQA checklist can be used for: Initial assessment of M&E systems established by new implementing partners (or in decentralized systems) to collect, manage and report data. Routine supervision of data management and reporting systems and data quality at various levels. For example, routine supervision visits may include checking on a certain time period worth of data (e.g. one day, one week or one month) at the service site level, whereas periodic assessments (e.g. quarterly, biannually or annually) could be carried out at all levels to assess the functioning of the entire Program/project s M&E system. Periodic assessment by donors of the quality of data being provided to them (this use of the DQA could be more frequent and more streamlined than official data quality audits that use the DQA for Auditing) but less frequent than routine monitoring of data. Preparation for a formal data quality audit. The RDQA is flexible for all of these uses. Countries and programs are encouraged to adapt the checklist to fit local program contexts. B. Levels of the M&E System to be Included Part of determining the scope of the RDQA is to decide what levels of the M&E system will be included in the assessment service sites, intermediate aggregation sites and a central M&E unit. The levels will depend on the program that is the subject of the RDQA. The levels should be determined once the appropriate levels have been identified and mapped (e.g. there are 100 sites providing the service in 10 districts). Reports from sites are sent to districts, which then send reports to the M&E Unit of the national program. In some cases, the data flow map will include more than one intermediate site (e.g. regions, provinces or states or multiple levels of program organizations). C. Indicator(s), Data Sources and Reporting Period The RDQA is designed to assess the M&E systems and to verify data related to indicators that are reported to programs or donors. Therefore, it is important to select one or more indicators or at least program areas to serve as the subject of the RDQA. This choice will be based on the list of indicators reported by the Program/project. For example, a program or project 7
11 focusing on the program area of treatment for HIV may report indicators of numbers of people on ART. Another program or project may focus on meeting the needs of orphans or vulnerable children therefore the indicators for that project would be from the OVC program area. A malaria Program/project might focus on providing insecticide-treated bed nets (ITN) or on treating people for malaria or on both of those activities. Therefore a Program/project may report one indicator or (more likely) multiple indicators to measure success. Table 3 shows an example of program areas, the service delivery areas related to AIDS, TB and Malaria covered under Global Fund grants. Other donors may have somewhat different program areas. Table 3. Illustrative Example of Program Areas (Global Fund Service Delivery Areas for AIDS, TB and Malaria) GLOBAL FUND SERCE DELIVERY AREAS HIV/AIDS Prevention Condom Distribution Testing and Counseling PMTCT STI Diagnosis and Treatment Behavioral Change Communication - Community Outreach Treatment Antiretroviral Treatment (ART) and Monitoring Prophylaxis and Treatment for Opportunistic Infections Care and Support Support for the chronically ill and families Support for orphans and other children (OVC) MALARIA TB Prevention Insecticide-Treated Nets Vector control (other than ITNs) Malaria Prevention in Pregnancy (IPT) Behavioral Change Communication - Community Outreach Treatment Prompt, Effective Antimalaria Treatment Home based Management of Malaria DOTS For each program area, a number of indicators are measured, through various data sources. For example, under the Disease TB, in the program area Treatment, the international community has agreed to the harmonized indicator: Number of new smear positive TB cases that successfully complete treatment. The data source for this indicator is facility-based and 8
12 the source documents are the district TB register along with the facility register and patient treatment cards. As another example related to the Disease AIDS, under the U.S. President s Initiative for AIDS Relief (PEPFAR), a program area is Orphans and Vulnerable Children, and an indicator is: Number of OVC served by OVC programs (disaggregated by male, female, primary direct and supplemental direct). The data source for this indicator will be at community-based organizations that serve OVC and the source documents will be client records (intake forms, daily logs, registers, etc). The RDQA can be implemented on one or more indicators, but it is important to keep in mind that data come from various sources, most notably facility-based, community-based, and commodity distribution-based data sources. When planning the RDQA, it is important to determine the data sources that will need to be assessed related to the indicator(s) and program area(s). The RDQA designed to assess data related to indicators during a specific (selected) time period, generally a reporting period. Using a specified reporting period gives a reference from which to compare the recounted data, and is the recommended method for the RDQA. If sites are large or reporting periods long (e.g. six months), the staff or team assessing the data might decide to review source documents for one day, one week or one month and to cross-check the data with registers or laboratory or pharmacy records. D. Determine and Notify Sites to be Visited The types of sites included will depend on the scope of the RDQA. Once the scope and levels have been set, the actual sites to be visited should be selected purposively. The list of sites may include those: Due for routine M&E monitoring during a given supervision cycle. Selected based on the indicator(s) under review or based on some other characteristic (e.g. all NGO sites offering the related service or all implementing partners offering services in a program area, such as community-based prevention programs for AIDS, TB or Malaria). Selected sites of a new implementing partner/subpartner or new principal recipient/subrecipients. Sites should be notified prior to the visit for the data quality assessment. This notification is important in order for appropriate staff to be available to answer the M&E systems questions in the checklist and to facilitate the data verification by providing access to relevant source documents. Step 2 SITE SITS Assess the Program/project s data management and reporting system at all or selected level of the M&E Unit, any intermediate aggregation level and at selected service sites Verify data for selected indicator(s) from source documents and compare with reported results. 9
13 Part 1 of the RDQA checklist is the systems assessment should be administered at each of the levels of the M&E system that is included in the RDQA. There is a specific worksheet or part of the checklist for each level included in the RDQA (e.g. worksheets for service sites, intermediate aggregation sites and the M&E Unit). The questions should be asked of the staff member(s) at each level who are most familiar with the data management and reporting system. Table 2 above shows the seven functional areas of the M&E system that are included in the assessment of the data management and reporting system. Relevant questions are asked at each level of the M&E system. Part 2 of the RDQA, data verifications, should also be filled out for each level included. At service sites, recounting includes recounting the number of people, cases or events recorded during the reporting period as contained in the relevant source documents. This number is compared to the number reported by the site during the reporting period (on a summary report sent to the next level)). At the Intermediate Aggregation Level, recounting includes re-aggregation of the numbers from reports received from all service delivery points and a comparison of that number with the aggregated result that was contained in the summary report prepared by the intermediate aggregation level and submitted to the next level. At the M&E Unit, recounting includes reaggregating reported numbers from all reporting entities and comparing them to the summary report that was prepared by the M&E Unit for the reporting period. At this level, the reports should also be reviewed to count the number that are on available, on time, and complete all measures of data quality. Alternatively, the recount can be simplified by comparing the recounted results in the relevant register to the summary report. Using this method, a sample of names from the register can be selected using the methodology for the cross checks in Part 2 of the checklist. Those entries on the register can be compared to the same information on the source documents (e.g. name, age, sex, diagnosis, treatment date(s), treatment regimen, etc. as relevant for the indicator). If errors are found, the methodology of recounting from source documents (described in the paragraph above) should be used to thoroughly check the data. Summary statistics that can be calculated from the RDQA include: Service Sites: Strength of the Data Management System The seven functional areas can be assessed using the 13 summary questions listed in Table 2 and appended to each checklist (note: in the Excel version to come, this will be automatically generated). Service Site Data Verification Rate (the difference between recounted and reported numbers). Under 100% indicates over reporting and over 100% indicates under reporting of results. Intermediate Aggregation Sites: Strength of the Data Management System The seven functional areas can be assessed using the 13 summary questions listed in Table 2 and appended to each checklist (note: in the Excel version to come, this will be automatically generated). Intermediate Aggregation Site Data Verification Rate (the difference between recounted and reported numbers). Under 100% indicates over reporting and over 100% indicates under reporting of results. % of Reports 10
14 o o o Available On time Complete M&E Unit: Strength of the Data Management System The seven functional areas can be assessed using the 13 summary questions listed in Table 2 and appended to each checklist (note: in the Excel version to come, this will be automatically generated). M&E Unit Data Verification Rate (the difference between recounted and reported numbers). Under 100% indicates over reporting and over 100% indicates under reporting of results. % of Reports o Available o On time o Complete Step 3 ACTION PLAN Based on the findings from the RDQA, prepare an action plan for strengthening the data management system and for improving the quality of data. Part 3 of the RDAQ has a template for an action plan that is based on the findings from the RDQA and includes follow up actions, responsibilities, timelines and resource needs. Examples of findings related to the design and implementation of the Program/project s data collection, reporting and management systems include: The lack of documentation describing aggregation and data manipulation steps. Unclear and/or inconsistent directions provided to reporting sites about when or to whom report data is to be submitted. The lack of designated staff to review and question submitted site reports. The lack of a formal process to address incomplete or inaccurate submitted site reports, The lack of a required training program for site data collectors and managers. Differences between program indicator definitions and the definition as cited on the Program/project s data collection forms. The lack of standard data collection forms. Examples of findings related to verification of data produced by the data management system could include: A disconnect between the delivery of services and the filling out of source documents. Incomplete or inaccurate source documents. Data entry and/or data manipulation errors. Misinterpretation or inaccurate application of the indicator definition. Step 4 FOLLOW UP Implement activities included in the RDQA action plan and follow up with relevant sites/locations to insure implementation. 11
15 The purpose of the RDQA is to strengthen M&E systems related to collecting, managing and reporting data related to indicators. It will be important to follow-up to ensure that strengthening measures identified in the action plan are actually carried out. 5- ETHICAL CONSIDERATIONS Data quality assessments must be conducted with the utmost adherence to the ethical standards of the country. While those undertaking the RDQA may require access to personal information (e.g. medical records), under no circumstances should any personal information be disclosed in relation to the conduct of the assessment. 12
16 Annex 1: RDQA Checklist (with sheets for Service Sites, Intermediate Aggregation Sites and the Central M&E Unit) 13
17 ROUTINE DATA QUALITY ASSESSMENT Service Delivery Points Assessment and Verification Sheet This ROUTINE DATA QUALITY ASSESSMENT (RDQA) framework is comprised of three complementary checklists which enable to assess the quality of data at each level of the datacollection and reporting system (i.e., M&E Unit, Intermediate Aggregation Levels and Service Delivery Points). This questionnaire should be used at Service Delivery Points. It is divided in three parts: PART 1 Systems Assessment. Assessment of the systems in place to collect, check and report data to the next level of aggregation (e.g. District, Province or National). PART 2 - Data Verifications. Review of the availability, completeness and accuracy of reported data for a given time period. PART 3 Action Plan. Summarize key findings and needed improvements. Identify issues to be followed up at the various levels of the M&E system. Background Information Name of Program/project Name of Funding Agency (if appropriate) Name of Central M&E Unit Indicator(s) Selected Reporting Period Verified Documentation Reviewed Date of Verifications PART 1 - Systems Assessment This assessment is designed to help identify potential risks to data quality linked to the datacollection and reporting system at Service Delivery Sites. Data Management System Questions Functional Area* Yes / No Comments 1 There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit). I 2 The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff. I 14
18 Data Management System Questions Functional Area* Yes / No Comments 3 All relevant staff have received training on the data management processes and tools. II 4 The M&E Unit has provided written guidelines to the Service Delivery Point on reporting requirements and deadlines. IV 5 Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools. I, II, V 6 The source documents and reporting forms/tools specified by the M&E Unit are consistently used by the Service Delivery Point. V 7 All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). V 8 If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). 9 If applicable, there is a written back-up procedure for when data entry or data processing is computerized. 10 If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly). 11 The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics). III, V 12 Relevant personal data are maintained according to national or international confidentiality guidelines. 13 The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc). 14 The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died. 15
19 Data Management System Questions Functional Area* Yes / No Comments 15 When applicable, the data are reported through a single channel of the national reporting system. I * 7 Functional areas include: I. M&E Capabilities, Roles and Responsibilities II. Training (of M&E staff) III. Indicator Definitions IV. Data Reporting Requirements V. Data Collection and Reporting Forms and Tools. Data Management Processes and Data Quality Controls I. Links with National Reporting System PART 2 - Data Verifications A- Documentation Review: Review availability and completeness of all indicator source documents for the selected reporting period. Yes / No Comments 1 Review available source documents for the reporting period being verified. Is there any indication that source documents are missing? If yes, determine how this might have affected reported numbers. 2 Are all available source documents complete? If no, determine how this might have affected reported numbers. 3 Review the dates on the source documents. Do all dates fall within the reporting period? If no, determine how this might have affected reported numbers. 16
20 B- Recounting reported Results: Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any). % or Nb. Comments 4 Recount the number of people, cases or events recorded during the reporting period by reviewing the source documents. [A] 5 Copy the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] 6 Calculate % difference between recounted and reported numbers. [A/B] 7 What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)? Note: Recounting can be simplified by recounting from registers and comparing to summary reports. The registers can be verified using cross checks with source documents (e.g. client records). See below C- for cross checks. C- Cross-check reported results with other data sources: For example, cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards). 8 List the documents used for performing the cross-checks. 9 Describe the cross-checks performed? 10 What are the reasons for the discrepancy (if any) observed? 17
21 PART 3 - Recommendations for the Service Site Based on the findings of the systems review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program. Description of Action Point Requirement or Recommendation (please specify) Timeline I II III IV V I Functional Areas Systems Assessment Questions by Functional Area (for reference) M&E Capabilities, Roles and Responsibilities Training Indicator Definitions Data Reporting Requirements Data Collection and Reporting Forms and Tools Data Management Processes and Data Quality Controls Links with National Reporting System Summary Questions Are key M&E and data-management staff identified with clearly assigned responsibilities? Have the majority of key M&E and data-management staff received the required training? Are there operational indicator definitions meeting relevant standards that are systematically followed by all service points? Has the Program/project clearly documented (in writing) what is reported to who, and how and when reporting is required? Are there standard data-collection and reporting forms that are systematically used? Are data recorded with sufficient precision/detail to measure relevant indicators? Are data maintained in accordance with international or national confidentiality guidelines? Are source documents kept and made available in accordance with a written policy? Does clear documentation of collection, aggregation and manipulation steps exist? Are data quality challenges identified and are mechanisms in place for addressing them? Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports? Are there clearly defined and followed procedures to periodically verify source data? Does the data collection and reporting system of the Program/project link to the National Reporting System? 18
22 ROUTINE DATA QUALITY ASSESSMENT Intermediate Levels Assessment and Verification Sheet This ROUTINE DATA QUALITY ASSESSMENT (RDQA) framework is comprised of three complementary checklists which enable to assess the quality of data at each level of the datacollection and reporting system (i.e., M&E Unit, Intermediate Aggregation Levels and Service Delivery Points). This questionnaire should be used at the Intermediate Aggregation Levels (e.g., districts, regions, provinces). It is divided in two parts: PART 1 - Data Verifications. Review of the availability, completeness and accuracy of reported data for a given time period. PART 2 Systems Assessment. Assessment of the systems in place to aggregate data from sub-reporting levels (e.g. Service Delivery Points) and report results to the next level (e.g. National M&E Unit). PART 3 Action Plan. Summarize key findings and needed improvements. Identify issues to be followed up at the various levels of the M&E system. Background Information Name of Program/project Name of Funding Agency (if appropriate) Name of Central M&E Unit Indicator(s) Selected Reporting Period Verified Documentation Reviewed Date of Verifications PART 1 - Systems Assessment This assessment is designed to help identify potential risks to data quality linked to the datacollection and reporting system at Intermediate Aggregation Levels (e.g., districts, regions, provinces). Data Management System Questions Functional Area Yes / No Comments 1 There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., I 19
23 Data Management System Questions service points). Functional Area Yes / No Comments 2 There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit). I 3 All relevant staff have received training on the data management processes and tools. II 4 The M&E Unit has provided written guidelines to the Intermediate Aggregation Level on reporting requirements and deadlines. IV 5 Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools. I, II, IV, V 6 The source documents and reporting forms/tools specified by the M&E Unit are consistently used by all reporting levels. V 7 All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). V 8 Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness). 9 If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). 10 If applicable, there is a written back-up procedure for when data entry or data processing is computerized. 11 If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly). 12 Relevant personal data are maintained according to national or international confidentiality guidelines. 13 The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc). 20
24 Data Management System Questions Functional Area Yes / No Comments 14 The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died. 15 There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues. 16 If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved. 17 When applicable, the data are reported through a single channel of the national reporting system. I * 7 Functional areas include: I. M&E Capabilities, Roles and Responsibilities II. Training (of M&E staff) III. Indicator Definitions IV. Data Reporting Requirements V. Data Collection and Reporting Forms and Tools. Data Management Processes and Data Quality Controls I. Links with National Reporting System PART 2 - Data Verifications D- Re-aggregate reported numbers from all Service Delivery Points: Reported results from all Service Delivery Points should be re-aggregated and the total compared to the number contained in the summary report prepared by the Intermediate Aggregation Site. % or Nb. Comments 4 What aggregated result was contained in the summary report prepared by the Intermediate Aggregation Site (and submitted to the next reporting level)? [A] 5 Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [B] 6 Calculate % difference between recounted and reported numbers. [A/B] 7 What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)? 21
25 E- Review availability, completeness and timeliness of reports from all Service Delivery Points. How many reports should there have been from all Service Delivery Points? How many are there? Were they received on time? Are they complete? % or Nb. Comments 1 How many reports should there have been from all Service Delivery Points? [A] 2 How many reports are there? [B] 3 Calculate % Available Reports [B/A] 4 Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C] 5 Calculate % On time Reports [C/A] 6 How many reports were complete? (i.e., complete means that the report contained all the required indicator data*). [D] 7 Calculate % Complete Reports [D/A] * For a report to be considered complete, it should at least include (1) the reported count relevant to the indicator; (2) the reporting period; (3) the date of submission of the report; and (4) a signature from the staff having submitted the report. PART 3 Recommendations for the Intermediate Aggregation Level Based on the findings of the systems review and data verification at the intermediate aggregation site, please describe any compliance requirements or recommended strengthening measures, with an estimate of the length of time the improvement measure could take. See systems assessment functions by function area (table below) for review of system). Action points should be discussed with the Program. Description of Action Point Requirement or Recommendation (please specify) Timeline Add more rows, as needed 22
26 I II III IV V I Functional Areas Systems Assessment Questions by Functional Area (for reference) M&E Capabilities, Roles and Responsibilities Training Indicator Definitions Data Reporting Requirements Data Collection and Reporting Forms and Tools Data Management Processes and Data Quality Controls Links with National Reporting System Summary Questions Are key M&E and data-management staff identified with clearly assigned responsibilities? Have the majority of key M&E and data-management staff received the required training? Are there operational indicator definitions meeting relevant standards that are systematically followed by all service points? Has the Program/project clearly documented (in writing) what is reported to who, and how and when reporting is required? Are there standard data-collection and reporting forms that are systematically used? Are data recorded with sufficient precision/detail to measure relevant indicators? Are data maintained in accordance with international or national confidentiality guidelines? Are source documents kept and made available in accordance with a written policy? Does clear documentation of collection, aggregation and manipulation steps exist? Are data quality challenges identified and are mechanisms in place for addressing them? Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports? Are there clearly defined and followed procedures to periodically verify source data? Does the data collection and reporting system of the Program/project link to the National Reporting System? 23
27 ROUTINE DATA QUALITY ASSESSMENT M&E Unit Assessment and Verification Sheet This ROUTINE DATA QUALITY ASSESSMENT (RDQA) framework is comprised of three complementary checklists which enable to assess the quality of data at each level of the datacollection and reporting system (i.e., M&E Unit, Intermediate Aggregation Levels and Service Delivery Points). This questionnaire should be used at the central M&E Unit. It is divided in three parts: PART 1 - Data Verifications. Review of the availability, completeness and accuracy of reported data for a given time period. PART 2 Systems Assessment. Assessment of the design and implementation at the M&E Unit of the data management and reporting system. PART 3 Follow up Recommendations and Action Plan. Summarize key findings and needed improvements. Identify issues to be followed up at the various levels of the M&E system. Background Information Name of Program/project Name of Funding Agency (if appropriate) Name of Central M&E Unit Indicator(s) Selected Reporting Period Verified Documentation Reviewed Date of Verifications PART 1 - Systems Assessment This assessment is designed to help identify potential risks to data quality linked to the design and implementation of the data-management and reporting system. Question Functional area Yes / No Comments 1 There is a documented organizational structure/chart that clearly identifies positions that have data management responsibilities at the M&E Unit. (to specify which Unit: e.g. MoH, NAP, GF, World Bank) I 24
28 Question Functional area Yes / No Comments 2 All staff positions dedicated to M&E and data management systems are filled. I 3 A senior staff member (e.g., the Program Manager) is responsible for reviewing the aggregated numbers prior to the submission/release of reports from the M&E Unit. I 4 There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness, timeliness and confidentiality ) received from sub-reporting levels (e.g., regions, districts, service points). I 5 There is a training plan which includes staff involved in data-collection and reporting at all levels in the reporting process. II 6 All relevant staff have received training on the data management processes and tools. II 7 The M&E Unit has documented and shared the definition of the indicator(s) with all relevant levels of the reporting system (e.g., regions, districts, service points). III 8 There is a description of the services that are related to each indicator measured by the Program/project. III 9 The M&E Unit has provided written guidelines to all reporting entities (e.g., regions, districts, service points) on reporting requirements and deadlines. IV 10 If multiple organizations are implementing activities under the Program/project, they all use the same reporting forms and report according to the same reporting timelines. V 11 The M&E Unit has identified a standard source document (e.g., medical record, client intake form, register, etc.) to be used by all service delivery points to record service delivery. V 12 The M&E Unit has identified standard reporting forms/tools to be used by all reporting levels. V 25
29 Question Functional area Yes / No Comments 13 Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools. I, II, V 14 The data collected by the M&E system has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies disaggregation by these characteristics). V 15 There is a written policy that states for how long source documents and reporting forms need to be retained. V 16 All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system). V 17 The M&E Unit has clearly documented data aggregation, analysis and/or manipulation steps performed at each level of the reporting system. 18 Feedback is systematically provided to all subreporting levels on the quality of their reporting (i.e., accuracy, completeness and timeliness). 19 There are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc). 20 There is a written back-up procedure for when data entry or data processing is computerized. 21 If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly). 22 Relevant personal data are maintained according to national or international confidentiality guidelines. 23 The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc). 26
30 Question Functional area Yes / No Comments 24 The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died. 25 There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with sub-reporting levels on data quality issues. 26 If data discrepancies have been uncovered in reports from sub-reporting levels, the M&E Unit (e.g., districts or regions) has documented how these inconsistencies have been resolved. 27 The M&E Unit can demonstrate that regular supervisory site visits have taken place and that data quality has been reviewed. 28 When applicable, the data are reported through a single channel of the national reporting system. I * 7 Functional areas include: I. M&E Capabilities, Roles and Responsibilities II. Training (of M&E staff) III. Indicator Definitions IV. Data Reporting Requirements V. Data Collection and Reporting Forms and Tools. Data Management Processes and Data PART 2 - Data Verifications F- Re-aggregate reported numbers from all reporting entities. Reported results from all reporting entities (e.g., Districts or Regions) should be re-aggregated and the total compared to the number contained in the summary report prepared by the M&E Unit. % or Nb. Comments 4 What aggregated result was contained in the summary report prepared by the M&E Unit? [A] 5 Re-aggregate the numbers from the reports received from all reporting entities. What is the re-aggregated number? [B] 6 Calculate % difference between recounted and reported numbers. [A/B] 27
Botswana s Integration of Data Quality Assurance Into Standard Operating Procedures
Botswana s Integration of Data Quality Assurance Into Standard Operating Procedures ADAPTATION OF THE ROUTINE DATA QUALITY ASSESSMENT TOOL MEASURE Evaluation SPECIAL REPORT Botswana s Integration of Data
More informationSTRENGTHENING TASO S MONITORING AND EVALUATION SYSTEM
STRENGTHENING TASO S MONITORING AND EVALUATION SYSTEM By BENNET JOSEPH KIZITO B.STAT (MUK), Msc. IS (MUK) MEDIUM TERM FELLOWSHIP PROGRAMME 2012 1 TABLE OF CONTENTS Contents DECLARATION... i ACKNOWLEGEMENT...
More informationSVNMM060 Open to Internal and External Candidates
SVNMM060 Open to Internal and External Candidates Position Title : Medical Officer (TB) Duty Station : Mawlamyine Township, Mon State, Myanmar Classification : Field Service Staff, FS-7 Type of Appointment
More informationUsing UNAIDS s organizing framework to assess Nigeria s national HIV monitoring and evaluation system
Vol.2, No.3, 372-378 (2012) doi:10.4236/ojpm.2012.23054 Open Journal of Preventive Medicine Using UNAIDS s organizing framework to assess Nigeria s national HIV monitoring and evaluation system Kayode
More informationThe Uganda Experience S MS. S o u t h. W e s t e r n. R e g i o n
S MS The Uganda Experience S o u t h W e s t e r n R e g i o n Outline Background USAID co-ordination team Procedures Successes Challenges Lessons learnt and recommendations Discussions Background USAID
More informationWorkstream IV: Monitoring and Evaluation Background note: Overview of evaluation mechanisms in selected funds
Workstream IV: Monitoring and Evaluation Background note: Overview of evaluation mechanisms in selected funds I. Introduction 1. Decision 1/CP.16 requires that the Transitional Committee shall develop
More informationLet s get started with the module Essential Data Steps: A Self Assessment.
Welcome to Data Academy. Data Academy is a series of online training modules to help Ryan White Grantees be more proficient in collecting, storing, and sharing their data. Let s get started with the module
More informationHSCIC Audit of Data Sharing Activities:
Directorate / Programme Data Dissemination Services Project Data Sharing Audits Status Approved Director Terry Hill Version 1.0 Owner Rob Shaw Version issue date 20/04/2016 HSCIC Audit of Data Sharing
More informationSupply Chain Management in HIV/AIDS Programs
Chapter 18. in HIV/AIDS Programs Chapter 18 in HIV/AIDS Programs 1. Introduction HIV/AIDS prevention, care and treatment, and mitigation programs cannot succeed without a reliable and consistent supply
More informationDATA QUALITY POLICY. Version: 1.2. Management and Caldicott Committee. Date approved: 02 February Governance Lead
DATA QUALITY POLICY Version: 1.2 Approved by: Date approved: 02 February 2016 Name of Originator/Author: Name of Responsible Committee/Individual: Information Governance, Records Management and Caldicott
More informationTIER II STANDARD FOR TECHNICAL COOPERATION ADMINISTRATORS INTRODUCTION
Job Classification Manual Page 1 of 49 TIER II STANDARD FOR TECHNICAL COOPERATION ADMINISTRATORS INTRODUCTION 1. This grade level standard illustrates the application of the ICSC Master Standard (Tier
More informationbased Tuberculosis Recording and Reporting System in China
Web-based based Tuberculosis Recording and Reporting System in China National Center for Tuberculosis Control and Prevention, China-CDC CDC 11 March 2011 Overview Infectious Disease Surveillance System
More informationAudit Project Process Overview 1/18/ Compliance and Audit Symposium. Agenda. How to Kick-start your. Audit Planning and Risk Assessment
2013 Compliance and Audit Symposium How to Kick-start your Audit Planning and Risk Assessment Jaime Jue, Associate Director, UC Berkeley David Meier, Manager Campus Audits, UC San Diego January 2013 Agenda
More informationLOGISTIC SYSTEM ASSESSMENT TOOL (LSAT) SCORING SHEET
Logistics System Assessment Tool (LSAT) Scoring Sheet [1 LOGISTIC SYSTEM ASSESSMENT TOOL (LSAT) SCORING SHEET Country: _Cameroon Date: 01/31/03 Product categories covered in this assessment: (Check all
More informationReliability Assurance Initiative. Sonia Mendonca, Associate General Counsel and Senior Director of Enforcement
Reliability Assurance Initiative Sonia Mendonca, Associate General Counsel and Senior Director of Enforcement Agenda Reliability Assurance Initiative (RAI) Overview 2015 ERO CMEP Implementation Plan Inherent
More informationJOB DESCRIPTION. Job title: Accounts Officer Location: Gombe, Kano and Niger. Role type: National Grade: 6
JOB DESCRIPTION Job title: Accounts Officer Location: Gombe, Kano and Niger Department: Finance Length of contract: 3 years Role type: National Grade: 6 Travel involved: 5% Child safeguarding level: TBC
More informationOffice of the Superintendent of Financial Institutions
Office of the Superintendent of Financial Institutions Internal Audit Report on Supervision Support Group Capital Markets & Risk Assessment Services (SSG-CMRAS) February 2013 Table of Contents 1. Background...
More informationOFFICE OF INSPECTOR GENERAL PALM BEACH COUNTY AUDIT REPORT: 2012-A-0004 CHILDREN S SERVICES COUNCIL REPORT ON EXTERNAL QUALITY REVIEW
PALM BEACH COUNTY AUDIT REPORT: 2012-A-0004 CHILDREN S SERVICES COUNCIL REPORT ON EXTERNAL QUALITY REVIEW Sheryl G. Steckler Inspector General Enhancing Public Trust in Government SUMMARY RESULTS AT A
More informationJanuary 2015 Recommended Citation:
January 2015 Recommended Citation: Government of Kenya. 2015. Addendum to Kenya Health Sector Data Quality Assurance Protocol (2014) Data Quality Review: Guidelines for Conducting Data Quality Reviews
More informationPRINCE2 - Configuration Management Strategy
Created/updated 05/11/17 PRINCE2 - Configuration Management Strategy Downloaded from stakeholdermap.com. Visit Prince2 Templates for more Prince2 downloads. Get a Configuration Management Strategy Mind
More informationCase Report from Audit Firm Inspection Results
Case Report from Audit Firm Inspection Results July 2014 Certified Public Accountants and Auditing Oversight Board Table of Contents Expectations for Audit Firms... 1 Important Points for Users of this
More informationCENTER FOR MEDICARE. Date: February 13, All Medicare Advantage Organizations and Prescription Drug Plans
DEPARTMENT OF HEALTH & HUMAN SERVICES Centers for Medicare & Medicaid Services 7500 Security Boulevard, Mail Stop C1-22-06 Baltimore, Maryland 21244-1850 CENTER FOR MEDICARE Date: February 13, 2015 To:
More informationPBMs: From RFP To Implementation. Christopher Cannon, NASTAD U.S. Virgin Islands ADAP and Georgia ADAP April 3, 2014
PBMs: From RFP To Implementation Christopher Cannon, NASTAD U.S. Virgin Islands ADAP and Georgia ADAP April 3, 2014 Webinar Etiquette Phone lines Lines will be muted until dedicated question time. Please
More information6. Cross-Cutting Issues Indicators
6. Cross-Cutting Issues Indicators This thematic area evaluates in more detail several key topics that transcend each of the first four thematic areas. As such, the indicators in this section can be applied
More informationProgress Report 2016
Progress Report 2016 Biometrics, for client identification and service delivery management information system in real time An innovative pilot by Nai Zindagi and Mainline Acknowledgements Nai Zindagi would
More informationStrategic Issues in Global Health Program Transitions
March 30 - April 1, 2016 Strategic Issues in Global Health Program Transitions Opportunities for capacity development and learning on programmatic transition Ligia Paina and Sara Bennett Introduction Programmatic
More informationImplementation Guides
Implementation Guides Implementation Guides assist internal auditors in applying the Definition of Internal Auditing, the Code of Ethics, and the Standards and promoting good practices. Implementation
More informationRPI EMPLOYEES FEDERAL CREDIT UNION (RPIEFCU) SUCCESSION PLAN POLICY 5/16/2011
RPI EMPLOYEES FEDERAL CREDIT UNION (RPIEFCU) SUCCESSION PLAN POLICY PLANNING AND PREPARATION 5/16/2011 The Board of Directors and Management of the RPIEFCU recognize that a realistic succession plan is
More informationQuality Assurance QA STANDARD OPERATING PROCEDURE FOR FDA or Pharmaceutical Sponsored Audits
Quality Assurance QA 601.01 STANDARD OPERATING PROCEDURE FOR FDA or Pharmaceutical Sponsored Audits Approval: Nancy Paris, MS, FACHE President and CEO 24 May 2017 (Signature and Date) Approval: Frederick
More informationHumber Information Sharing Charter
External Ref: HIG 01 Review date November 2016 Version No. V07 Internal Ref: NELC 16.60.01 Humber Information Sharing Charter This Charter may be an uncontrolled copy, please check the source of this document
More informationGlion-sur-Montreux, Switzerland, June Summary report. Organized by the World Health Organization
Monitoring results with health facility information systems: A technical consultation Glion-sur-Montreux, Switzerland, 11 12 June 2014 Summary report Organized by the World Health Organization In collaboration
More information2018 Nomination Guide
2018 Nomination Guide The Models of Excellence Selection Committee depends on the content of the award nominations to make its rating decisions. Therefore, nominations submitted should be as detailed and
More informationBurton Hospitals NHS Foundation Trust. On: 22 January Review Date: December Corporate / Directorate. Department Responsible for Review:
POLICY DOCUMENT Burton Hospitals NHS Foundation Trust DATA QUALITY POLICY Approved by: Trust Management Team On: 22 January 2016 Review Date: December 2018 Corporate / Directorate Clinical / Non Clinical
More informationGuide to Public Health Supply Chain Costing
Guide to Public Health Supply Chain Costing A Basic Methodology OCTOBER 2013 This publication was produced for review by the U.S. Agency for International Development. It was prepared by the USAID DELIVER
More informationPART 3.4 DEPARTMENT OF TRANSPORTATION AND WORKS USE OF GOVERNMENT VEHICLES
PART 3.4 DEPARTMENT OF TRANSPORTATION AND WORKS USE OF GOVERNMENT VEHICLES Summary Introduction The Department of Transportation and Works (the Department) is responsible for monitoring and managing Government
More informationStrategy for Development and Implementation of an Integrated Logistics System for Essential Health Commodities
The United Republic of Tanzania MINISTRY OF HEALTH Strategy for Development and Implementation of an Integrated Logistics System for Essential Health Commodities DELIVER John Snow, Inc. Contract No. HRN-C-00-00-00010-00
More informationTechnical Assistance to the National Malaria Control Program to Strengthen the Malaria Supply Chain in Niger
Technical Assistance to the National Malaria Control Program to Strengthen the Malaria Supply Chain in Niger SIAPS Quarterly Progress Report April June 2015 Technical Assistance to the National Malaria
More informationAchieve. Performance objectives
Achieve Performance objectives Performance objectives are benchmarks of effective performance that describe the types of work activities students and affiliates will be involved in as trainee accountants.
More informationGlobal Fund TB Project Coordinator
Global Fund TB Project Coordinator Location: [Africa] [Somalia] [Mogadishu] Category: Health Job Type: Fixed term, Full-time For Somali Nationals Only Purpose of the position: The position holder will
More informationDRUG ORDERING AND MAINTENANCE
AND MAINTENANCE Disclaimer: Please note that the instructions provided in this chapter mainly apply to agents provided by the Pharmaceutical Management Branch (PMB), of Cancer Therapy Evaluation Program
More information1. GLOSSARY EXECUTIVE SUMMARY INTRODUCTION Audit Objective Audit Scope... 5
Audit REPORt Audit of offi cial controls in food business operations catering for high-risk population groups supervised by environmental health offi cers in the Health Service Executive Audit REPORT Audit
More informationPepsiCo s Sustainable Farming Initiative Scheme Rules
PepsiCo s Sustainable Farming Initiative Scheme Rules 1 Table of Contents 1. Introduction... 3 1.1 Setting the Stage... 3 1.2 Purpose of the Scheme Rules... 4 1.3 SFI Continuous Improvement Process...
More informationRecords Management Plan
Records Management Plan October 2014 1 2 Document control Title The Scottish Funding Council Records Management Plan Prepared by Information Management and Security Officer Approved internally by Martin
More informationQuality Management System
Quality Management System ZYQ Testing Laboratory 11111 Testing Avenue Testing Town, Alaska 22222 (555) 555-5555 Date Prepared: January 19, 2009 1 Please note that this Quality Management System (QM) was
More informationReview of agreed-upon procedures engagements questionnaire
Review of agreed-upon procedures engagements questionnaire Review code Reviewer Review date Introduction Standards on Related Services (ASRSs) detail the responsibilities of an assurance practitioner,
More informationHIV/AIDS Programme USER GUIDE DRUG REGULATORY STATUS DATABASE (DRS) Version 2
HIV/AIDS Programme USER GUIDE DRUG REGULATORY STATUS DATABASE (DRS) Version 2 November 2012 HIV/AIDS Programme USER GUIDE DRUG REGULATORY STATUS DATABASE (DRS) Version 2 November 2012 WHO Library Cataloguing-in-Publication
More informationPLUMAS RURAL SERVICES Serving people, Strengthening families, Building communities
PLUMAS RURAL SERVICES Serving people, Strengthening families, Building communities Job Description Title: Hours: Wage Range: Supervisor: Exempt Status: Housing Coordinator Up to 40 hours per week $16.41
More informationMalaria Research Capability Strengthening in Africa
July 2005 UNICEF/UNDP/World Bank/WHO Special Programme for Research & Training in Tropical Diseases (TDR) Multilateral Initiative on Malaria (MIM) Malaria Research Capability Strengthening in Africa INTRODUCTION
More informationAssessing quality-assured diagnoses made by TB laboratories
Assessing quality-assured diagnoses made by TB laboratories Quality-assured TB laboratory Objectives: at the end of the assessment reviewers should comment on the laboratory network and its structure;
More information- 2 - Report on the implementation by the Russian Federation of articles 5, 7, 8, 10, 12 and 13 of the United Nations Convention against Corruption
- 2 - Report on the implementation by the Russian Federation of articles 5, 7, 8, 10, 12 and 13 of the United Nations Convention against Corruption The Russian Federation signed the United Nations Convention
More informationQuality Management. Carlos Bachier, MD
Quality Management Carlos Bachier, MD Goals of Quality Management Ensure credibility of outcomes Improve patient safety and quality of processes Significantly reduce errors Quality Medical and Laboratory
More informationInternal Audit of the Republic of Azerbaijan Country Office
Internal Audit of the Republic of Azerbaijan Country Office March 2015 Office of Internal Audit and Investigations (OIAI) Report 2015/05 Internal Audit of the Azerbaijan Country Office (2015/05) 2 Summary
More informationAmeriCorps Resiliency Corps Program Information & Application
Administered by CaliforniaVolunteers and sponsored by the Corporation for National and Community Service. 2017-2018 AmeriCorps Resiliency Corps Program Information & Application Recognizing the need for
More informationJames Cook University. Internal Audit Protocol
James Cook University Internal Audit Protocol Table of Contents A. Introduction 2 B. Management Consultation during the Annual Internal Audit Planning Process 2 C. Support Provided to QAO/External Auditor
More informationHRM. Human Resource Management Rapid Assessment Tool. A Guide for Strengthening HRM Systems. for Health Organizations. 2nd edition
HRM Human Resource Management Rapid Assessment Tool for Health Organizations A Guide for Strengthening HRM Systems 2nd edition Copyright 2005, 2009 Management Sciences for Health, Inc. All rights reserved.
More informationRepublic of Korea. Submission on Art 6.2 and 6.4 of the Paris Agreement
Republic of Korea Submission on Art 6.2 and 6.4 of the Paris Agreement The conclusion of the SBSTA 46 invites Parties to submit their views on the content of the guidance (Article 6, paragraph 2) and the
More informationCHAPTER 5 INFORMATION TECHNOLOGY SERVICES CONTROLS
5-1 CHAPTER 5 INFORMATION TECHNOLOGY SERVICES CONTROLS INTRODUCTION In accordance with Statements on Auditing Standards Numbers 78 and 94, issued by the American Institute of Certified Public Accountants
More informationImproving OVC Program Quality with a Nationally Developed MIS
Improving OVC Program Quality with a Nationally Developed MIS Sontyo Jimin Monitoring and Evaluation Manager CRS Nigeria Country Program 17 th May,2016 Presentation Outline Overview of the SMILE Project
More informationAdvocacy. Self-Assessment Checklist: The Code identifies two key principles on advocacy: Self-Assessment Checklist - Advocacy.
Self-Assessment Checklist: Advocacy The Code of Good Practice for NGOs Responding to HIV/AIDS (the Code ) defines advocacy as a method and a process of influencing decision-makers and public perceptions
More informationIndependent Validation of the Internal Auditing Self-Assessment
Minnesota State Colleges & Universities Office of Internal Auditing Independent Validation of the Internal Auditing Self-Assessment Final Report March 7, 2007 Reference Number: 2007-03-004 INDEPENDENT
More informationCertificate of Recognition (COR ) COR Program Guidelines. Infrastructure Health & Safety Association (IHSA) 05/17 1
Certificate of Recognition (COR ) COR Program Guidelines Infrastructure Health & Safety Association (IHSA) 05/17 1 Table of Contents Contents Page 1. Program Guideline Overview... 4 1.1 Scope... 4 1.2
More information2017 Archaeology Audit Program Procedure Manual. April 2017
2017 Archaeology Audit Program Procedure Manual April 2017 Table of Contents Contents Table of Contents... 2 1.0 Introduction and Scope... 3 2.0 Audit Objectives... 3 3.0 Audit Procedures... 4 3.1 Audit
More informationTHE GOLD STANDARD PROGRAMME OF ACTIVITIES GUIDANCE DOCUMENT
THE GOLD STANDARD PROGRAMME OF ACTIVITIES GUIDANCE DOCUMENT This document provides detailed instructions and guidance on the steps to be followed for the development of Gold Standard Programme of Activities
More informationCountry Harmonization and Alignment Tool (CHAT)
Country Harmonization and Alignment Tool (CHAT) A tool to address harmonization and alignment challenges by assessing strengths and effectiveness of partnerships in the national AIDS response UNAIDS/07.17E
More informationReport. Quality Assessment of Internal Audit at <Organisation> Draft Report / Final Report
Report Quality Assessment of Internal Audit at Draft Report / Final Report Quality Self-Assessment by Independent Validation by Table of Contents 1.
More informationProvision of Consultancy Services to Conduct trapca Tracer Study and Impact Assessment of Training Programmes. May Terms of Reference.
Provision of Consultancy Services to Conduct trapca Tracer Study and Impact Assessment of Training Programmes. May 2015 Overview Terms of Reference 1 Established in December 2006, the Trade Policy Centre
More informationInternal Audit of Gambia Country Office
Internal Audit of Gambia Country Office April 2016 Office of Internal Audit and Investigations (OIAI) Report 2016/03 Internal Audit of Gambia Country Office (2016/03) 2 Summary The Office of Internal Audit
More informationA Guide to Clinical Coding Audit Best Practice Version 8.0
A Guide to Clinical Coding Audit Best Practice Version 8.0 Copyright 2017 Health and Social Care Information Centre Page 1 of 17 The Health and Social Care Information Centre is a non-departmental body
More informationUNICEF Evaluation Office Terms of Reference: External Assessment of UNICEF s Global Evaluation Reports Oversight System (GEROS)
UNICEF Evaluation Office Terms of Reference: External Assessment of UNICEF s Global Evaluation Reports Oversight System (GEROS) The Evaluation Office is planning to assess the Global Evaluation Reports
More information1 This Market Practice is being issued by the FXC and FMLG to facilitate implementation of the PB
To: Adherents to the ISDA Derivatives/FX PB Business Conduct Allocation Protocol From: Foreign Exchange Committee and Financial Markets Lawyers Group 1 Date: August 19, 2013 Re: Best Practices for Designation
More informationWUNGENING ABORIGINAL CORPORATION
WUNGENING ABORIGINAL CORPORATION Employment Information Pack Date: August 2017 Version: 3 Page 1 of 2 Congratulations on taking the first step towards working for Wungening Aboriginal Corporation (Wungening),
More informationAUDIT UNDP COUNTRY OFFICE SOUTH AFRICA. Report No Issue Date: 22 September 2014 [REDACTED]
UNITED NATIONS DEVELOPMENT PROGRAMME AUDIT OF UNDP COUNTRY OFFICE IN SOUTH AFRICA Report No. 1313 Issue Date: 22 September 2014 [REDACTED] Table of Contents Executive Summary i I. About the Office 1 II.
More informationThe IEG report How Effective Have Poverty and Social Impact Analyses Been? was discussed by CODE on June 3, 2009
Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized FAST TRACK BRIEF June 3, 2009 The IEG report How Effective Have Poverty and Social Impact
More informationPRINCE2 - Quality Management Strategy
Created/updated 05/11/17 PRINCE2 - Quality Management Strategy Downloaded from stakeholdermap.com. Visit Prince2 Templates for more Prince2 downloads. Get a Mind Map Quality Management Strategy template
More informationEvaluation Policy for GEF Funded Projects
Evaluation Policy for GEF Funded Projects Context Conservation International helps society adopt the conservation of nature as the foundation of development. We do this to measurably improve and sustain
More informationThe Episcopal Diocese of Kentucky
The Episcopal Diocese of Kentucky Internal Control Questionnaire Manual of Business Methods in Church Affairs (Spring 2012) Chapter II: Internal Controls, Section C The following Internal Control Questionnaire
More informationCountries in the WHO African
Immunization monitoring and vaccine-preventable diseases surveillance data management in the African Region Alain Poy, Balcha Masresha, Keith Shaba, Reggis Katsande, Goitom Weldegebriel, Blanche Anya,
More informationWHO Prequalification of In Vitro Diagnostics Programme
P r e q u a l i f i c a t i o n T e a m - D i a g n o s t i c s Information for Manufacturers on the Manufacturing Site(s) Inspection (Assessment of the Quality Management System) WHO Prequalification
More informationRequest for Proposal (RFP)
Request for Proposal (RFP) RFP # Reference Number 17630-NG-007 Funding Agencies: USAID Activity MEASURE Evaluation: OVC MER Outcome Monitoring in Nigeria Services Required Country Data collection, data
More informationThe National Statistical System in Italy. Coordinating function of Istat and the quality of statistics operations in the system
The National Statistical System in Italy. Coordinating function of Istat and the quality of statistics operations in the system Presentation outline The Italian national statistical system: institutional
More informationClinical Mentorship Manual for Intergrated Services
Clinical Mentorship Manual for Intergrated Services January 2011 Contents Foreword 1 Acknowledgements 3 Acronyms 4 1 Introduction to Clinical Mentorship 6 2 Goal and Objectives of Clinical Mentorship
More informationAssessing the Development Effectiveness of Multilateral Organizations: Guidance on the Methodological Approach
Assessing the Development Effectiveness of Multilateral Organizations: Guidance on the OECD DAC Network on Development Evaluation Revised June 2012 Table of Contents Acronyms... ii 1.0 Introduction...
More informationcertificate of recognition
certificate of recognition Small Business Audit Document / Version 1.0 www.bccsa.ca BC Construction Safety Alliance 2010 BC Construction Safety Alliance (BCCSA) The BC Construction Safety Alliance (BCCSA)
More informationGxP Auditing, Remediation, and Staff Augmentation
GxP Auditing, Remediation, and Staff Augmentation TABLE OF CONTENTS 3 Introduction 4 GxP Auditing 4 GMP Auditing 5 GCP Auditing 5 GLP Auditing 6 Pharmacovigilance Auditing 6 Vendor/Supplier Auditing 7
More informationStudy Files and Filing
Study Files and Filing The current version of all Hillingdon Hospital R&D Guidance Documents and Standard Operating Procedures are available from the R&D Intranet and Internet sites: www.ths.nhs.uk/departments/research/research.htm
More informationGxP Auditing, Remediation, and Quality System Resourcing
GxP Auditing, Remediation, and Quality System Resourcing TABLE OF CONTENTS 3 Introduction 4 GxP Auditing 4 GMP Auditing 5 GCP Auditing 6 GLP Auditing 7 Pharmacovigilance Auditing 7 Vendor/Supplier Auditing
More informationPRACTICAL EXPERIENCE CERTIFICATE FOR INTERNATIONALLY TRAINED CANDIDATES
PRACTICAL EXPERIENCE CERTIFICATE FOR INTERNATIONALLY TRAINED CANDIDATES Please read the following information prior to completing the experience certification form as an applicant applying for admission
More informationREPORT 2016/067 INTERNAL AUDIT DIVISION. Audit of management of national staff recruitment in the United Nations Assistance Mission for Iraq
INTERNAL AUDIT DIVISION REPORT 2016/067 Audit of management of national staff recruitment in the United Nations Assistance Mission for Iraq Overall results relating to the effective management of national
More informationTitle: How to assess a PQS testing laboratory Table of Content
Version: 01.05 Effective date: 08/07/2004 Page: 2 of 22 Table of Content 1.0 Purpose... 3 2.0 Scope... 3 3.0 Responsibility... 3 4.0 Documentation required... 4 5.0 Procedure... 4 5.1 Assessment Register...
More informationInternal Oversight Division. Audit Report. Audit of the Management of WIPO Customer Services
Internal Oversight Division Reference: IA 2015-07 Audit Report Audit of the Management of WIPO Customer Services December 22, 2015 IA 2015-07 2. TABLE OF CONTENTS LIST OF ACRONYMS... 3 EXECUTIVE SUMMARY...
More informationAssessment of the Design Effectiveness of Entity Level Controls. Office of the Chief Audit Executive
Assessment of the Design Effectiveness of Entity Level Controls Office of the Chief Audit Executive February 2017 Cette publication est également disponible en français. This publication is available in
More informationCompliance Monitoring and Enforcement Program Implementation Plan. Version 1.7
Compliance Monitoring and Enforcement Program Table of Contents TABLE OF CONTENTS NERC Compliance Monitoring and Enforcement Program... 1 Introduction... 2 NERC Compliance Monitoring and Enforcement Program
More informationWhether you take in a lot of money. or you collect pennies
Whether you take in a lot of money or you collect pennies ..it is important to maintain good cash handling procedures: Segregation of Duties Security Reconciliation Management Review Documentation It s
More informationAPPLICANT'S CHECKLIST
APPLICANT'S CHECKLIST Research tissue bank (or other research biobank) REC Ref: Title of Tissue Bank: Applicant's Name: Name of Designated Individual: Please complete this checklist and send it with your
More informationGxP Auditing, Remediation, and Staff Augmentation
GxP Auditing, Remediation, and Staff Augmentation TABLE OF CONTENTS 3 Introduction 4 GxP Auditing 4 GMP Auditing 5 GCP Auditing 6 GLP Auditing 7 Pharmacovigilance Auditing 7 Vendor/Supplier Auditing 8
More informationUnited Nations Children s Fund (UNICEF) Phnom Penh, Cambodia
United Nations Children s Fund (UNICEF) Individual Consultant Technical Adviser to support the implementation of Cambodia PROTECT: A Communication Strategy to End Violence against Children and Unnecessary
More informationQuality Management as Knowledge Sharing: Experiences of the Napa County Health and Human Services Agency
Journal of Evidence-Based Social Work ISSN: 1543-3714 (Print) 1543-3722 (Online) Journal homepage: http://www.tandfonline.com/loi/webs20 Quality Management as Knowledge Sharing: Experiences of the Napa
More informationIn-class Exercise BSAD 141 Entity/Activity Table, Context Diagram, Physical DFD exercise
In-class Exercise BSAD 141 Entity/Activity Table, Context Diagram, Physical DFD exercise 1. Using the entity/ activity table, identify information /data processing activities (put a check next to the row
More informationTHE INDEPENDENT PHARMACIST S GUIDE TO THE ON-SITE AUDIT Amanda C. Fields, General Counsel for American Pharmacies (APRx) American Pharmacies 2011
THE INDEPENDENT PHARMACIST S GUIDE TO THE ON-SITE AUDIT Amanda C. Fields, General Counsel for American Pharmacies (APRx) American Pharmacies 2011 Introduction As the Texas pharmacists know, audits by insurance
More information12 Components. A functional national system for monitoring and evaluating the response to HIV - AIDS
Central American Course Monitoring and Evaluation for HIV/AIDS Policy and Program Management 1 2 3 4 Module 2 Unit 3 12 Components A functional national system for monitoring and evaluating the response
More information