FOOD FOR PEACE DATA QUALITY ASSESSMENT WEBINAR HANDOUTS

Similar documents
List of Private Voluntary Organization (PVO) Questions

HOW TO PREPARE AND MANAGE FFP MID-TERM EVALUATIONS

Performance Monitoring Session 7 Data Quality

Workplace Core Skills: Communication

Survey Expert to provide assistance for the Randomized rural household survey Scope of Work (SOW)

SCOPE OF WORK FOR THE DESIGN OF ISUKU IWACU s MONITORING, EVALUATION AND LEARNING DATABASE

ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA)

EVALUATION STATEMENT OF WORK CHECKLIST AND REVIEW TEMPLATE

Recommended mitigating controls

Kentucky State University Office of Internal Audit

Cancer Prevention & Research Institute of Texas. IA # Internal Audit Report over Communication Report Date: April 30, 2018 Issued: May 25, 2018

Botswana s Integration of Data Quality Assurance Into Standard Operating Procedures

Performance Indicator Reference Sheet (PIRS) Guidance & Template

REPORT 2015/091 INTERNAL AUDIT DIVISION

FOOD FOR PEACE INDICATOR PERFORMANCE TRACKING TABLE (IPTT) GUIDANCE

Internal Audit Quality Analysis Evaluation against the Standards International Standards for the Professional Practice of Internal Auditing (2017)

SDWP Operations Manual Chapter 8 Oversight and Monitoring

CHAPTER 5 INFORMATION TECHNOLOGY SERVICES CONTROLS

Senior Monitoring and Evaluation Specialist, Niger

Food for Peace Monitoring and Evaluation Workshop for FFP Development Food Assistance Projects. Monitoring and Evaluation (M&E) Plan

PAI Inspections, Observations and Data Integrity

In-class Exercise BSAD 141 Entity/Activity Table, Context Diagram, Physical DFD exercise

INTERNAL AUDIT DIVISION AUDIT REPORT 2013/102

CONTRACT ADMINISTRATION STANDARD OPERATING PROCEDURES

Environment: Public Drinking Water Supply Program

17. Information management

PART 6 - INTERNAL CONTROL

Report. Quality Assessment of Internal Audit at <Organisation> Draft Report / Final Report

ISO & ISO TRAINING DAY 4 : Certifying ISO 37001

GUIDANCE FOR USAID STAFF PARTICIPATION ON EXTERNAL EVALUATIONS

REPORT 2014/112 INTERNAL AUDIT DIVISION. Audit of recruitment of national staff in the African Union-United Nations Hybrid Operation in Darfur

DRAFT FOR PUBLIC COMMENT Guidance Note for ESS10 Stakeholder Engagement and Information Disclosure

Internal Control Questionnaire and Assessment

Guide to Internal Controls

GATU Webinar Part 1 March 2017 Presented by Carol Kraus, CPA

Hong Kong Deposit Protection Board

International Standards for the Professional Practice of Internal Auditing

PROGRAM EVALUATIONS METAEVALUATION CHECKLIST (Based on The Program Evaluation Standards) Daniel L. Stufflebeam, 1999

Survey Statistician to provide assistance for the Randomized rural household survey Scope of Work (SOW)

Software Auditor Skills Training Course Offered by The Westfall Team

Performing a Successful Audit. Fundamentals of Auditing ERO Compliance Audit Process Jim Hughes Manager, Audit Assurance and Oversight

REPORT 2015/102 INTERNAL AUDIT DIVISION

VERTICAL ASSESSMENT ISO FOR MEDICAL LABORATORIES

SECTION 3: TERMS OF REFERENCE (TOR)

3.6.2 Internal Audit Charter Adopted by the Board: November 12, 2013

INTERNATIONAL STANDARD ON AUDITING 315 UNDERSTANDING THE ENTITY AND ITS ENVIRONMENT AND ASSESSING THE RISKS OF MATERIAL MISSTATEMENT CONTENTS

PROCEDURE AND TOOLS FOR EVALUABILITY REVIEW OF ILO PROJECTS OVER US$5 MILLION

By Lucy Bassett, Gaston Mariano Blanco, and Veronica Silva Villalobos World Bank, LCSHS. October 2010

Transportation and Infrastructure Renewal: Mechanical Branch Management

Table of Contents. Title 28 EDUCATION. Part XCI. Bulletin 1922 Compliance Monitoring Procedures

M&E Specialist. Location: [Europe & the Middle East] [Turkey] Town/City: Gaziantep. Category: Programme Effectiveness. Job Type: Fixed term, Full-time

Key Considerations for Implementing Bodies

REPORT 2015/082. Lebanon. in the United Nations Interim Force in FINAL OVERALL RATING: PARTIALLY SATISFACTORY

Definitions Definitions used in this document are taken from TNI SOP 7-100, and may be found there.

Report on the development of the. competence framework. for chemical engineering. SBG Dresden

GRANT PROPOSAL: Part Two

United Nations Children s Fund (UNICEF) Phnom Penh, Cambodia

NIRS Data Coordinators Meeting

Root Cause Analysis Procedure

GENERALIZED SYSTEM OF PREFERENCES TECHNICAL INFORMATION FOR PRE-ASSESSMENT SURVEY (TIPS)

Conducting a Customer Survey Part 3 of 3

FinScope Methodology

If an adequate segregation of duties does not exist, the following could occur:

AUDIT UNDP COUNTRY OFFICE AFGHANISTAN ASSET MANAGEMENT. Report No Issue Date: 14 February 2014

Quality Management System

Supply Chain Management

STAFF SERVICES SPECIALIST

REQUEST FOR PROPOSALS

Chapter 6 Field Work Standards for Performance Audits

INTERNAL AUDIT DIVISION AUDIT REPORT 2013/093

Sample Form: Review Checklist

REQUEST FOR PROPOSALS

For. Planning and Research Related to Procurement of a Systems Integration, Enhancements to a MMIS, New Fiscal Agent, and a Replacement DSS

Auditing Standards and Practices Council

Quality Management System

PHASE TWO FOLLOW-UP REPORT ON THE AUDIT OF CONTRACTS (2008)

BHG Operational Awareness Program May 8, 1998 Performance Assessment Guide ENA 7.1 Revision 0 Engineering Program Page 1 of 10 ENGINEERING PROGRAM

Internal Audit of the Botswana Country Office

UNITED NATIONS DEVELOPMENT PROGRAMME Office of Audit and Investigations PERFORMANCE AUDIT OF UNDP MONITORING PRACTICES

Modern Languages Portfolio Analysis of literature, media or language in work General assessment information

VERTICAL ASSESSMENT INSPECTION ISO/IEC 17020:2012

EXCELLENCE 1.1 Quality, innovative aspects and credibility of the research (including inter/multidisciplinary aspects)

Transportation Contributed Assets Review January 28, 2015

A Review of Constraints to Using Data for Decision Making. Recommendations to Inform the Design of Interventions

Guideline on good pharmacovigilance practices (GVP)

CHAPTER 10 FACILITIES EVALUATIONS (PRE-AWARD) SURVEYS PART 1 INTRODUCTION

METHODOLOGY FOR INCOME SURVEYS: TO DETERMINE LOW AND MODERATE INCOME BENEFICIARIES OF NEIGHBORHOOD INVESTMENT PROJECTS

Terms of Reference. Senior Researcher / Study Consultant for the Educational Demand and Needs Assessment Study

ICT Agency of Sri Lanka

CONTEXT INDICATOR REFERENCE SHEET (CIRS) TEMPLATE

Audit of Procurement Practices

ADMINISTRATIVE ANALYST I ADMINISTRATIVE ANALYST II

REQUEST FOR EXPRESSIONS OF INTEREST

UNIVERSITY OF CALIFORNIA, DAVIS INTERNAL AUDIT SERVICES. Subrecipient Monitoring Project # April 2013

Republic of Kosovo. Office of the Auditor General. Audit Quality Management Guide

ADVISORY CIRCULAR AC

HUD-US DEPT OF HOUSING & URBAN DEVELOPMENT: Understanding Internal Controls. Ladies and gentlemen, thank you for standing by and welcome to the

2018 Parking Services Cash Handling Review 8/3/2018. P.O. BOX 1027, SAVANNAH, GA

Evaluation Inception Report Draft Template

Transcription:

FOOD FOR PEACE DATA QUALITY ASSESSMENT WEBINAR HANDOUTS A. Data Quality Assessment Requirements and Best Practices... 1 B. Illustrative DQA Process... 4 C. Sample DQA Checklist and Procedures... 5 D. Recommendations for Conducting Data Quality Assessments... 8 E. Potential Pitfalls to Avoid... 9 F. Resources... 10

A. Data Quality Assessment Requirements and Best Practices Introduction As per the recently developed Food for Peace (FFP) Policy and Guidance for monitoring, evaluation, and reporting, FFP development food assistance projects are required to conduct a data quality assessment of indicators annually reported to FFP. 1. What is a Data Quality Assessment (DQA)? A DQA is a systematic and periodic review of the data quality of indicators that FFP development projects report annually. For the purposes of the DQA, USAID defines data quality based on the following standards, as described in the USAID ADS 203: STANDARD Validity Integrity Precision Reliability Timeliness DEFINITION Data should clearly and adequately represent the intended result. Data collected should have safeguards to minimize the risk of transcription error or data manipulation. Data should have a sufficient level of detail to permit management decision making; e.g., the margin of error is less than the anticipated change. Data should reflect stable and consistent data collection processes and analysis methods over time. Data should be available at a useful frequency, should be current, and should be timely enough to influence management decision making. 2. What is the purpose of DQA? A DQA is one important method of improving data quality, with the ultimate goal of improving accountability and the ability to make well-informed programmatic and policy decisions. The purpose of the DQA is to be able to understand the strengths and weaknesses of the data quality, and to identify factors contributing to lower quality and ways to improve quality. A DQA is designed to: Verify the quality of data Assess the system that produces that data Develop action plans to improve both 3. When and where should the awardee describe its plan for DQA? At a minimum, Awardees must provide a description of their plans for a DQA on an annual basis. Awardees must describe the DQA plan in the monitoring and evaluation (M&E) plan for the first 12 months of award implementation and then in the Pipeline and Resource Estimate Proposal (PREP) for the subsequent fiscal years of program implementation. Note: If over the course of a year they receive additional information, they can make a revision to the DQA, speak to the Agreement Officer Representative, and submit a revision of the DQA. FFP DQA Webinar Handouts 1

4. What must the description of the plan for DQA include? A list of indicators to be reviewed and justification for the selection Timeframe: timing and duration Any particular focus of the review Who will participate in the DQA: roles and qualifications 5. How should indicators be selected for DQA? Awardees should select annual indicators for the DQA based on the importance of the indicator to the ToC, identified and perceived data quality risks associated with the indicator, timing and availability of staff, frequency and timing of data collection, and other a number of other factors. Per USAID s Office of Food for Peace Policy and Guidance for Monitoring, Evaluation, and Reporting for Development Food Assistance Projects; all annual indicators that are reported on the basis of an annual beneficiary based survey must be approved, and include a variety of requirements pertaining to the survey methodology. As a result, FFP encourages Awardees to focus their DQA efforts on project-specific indicators from non-survey data collection methods. It is not necessary to assess every single project-specific indicator throughout the life of the award, but to have a plan for how a sample of selected indicators will provide sufficient evidence to assess overall data quality. For example, an awardee could categorize all its indicators by outcome/output or based on how indicators are collected (e.g. split indicators according to similar data flows). Then the awardee could select a sample of indicators for each category. Reviewers would reconstruct the flow of data for each selected indicator at selected sites to verify its quality beginning from the initial point of collection through the highest level of reporting and use. Awardees may contact their AOR and FFP Regional M&E specialist to request additional guidance on how to select indicators. 6. What aspects of the M&E system may awardees assess? Awardees may structure their DQA process to examine the following functional areas of their M&E system: Functional Areas I II III M&E structures, functions, and capabilities Indicator definitions and reporting guidelines Data collection tools and reporting forms Summary Questions 1 2 3 4 5 6 Are key M&E and data-management staff identified with clearly assigned responsibilities? Have the majority of key M&E and data-management staff received the required training? Are there operational indicator definitions meeting relevant ADS standards (validity, integrity, precision, reliability, and timeliness) that are systematically followed by all collection and reporting sites? Has the program/project clearly documented (in writing) what is reported to who, and how and when reporting is required? Are there standard data-collection tools and reporting forms that are systematically used? Is data recorded with sufficient precision/detail to measure relevant indicators? FFP DQA Webinar Handouts 2

IV V Processes of data verification, aggregation, processing, management, storage and safeguarding Data use and dissemination practices 7 8 9 10 11 12 13 14 Are data maintained in accordance with international or national confidentiality guidelines? Are source documents kept and made available in accordance with a written policy? Does clear documentation of collection, aggregation, manipulation, storage, and safeguard steps exist? Are data quality challenges identified and are mechanisms in place for addressing them? Are there clearly defined and followed procedures to identify and reconcile discrepancies in reports? Are there clearly defined and followed procedures to periodically verify source data? Are there clearly defined and followed procedures for data use and dissemination? Is data used and disseminated timely to inform management decisions? VI Links with national reporting systems (where relevant) 15 Does the data collection and reporting system of the program/ project link to a national reporting system (where relevant)? 7. When and how should DQA results be submitted to FFP? For each fiscal year, reports of the DQAs completed during the year, including a description of the DQA, the assessment findings, and actions taken in response to the findings, must be uploaded to the FFP Management Information System (FFPMIS) as part of the Annual Results Report (ARR). FFP DQA Webinar Handouts 3

B. Illustrative DQA Process FFP is not prescriptive in terms of a process to follow to conduct a DQA. Awardees should have their own DQA processes. Below are some illustrative steps to conduct a DQA. In reality, these steps are often iterative. Step 1. Develop an overall approach and schedule. This includes developing objectives, process, etc. Step 2. Identify the DQA team. Identify one person to lead the DQA process. This person is often an M&E expert. The leader is responsible for setting up the overall process, identifying team members, and establishing roles and responsibilities. If the project is a consortium, the leader should coordinate with sub-awardees to ensure the process is inclusive of all implementing partners. Step 3. Identify the indicators and sites to be included in the review. This is often done by categorizing indicators (by outcome/output, based on data flows, program area importance, indicators with suspected data quality issues, etc.) and drawing a sample of indicators for each category and a sample of sites to conduct the DQA. Step 4. Develop a budget and develop a logistics plan. Awardees should set aside a budget for an annual DQA exercise throughout the life of the award. Step 5. Develop and pilot DQA tools or instruments. Examples may include topical guidance for specific key informants (or types of key informants), worksheets to record findings for each indicator or type of indicator (e.g., a worksheet for outcome indicators and a different worksheet for output indicators), etc. Step 6. Train DQA reviewers. If more than one person is conducting a DQA, personnel should be trained to ensure the DQA is consistently conducted among different reviewers. Step 7. Conduct the DQA. Deploy a team to conduct DQA data collection based on an agreed schedule and using agreed methodology and instruments. Step 8. Prepare draft of DQA report. The team should record findings on the DQA instruments (worksheets) and include recommendations for action at the conclusion of each worksheet. The worksheets should be used to develop a report which should include: Outline the overall approach and methodology used Highlight key data quality issues that are important for senior management Summarize recommendations for improving performance management systems Step 9. Report review. Implementing partners should have an opportunity to review the first draft. Any comments or issues can then be incorporated and the DQA finalized. Step 10. Follow up on Actions. Finally, it is important to ensure that there is a process to follow up on recommendations, and responsible parties to carry out recommendations are identified. Some recommendations may be addressed internally by the project team. Other issues may need to be addressed across different project units or across implementing partners (if the project is a consortium). Step 11. Submit DQA report to FFP through FFPMIS as part of the ARR. DQA report should include a description of the DQA, the assessment findings, and actions taken in response to the findings. FFP DQA Webinar Handouts 4

C. Sample DQA Checklist and Procedures This is a sample of a Data Quality Assessment (DQA) Checklist. Awardees may adapt this checklist to use to conduct their own DQAs or may decide to use a different checklist. This checklist is intended to assist in assessing each of the five aspects of data quality and provide a convenient manner in which to document the DQA findings. Name of FFP Project/Organization: Title of Performance Indicator: [Indicator should be copied directly from the Performance Indicator Reference Sheet] Result this Indicator Measures (i.e., Specify the Project Purpose, Sub-purpose, etc.): Data Source(s): [Information can be copied directly from the Performance Indicator Reference Sheet] Partner or Contractor Who Provided the Data (only applicable for project run by consortium): [In addition to completing the checklist for its own organization, Awardee should complete this checklist for each sub-awardee that contributes data to an indicator it is the prime s responsibility to ensure the data quality of sub-contractors or sub grantees.] Period for Which the Data Are Being Reported: Is this Indicator a project-specific indicator or a FFP/USAID indicator? FFP/USAID indicator Project-specific indicator (created by the project) Data Quality Assessment Methodology: [Describe here or attach to this checklist the methods and procedures for assessing the quality of the indicator data, e.g., reviewing data collection procedures and documentation, interviewing those responsible for data analysis, checking a sample of the data for errors, etc.] Date(s) of Assessment: Assessment Team Members: Team Leader Approval X FFP DQA Webinar Handouts 5

YES NO COMMENTS VALIDITY Data should clearly and adequately represent the intended result. 1 Would an outsider or an expert in the field agree that the indicator is a valid and logical measure for the stated result (e.g., a valid measure of overall nutrition is healthy variation in diet; age is not a valid measure of overall health)? 2 Does the indicator measure the contribution of the project? 3 Is there reasonable assurance that the data collection methods being used do not produce systematically biased data (e.g., consistently overor under-counting)? 4 Are the people collecting data qualified and properly supervised? 5 Were known data collection problems appropriately addressed? RELIABILITY Data should reflect stable and consistent data collection processes and analysis methods over time. 1 Is a consistent data collection process used across time, locations, and data sources? 2 Are there appropriate written procedures in place for periodic review of data collection, maintenance, and processing? 3 Are the written procedures for periodic review of data collection, maintenance, and processing consistently practiced? 4 Are data collection and analysis methods documented in writing and being used to ensure the same procedures are followed each time? TIMELINESS Data should be available at a useful frequency, be current, and timely enough to influence management decision making. 1 Are data available frequently enough to inform program management decisions? 2 Is data properly stored and readily available? 3 How has the information been used to make management decisions? PRECISION Data have a sufficient level of detail to permit management decision making; e.g., the margin of error is less than the anticipated change. 1 Is the margin of error less than the expected change being measured (e.g., If a change of only 2% is expected and the margin of error in a survey used to collect the data is +/- 5%, then the tool is not precise enough to detect the change)? FFP DQA Webinar Handouts 6

2 Has the margin of error been reported along with the data (only applicable to results obtained through statistical samples)? 3 Is the data collection method/tool being used to collect the data fine-tuned or exact enough to register the expected change (e.g., a yardstick may not be a precise enough tool to measure a change of a few millimeters)? INTEGRITY Data collected should have safeguards to minimize the risk of transcription error or data manipulation. 1 Are procedures or safeguards in place to minimize data transcription errors? 2 Is there independence in key data collection, management, and assessment procedures? 3 Are mechanisms in place to prevent unauthorized changes to the data? SUMMARY Based on the assessment relative to the five standards, what is the overall conclusion regarding the quality of the data? Significance of limitations (if any): Actions needed to address limitations: IF NO DATA ARE AVAILABLE FOR THE INDICATOR If no recent relevant data are available for this indicator, why not? What concrete actions are now being taken to collect and report these data as soon as possible? When will data be reported? COMMENTS FFP DQA Webinar Handouts 7

D. Recommendations for Conducting Data Quality Assessments 1. The Data Quality (DQ) assessor should make sure that h/she understand the precise definition of the indicator by checking the Performance Indicator Reference Sheet. Please address any issues of ambiguity before the DQA is conducted. 2. The DQ assessor should have a copy of the methodology for data collection in hand before assessing the indicator. This information should be in the Monitoring and Evaluation (M&E) Plan s Performance Indicator Reference Sheets for each indicator. Each indicator should have a written description of how the data being assessed are supposed to be collected. 3. The prime and each sub-awardee (for projects run by consortium) should have a copy of the method of data collection in their files and documented evidence that they are collecting the data according to the methodology. 4. The DQ assessor should record the names and titles of all individuals involved in the assessment. 5. Does the project staff (and sub-awardees when applicable) have documented evidence that they have verified and cross-checked the data that has been reported? Project staff should be able to provide the DQ assessor with documents (process/person conducting the verification/field visit dates/persons met/activities visited, etc.), which demonstrates that they have verified the data that was reported. Note: Data verification should be an ongoing process. 6. The DQ assessor should be able to review the projects files/records against the methodology for data collection laid out in the project s M&E plan. Any data quality concerns should be documented. 7. The DQ should include a summary of significant limitations found. A plan of action, including timelines and responsibilities, for addressing the limitations should be made. FFP DQA Webinar Handouts 8

E. Potential Pitfalls to Avoid 1. Security issues: Inadequate security and access procedures, such as password protection, for data entry. 2. Lack of assigned budget and personnel for annual DQA: The budget is not annually allocated for DQAs and personnel is not trained and assigned to conduct DQAs. 3. Lack of traceability standards: Data on who collected the information, when, where, etc., is not systematically captured, so it is not possible to trace the data from point of collection to highest level of aggregation. 4. Filing system inconsistencies: Inconsistent hardcopy and/or electronic filing system increases the possibility of error and wrongful manipulation. 5. Incomplete requirements in collection forms: Indicator collection forms lack demographic information in order to complete the indicators disaggregation requirements (e.g., data of birth for age disaggregation, sex of individuals, household type) 6. Insufficient training and knowledge refreshers: Significant difference in the M&E staff knowledge and understanding of requirements for data collection, analysis, and reporting. 7. Inconsistent data collection methodology among prime and sub-awardees (for projects run by consortium): Albeit efforts to standardize collection forms and procedures, prime and sub-awardees continue using different forms and procedures. 8. Insufficient standards to verify and cross-check data: Lack of indicator specific data duplication, verification and entry checklists or processes to ensure that data is accurate, not duplicated or subject to transcription error. 9. Insufficient follow-up on actions: recommendations do not always translate into actions. FFP DQA Webinar Handouts 9

F. Resources Section 3.2: Data Quality Assurance, Management, and Safeguard. Draft of USAID s Office of Food for Peace Policy and Guidance for Monitoring, Evaluation, and Reporting for Development Food Assistance Projects. USAID s ADS 203: Assessing and Learning provides guidance on five standards: validity, integrity, precision, reliability, and timeliness. Available at: http://www.usaid.gov/sites/default/files/documents/1870/203.pdf Measure Evaluation Data Quality Assurance Tools provides a number of guidance documents and tools to conduct data quality assessments. Available at: http://www.cpc.unc.edu/measure/resources/tools/monitoring-evaluation-systems/dataquality-assurance-tools USAID s 2010 Performance Monitoring & Evaluation TIPS: Conducting Data Quality Assessments. Available at: http://pdf.usaid.gov/pdf_docs/pnadw118.pdf USAID s 2009 Performance Monitoring & Evaluation TIPS: Data Quality Standards. Available at: http://pdf.usaid.gov/pdf_docs/pnadw112.pdf FFP DQA Webinar Handouts 10