Tools for Building Monitoring and Evaluation Systems in Public Works Projects

Similar documents
Monitoring and Evaluation: the Foundations for Results

Building Monitoring and Evaluation Systems. Maddalena Honorati Economist Human Development Network The World Bank

How to write a logframe

Monitoring and Evaluation in the GEF: How Country Stakeholders Get Involved. GEF Expanded Constituency Workshops Group Work

Monitoring and Evaluation in the GEF: How Country Stakeholders Get Involved. GEF Expanded Constituency Workshops Group Work

9. Project Cycle Management and Logical Framework: The case of the EC. By Claudio Foliti

Monitoring and Evaluation Policy

Logical Framework for the project

PDP s Third Call for Proposals

TERMS OF REFERENCE. Independent Evaluation of the ILO Action Plan for Gender Equality

PROCEDURE AND TOOLS FOR EVALUABILITY REVIEW OF ILO PROJECTS OVER US$5 MILLION

ISPA tools South Asia and East Asia Pacific Regional ISPA Face-2-Face Training Yogyakarta, Indonesia, March 2018

Monitoring and Evaluation: A Logical Frame. Evaluating the Impact of Projects and Programs Beijing, China, April 10-14, 2006 Shahid Khandker, WBI

MERIT SYSTEM SERVICES Date Established: 10/30/80 Date Revised: 05/04 STAFF SERVICES ANALYST I/II (Madera County Title: Staff Services Analyst I)

Training on Project Cycle Management for Researchers Ethiopian Academy of Sciences May 2016, Ghion Hotel, Addis Ababa

Servacius B. Likwelile EXECUTIVE DIRECTOR Tanzania Social Action Fund (TASAF) Dar es Salaam, TANZANIA October 1-5, 2007

DevNet Conference Kate Averill Director, Evaluation Consult PhD research, November 2010

Workshop II Project Management

i FACILITATOR S GUIDE

Monitoring and Evaluation Guidance Note

NATIONAL EXPORT STRATEGY

Organizational capacity Assessment tool 1

Indicators, Targets and Data Sources

Senior Monitoring and Evaluation Specialist, Niger

Glossary of Performance Measurement Terms Ontario Public Service Reference Document for Provincial Ministries

TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS

GLOSSARY OF SELECT M&E TERMS

Introduction Concept of the National Policy Project Objective Guiding principles

Terms of Reference for a Gender Analysis

AFRICA FOR RESULTS INITIATIVE GUIDANCE ON ASSESSING THE PERFORMANCE OF MFDR PROJECTS. Introduction. Massoud Moussavi Tool N 2 SYNOPSIS

Management response to the annual report for 2017 on the evaluation function in UNICEF

PLANNING AND CONDUCT OF EVALUATIONS

Module 6 Introduction to Measurement

White Paper Realising benefits: How to plan for success

CRITERIA FOR EQUASS ASSURANCE (SSGI)

Beaver Works: Business Excellence Adding Value & Service. Business Operations Project Plan Information Sessions August-October 2018

POVERTY REDUCTION ACTION PLAN (PRAP) DEVELOPMENT GUIDANCE 1

Terms of Reference Final external evaluation of the African Children s Charter Project (ACCP) Bridge period (April 2015-December 2016)

Provincial framework for. public participation in Gauteng

WORLD INTELLECTUAL PROPERTY ORGANIZATION GENEVA INTERNAL AUDIT AND OVERSIGHT DIVISION INTERNAL REVIEW ON PROGRAM PERFORMANCE REPORTING PROCESS

The Role of Monitoring & Evaluation for Good Governance King Mongkut s University of Technology North Bangkok

USAID PROGRAM CYCLE OVERVIEW

Joint Assessment of National Health Strategies and Plans

The Seven Areas of Responsibility of Health Educators Area of Responsibility I: ASSESS NEEDS, ASSETS AND CAPACITY FOR HEALTH EDUCATION COMPETENCY

Significant Service Contracts Framework

Terms of Reference (TOR)

PROJECT IDENTIFICATION

Guide on the elaboration of local plans for increasing transparency and ensuring access to information in the activity of the local public authorities

Organizing Framework for a Functional National HIV Monitoring & Evaluation System

Executive Board of the United Nations Development Programme and of the United Nations Population Fund

ENGAGING WITH CITIZENS FOR IMPROVED DEVELOPMENT RESULTS

KATSINA STATE GOVERNMENT CORPORATE PLANNING POLICY PAPER

Organizing framework for functional national WASH monitoring and evaluation systems

GEROS Evaluation Quality Assurance Tool Version

TERMS OF REFERENCE FOR CONSULTANTS

Call for Expression of Interest

GUIDANCE NOTE: DEVELOPING A RESULTS

Linkages between the Africa Governance Inventory (AGI) and the African Peer Review Mechanism (APRM)

TERMS OF REFERENCE. 1. Background on the project

Indigenous and Northern Affairs Canada. Internal Audit Report. Audit of Performance Measurement. Prepared by: Audit and Assurance Services Branch

Western Balkans Recommendation on Public Participation

Personal and Public Involvement (PPI) Trust Board Update Paper

Service Evaluation. An introductory guide

Working Party on Aid Evaluation

Performance Auditing: What It Is, and Why It Is Important Presented by: Harriet Richardson, CPA, CIA, CGAP Audit Manager, City of Berkeley

What will we do?

SMS Elements Veriforce, LLC. All rights reserved.

Social and Environmental Assessments, Indicators, Information, Monitoring, Reporting and Verification

DIAKONIA TERMS OF REFERENCE FOR BASELINE SURVEY CONSULTANCY PROCUREMENT REFERENCE NUMBER: DK/CONS/09/001/2015

Monitoring and Evaluating Social Protection Programs Efforts to Respond to Natural Disasters and Climate Change-Related Shocks

CODI CORE DIAGNOSTIC INSTRUMENT ASSESSMENT MATRIX SYSTEM

Fairtrade International Monitoring Evaluation and Learning Programme ISEAL Impacts Code Public System Report January Executive Summary 2

MONITORING AND EVALUATION UNDAF CAMPANION GUIDANCE UNDAF COMPANION GUIDANCE: MONITORING AND EVALUATION


Participatory planning and monitoring

Theory of Change. Aparna Krishnan JPAL South Asia at IFMR. 3 July, 2017

Developing a Monitoring and Evaluation Plan for Food Security and Agriculture Programmes

ToR for Individual Consultant Evaluation of GOPP-UNDP Projects

Executive Board of the United Nations Entity for Gender Equality and the Empowerment of Women

NETWORK OF ECONOMIC REGULATORS. The First Five Years: Taking Stock

COMMUNITY LIVING BRITISH COLUMBIA 2015/ /18 SERVICE PLAN

Terms of reference Evaluator for mid-term review of 4.5-year EuropeAid Grant Agreement

WHO reform. WHO evaluation policy

New Zealand Aid Programme Results-Based Management Toolkit

Summative Evaluation Guidelines for Jobs Fund Partners

TERMS OF REFERENCE FOR THE REVIEW OF WIPO S TECHNICAL ASSISTANCE ACTIVITIES IN THE AREA OF COOPERATION FOR DEVELOPMENT prepared by the Secretariat

Quality Improvement Framework

CSR / Sustainability Governance and Management Assessment By Coro Strandberg President, Strandberg Consulting

Mid-term Evaluation for the Gender and Generational Empowerment project in Tanzania.

INTERNATIONAL LABOUR OFFICE JOB DESCRIPTION. One year (with the possibility to renew)

Building Back Better. focus on resilience and participation

Provision of Support Services: Office space: Yes x No Equipment (laptop etc): Yes x No Secretarial Services Yes x No. Signature of the Budget Owner:.

EXTERNAL EVALUATION OF THE EUROPEAN UNION AGENCY FOR FUNDAMENTAL RIGHTS DRAFT TECHNICAL SPECIFICATIONS

DISCRETIONARY GRANT MONITORING AND EVALUATION POLICY AND GUIDELINES

Harnessing the potential for SAIs to contribute to the success of the Sustainable Development Goals

GCF Templates and Guidance: Window A

Designing IAEA Technical Cooperation Projects using the Logical Framework Approach

Certified Training Professional for Workplace Performance. Accredited by Institute of Training & Occupational Learning, UK

National Health Workforce Observatories. Concept and Implementation strategy. In the context of Africa Health Workforce Observatory DRAFT

CSO Evaluability Assessment Checklist: Working Draft

Transcription:

Tools for Building Monitoring and Evaluation Systems in Public Works Projects Laura B. Rawlings Lead Social Protection Specialist Human Development Network World Bank 1 Making Public Works Work South South Learning Forum Arusha June 2010

Objectives of this session 1. Functions of Monitoring and Evaluation (M&E) Systems 2. Structuring M&E Systems for Programs 3. Monitoring and Evaluation Tools 4. Operationalizing M&E Systems 5. Quick Tips! 2

Objectives of Monitoring and Evaluation Systems Why do we need good M&E systems? Managing for results Information to improve decision making and steer development interventions towards clearly defined goals. Accountability and transparency Information to track performance and report on progress to stakeholders, civil society Knowledge generation on development effectiveness. Evidence on the outcomes of development programs There is no blueprint for an M&E system systems should be designed to reflect the needs of program managers, policymakers Simple approaches are often the best! 3

Principles of Management for Development Results* 5. Using results 1. Focusing the information for dialogue on results at learning and all phases of the decision making development process 4. Managing for, not by, results 2. Aligning programming, M&E with results 3. Keeping measurement t& reporting simple & cost-effective Tools for performance measurement to increase the effectiveness of development interventions Results-based M&E * Roundtable on Managing for Results in 2004

Objectives of this session 1. Functions of Monitoring and Evaluation (M&E) Systems 2. Structuring M&E Systems for Public Works Programs 3. Monitoring and Evaluation Tools 4. Operationalizing M&E Systems 5. Quick Tips! 5

The Cycle of Project Level M&E Engage stakeholders 2-Selecting Indicators Be Selective 1-Formulating Objectives 3-Setting Baselines and Targets 6 Promote Transparency 5-Reviewing and Reporting on Performance 4-Monitoring and Evaluation Communicate

1. Formulating Objectives: Using a Results Chain Results chains are a simple approach to mapping the causal logic/theory of change underpinning a program best used as a participatory i t tool, during project design basis for constructing a M&E approach which will test the validity of the theory of change A results chain answers 3 questions What are the intended results of the program? How will we achieve the intended results? How will we know we have achieved the intended results? 7

The Results Chain Inputs Activities t Outputs Outcomes Final Outcomes Financial, human, and other resources mobilized to support activities Budgets, staffing, other available resources Actions taken or work performed to convert inputs into specific outputs Series of activities undertaken to produce goods and services Products resulting from converting inputs into tangible outputs Goods and services produced and delivered, under the control of the implementing agency Changes resulting from use of outputs by targeted population Not fully under the control of implementing agency The final objective of the program Long-term goals Changes in outcomes with multiple drivers SUPPLY SIDE DEMAND + SUPPLY 8

Critical factors for defining results to be achieved Socio- economic context: Results sought by the program should reflect current needs and priorities Local Capacity: Existing skills, leadership, and management capacity will impact on what can be implemented to achieve expected results Resources: Level of resources will impact on what can realistically be achieved Timetable: Results framework should identify the results (changes) to be achieved in the life of the program

2- Selecting Key Performance Indicators 3- Setting Baselines, Targets Use Results Chain/Logframe as a guide Select indicators along the results chain (activities, outputs, outcomes) Be clear on use of indicators Be selective and work with stakeholders to ID best indicators Indicators should be SMART Specific: measure as closely as possible what you want to know Measurable: be clear about how it will be measured ttributable: logically and closely linked to a program s efforts ealistic: data obtainable at reasonable cost, frequency, and accuracy argeted: Specific to the program s target group Attributable Realistic Targeted Collect baselines, set targets 10

4 - Monitoring and Evaluation Monitoring A continuous process of collecting and analyzing information -- To compare how well a project, program or policy is performing against expected results -- To inform implementation and program management Often based on administrative data Navigation system tailored to different users Evaluation A selective, systematic, objective assessment of an on-going or 11 completed project, program, or policy, its design, implementation and/or results --To determine the relevance and fulfillment of objectives, development efficiency, effectiveness, impact and sustainability. --To generate lessons learned to inform the decisionmaking process. --Tailored to key questions Often based on specialized data, surveys

Indicator Planning Matrix Expected Results (Outcomes & Outputs) Indicators (with Baselines & Indicative Targets) Data Source Time or Schedule and Frequency Responsibilities Analysis and Reporting Resources End Use Risks Obtained From results How are Frequency Who is Frequency of Estimate of Who will What are the from development plan and results framework 12 framework data to be obtained? Source and Indicators location should also capture key priorities Example: such as through a capacity survey, a development review or and gender stakeholder meeting, etc. of data responsible for analysis, analysis resources availability organizing the method, and required and data collection responsibility of committed and verifying data reporting for carrying quality and out planned source? m&e activities receive and risks and review this assumptions information? for carrying out the What planned m&e purpose does activities? it serve? How may these affect the planned m&e events and quality of data? Source: Adapted from Handbook on Planning, Monitoring

Monitoring and Evaluation Tools Examples from Public Works Projects Monitoring Management Information Systems (MIS) Costs Coverage Service delivery Compliance Beneficiary profiles Financial data Spot check system Complaints and appeals systems Evaluation Financial, Operational audits Process evaluations Participatory evaluations Social audits Beneficiary assessments Community scorecards Targeting assessments Impact evaluations Cost- benefit analysis Expenditure tracking studies 13

Sources of Monitoring Data Tips! 14 Administrative Databases (MIS) Essential, but difficult to assure that they cover all relevant units (clients, customers, households, etc.) and that all the fields you need are filled in, and filled in accurately. Require a good deal of quality control; most likely to be accurate if the line workers and other agency personnel depend d on their accuracy for their daily work. Funding data Needed for efficiency and productivity measures, but difficult to know that they include all real costs for an intervention and do not include some funds actually going to other activities. Don t take at face value; probe knowledgeable people about what they contain. Spot Check Systems Good approach for verifying accuracy of Administrative/MIS data Can involve review of records, surveys of providers, beneficiaries and other stakeholders Can involved trained observation - works best on easily observed and quantified measures Complaints and Appeals Systems Good source of information for troubleshooting Not representative of projects or beneficiaries as a whole, so careful drawing conclusions Source: adapted from Burt. Greiner

Sources of Monitoring Data Tips! Don t always need complex computer systems! Dt Data can be generated td from hand-processed d forms at a local l or point-of-service level and aggregated and computerized at a provincial or higher level. The more frequently data are collected or processed, the more computerization and networking is desirable. Take advantage of routine tasks Use routine steps in the project and subproject cycle to gather and use key data, such as entering data on beneficiaries in MIS system to review targeting and program eligibility.

Sources of evaluation data Rigor/Difficulty (Cost/Time/skills)( ) Community Focus Interviews Group Interviews Key Informant Interviews Choice depends Direct Observation Review of official records 1. Issue to be examined One-Time Survey Census Panel Surveys 2. Quality of the information needed Impact Evaluation Validity Reliability Credibility Statistical i Emphasis 3. Time frame in which information is needed 4. Cost Source: Kusek, Karthouri

5. Reviewing and Reporting on Performance The Feedback Process: Program managers together with evaluation, planning and budget units can Review M&E results on a periodic basis Hold How are we doing? sessions Develop action plans/responses to results Are program changes needed? How will we continue to track progress? Report results to others Use information to inform program management, policy 17

Getting it done! Champions Inside the program managers and key stakeholders Outside the program policymakers; Ministries of Finance, Congress, budget offices; citizens, media Changing culture from threats to tools 18