Impact Evaluation for Safety Nets

Similar documents
Measuring Impact Impact Evaluation Methods. Nathan Fiala DIW Berlin March 2014

Impact Evaluation Objectives and Design

Monitoring and Evaluation: the Foundations for Results

Introduction to Results-Based M&E and Impact Evaluation

Economics Training Series Introductory Course. Project Impact Evaluation Design

Using administrative data to make development policy more effective

Measuring Impact: Part 2

Impact Evaluation. Some Highlights from The Toolkit For The Evaluation of Financial Capability Programs in LMIC

NOTE 6: Identifying an Appropriate Impact Evaluation Method

How to Randomize? Dean Karlan Yale University. povertyactionlab.org

YOUTH EMPLOYMENT SEMINAR. Meghan Mahoney Policy Manager, J-PAL Global 13 October 2015

AYT. Centre for Analysis of Youth Transitions. Evaluating the impact of youth programmes. Claire Crawford and Anna Vignoles June 2013

The Nuts and Bolts of Designing and Implementing Training Programs in Developing Countries

Sahel Adaptive Social Protection Program Summary Note

David N. Margolis, Paris School of Economics

Module 12 Advanced Evaluation Designs (IEs)

By Lucy Bassett, Gaston Mariano Blanco, and Veronica Silva Villalobos World Bank, LCSHS. October 2010

Using administrative data for randomized control trials

Randomized Evaluations in Practice: Opportunities and Challenges. Kyle Murphy Policy Manager, J-PAL January 30 th, 2017

Toolkit for Impact Evaluation of Public Credit Guarantee Schemes for SMEs (Preliminary)

Using impact evaluation to improve policymaking for climate change adaptation in the agriculture sectors

Impact Evaluation AND. Human Development

Experimentation for better policy-making. CES Breakfast session, January 18, 2011

What is impact evaluation, when and how should we use it, and how to go about it?

Climate change and adaptation Policy lessons from experiments in Nicaragua

Econ 1629: Applied Research Methods Assignment 9: Assessing Empirical Studies

Indicators, Targets and Data Sources

Operational Issues in Implementing Safety Nets Presentation by Qaiser Khan, Lead Economist, AFTHD. Thursday March 10,2011

the world bank economic review, vol. 16, no Impact Evaluation of Social Funds An Introduction Laura B. Rawlings and Norbert R.

Barbara Bruns Program Manager, SIEF and Lead Economist Human Development Network Cairo January 2008

jscreationzs's image

Strengthening coherence between agriculture and social protection Emerging lessons

From Protection to Production. An overview of impact evaluations on Social Cash Transfers in Sub Saharan Africa

Strand 2 Measuring the economic and social impact of local Services

Reform of the Ethiopian TVET System Towards a Suitably Skilled Work Force

Targeting Social Safety Nets Programs. SSN Core Course, December 12, 2014

Entrepreneurship training and self-employment among university graduates: Evidence from a randomized trial in Tunisia

RESILIENCE, EQUITY, OPPORTUNITY

SUMMARY OF THE PROJECT INFORMATION

Building Governance and Anti-Corruption in CCT Program Design The Case of the Philippines. Jehan Arulpragasam and Matt Stephens

REPORT ON CONFERENCE OUTCOMES

Module 6 Introduction to Measurement

New Thinking about Jobs. Presentation at UN Public Service Forum, 22 June 2017, The Hague Luc Christiaensen, Lead Economist, Jobs Group, World Bank

Promoting Entrepreneurship Among University Graduates: Evidence from Tunisia

Works Programs: A Toolkit for Practitioners. Kalanidhi Subbarao Claudia Rodríguez-Alas

Moving towards productive inclusion: strengthening coherence between agriculture and social protection

A route map to plan an evaluation. Impact Evaluation Workshop Gdańsk, luty 2017

Monitoring Interactions between the Roads Sector and Conflict

Policy Context: National Social Protection Strategy (2007)

Building Systems around the Common Targeting Mechanism in Ghana

EMPLOYMENT POLICY BRIEF

An Overview of the SNAP E&T Pilot Projects and Effective Approaches to Increasing Employment and Earnings

Cost-Benefit Analysis for Forecast-Based Financing

WHEN THINGS GO WRONG:

Tools for Building Monitoring and Evaluation Systems in Public Works Projects

ide s Approach to Sanitation Marketing:

ICCO and the private sector

Youth Employment and Entrepreneurship: Evidence and Open Questions. Marianne Bertrand University of Chicago

Impact Evaluation Techniques for Evaluating Active Labor Market Programs *

This document briefly outlines some of the words you might come across when looking at evaluation and research.

Making the choice: Decentralized Evaluation or Review? (Orientation note) Draft for comments

GLOSSARY OF SELECT M&E TERMS

START Version 1.0. START Guidelines. Department of Labour and Advanced Education / Employment Nova Scotia LAE/ENS

Nestlé Action Plan on the Responsible Sourcing of Cocoa from Côte d Ivoire 29 th June, 2012

Measuring Impact of Food Assistance Programmes Insights from WFP s Experience

PLANNING AND CONDUCT OF EVALUATIONS

ILO-Employment Sector Youth Employment Programme TECHNICAL NOTE 1

Poverty Issues in Infrastructure Regulation: Choices and Dilemmas, Opportunities and Pitfalls

World Bank Africa Gender Innovation Lab (GIL) Call for Expressions of Interest: About the World Bank Africa Region Gender Innovation Lab (GIL)

Scaling Up in Agriculture. Framework and Lessons

AYT. Centre for Analysis of Youth Transitions. Evaluating the impact of youth programmes. Anna Vignoles and Ben Alcott December 2013

IFRC nutrition-sensitive interventions in the Asia-Pacific region

ACHIEVING UNIVERSAL FINANCIAL INCLUSION

Impact evaluation proposal form

14 May 2018 estimated start date. Total of 30 chargeable working days only.

Impact and Baselines. Isabelle Kidney

From Protection to Production: measuring the impact of social cash transfers on local economic development in Africa

Thank you for inviting me. My role within the organisation has been to deal with the demands of the PPA. Core to this has been Value for Money (VfM).

Real Good, Not Feel Good A Brief Guide to High-Impact Philanthropy

OXFAM GB S GLOBAL PERFORMANCE FRAMEWORK:

Adaptive learning in trade & competitiveness. Arianna Legovini Development Impact Evaluation (DIME)

Guidance note for applicants

Developing a Social Enterprise Business Plan

MAPPING THE DEVELOPMENT JOURNEY USING LARGE-SCALE SURVEYS TO UNDERSTAND CONTEXT AND ADDRESS LEARNING NEEDS

Social protection for food security

Technical Note Integrating Gender in WFP Evaluations

Baseline Study on Newly Fairtrade Certified Cocoa Producer Organizations in Côte d Ivoire and Ghana. Terms of Reference - May 2013

The overall objective of PATH is to provide the poorest Jamaican households with a targeted social safety net programme to enable them to increase

Guidelines for identifying and designing planned comparisons

Delivering Sustainable Change: TREE AID s Approach

G.M.B. Akash/Panos. Education for All Global Monitoring Report 2

Terms of Reference. Senior Researcher / Study Consultant for the Educational Demand and Needs Assessment Study

Harnessing Growth Sectors for Poverty Reduction

DIME Impact Evaluation Workshop

Productive inclusion and cash transfers

Controlling for Error, Fraud and Corruption (EFC)

Learning from the South? Chris Garforth Emeritus Professor, University of Reading

IFC EMPLOYABILITY TOOL. Bridging Education to Employment

Office for Students Business plan Reference OfS Enquiries to Date of publication 30 April 2018

Integrating Handwashing with Soap Behavior Change into Other Programs, with Hnin Hnin Pyne

Transcription:

Impact Evaluation for Safety Nets Methods and Applications Patrick Premand (Africa HD) Safety Nets Course, December 2013 This material builds on Gertler, P. J.; Martinez, S., Premand, P., Rawlings, L. B. and Christel M. J. Vermeersch, 2010, Impact Evaluation in Practice: Ancillary Material, The World Bank, Washington DC (www.worldbank.org/ieinpractice). The content of this presentation reflects the views of the authors and not necessarily those of the World Bank.

Outline From M&E to Impact Evaluation The Main Concepts of Impact Evaluation Choosing the best IE design for your project Randomization as an operational tool Illustration of IE results Operational aspects

From M&E to Impact Evaluation

The Results Chain in a Typical Program INPUTS ACTIVITIES OUTPUTS OUTCOMES LONGER-TERM OUTCOMES HIGHER ORDER GOALS Financial, human, and other resources mobilized to support activities. Actions taken or work performed to convert inputs into specific outputs. Project deliverables within the control of implementing agency SUPPLY SIDE. Use of outputs by beneficiaries and stakeholders outside the control of implementing agency DEMAND SIDE. Changes in outcomes that have multiple drivers. o o Budget Staffing o o o Training Studies Construction o o o o Training plan completed Cash transfer delivered Road constructed School built o o o o New practices adopted Use of the road School attendance up Health service use up o o o Poverty reduced Income inequality reduced Labor productivity increased Focus of traditional M&E Results-based management Focus of Impact Evaluation

Monitoring vs. Evaluation Monitoring Evaluation Frequency Regular, Continuous Periodic Coverage All programs Selected program, aspects Data Universal Sample based Depth of Information Tracks implementation, looks at WHAT Tailored, often to performance and impact/ WHY Cost Cost spread out Can be high Utility Continuous program improvement, management Major program decisions

Evaluations A systematic, objective assessment of an on-going or completed project, program, or policy, its design, implementation and/or results, asking o o o Descriptive Questions to seek to determine what is taking place and describe aspect of a process. Normative Questions to compare what is taking place to what should be taking place. (PROCESS EVALUATION) Cause-and-Effect Questions to examine outcomes and assess what difference the intervention makes in outcomes (IMPACT EVALUATION)

Impact Evaluation is not for every project Evaluate impact selectively, when project is: Innovative Replicable, scalable, or implemented at scale Strategically relevant (e.g. large budget) Evaluation will fill knowledge gap Substantial policy impact Impact Evaluation can focus on selective innovations within projects Beyond does my program work? Towards which design is more effective?

Public Works (THIMO) in Cote d Ivoire Emergency Youth Employment and Skills Development Project (US $45 million) Public Works Component and Skills Development Component (apprenticeships, internships, professional training, entrepreneurship training, ) Public Works 12,500 youths (18-30) by 2015 Daily wage rate of CFA 2,500 for 6 months Graduation elements in public works: Entrepreneurship training to help youth enter into selfemployment Sensitization on wage employment opportunities to help youth transition into wage jobs Also: payment into bank accounts, basic life skills training

Key questions for the impact evaluation of Public Works in Cote d Ivoire? Basic Question What is the impact of participation in the public works program on employment and earnings of youths and their households? Design Question (1) Does the provision of basic entrepreneurship training facilitate the creation of household enterprises after graduation? Design Question (2) Does the provision of sensitization on wage employment opportunities facilitate insertion into wage jobs after graduation?

The Niger Safety Nets Project Niger safety nets project (US $70 million) Building block for a national social protection system Cash transfer and cash for work components Video and Story Large-Scale cash transfer program: 80,000 households by 2017 Unconditional cash transfer, US$20/month for 24 months (PMT targeting) Accompanying measures as soft conditionalities 18-month parenting training (community assemblies, meetings, household visits) covering holistic early childhood development Aims to trigger behavioral changes Similar approach in several new projects in Africa

Key questions for the impact evaluation of cash transfers in Niger? Basic Question Does the cash transfers program reduce poverty, improve food security and improve children s nutrition? Design Question Do the accompanying measures (parenting training) generate higher impacts on children s nutrition and development?

The Main Concepts of Impact Evaluation

Impact Evaluation needs to be distinguished from other evaluations The objective of impact evaluation is to estimate the causal effect or impact of a program on outcomes of interest.

The Objective Estimate the causal effect (impact) of intervention (P) on outcome (Y). (P) = Program or Treatment (Y) = Outcome Indicator, Measure of Success Example: What is the effect of a cash transfer program (P) on Household Consumption (Y)?

Solution Estimate what would have happened to outcomes (Y) in the absence of the program (P). We call this the Counterfactual.

Communicating complex concepts in 3 slides Example: What is the Impact of giving Ali additional money (P) on Ali s consumption (Y)?

The Perfect Clone Ali Ali s Clone 6 candies 4 candies IMPACT=6-4=2 Candies

In reality, use statistics Treatment Comparison Average Y=6 candies Average Y=4 Candies IMPACT=6-4=2 Candies

Choosing the best IE design for your project

Finding good comparison groups We want to find clones for the Alis in our programs. The treatment and comparison groups should o o have identical characteristics except for benefiting from the intervention. In practice, use program eligibility & assignment rules to construct valid estimates of the counterfactuals

Two false counterfactuals to avoid! Before vs After Compare: Same individuals Before and After they receive P. Problem: Other things may have happened over time. Enrolled vs Not Enrolled Compare: Group of individuals Enrolled in a program with group that chooses not to enroll. Problem: Selection Bias. We don t know why they are not enrolled. Both counterfactuals lead to biased estimates of the counterfactual and the impact.

The conversation needs to start early Retrospective Evaluation is necessary when we have to work with a program that has already been roll-out and existing data. Rarely feasible: baseline data? Information on targeting? In Prospective Evaluation, the evaluation is designed in parallel with the program (and targeting decisions). The way to go: ensure baseline data is collected, and comparison group exists.

Where do good Comparison Groups come from? The rules of program operation determine the evaluation strategy. We can almost always find a valid comparison group if: the operational rules for selecting beneficiaries are equitable, transparent and accountable; the evaluation is designed prospectively. Evaluation design and program design go hand-in-hand.

5 methods in IE Toolbox 5 methods in IE toolbox take different approaches to generate comparison groups and estimate the counterfactual: 1 Randomized Assignment 2 Randomized Promotion 3 Regression Discontinuity Design 4 Difference-in-Differences RDD DD 5 Matching

Choosing an IE design for your program Design IE prospectively to generate good comparison groups and collect baseline data 3 operational questions to determine which method is appropriate for a given program Money: Does the program have sufficient resources to achieve scale and reach full coverage of all eligible beneficiaries? Targeting Rules: Who is eligible for program benefits? Is the program targeted based on an eligibility cut-off or is it available to everyone? Timing: How are potential beneficiaries enrolled in the program all at once or in phases over time?

Choosing your IE method(s) Money Excess demand No Excess demand Targeting Timing Phased Roll-out Immediate Roll-out Targeted Universal Targeted Universal 1 Randomized assignment 4 RDD 1 Randomized Assignment 4 RDD 1 Randomized assignment 2 Randomized promotion 3 DD with 5 Matching 1 Randomized Assignment 2 Randomized Promotion 3 DD with 5 Matching 1 Randomized Assignment 4 RDD 4 RDD 1 Randomized assignment to phases 2 Randomized Promotion to early take-up 3 DD with 5 matching If less than full Take-up: 2 Randomized Promotion 3 DD with 5 Matching

Choosing the IE method in Niger Money Excess demand No Excess demand Targeting Timing Phased Roll-out Immediate Roll-out Targeted Universal Targeted Universal 1 Randomized assignment 4 RDD 1 Randomized Assignment 4 RDD 1 Randomized assignment 2 Randomized promotion 3 DD with 5 Matching 1 Randomized Assignment 2 Randomized Promotion 3 DD with 5 Matching 1 Randomized Assignment 4 RDD 4 RDD 1 Randomized assignment to phases 2 Randomized Promotion to early take-up 3 DD with 5 matching If less than full Take-up: 2 Randomized Promotion 3 DD with 5 Matching

Randomization as an operational tool

Randomization is not only for the Impact Evaluation. In Niger Geographical targeting only works with higher administrative units: Regions, departments, communes How to chose between long list of criteria to select villages within units? Public lottery was deemed the most transparent, and least controversial approach. Project decided to keep using the randomization approach for purely operational reasons, including when not needed for the IE.

Randomization can help with transparency Project staff in Niger: Now political authorities cannot interfere with the village selection. All the village chiefs were present and signed that they agreed with the procedure before we did the selection. Noone can complain to us and try to change the result. Beneficiaries in Nicaragua: At least this time we know why we were not chosen for the program. Usually decisions are made and we don t know why our village cannot participate.

Randomization Randomization can be used only in certain contexts. BUT excess demand happens for most programs Even after applying all existing targeting criteria, not everyone can be served Randomization is fair, transparent and ethical way to assign benefits to equally deserving populations Provides equal chance of participation among equally deserving units. Randomization is the Gold Standard Most robust method But also the most simple, and the cheapest. Multiple ways to perform randomization

Randomization to answer basic IE questions 3. Randomize treatment 1. Population 2. Evaluation sample Comparison = Ineligible = Eligible External Validity Internal Validity

Randomized to answer IE design questions 1. Population 2. Evaluation Sample 3. Randomize treatment. Comparison = Not eligible = Eligible

Randomized Assignment! In Randomized Assignment, large enough samples, produces 2 statistically equivalent groups. We have identified the perfect clone. Feasible for prospective evaluations with oversubscription/excess demand. Most pilots and new programs fall into this category. Randomized beneficiary Randomized comparison Consider evaluating relative effectiveness of alternative program design options.

Illustration of IE results

Productive safety net pilot in Nicaragua o Key question: o Can safety nets provides reduce poverty and protect households from shocks in the short-term o While also help household invest and manage risk in the medium term? o Combine a basic CCT with 2 productive interventions o Where: 6 municipalities in rural Nicaragua o High poverty, dependent on agriculture o 3 groups of households: GROUP 1 Basic CCT GROUP 2 Basic CCT + Productive investment grant GROUP 3 Basic CCT + Vocational training

Impact Evaluation Design 1. Public Lottery within selected municipalities, public lottery to randomly select 50 Control communities 56 Treatment communities 2. Within each treatment communities, public lottery to assign households to 3 packages 1000 Basic CCT 1000 Basic CCT + Productive investment grant 1000 Basic CCT + Vocational training

CCT + Business Grant CCT + Training

o The program lasted for one year o What were the results two years after the end of the intervention? o (There was a range of short-term impacts, more info: www.worldbank.org/atencionacrisisevaluation

The CCT + grant had a lasting impact on welfare two years after the end of the program CCT CCT + Training CCT + Grant *** -0.04-0.02 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14

The CCT + grant had the largest impact on entry into non-agricultural self-employment CCT * CCT + Training * CCT + Grant *** 0 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09

and the largest impact on profits in nonagricultural businesses CCT CCT + Training * CCT + Grant ** -400-200 0 200 400 600 800 1000 1200 1400

There was no significant impact on entry into non-agricultural wage employment CCT CCT + Training CCT + Grant -0.04-0.02 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14

But the CCT + training increased wages for those in private sector jobs CCT CCT + Training ** CCT + Grant -400-200 0 200 400 600 800 1000 1200

Operational aspects

What to Evaluate? Efficacy Studies are carried out in a specific setting to test a model implemented in best-possible way. (e.g. Pilots for proof of concept) Effectiveness Studies, provide evidence from interventions taking place under normal circumstances (e.g. Scalable National Programs)

Who does the Impact Evaluation? Critical to start discussing IE early: Clarify role of different types of evaluations Large potential value-added of IE but it is an investment Essential to design the evaluation with operational team Framing of evaluation question Program design and IE design go together. Implementing of IE requires close coordination with project implementation IE best as seen of collaboration between implementers and evaluators Quality/Validity of design is what makes results legitimate Consider which components to outsource

Design and implementation of IE in Niger? Broad outline of Impact Evaluation in project design document After effectiveness, 6-months preparation phase (typically longer) Multiple workshops with local stakeholders to finalize design Participation of project team in a regional impact evaluation workshop Randomization performed by project (through geographical targeting process) Baseline data collection contracted to National Statistical Agency (June 2012) Oversight by WB and project teams. Next steps: Focus on ensuring MIS colleges comprehensive data on project implementation Qualitative (process) evaluation to gather information on quality of implementation, as well as beneficiaries perceptions Follow-up data collection (December 2014)

Benefits of Impact Evaluation IE is the only way to know if a program is effective. Clear value-added, but needs to be used selectively. Doing IEs can improve project implementation The process of doing IE can change the way we work ( Science of Delivery ) Learning through the process in Niger: Learning quality control in baseline helped with PMT survey Randomization helped with transparency Solid baseline helps for profiling of beneficiaries Allows analysis of targeting efficiency Complementary qualitative evaluation helps improve project implementation

Impact Evaluation, MIS, and Process evaluation are complementary Monitoring and Process Evaluation data is key for the evaluation o MIS: Lists of beneficiaries, distribution of benefits o Targeting data o Process evaluation Common issues: o For the IE design, can you do targeting (e.g. PMT) in comparison group? o How to set up unique identifiers across all sources of data Data on project implementation is essential to interpret IE result. Does the project work/not work: o because of the intervention model, o or because of implementation of the intervention model?

Remember The objective of impact evaluation is to estimate the causal effect or impact of a program on outcomes of interest.

Remember To estimate impact, we need to estimate the counterfactual. o o what would have happened in the absence of the program and use comparison or control groups.

Remember We have a toolbox with 5 methods to identify good comparison groups.

Remember Choose the best evaluation method that is feasible in the program s operational context.

Thank you! Reference also: available in Spanish French and Portuguese (soon) www.worldbank.org/ieinpractice