Using administrative data to make development policy more effective

Similar documents
Monitoring and Evaluation: the Foundations for Results

Impact Evaluation for Safety Nets

Impact Evaluation Objectives and Design

Operational Issues in Implementing Safety Nets Presentation by Qaiser Khan, Lead Economist, AFTHD. Thursday March 10,2011

AYT. Centre for Analysis of Youth Transitions. Evaluating the impact of youth programmes. Claire Crawford and Anna Vignoles June 2013

Barbara Bruns Program Manager, SIEF and Lead Economist Human Development Network Cairo January 2008

Using administrative data for randomized control trials

INTER-AGENCY SOCIAL PROTECTION ASSESSMENTS

Public Safety Realignment What is it?

Measuring Impact: Part 2

Applied Research in a Public Policy Setting. Legislative Budget Board Criminal Justice Data Analysis Team May 2012

World Bank Africa Gender Innovation Lab (GIL) Call for Expressions of Interest: About the World Bank Africa Region Gender Innovation Lab (GIL)

Toolkit for Impact Evaluation of Public Credit Guarantee Schemes for SMEs (Preliminary)

Executive Summary. xiii

Engaging Employers: A Sector-Based Approach to Employment for People with Criminal Records

Sahel Adaptive Social Protection Program Summary Note

Workshop II Project Management

Monitoring and Evaluation: A Logical Frame. Evaluating the Impact of Projects and Programs Beijing, China, April 10-14, 2006 Shahid Khandker, WBI

WHY DON T PROGRAMS REACH PEOPLE?

Tools for Building Monitoring and Evaluation Systems in Public Works Projects

Exhibit 1. Screenshot of Pretrial Cost-Benefit Model: New System Results Tab

Controlling for Error, Fraud and Corruption (EFC)

By Lucy Bassett, Gaston Mariano Blanco, and Veronica Silva Villalobos World Bank, LCSHS. October 2010

Measuring Impact Impact Evaluation Methods. Nathan Fiala DIW Berlin March 2014

The overall objective of PATH is to provide the poorest Jamaican households with a targeted social safety net programme to enable them to increase

Economics Training Series Introductory Course. Project Impact Evaluation Design

AYT. Centre for Analysis of Youth Transitions. Evaluating the impact of youth programmes. Anna Vignoles and Ben Alcott December 2013

Secondary Data: Valuable Resources for Extension Educators. Bo Beaulieu Southern Rural Development Center May 2007

Post-2015 Data Test: Unpacking the Post-2015 Data Revolution at the Country Level. Initial Findings: Peru GRADE

Experimentation for better policy-making. CES Breakfast session, January 18, 2011

Reducing Rural Poverty: Social Protection, Access and Decent Employment

TECHNICAL ASSISTANCE

Severe Weather and Other Shocks Are we Doing Enough to Mitigate Risks for Nutrition? Mike Manske, USAID Office of Food for Peace,

Ryan Quist, PhD RIVERSIDE COUNTY

Inter Agency Group on Disaggregated Education Indicators (IAG DEI): Concept note. 10 March 2016

Administration and Management in Criminal Justice. Chapter 3: Service Quality Approach

Randomized Evaluations in Practice: Opportunities and Challenges. Kyle Murphy Policy Manager, J-PAL January 30 th, 2017

Nueces County Job Description

Education and Workforce Development Go Together: A Discussion Paper About People with Disadvantaged Backgrounds

Introduction to Results-Based M&E and Impact Evaluation

REPUBLIC OF KENYA. Terms of Reference - Field Supervisor Individual Consultant March I. Summary

Common Business Functions for Correctional Management Systems

OXFAM GB S GLOBAL PERFORMANCE FRAMEWORK:

Making the choice: Decentralized Evaluation or Review? (Orientation note) Draft for comments

Programme Manager - Nakasongola Cluster

Reporting guidance: Conducting a systematic review for inclusion in the Crime Reduction Toolkit

What is impact evaluation, when and how should we use it, and how to go about it?

Impact Evaluation AND. Human Development

Strategic objective No. 3: Enhance the coverage and effectiveness of social protection for all

Impact Evaluation. Some Highlights from The Toolkit For The Evaluation of Financial Capability Programs in LMIC

The application of the ISPA Payments tool in Tanzania

CareerMotion PRESENTATION

NUTRITION LINKS. Food insecurity among households with young children in rural Ghana

Measuring the impact of cash transfer programs on the local rural economy: combining household survey data with a business enterprise survey

PROGRAMMING FOOD FOR WORK (FFW) INTERVENTIONS

ACHIEVING UNIVERSAL FINANCIAL INCLUSION

Econ 1629: Applied Research Methods Assignment 9: Assessing Empirical Studies

The Nuts and Bolts of Designing and Implementing Training Programs in Developing Countries

Evaluation and Accountability. 21 September 2015 INTERPOL HQ, Lyon

Building Monitoring and Evaluation Systems. Maddalena Honorati Economist Human Development Network The World Bank

Statistics for Transparency, Accountability, and Results

Targeting Social Safety Nets Programs. SSN Core Course, December 12, 2014

PROJECT INFORMATION DOCUMENT (PID) CONCEPT STAGE Report No.: AB5158 Project Name

Candidate Information VICE PRESIDENT INCLUSION (Head of Inclusion and SEN Strategy) GEMS EDUCATION Head office, Dubai

Administration of Justice

Measuring Impact of Food Assistance Programmes Insights from WFP s Experience

Example Job #21 Director, Strategic Human Resources

CODI CORE DIAGNOSTIC INSTRUMENT ASSESSMENT MATRIX SYSTEM

Using impact evaluation to improve policymaking for climate change adaptation in the agriculture sectors

The Human Capital Project: Frequently Asked Questions

Longitudinal Population Studies Strategy

Planning & Implementation Guide

TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS

Impact Measurement Case Study

Building Systems around the Common Targeting Mechanism in Ghana

ISPA tools South Asia and East Asia Pacific Regional ISPA Face-2-Face Training Yogyakarta, Indonesia, March 2018

Human Capital, Social Capital and Economic Growth

Considerations For Including Non-Experimental Evidence In Systematic Reviews

JOB CREATION AND ENTREPRENEURSHIP OPPORTUNITIES FOR SYRIANS UNDER TEMPORARY PROTECTION AND HOST COMMUNITIES IN TURKEY

Ethiopia - Socioeconomic Survey , Wave 3

Education Data Service pilot: Invitation to Tender

Evidence-based Policy: Do we have the data necessary to recognize good policy if we see it?

GLOSSARY OF SELECT M&E TERMS

POVERTY REDUCTION ACTION PLAN (PRAP) DEVELOPMENT GUIDANCE 1

J-PAL North America. Vincent Quan and Rohit Naimpally December 7 th, 2016

Table of Contents. Budget Pg. 6

Implementing Effective Correctional Management of Offenders in the Community Outcome and Process Measures

For: Approval. Note to Executive Board representatives. Document: EB 2016/LOT/G.19 Date: 21 November Focal points:

Step 1: Complete Desk Review

Public Disclosure Authorized. Public Disclosure Authorized. Public Disclosure Authorized. Public Disclosure Authorized. Report No.

Strengthening coherence between agriculture and social protection Emerging lessons

the world bank economic review, vol. 16, no Impact Evaluation of Social Funds An Introduction Laura B. Rawlings and Norbert R.

Functional Dimension 1. Functional Dimension 1

PLANNING AND CONDUCT OF EVALUATIONS

LAO PDR COUNTRY PARTNERSHIP FRAMEWORK

A route map to plan an evaluation. Impact Evaluation Workshop Gdańsk, luty 2017

How to Randomize? Dean Karlan Yale University. povertyactionlab.org

Climate change and adaptation Policy lessons from experiments in Nicaragua

FOOD AND NUTRITION SERVICE RESEARCH AND EVALUATION PLAN FISCAL YEAR 2015

Business Engagement and Services RFP

Transcription:

Using administrative data to make development policy more effective Javier Baez and Julieta Trias The World Bank

What is impact evaluation? 2 Remember the defining characteristics of IE: Attribution o A method to identify the impact attributable to an intervention emphasis on causal effects Counterfactual o Outcomes are compared with a counterfactual situation what would have happened without the intervention

IEs are data intensive 3 IEs often rely on primary data collection Rich baseline & follow up information Suitable to answer the what, why and how questions of program impacts But brings challenges: Timing: Designing and collecting surveys takes a lot of time Costs: Around two thirds of the evaluation budget (300K-400K)

But sometimes IEs can be carried out with administrative data 4 Administrative data (AD) is an alternative to address key policy questions of attribution Program effectiveness: will the nutritional program reduce malnutrition? Program design: what is the marginal value added of different program alternatives (e.g. nutrition supplements vs. nutrition supplements + training on good nutrition practices)? And often at a much lower cost and more quickly

What is administrative data? 5 Data originally collected for three main purposes: monitoring of government programs and interventions, targeting government interventions, enabling regulation and auditing Derived from an administrative source, usually a government unit (sectoral ministries, program implementation and administration units, etc.) Often of high frequency and with large coverage of target group (e.g. children enrolled in school, migration records, vital records, social security records, etc.)

Could be very rich in information 6 For instance, administrative data in the health sector could include: Supply side: health centers, type and quantity of services provided, location, staff, etc. Demand side: number of users, patient profile, clinical history, location, use of services, health status of patients, health insurance, out of pocket expenditures, etc. Costs: budget, cost data and payments

Can be complemented with secondary data to broaden the scope of the analysis 7 Several sources of micro-data available: Population based surveys (LSMS, DHS, MIC) Group-specific surveys (labor force, student performance surveys Census: population, housing, agriculture Geo-referenced data Weather data: rainfall, temperature, etc. Enterprise surveys

Developing administrative data for IE purposes (1) 8 Ability to link datasets accurately is critical Based on personal information: national ID, name, date of birth, gender, location, phone number, etc. Unique identifiers have to be carefully recorded across ADs Need to employ different data merging algorithms to assess the robustness of the findings Based on age cohorts: date and place of birth Based on geographic information: locality, detailed geo-reference data

Developing administrative data for IE purposes (2) 9 AD have to be relevant for the IE Need to capture information on treatment and control group (e.g. program allocation mechanism) Need to include information for a period of time that is relevant for the program/intervention evaluated and research questions asked Include information on response and control variables Very useful if it includes data on program processes, implementation and operation (often built in M&E systems developed to manage programs)

Developing administrative data for IE purposes (3) 10 Ensuring that the data are of high quality is challenging AD usually developed and managed by different agencies with varied: Protocols to gather and enter the data Data management systems Quality control checks Meta-data, documentation Technical capacity

Developing administrative data for IE purposes (4) 11 Data availability Most times AD is not publically available If available, access to some part of the data often restricted (e.g. personal information, earnings, etc.) Confidentiality: rigorous protocols to protect confidentiality of the data are key to develop trust between agencies and policy analysts/researchers Meta-data not always available or fully documented Information recorded in different data platforms and not organized for statistical analysis

Using AD to assess and enhance the effectiveness of programs (1) 12 Assessing the effectiveness of program alternatives to improve design Testing the marginal benefit of program alternatives goes beyond learning whether program works or not (remember Impact Evaluation 2.0) Measuring the cost-effectiveness of program alternatives

Case 1: Which treatment is more effective? 13 Objective: address short-term poverty and increasing households productive potential MIS with data of all people enrolled in the program CCT MIS data for participants assigned to each treatment arm (1/3, 1/3, 1/3) CCT CCT + scholarship for an occupational training CCT + grant for productive investments MIS data for participants assigned to each cash treatment arm (1/6, 1/6) $X $X + $Y For instance, program manager interested in learning if the scholarship has an additional impact: E(Y CCT+S ) - E(Y CCT ) Or if more cash ($Y) has extra benefits: E(Y CCT(X+Y) ) - E(Y CCT(X) )

Example: Philadelphia Low-Intensity Community Supervision Experiment 14 Background: Program for criminal offenders on probation or parole identified as being at low risk of committing a serious crime Objective: Seeking ways to reduce the cost of supervision to Philadelphia County Low v. high-intensity supervision by a probation officer (2.4 visits with the officer per year v. 4.5 visits per year) Evaluation: randomly assigned 1,559 offenders on parole (from a short sentence in county jail) or probation in 2007-2008 to either treatment with follow up one year after (Barnes et al 2010)

Key lessons for program design and effectiveness, 15 No differences in crime between a low and high dosage in probation supervision Prevalence of offending and incarceration for one year after RCT start Source: Barnes et al (2010)

16 that led to changes in the operation and cost-effectiveness of the program The county adopted the low intensity approach for all low-risk offenders The changes tested were found to be a viable way to reduce costs in the criminal justice system A low-cost evaluation: less than $100,000. Low cost achieved by using administrative data (e.g., arrest records) that the county already collects for other purposes

Using AD to assess and enhance the effectiveness of programs (2) 17 Examining different aspects of program impacts Average treatment effects (TOT, ITT) Intensity treatment effects Short, medium- and long-term effects Distribution of treatment effects (sub-group analysis) Mechanisms that explain program effects or the absence of them

Case 2: Do conditional cash transfers raise human capital in the long-term? 18 Background 2001: Government of Colombia (GoC) implemented a standard CCT program (FA) in response to a major economic crisis 2003-04: A short-term IE showed that FA raised school participation among participants vs. nonparticipants 2009: policy makers start questioning whether the increase in school participation actually translated into more school attainment

How to answer the question in the absence of primary data? 19 Challenges No formal plan in place to systematically track participants and non-participants over time Very little time, narrow budget Opportunities Rich and linkable administrative data available to: (1) identify participants and comparable non-participants and (2) assess the effects on indicators of school attainment and learning Strong support, interest and technical capacity from key local stakeholders

Rich administrative data available 20 Three different sources of AD: 1. A census of the poor collected between 1994 and 2003 to design the targeting system [SISBEN] 2. Administrative records from the M&E system of FA a rich longitudinal census of all program beneficiaries [SIFA] 3. Administrative records on registration and results for Icfes -- a standardized national test that is administered prior to graduation from high school [ICFES]

Mapping the data landscape 21 Icfes tests for students in grade 11 (T & C) [ICFES] 2002 03 04 05 06 07 08 2009 First round of panel survey (T & C ) [Base] Program M&E system (census of participants, only T) [SIFA] Sisben I 94 03 (T & C) [SISBEN] Sisben II 03 07 (T & C) [SISBEN]

Our research strategy: RDD 22 Exploits the discontinuity arising from the program eligibility rules (using SISBEN + SIFA) Probability 1.0 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0.0 Program Probability -20-18-16-14 -12-10 -8-6 -4-2 0 2 4 6 8 10 12 14 16 18 20 Distance to the Eligibility threshold Observations=630795 means different quartic Source: Baez et al (2011)

Allowed us to answer the questions of the Government, 23 Participant children are between 2.5 and 5 percentage points more likely to finish high school 0.35 School completion 0.30 0.25 0.20 0.15 0.10-20 -18-1 6-14 -12-10 -8-6 -4-2 0 2 4 6 8 10 12 14 16 18 20 Distance to the Eligibility threshold Observations=624028 means different quartic Source: Baez et al (2011)

in a credible way, 24 Household head education Partner education level 3.50 3.00 2.50 2.00 1.50 1.00 3.50 3.00 2.50 2.00 1.50 1.00-20 -18-16 -14-12 -10-8 -6-4 -20 2 4 6 8 10 12 14 16 18 20-20 -18-16 -14-12 -10-8 -6-4 -2 0 2 4 6 8 10 12 14 16 18 20 Distance to the Eligibility threshold Distance to the Eligibility threshold means different quartic means different quartic Observations=630021 Observations=498138 Married household Social Security 0.50 0.40 0.30 0.20 0.10 4.00 3.80 3.60 3.40-20 -18-16 -14-12 -10-8 -6-4 -20 2 4 6 8 10 12 14 16 18 20-20 -18-16 -14-12 -10-8 -6-4 -2 0 2 4 6 8 10 12 14 16 18 20 Distance to the Eligibility threshold Distance to the Eligibility threshold means different quartic means different quartic Observations=630795 Observations=630795 Source: Baez et al (2011)

and in a timely manner and at a very low cost 25 Solid preliminary results available 6 months after team got access to all the four datasets Total costs around one fifth of traditional IEs that rely on primary data collection plans Results helped Inform public debate about the future of the program Inform program modifications Continue an analytical agenda on the effects on postsecondary outcomes building on the same data

Administrative data can be very useful but we aware of the trade-offs 26 Pros Large sample sizes High frequency Fewer problems with attrition, non-response and measurement error Lower cost Cons Fewer variables and therefore more limited scope for the analysis Less control to ensure high and consistent quality Often is difficult to accurately link AD Access still very limited

Final Remarks 27 Enhance the quality of AD to create opportunities for policy/program relevant research at a low cost Use AD for Impact Evaluation. Test - learn - adapt. Use AD to assess the overall effect of the program. Use AD to learn about the effectiveness of program alternatives and improve design.