Adaptive learning in trade & competitiveness. Arianna Legovini Development Impact Evaluation (DIME)

Size: px
Start display at page:

Download "Adaptive learning in trade & competitiveness. Arianna Legovini Development Impact Evaluation (DIME)"

Transcription

1

2 Adaptive learning in trade & competitiveness Arianna Legovini Development Impact Evaluation (DIME)

3 Why are you here today?

4 We are here to get better at what we do

5 Identify problems Test alternatives Adopt solutions A simple idea

6 Operational research to inform a process of adaptive policy making

7 Iterative learning Policy problems Policy adoption Data Actionable answers Evidence/sol utions Hypotheses IE Testing

8 Objective to improve policy choices overtime and increase policy effectiveness

9 2 types of inputs into decisions Data Evidence

10

11

12 What is a sufficient basis for a decision? Call for action Lack of sufficient basis for deciding to do something no data, no system of accountability, perverse incentives Diagnosis High quality data to understand problem and underlying population. Use it to debunk priors and formulate policy hypothesis. Test hypothesis and select policy modality Causal evidence to sort out complex supply & demand response to policy Use it to select most effective policy alternative.

13 Kenya s health inspection Patient Safety CALL FOR ACTION Unclear inspection rules, inadequate capacity to inspect and monitor health facilities, and no sanctions and enforcement)

14 Kenya s health inspection Patient Safety CALL FOR ACTION Unclear inspection rules, inadequate capacity to inspect and monitor health facilities, and lack of incentives to improve patient safety (no sanctions and enforcement) DIAGNOSIS Develop and validate instrument for collecting high quality facility data on patient safety. Use results to change regulation and inform policy prior (close unsafe facilities). Develop hypothesis on top and bottom up incentives for patient safety.

15 Kenya s health inspection Patient Safety CALL FOR ACTION Unclear inspection rules, inadequate capacity to inspect and monitor health facilities, and lack of incentives to improve patient safety (no sanctions and enforcement) DIAGNOSIS Develop and validate instrument for collecting high quality facility data of patient safety. Use results to change regulation and inform policy prior (close unsafe facilities). Develop hypothesis on top and bottom up incentives for patient safety. TESTING Conduct 3-arm Random Control Trials on alternative inspection modalities. Use RCT results to select inspection modality that is most effective in addressing patient safety.

16 So what is causal inference?

17 Policy objective Invest in Invest in what gets us results the cause to get an effect 17

18 Evaluation objective Identify cause - effect enable good policy decisions 18

19 WHEN I USE MY UMBRELLA MY SHOES GET WET WHY IS UNDERSTANDING CAUSALITY SO IMPORTANT? I THINK I WILL STOP USING UMBRELLAS It is easy to confuse correlation with causation 19

20 Do training programs increase employment? 20

21 Invest in the cause to get results More motivated youth Attitudes and values Charismatic mentor + behavioral therapy Vocational training EMPLOYMENT Mentorship interacts with quality of mentor to change attitudes and value for youth to act upon their life-skill training and change their behavior (Bushway and Reuter, 2002, 2007) 21

22 Monitoring: trends and correlations not causality Monitoring tracks indicators over time (But only among participants) It is descriptive before-after analysis It tells us whether things are moving in the right direction It does not tell us why things happen (causality) 22

23 Evaluation Compares WHAT HAPPENED TO WHAT WOULD HAVE HAPPENED 23

24 What is counterfactual analysis? Compare same individual with & without intervention at the same point in time Missing data Compare statistically identical groups of individuals with & without intervention at the same point in time Comparable data 24

25 Counterfactual criteria Treated & control groups Have identical initial average characteristics (observed and unobserved) So that, the only difference is the treatment Therefore the only reason for the difference in outcomes is due to the treatment 25

26 2 things will be fundamental The quality of the thinking and The quality of the engagement

27 Strength of collaboration However dire the situation at the outset, concerted action was lacking: research team created dynamics for Stakeholder collaboration/creation of task force, Dialogue/development of viable theory of change Policy proposals not likely to succeed: data triggered discussion Cannot close non-compliant health facilities, they are 97% of total Stakeholder discussion needed to narrow down testable policy options based on: understanding of budget constraints, scalability relative to local capacity, feasibility of alternatives relative to local practice, culture, etc.

28 the idea DIME idea Empower policy people to exert control on their local environment Invest in impact evaluation research Capitalize on the World Bank/IFIs billion $ investments to pilot ideas and exert influence generate data & evidence and motivate change 28

29 DIME objectives Develop institutional capacity for data & evidence-based policy Generate useful knowledge that solve development problems 29

30 How we work with the rest of the Bank Bank-wide governance structure Bank-wide Advisory Council Global Practice-level Working Groups Technical Committee GP IE program cycle Agreement with Senior Director Appointment of Dime team and GP IE team Establishment of Working Group Knowledge priorities Strategic case selection Operationalization of results into portfolio Launching workshop (Dime, GP and client governments) Training Evidence sharing IE program development Call for proposals Technical case selection Implementation Data Diagnosis Tests Knowledge sharing Production and dissemination of data and evidence 30

31 Systematic use of evidence How we work with projects Inform policy design IE DESIGN Train & apply Guide midcourse corrections Inform adoption and scale-up IE IMPLEMENTATION IE DISSEMINATION Learn by doing Apply knowledge Capacity building IE Products 31

32 13 workshops with 147 project teams (900 Bank and Government officials) ECA 6% MENA 5% EAP 5% SAR 12% AFR 41% LAC 31% Putting in perspective: The Bank approved around 400 projects in FY14

33 Global outreach 5 regions 60 countries 173 IEs 300 partners 33

34 Across all sectors Governance 29 IEs 18.1% Agriculture 25 IEs 15.6% Trade and Competitiveness Social Protection and Labor 20 IEs 17 IEs 10.6% 12.5% Social, Urban, Rural, and Resilience Health, Nutrition and Population Transport and Information and Communi.. 14 IEs 14 IEs 12 IEs 7.5% 8.8% 8.8% Environment and Natural Resources Water Finance and Markets Energy and Extractives Education 7 IEs 6 IEs 5 IEs 5 IEs 5 IEs 4.4% 3.8% 3.1% 3.1% 3.1% Poverty 1 0.6% 34

35 USD 12 billion in underlying WB lending Billions Governance Agriculture Transport and ICTs Water Energy and Extractives Social Protection and Labor Social, Urban, Rural, and Resilience Environment and Natural Resources Trade and Competitiveness Health, Nutrition and Population Education Poverty Finance and Markets 35

36 T&C impact evaluation timeline Firms capabilities skills and capital & Financial literacy Getting operations to work: Take-up Targeting Spillovers Matching grant & SME services projects Investment climate business registration/formalizatio n/inspection High growth firms Firms linkages Regulatory efficiency Evaluate what we do Evaluate how we can do better

37 Dakar 2010 We learned that: 1. Firms don t demand matching grants ( Learning from Evaluations that Never Happened 2014) 2. How NOT to do things! Matching grant & SME services projects The findings changed our priors: Free money is not actually free (basic paperwork can be cumbersome!) Firms not necessarily know what they don t know Local markets for services may be underdeveloped Need to focus on quality of implementation and intensity (McKenzie, Assaf, Cusolito 2015) Now we are asking what are the demand and supply constraints firms face do they know what they don t know? are services available in local markets?

38 Rio 2011 We learned that: We learned that: Financial literacy can change attitudes and behaviors Program quality is what matters Firms capabilities skills and capital Financial literacy Training programs don t seem to work How NOT to do things! The findings changed our priors: Financial literacy potential to change behavior Now we are asking: for what age groups The findings changed our priors: Training should neither be too light touch nor too intensive; Increasing focus on take-up and targeting Now we are asking what kind of training AND how to increase adoption e.g. 1. On site management trainings work (Bloom et al., 2013) 2. Marketing training more effective than financial training (Anderson, Chandyand, Zia 2016)

39 Paris 2012 We learned that: We learned that: We learned that: 1. Firms don t formalize 2. Those that do don t perform better 3. How NOT to do things! The inspection function is weak Better regulation increases the speed of justice (Kondylis et al) The findings changed our priors: formalization is not a necessary condition for better performance Now we are asking how to make informal firms more productive Investment climate business registration/formalizatio n/inspection Now we are investing in 1. Rules of the game 2. Measurement 3. Technical support & Enforcement Now we are investing in understanding whether firm perform better with a more efficient delivery of justice

40 Istanbul 2015 Getting operations to work: Take-up Targeting Spillovers

41 Mexico City 2017 Opportunities: Understand that some firms benefit disproportionally from an intervention How to identify and target these firms? Understand whether local markets for SME services are underdeveloped How to develop these local markets? Develop a more comprehensive framework of analysis Supply constraints (firm capabilities) Demand constraints (low demand in local markets) Market structure Trade Attention to intervention design issues: High growth firms Firms linkages Regulatory efficiency

42 Some fundamental ideas (not always understood) Data must be embedded in a rigorous analytical framework to clarify: the process through which data is generated and how data can be interpreted. Strong team with operational research skills needed to: Impose analytical structure on design, implementation, measurement system and rigorous testing Critical for adaptive process to take place as to ensure greater effectiveness Skilled research team need the promise of analytical rewards

43 What we learned doing this for the last 12 years It takes time to change the way projects operate It takes time to build the capacity, relationships, shared understanding It requires changing the skill composition of the teams Even when the skills are there, individuals may not have the bandwidth Need team members dedicated to the task It s a journey not a destination

44