Setting the Stage: Workshop Framing and Crosscutting Issues

Size: px
Start display at page:

Download "Setting the Stage: Workshop Framing and Crosscutting Issues"

Transcription

1 Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health Initiatives, Institute of Medicine 7-8 January, London, UK

2 What do we mean by complex interventions? The nature of the intervention: 1. Focus of objectives 2. Governance 3. Consistency of implementation How it works: 4. Necessariness 5. Sufficiency 6. Change trajectory Photo: Les Chatfield -

3 What are the challenges of evaluating complex interventions? Describing what is being implemented Getting data about impacts Attributing impacts to a particular programme Photo: Les Chatfield -

4 Why a framework is needed Wikipedia: Evaluation Methods

5 Why a framework is needed Image: Simon Kneebone.

6 The Rainbow Framework

7 DEFINE what is to be evaluated

8 Why do we need to start with a clear definition? Photo: Hobbies on a Budget / Flickr

9 1. Develop initial description 2. Develop program theory or logic model 3. Identify potential unintended results

10 Options for representing logic models Pipeline / results chain Logical framework Outcomes hierarchy / theory of change Realist Matrix

11 FRAME what is to be evaluated

12 Frame Decision Make Decision Frame Evaluation Design Evaluation Source: Hobbies on a Budget / Flickr

13 1. Identify primary intended users 2. Decide purpose(s) 3. Specify key evaluation questions 4. Determine what success looks like

14 DESCRIBE what happened

15 1. Sample 2. Use measures, indicators or metrics 3. Collect and/or retrieve data 4. Manage data 5. Combine qualitative and quantitative data 6. Analyze data 7. Visualize data

16 Combine qualitative and quantitative data Enrich Examine Explain Triangulate Parallel Sequential Component Integrated

17 UNDERSTAND CAUSES of outcomes and impacts

18 Outcomes Impacts

19 As a profession, we often either oversimplify causation or we overcomplicate it!

20 In my opinion, measuring attribution is critical, and we can't do that unless we use control groups to compare them to. Comment in an expert discussion on The Guardian online, May 2013

21 1. Check that the results support causal attribution 2. Compare results to the counterfactual 3. Investigate possible alternative explanations

22 SYNTHESIZE data from one or more evaluations

23 Was it good? Did it work? Was it effective? For whom did it work? In what ways did it work? Was it value for money? Was it cost-effective? Did it succeed in terms of the Triple Bottom Line?

24 How do we synthesize diverse evidence about performance? All intended impacts achieved Some intended impacts achieved No negative impacts Overall synthesis GOOD???? BAD 24

25 1. Synthesize data from a single evaluation 2. Synthesize data across evaluations 3. Generalize findings

26 REPORT and SUPPORT USE of findings

27 I can honestly say that not a day goes by when we don t use those evaluations in one way or another

28 1. Identify reporting requirements 2. Develop reporting media 3. Ensure accessibility 4. Develop recommendations 5. Support use

29

30 MANAGE your evaluation

31 1. Understand and engage with stakeholders 2. Establish decision making processes 3. Decide who will conduct the evaluation 4. Determine and secure resources 5. Define ethical and quality evaluation standards 6. Document management processes and agreements 7. Develop evaluation plan or framework 8. Review evaluation 9. Develop evaluation capacity

32

33 Making decisions Look at type of questions DESCRIBE UNDERSTAND CAUSES SYNTHESIZE REPORT & SUPPORT USE Descriptive Questions Was the policy implemented as planned? Causal questions Did the policy change contribute to improved health outcomes? Synthesis questions Was the policy overall a success? Action questions What should we do?

34 Making decisions Compare pros and cons

35 Making decisions Create an evaluation matrix Participant Questionnaire Key Informant Interviews Project Records Observation of program implementation KEQ1 What was the quality of implementation? KEQ2 To what extent were the program objectives met? KEQ3 What other impacts did the program have? KEQ4 How could the program be improved?

36

37 Documenting Sharing R & D Events Descriptions Comments Tools Guides Examples

38 Founding Partners Financial Supporters

39 For more information: