Dr Ian Goldman CLEAR AA (Wits) (South Africa)

Size: px
Start display at page:

Download "Dr Ian Goldman CLEAR AA (Wits) (South Africa)"

Transcription

1 Dr Ian Goldman CLEAR AA (Wits) (South #Evidence2018

2 2 Evaluation of South Africa s National Evaluation System Evidence September 2018 Steering Committee: Dr Ian Goldman, Carol Nuga Deliwe, Stephen Taylor, Zeenat Ishmail, Dr Laila Smith, Thokozile Masangu, Christopher Adams, Gillian Wilson, Dugan Fraser, Annette Griessel, Cara Waller, Siphesihle Dumisa Evaluators: Alyna Wyatt, Jamie Robertsen of Genesis

3 What is a National Evaluation System? one in which evaluation is a regular part of the life cycle of public policies and programmes, it is conducted in a methodologically rigorous and systematic manner in which its results are used by political decisionmakers and managers, and those results are also made available to the public. Evaluation systems are a function of values, practices and institutions as outlined below. (Lazaro, 2015, p. 16) Characteristics of a NES The Building Blocks Presence of evaluation in political, administrative and social discourse Need for consensus on what evaluation is, what type of knowledge is produced, and how evaluations should be conducted Organisational responsibility Permanency Individuals Institutions Environments Evaluation significant in that it it is double loop learning, an evaluation of a system promoting evaluations 5

4 EVALUATION QUESTIONS EVALUATION QUESTIONS RELEVANCE EFFECTIVENESS EFFICIENCY RELEVANCE EFFECTIVENESS EFFICIENCY How is the system working as a whole, who is is involved and and what what are are the the implications of this? of this? How is the system working as a whole, who is involved and what are the implications of this? How are the specific components of of the system (e.g. procurement, quality assurance process, steering steering committees, training, How are guidelines, the specific quality components assessment of the system, communication) (e.g. procurement, working quality nationally assurance and and provincially process, steering and and how committees, how can can they they be be strengthened? training, guidelines, quality assessment system, communication) working nationally and provincially and how can they be strengthened? Are there other evaluative mechanisms which need to to be be included to to maximise the the benefits benefits for for government government (e.g. (e.g. rapid rapid Are there other evaluative mechanisms which need to be included to maximise the benefits for government (e.g. rapid methodologies, promotion of of evaluative thinking)? methodologies, promotion of evaluative thinking)? What appears to the be cost-benefit or or value-for-money of of establishing an an evaluation evaluation system? system? What appears to the be cost-benefit or value-for-money of establishing an evaluation system? IMPACT IMPACT Is Is Is there there there initial initial evidence of evidence of symbolic, of symbolic, conceptual symbolic, conceptual or conceptual or instrumental or instrumental outcomes instrumental outcomes from outcomes from evaluations? from evaluations? If evaluation evaluations? If If evaluation findings evaluation findings are findings are not are not not being being being used, used, used, why why are why are they are they not? they not? not? What What evidence evidence is is there is there of of evaluations of evaluations contributing contributing to to planning, to planning, budgeting, budgeting, improved improved accountability, accountability, decision-making decision-making and and knowledge? knowledge? Is there evidence of other unintended outcomes or benefits from the evaluation system, e.g. in raising the importance of Is Is there evidence of of other unintended outcomes or benefits from the evaluation system, e.g. in in raising the importance of evidence within departments? evidence within departments? SUSTAINABILITY AND UPSCALING SUSTAINABILITY AND UPSCALING How should the internally-initiated (demand-driven and voluntary) approach used evolve in future to strengthen its impact How should the internally-initiated (demand-driven and voluntary) approach used evolve in future to strengthen its impact on government priorities (NDP, gender etc.)? on government priorities (NDP, gender etc.)? How should the balance be managed going forward of internal / outsourced, use of a government department as How should the balance be managed going forward of of internal // outsourced, use of of a government department as custodian of the system (DPME), in terms of independence, learning, credibility of evaluations and the system going custodian of of the system (DPME), in in terms of of independence, learning, credibility of of evaluations and the system going forward? forward? What What are are are the the the implications implications for for for expanding expanding the the the system, system, e.g. e.g. e.g. to to to all all all departments, departments, metros metros and and and public public entities entities and and and how how should should intergovernmental intergovernmental links links links around around around evaluation evaluation be be be strengthened? strengthened? 11

5 Summary of KII s Method Used: KIIs Information point When conducted Interview tracking Volume of contact effort Data management process Explanation 12 April 10 June 2017 Conducted mainly in Gauteng but traveled to provincial (Western Cape, Eastern Cape and Limpopo) sites as needed Interview communications and set-ups were tracked internally in real time. The team received continuous feedback on progress which enabled a focused approach All interviewees were contacted a minimum of three times The maximum number of times contacted was 10 All interview notes were housed in a central repository These notes were coded into Atlas.ti and used to inform an analytic framework drawing on the Theory of Change and the Analysis Plan No. Completed No. Contacted Departmental Case Studies DBE 5 5 DPE 0 1 Justice 3 4 DTI 5 8 DSD 5 9 DHS 4 9 Sub-Total Provincial Case Studies Eastern Cape 8 12 Western Cape Limpopo Gauteng 4 7 Sub-Total General Interviews DPME 8 14 DAFF 5 6 DST 2 4 DHET 2 4 International 2 5 SAMEA 3 3 Other 9 18 Sub-Total, Other Sub-Total, Department Benchmarking Benin, Uganda, Mexico, Colombia 9 national and provincial case studies Total: 112 or 69%

6 Case Study Selection Innovation Curve Number of Innovators Innovators Early Adopters Early Majority Late Majority Laggards WC GP EC Prov Limpopo DBE dti DHS DJCD DSD Dept Time 13

7 How is the NES Working? Coverage of NEPs, DEPs and PEPs Use of NEPs/PEPs/DEPs to identify strategic evaluations, based on MTSF/ importance 8 provincial PEPs and 68 provincial and departmental DEPs, with over 100 provincial evaluations and 300+ departmental evaluations planned. Great strides in incorporating all but one province and many departments. Extent evaluation plan in place is good indication of breadth of NES, but not depth or quality of the system. More needs to be done to track the implementation and progress of the plans. Evaluations in the NEP (updated figures as at 11 June 2018) Active Evaluation s Approved reports Over R140 billion of government expenditure covered by the NEP evaluations, so at some scale Improvement plans Served at Cabinet Research underway TORs (Design) Prep Stage (Concept)

8 How is the NES Working? Time spent on evaluations NEPF Activities Pre-Design and Design Evaluator Activities 2012/13 Overview of Time Spent on Evaluations 16% 21% 32% 23% 26% 28% Implementation Peer review and validation process Inception Design Fieldwork 2013/ / /16 24% 21% 18% 24% 19% 16% 20% 18% 21% 28% 10% 10% 27% 33% 10% 12% 16% 5% Recommendations and Management Response Report writing Overall 20% 20% 19% 19% 20% 22% Communication Results Follow-up Cabinet Approval Pre-Design and Design Implementation Peer Review and Validation Process Recommendations and Management Response Only around 60% of time on the evaluation itself, rest on preparation and follow-up 28

9 How is the NES Working? Capacity development DPME s capacity building plan broad, and included: guidelines (18) and templates (9) promoting and facilitating learning networks and forums short courses developing the MPAT system. Overall, guidelines and templates seen as useful, particularly if adapted to the context and capabilities of the department or province participants undertook training between 2012/13 and 2016/17. Overall, respondents from the KIIs, and from the survey found significant progress arising from the training, deepening knowledge of many public officials. However; capacity remains a priority area of development in the NES to continue to build on the momentum achieved. 29

10 Conclusions - Institutionalisation Good progress in first 5 years of a 20 year project but still work in progress Important areas in strengthening institutionalisation/expansion of the system include: Senior-level buy-in; use of internal evaluation to help develop an evaluative culture; promotion of evaluative thinking; and drive from individual champions. Financial allocations: challenges in budgets for evaluations. Stakeholders supported evaluation budgets being prescribed by Treasury. Positive example is W Cape has allocated R10m Accountability : Doing evaluation should be part of performance agreement of every DG and HOD. In the Eastern Cape, incorporated in management contracts to encourage uptake, also KZN COGTA, where KPIs reflect the % of recommendations implemented. Promotion of an evaluation culture: Respondents noted evaluation culture has been strengthened by the NES eg evaluations have improved, procurement processes simplified, and departments provided technical support. Potential inhibitor fear of evaluation findings, where evaluation seen as punitive. Some depts do not submit evaluations to the NEP as do not wish findings being public. Being addressed through DG/DDG courses which show the purpose and use of evaluations to improve programmes and policies 38

11 Provinces Use Western Cape Gauteng Eastern Cape Limpopo Respondents indicated that results of evaluations are used frequently Programme managers in some depts meet to discuss implementation of evaluation findings OTP suggested best way to encourage use to evaluate a programme conducted successfully Some evidence of use beginning to emerge in provincial departments Dept of Education begun tracking progress of its improvement plans Dept of Economic Dev. and Dept of Infrastructure Dev begun using evaluations at various points in programmes as a programme develops in maturity Some evidence evaluations beginning to be used. An improvement plan was tracked in 2017 Respondents highlighted the importance of use of evaluative thinking None of the evaluations have been completed as yet 39

12 Conclusions: Impact - Use of results in the Case Studies Departments or Provinces Reporting Instances of Evaluation Use (Total Case Studies = 9) Process Use 5 Symbolic Use 1 Conceptual Use 1 Instrumental Use 7 Number of Evaluations 40

13 Conclusions - Impact of the evaluation system Preliminary evidence for use of evaluations is encouraging. Departments and provinces appear to understand the value of evaluations and attempt to use them to inform decisions, despite many challenges that they face. Improvement plan seen as key element in enhancing use and is seen as one of the key benefits that NES has brought about. Need for a better mechanism to track evaluation improvement plans. Looking at budgetary considerations arising from evaluations, those evaluations concerned with the broader economy more actively considering budgetary implications of their evaluations. However, in majority of cases little conscious consideration of budget implications. A key challenge raised by respondents was the capacity needed to use evaluations. Capacity (in terms of number of people, time and level of skill) is a challenge when implementing evaluation recommendations. While this was also a view held at the national departments, the challenge is particularly acute at the provincial level. 41

14 Conclusions? Cost benefit of evaluations Cost benefit ratios calculated for 3 sample evaluations, ranging from 1:7 (Agricultural Learnerships in the Cape) to 1:10 (Funza Lushaka), to 1:13 Business process Services dti) Tracking of costs and benefits of system needs to be done more systematically Cost-benefit for Funza Lushaka Costs / Benefits Costs Benefits Description Quantity Value / Unit Total Value Labour Assistant Director 1 50 days R R Labour Director 2 24 days R R Labour Administration 3 36 days R R Catering R Direct Evaluation Costs R Quality Assessment 6 R Total Costs R Management Information Systems (MIS) Funding R Additional Funding Received for Recommendations R Grant Application R Total Value of Benefits R Cost-Benefit Ratio (Benefits / Costs)

15 Recommendations 44

16 Recommendations - Evaluation Mandate R1: Evaluation should be embedded in legislation as a mandatory component of public management and improvement, with DPME as the custodian, and the roles of Offices of the Premier and departments defined. R2: Planning and budgeting must systemically draw from the results of monitoring and evaluation. The findings from evaluations and implementation of improvement plans should be codified in departmental strategic plans, APPs, annual and quarterly reports. DPME should incorporate this as it updates the Framework for Strategic Plans and APPs. In addition, they should be part of senior managers performance agreements. R3: New phases of programmes should not be funded until an evaluation of the previous phase is completed. R4: The role of impact evaluations needs to be strengthened, particularly for large policies or programmes, programmes that have already had an implementation evaluation, and in new programmes where there is an opportunity to design in for impact evaluation from the beginning. R5: The role of key stakeholders in the evaluation ecosystem including DPSA, National Treasury, SAMEA and civil society, notably think tanks, needs to be clarified. 45

17 Recommendations - Budgeting for Evaluation Processes R6: DPME should initiate and develop guidelines for rapid evaluative exercises which can be conducted when budgets are limited or time is short, and which potentially can be conducted by M&E units. This should include when internally conducted evaluations are relevant. R7: Programmes must be required to budget a % of programme budgets for evaluation, or M&E. Typically this should be in the range 0,5-5% depending on the size of the programme. Programme plans must include an evaluation cycle, as in the DPME Guideline for Planning. R8: DPME/national departments should promote the sharing of evaluation plans across spheres of government so that evaluation resources can be pooled across government departments, for evaluations that examine similar programmes, or cross-departmental evaluations. R9: DPSA with technical input from DPME should develop clear requirements for specific evaluation staff with competences, job descriptions, and posts in standard M&E units. M&E units should have at least one evaluation specialist. 46

18 Recommendations - Capacity Development R10: DPME must strengthen its investment in capacity development, including working with Treasury and PSETA to ensure that budget is available for courses/learnerships, and with additional dedicated staff time to focus on capacity development. R11: DPME to work with NSG, DPSA and SAMEA to ensure that suitable postgraduate courses and continuous professional development opportunities are available for evaluation professionals within the public sector (and the extended evaluation system). R12: DPME to work with stakeholders to establish a Community of Practice for learning and sharing around evaluation for government. The COP must focus on technical and experiential learning, not administrative or bureaucratic functions. R13: The national Evaluation Technical Working Group should suggest how internal evaluations should be encouraged to promote learning, bearing in mind the need for independence for major evaluations. R14: Build-in specific skills transfer elements into Service Level Agreements with evaluation service providers. R15: DPME needs to use both capacity development and procurement tools to ensure that emerging evaluators are brought into the system, and encourage a broader variety of universities to participate in the system. 47

19 Recommendations - Managing & Tracking Evaluations R16: DPME to work to strengthen the quality of foundational documents including TORs. This requires expanding the training, refinements to the guideline and more consistency in application of the guideline R17: The management information system is the backbone of the NES and it needs to be strengthened and used across all evaluation in government, not only for the NEP. This will allow transparent monitoring of the state of the system, as well as extraction. R18: DPME must use the results of this tracking to ensure that departments are following up on improvement plans, reporting to Cabinet, and naming and shaming departments who are not doing so. 48

20 Recommendations - Strengthening use through Communication Improvement Plans R19: DPME, provinces and departments need to allocate significant resources for evaluation communication, both financial and human. This will ensure full value is obtained from the investment currently being made, and that stakeholders are aware of the findings. This will also help to build trust in government. R20: DPME should hold some resources to be used during the improvement plan stage of NEP evaluations to enable funding of exercises such as costing. The same would be beneficial for OTPs for provincial evaluations. R21: DPME should develop mechanisms for tracking changes from evaluations beyond the current two years of the improvement plan. This would include later evaluations on programmes which have been revised from evaluations. 49

21 INPUTS ACTIVITIES OUTPUTS OUTCOMES IMPACTS Updated Theory of Change Public services become more effective and poverty is eradicated Government and public accountability is improved Success if identified and replicated Evidence based decision making around resources is facilitated Affected stakeholders are involved extensively and consistently Evaluation and research evidence informs change to government interventions Improvement plans implemented Evaluation recommendations are funded Monitor impact of evaluations Enhanced capacity of stakeholders to manage and use evaluation research evidence Evaluation and research evidence shared Findings in reports are compared to departmental plans and service delivery plans Improvement plans developed NEP & system developed Effective systems for managing evaluation of government programmes developed and operational Evaluation quality assurance system is operating effectively Learning, sharing and advocacy to build stakeholder awareness, capacity and commitment Quality evaluations designed, implemented and supported Provide evaluation technical advice Facilitate peer review process Provide post hoc quality assessment Develop guidelines, policies and standards Evaluations conducted Evaluation unit is managed and resourced collaboratively Implement capacity building DEPs and Policies developed PEPs and Policies developed Evaluation unit is established within the DPME Funding is made available for departmental, national and provincial evaluations DPME co-funds evaluations in the NEP Human resources Evaluation service providers and government officials Capacity building plan is developed to enhance demand for evaluation Funding is available for capacity building Capacity building plan is developed to deepen understanding of evaluation and promote use 50

22 Improvement Plan Improvement Objective 1 Improvement Objective 2 Improvement Objective 3 Improvement Objective 4 The PM&E Bill incorporates evaluations as a mandatory component of the public administration system to enable institutionalisation of evaluations in the public sector and SOEs, including linking evaluations with planning and budgeting cycle. The National Evaluation Policy Framework is revised in line with this. Improved quality and range of evaluations through consistent application of strengthened processes, guidelines and tools to support evaluations across spheres of government and SOEs Improved capacity in government to manage and undertake evaluations with a wider pool and diversity of evaluation service providers. Evaluation reports are used as reliable sources of evidence and communication to inform planning and decision making in and outside of government. Evaluation improvement plans are implemented and tracked. 52

23 Let s see what happens in current policy environment. Dr Ian Goldman CLEAR AA Steering Committee: Dr Ian Goldman (DPME), Carol Nuga Deliwe (DBE), Stephen Taylor (DBE), Zeenat Ishmail (W Cape), Dr Laila Smith (CLEAR AA), Thokozile Masangu (DRDLR), Christopher Adams (Treasury), Gillian Wilson (Treasury), Dugan Fraser (ex-samea), Annette Griessel (DOEW), Cara Waller (Twende Mbele), Siphesihle Dumisa (DPME) Evaluators: Alyna Wyatt, Jamie Robertsen of Genesis Analytics 58