EU Investment Evaluation Capacity Building in Lithuania:

Size: px
Start display at page:

Download "EU Investment Evaluation Capacity Building in Lithuania:"

Transcription

1 EU Investment Evaluation Capacity Building in Lithuania: Achievements and Guidelines

2 The publication was prepared by Visionary Analytics JSC together with the Ministry of Finance of the Republic of Lithuania Editor: Ina Bachova Layout designer: Ieva Sargevičienė Printed by ST Impress JSC The publication was financed by the European Social Fund under the Operational Programme for EU Structural Funds Investments for Evaluations: why do we need them and how many have been carried out? The aim of evaluation of European Union (EU) investment is to improve the quality of EU investment and to boost the implementation and impact of operational programmes. 63 evaluations provided for by annual evaluation plans for were carried out by Lithuania carries out more evaluations than other Baltic States, but less than Poland or the Czech Republic, both of which are leaders in Central and Eastern Europe (CEE) in terms of the number of evaluations. Evaluation aims to improve the implementation and impact of public policy interventions. There are three types of evaluation: ex-ante, ongoing and ex-post. Ex-ante evaluations assess the relevance of programmes before implementation. Ongoing evaluations focus on monitoring the implementation of programmes and assessing whether and what kind of changes they need. Ex-post evaluations measure the impact of programmes. Lithuania compares to other similar EU-12 countries in that most of the evaluations carried out have been of the ongoing type. Number of evaluations per 1m inhab./ 1b EU investment Czech Republic Poland Number of evaluations per 1m inhabitants Number of evaluations per 1b EU investment Lithuania Estonia Slovenia Latvia Romania 67 Lithuania (%) CEE countries (%) Ongoing evaluations Ex-ante evaluations Ex-post evaluations Source: Visionary Analytics, Evaluation Capacity Building Study and Action Plan, Source: Visionary Analytics, Evaluation Capacity Building Study and Action Plan,

3 Evaluation practice, quality and impact: what is the evidence of improvement and what are future challenges? Capacity building: why do we need it? Evaluation capacity building (ECB) is a precondition for delivering high-quality and useful evaluations. ECB activities contribute to evidence-based governance and encourage the use of evaluation results in decision-making. In order to achieve a systemic change, capacity building activities are implemented at four levels: personal, institutional, interinstitutional and societal. Evaluation capacity has been systematically built in Lithuania since For this purpose, the Lithuanian Ministry of Finance has been organising projects aimed at strengthening evaluation capacity at all four levels. Such systemic Society 4 ECB projects Inter-institutional structures Institutions Civil servants Knowledge, capacities and attitudes of civil servants approach to ECB in Lithuania is innovative compared to a more sporadic implementation of ECB measures in other CEE countries. Between 2005 and 2008, the first ECB projects were oriented towards creating an evaluation system and facilitating its institutionalisation. This publication presents ECB measures implemented between 2008 and In this period, ECB measures were aimed at further strengthening the evaluation system and implementing related innovation. The need for continuous ECB also depends on a high turnover of civil servants responsible for coordination of evaluations. Narrow target group Organisational structures, procedures and rules of evaluation Structures and procedures of institutional cooperation (Evaluation Coordination Group) Attitudes of the general public and stakeholders to evaluation Structures and procedures encouraging the general public and stakeholders to participate in evaluation processes and use evaluation results Sustainable evaluation practices, high-quality evaluations and evidence-based governance Broad target group Since 2007, a substantial progress has been achieved in terms of the quality of evaluation. ECB measures have proved to be helpful in responding to (or at least mitigating) challenges of the evaluation system that were identified by an evaluation capacity building feasibility study carried out in However, to achieve a superior quality of evaluation in future, more ambitious goals and challenges should be set. Sustainability of evaluation practices Lithuania had an appropriate legal framework and an institutionalised evaluation system already in 2009, but the implementation of evaluations was not sufficiently effective. There were many practical challenges: limited budget-setting skills, little knowledge on how to identify the scope and questions of evaluation and poor evaluation quality assurance. Civil servants had little experience with evaluation. Furthermore, coordinating evaluations was one of multiple functions for most civil servants responsible for evaluation, while at the same time a high turnover of civil servants obstructed the accumulation of evaluation experience among responsible civil servants. Studies from 2013 and show that eventually evaluation practices were institutionalised and improved: the quality of terms of reference improved, evaluations were increasingly in line with expectations of contracting authorities and the quality of evaluations matched their financial value. ECB measures significantly contributed to the improvement of evaluation practices: methodological guidelines facilitated the introduction of unified evaluation procedures, civil servants gained knowledge necessary for a smooth implementation of the evaluation project cycle through training, while discussions, conferences and publications contributed to the promotion of evaluation culture and shared values among institutions. Ensuring the sustainability of civil servants evaluation capacity will remain a challenge in the future. About half of members of the Evaluation Coordination Group changed jobs between 2007 and 2016, although the majority of them continued working in the public sector. Therefore, working to strengthen and retain the evaluation capacity of civil servants responsible for national policy evaluation should be a priority in order to manage the negative outcomes of high staff turnover in the civil service. Not only would this help to strengthen and maintain the existing evaluation capacity, but it would also contribute to national policy evaluation capacity building and the promotion of evaluation culture in general. This will be especially relevant during the new EU programming period after ESTEP, VPVI, Feasibility Study on EU Structural Fund Evaluation Capacity Building, VPVI, ESTEP, Evaluation of the Quality of EU Structural Fund Evaluations, 2013; Visionary Analytics, Evaluation of the Use of Evaluation Results,

4 Quality of evaluations The 2009 EBC feasibility study concluded that the quality of evaluations could be improved. Inadequate selection criteria for technical proposals submitted by service providers, limited competition between service providers and the use of evaluations for formal accountability purposes 3 they all have a negative impact on the quality of evaluation. A study carried out in 2013 claims that the overall quality of evaluations has improved. The general quality index of evaluation reports calculated by metaanalysis was 0.81 (the maximum score is 1), while contracting authorities rated almost 90% of all evaluation projects to be of high or superior quality 4. However, defining a proper scope and questions of evaluation as well as the validity of conclusions and recommendations still remained a challenge. These challenges were addressed by ECB measures: drafting methodological guidelines, organising training for civil servants, establishing evaluation standards, etc. 6 A 2016 survey of contracting authorities 5 indicates that the quality of evaluations has improved. A significant progress has been achieved in the areas that were considered problematic in However, inter-institutional differences in evaluation capacity and the quality of evaluation remain present. It is difficult to tackle this particular issue systematically as the majority of institutions carry out a limited number of evaluations over a programming period. This does not encourage learning by doing. Furthermore, targeted ECB measures should be planned according to the foreseen ECB needs, e.g. in accordance to the ongoing evaluation cycle and its specificities. The spread of evidence-based culture in public discussions and decision-making would promote further qualitative development of the evaluation system and produce enabling conditions for capacity building for civil servants from all ministries. Evaluations planned and implemented by ministries (according to evaluation plans) Source: Visionary Analytics, Evaluation Capacity Building Study and Action Plan, I - Ministry of Finance II - Ministry of Interior III - Ministry of Health IV - Ministry of Economy V - Information Society Development Committee VI - Ministry of Transport and Communications VII - Ministry of Environment VIII - Ministry of Education and Science IX - Ministry of Social Security and Labour I II III IV V VI VII VIII IX ESTEP, VPVI, Feasibility Study on EU Structural Fund Evaluation Capacity Building, Ibid. Visionary Analytics, Evaluation Capacity Building Study and Action Plan, Impact of evaluations Stakeholder participation in evaluation and communication to the public The participation of stakeholders in evaluation and evidence-based discussions involving the general public were and still remain a significant challenge. In response to the challenge, evaluation reports are made public and stakeholders are increasingly involved in evaluation monitoring groups. A number of ECB measures address this challenge, including publications, online and printed articles, conference presentations and interviews with evaluation experts. In addition, civil servants have been taught to communicate evaluation results to the general public. A more active application of participatory evaluation or at least of The 2009 ECB feasibility study 6 pointed out that a lot of evaluations were being carried out for formal accountability purposes. Decision-makers gave limited attention to these evaluations as their results seldom provided strong evidence to support problem-solving at that time. These challenges were addressed by ECB measures: training, international evaluation conferences and meta-analyses of the implementation of recommendations and their impact. A 2015 analysis on the use of evaluation results 7 showed that the majority of recommendations provided in evaluation reports were implemented. This enables to improve measures, the monitoring system and reporting. A slightly smaller share of evaluations contributes to the review of political priorities and policy measures. Going forward, it is important to ensure that evaluations not only encourage better administration and implementation, but also lead towards the implementation of better policies. In other words, an increasing number of evaluations should enable problem-solving in respective policy areas. its elements will be relevant in the future. A paradigm shift in communication with the general public is also important: a transition from publicity of evaluation results to communication and dialogue. 6 7 ESTEP, VPVI, Feasibility Study on EU Structural Fund Evaluation Capacity Building, Visionary Analytics, Evaluation of the Use of Evaluation Results,

5 Capacity building measures: what works, what doesn t and why? Methodological documents Studies and meta-evaluations Feasibility Study on EU Structural Fund Evaluation Capacity Building 2009 Improving ECB measures Accumulation of knowledge Overview of EU Structural Fund Evaluations: Recommendations for the Use of EU Structural Funds 2010 Overview of SPD Evaluations 2010 Recommendations on how to increase the quality and impact of evaluation Evaluation of Recommendations of EU Structural Fund Evaluations Evaluation of the Quality of EU Structural Fund Evaluations Evaluation of the Results of EU Structural Fund Evaluations Evaluation of the Use of Evaluation Results Evaluation standards Methodological knowledge Smooth implementation of evaluation projects 8 Enables monitoring the quality of evaluation and the scope of the implementation of recommendations, as well as identifying and systematically improving weak spots within the evaluation system. Enables strengthening the implementation of recommendations. Studies and research provide codified knowledge that enables to address the big questions: is the quality of evaluation increasing, how evaluation recommendations are being implemented, etc. Yet the efforts to assess the quality and impact of evaluation encounter inevitable methodological constraints: is there an objective criterion for the quality of evaluation, how to monitor the impact of evaluation and what are the benefits of this information? Other tools such as consultations, discussions, analysis of monitoring data, etc. have the potential to contribute to more effective and more targeted improvements in the quality of the evaluation system. Ensuring that evaluations contribute to cumulative knowledge building, i.e. evaluations are based on information from previous evaluations and add to the horizon of knowledge in terms of what works, what does not work and under what conditions. This requires a continuous assessment of knowledge, remaining knowledge gaps and the lessons learnt: which recommendations from previous evaluations have been successfully implemented and which have failed; which evaluation methods have produced good results, etc. Methodological guidelines are an important source of knowledge for civil servants and especially for those for whom this is not a routine task. Methodological documents are often used for planning evaluations and ensuring the proper application of suitable methods. Institutionalisation of evaluation practices: establishing evaluation standards and procedures common to all institutions. After deciding to no longer publish separate methodological documents, it seems relevant to create an easy-to-use and comprehensive knowledge bank to store all methodological guidelines, practical tools, examples, etc. A unified and up-to-date e-platform would allow civil servants to find all necessary Knowledge management is an important challenge: not all civil servants use methodological guidelines and they do not know where to find relevant information. Revision of methodological documents faces a dilemma: either information from previous guidelines is being duplicated or this results in the fragmentation of documents. Civil servants are in a constant need for practical tools and blueprints which could be directly applied in their daily tasks. information quickly and efficiently. Due to a high turnover of civil servants who are responsible for evaluation, practical tools such as blueprints, forms or questionnaires are becoming increasingly relevant. These tools would allow reaching positive results even for those who have little or no experience with evaluation. 9

6 Training Preparing and implementing EU structural support evaluation projects: focus on quality (2009) Budget-setting for EU structural support evaluation projects (2009) Types of public procurement procedures and specifics of their practical implementation (2009) Planning EU structural support evaluations and drafting the terms of reference (2010) Competence management as an ECB measure (2009) Planning and implementing internal evaluations (2009) EU structural support evaluation methods and their application (2009) Evaluation networks and networking as ECB measures (2010) Deepening knowledge on EU structural support evaluation methods: Parts 1 and 2 ( ) Theory-based evaluation and contribution analysis (2012) Counterfactual and experimental evaluation methods (2013) Possibilities of applying quantitative methods in evaluation (2013) Seminar-discussion on methodological guidelines for the evaluation of EU investment (2015) Increasing the use of EU Structural Funds evaluation results (2009) Using evaluation as a tool for justifying changes in operational programmes (2010) Using EU Structural Funds evaluation as a tool for public policy improvement (2010) Three study visits to Portugal, Belgium and Ireland ( ) High-quality planning of evaluations Better use of evaluation results Comprehensive methodological knowledge and application capacities Foreign good practices Conferences Evaluation of EU Structural Funds: Reinforcing Quality and Utilisation March 2009, Vilnius What s New and What Works in the EU Cohesion Policy : Discoveries and Lessons for March 2011, Vilnius Better use of evaluation results Evaluation innovation Cohesion Policy : Towards Evidence- Based Programming and Evaluation 4 5 July 2013, Vilnius Evaluation Results for Decision-Making: Use, Challenges and Examples May 2015, Vilnius Training is one of the main ECB measures for civil servants. It is a quick and effective way for civil servants to gain knowledge necessary for introducing evaluation innovation or strengthening weaknesses of the evaluation system. Knowledge gained through training is quickly lost if it is not put to use, i.e. knowledge is not applied in initiating and coordinating evaluations or promoting the use of evaluation results. This is especially relevant for civil servants in various ministries that commission very few evaluations. Adjusting training programmes to suit different levels of civil servants knowledge and experience is challenging. Each civil servant has different requirements which are difficult to meet in standardised training for all. Encourages the spread of evaluation culture, establishes links with foreign evaluation societies, which, as a result, strengthens the Lithuanian evaluation community and helps to develop its attitudes towards evaluation. Conferences are a source of methodological and organisational innovation in evaluation. Conferences could be used more actively to communicate the benefits of evaluation to decision-makers, as they are usually focused on discussing evaluation results. A prestigious international conference in Vilnius is confronted with a lot of competition from other countries where similar events are organised by well-established evaluation societies. Applying various training methods (e.g. mentorship, targeted or systemic training courses for beginners) that are better suited for particular needs of individual civil servants. Furthermore, it is necessary to ensure that all new civil servants who are starting to work with evaluation are systematically introduced to evaluation standards and evaluation practices, while more experienced civil servants have a good knowledge of evaluation-related innovation. If systematically organised, training can help to reduce the stark differences between institutions in terms of the number, quality and impact of evaluations. The concept of evaluation conferences should be clarified. On the one hand, the target group could be expanded in Lithuania by adjusting the conference programme to meet the needs of decision-makers and inviting more civil servants who are responsible not only for EU investment evaluation, but also for budget programme evaluation. On the other hand, it is also possible to focus on a professional audience of evaluation experts by organising international conferences. In this case, it is worthwhile to consider organising joint conferences with Poland or other Baltic States: multiple conferences organised in CEE countries create competition for speakers and guests, which brings almost no benefit for either party

7 Evaluation capacity building in the future: what are priorities? Evaluation publicity tools Priority areas for evaluation capacity building until 2020 are being planned based on the need to improve the evaluation system. Priority topics are outlined in each stage of the evaluation project cycle. Leaflets Presentation of the evaluation system Press releases Preparatory stage Implementation stage Knowledge application and dissemination stage Speeches in conferences Articles Publications Evaluation value and the use of results Presentation of evaluation system developments Future of evaluation Interviews Discussions Establishing the need for evaluation and drafting the terms of reference how to anticipate the need for a (particular type of) evaluation and how to set appropriate evaluation questions? Public procurement how to evaluate the quality of the technical proposals received? General and specific methodological skills how to ensure that appropriate methods are chosen and used? Project management skills how to ensure efficient and effective cooperation between evaluators and contracting authorities in quality assurance? Communication to the public how to communicate evaluation results in an attractive way, how to foster a culture of evidence-based decision-making? Application of knowledge in the decision-making process how to ensure the ownership of knowledge generated through evaluation, how to distribute and apply this knowledge in institutions and among decision-makers? Communicating and discussing evaluation results with the general public. It is not enough to publish evaluation reports online and to use publicity measures to inform the public about the benefits of evaluation. Successful communication 12 Introduces the need and benefits of evaluation to the general public. Presents Lithuanian good practices and facilitates their dissemination within international evaluation societies. Dialogue with the public about evaluation results remains a key challenge. Evaluation publicity tools are too narrow and fragmented and therefore cannot respond to this challenge. Instead of promoting evaluation benefits to the public, it is important to create a system that would allow communicating the results of each evaluation in a way that speaks to the general public. requires capable civil servants and a clear communication system. This would contribute to a better use of results in solving key issues in respective areas. The following four principles should be applied when implementing ECB measures in future: 1 strengthening knowledge management: developing measures facilitating the change of attitudes (know why) to evaluation, helping to find relevant information efficiently (know who, know where), supporting civil servants in applying knowledge acquired in training and methodological guidelines, etc.; 2 actively introducing principles of self-learning when learners themselves determine what they want to learn, use various sources of information (not limited to the lecturer), and learn in different contexts (e.g. on-the-job) and locations (not limited to training seminars); 3 systemising the acquisition of knowledge: applying different forms of learning evaluation academy and mentoring for beginners, themed and targeted training, discussions and exchange of good practices for experienced civil servants; selecting participants appropriately to the type of training, organising refresher courses to discuss the practical use of acquired skills; 4 increasing the accessibility and applicability of methodological knowledge: creating and developing an e-platform that would provide up-to-date information, practical examples, links to evaluation reports, etc. 13

8 NOTES 14