Governing Body Geneva, November 2002 PFA. ILO evaluation framework. Evaluation within a strategic budgeting context TENTH ITEM ON THE AGENDA

Similar documents
TERMS OF REFERENCE. Independent Evaluation of the ILO Action Plan for Gender Equality

Executive Board of the United Nations Development Programme and of the United Nations Population Fund

Governing Body Geneva, March 2008 TC FOR DECISION. Public private partnerships. A. The issue INTERNATIONAL LABOUR OFFICE FIRST ITEM ON THE AGENDA

UNODC Evaluation Policy Independent Evaluation Unit

Work plan for enhancing the management and administration of UNCTAD

Independent Formative Evaluation of the World Health Organization

Guide for the formulation of national employment policies. An EMP/POLICY tool. 09/21/2011 Claire Harasty, CEPOL EMP/POLICY

Critical milestones towards a coherent, efficient, and inclusive follow-up and review of the 2030 Agenda at the global level COVER NOTE:

Strategic objective No. 3: Enhance the coverage and effectiveness of social protection for all

Evaluation: annual report

Strategic objective No. 2: Create greater opportunities for women and men to secure decent employment and income

UNICEF report on the recommendations of the Joint Inspection Unit

CGIAR Policy for Independent External Evaluation

UN-HABITAT ENTERPRISE RISK MANAGEMENT IMPLEMENTATION GUIDELINES

CARIBBEAN DEVELOPMENT BANK EVALUATION POLICY

ESCAP M&E SYSTEM Monitoring & Evaluation System Overview and Evaluation Guidelines

NATIONAL EXPORT STRATEGY

ARRANGEMENTS FOR JOINT OECD- UNDP SUPPORT TO THE GLOBAL PARTNERSHIP FOR EFFECTIVE DEVELOPMENT CO- OPERATION

Governing Body 329th Session, Geneva, 9 24 March 2017

Internal Oversight Division. Internal Audit Strategy

WORLD HEALTH ORGANIZATION

Economic and Social Council

Evaluation. Evaluation Document 2006, No. 1. Office GLOBAL ENVIRONMENT FACILITY. The GEF Monitoring and. Evaluation. Policy

Critical milestones towards a coherent, efficient, and inclusive follow-up and review of the 2030 Agenda at the global level COVER NOTE:

COUNCIL. Hundred and Fifty-fifth Session. Rome, 5-9 December Analysis of the Evaluation Function in the United Nations System (JIU/REP/2014/6)

Critical milestones towards a coherent, efficient, and inclusive follow-up and review of the 2030 Agenda at the global level COVER NOTE:

Advisory on UNESCO s Enterprise Risk Management. Internal Oversight Service Audit Section. IOS/AUD/2016/05 Original: English.

The GEF Monitoring and Evaluation Policy 2010

Global Environment Facility

Human resources: annual report

Economic and Social Council

ESMAP MANAGEMENT RESPONSE TO THE EXTERNAL EVALUATION REPORT

UNCT Performance Indicators for Gender Equality. Users Guide

Evaluation, Evaluators, and the American Evaluation Association

2009/20 Social dimensions of the New Partnership for Africa s Development

IAS EVALUATION POLICY

DAC Principles for Evaluation of Development Assistance (1991) 1

Fairtrade International Monitoring Evaluation and Learning Programme ISEAL Impacts Code Public System Report January Executive Summary 2

ENTERPRISE RISK MANAGEMENT

BACKGROUND PAPER FOR THE FIRST INFORMAL CONSULTATION ON THE WFP GENDER POLICY ( ) Informal Consultation

INTERNAL AUDIT DIVISION REPORT 2017/134

Inter-Agency Working Group on Evaluation Report of the Meeting 2-3 April 2001 Palais des Nations, Geneva, Switzerland

Vision on Structure and Functioning of FAO s Decentralized Offices Network

ITC Evaluation Policy

Assessment of Sustainability Indicators (ASI) A SCOPE/UNEP/IHDP/EEA Project

JOB CREATION IN SMALL AND MEDIUM-SIZED ENTERPRISES

Gender Sensitive Parliaments The IPU Self-Assessment methodology for parliaments

World Heritage Leadership A new capacity building programme of ICCROM and IUCN

INTERNAL AUDIT AND OVERSIGHT DIVISION

Economic and Social Council

ASIA-PACIFIC MONITORING & EVALUATION REGIONAL ANNUAL MEETING 2009

RESULTS BASED MANAGEMENT POLICY

Enhancing Results-Based Management through Quality Assurance of Supervision Missions

Executive Board of the United Nations Development Programme and of the United Nations Population Fund

Recommendation concerning Human Resources Development: Education, Training and Lifelong Learning

Towards a sustainable health workforce in the WHO European Region: framework for action

ECONOMICALLY ACTIVE POPULATION: EMPLOYMENT, UNEMPLOYMENT, UNDEREMPLOYMENT

GUIDING FOR ACCOUNTABILITY:

Organizational capacity Assessment tool 1

CHARTER INTERNAL OVERSIGHT OFFICE (IOO)

Abbreviations... I. Introduction... 1 The ILO mandate... 2 The ILO today... 2 The ILO in

DECISION No. 19/06 STRENGTHENING THE EFFECTIVENESS OF THE OSCE

MAKING HEADWAY IN EUROPE

Global Health Workforce Alliance Concept Note: A global strategy on Human Resources for Health for Post-2015

Intergovernmental Science-Policy. Platform on Biodiversity and Ecosystem Services

REPORT 2015/024 INTERNAL AUDIT DIVISION. Audit of the Office of the United Nations High Commissioner for Human Rights Country Office in Guatemala

KNOWLEDGE MANAGEMENT AT THE AFRICAN DEVELOPMENT BANK. Bakri Abdul-Karim (Ph.D)

8 June Excellency,

TIPS PREPARING AN EVALUATION STATEMENT OF WORK ABOUT TIPS

UNEP s Gender Plan of Action

UNICEF Evaluation Office Terms of Reference: External Assessment of UNICEF s Global Evaluation Reports Oversight System (GEROS)

PEACE IV PROGRAMME ( ) European Territorial Cooperation Programme United Kingdom-Ireland (Ireland - Northern Ireland) EVALUATION PLAN

Economic and Social Council

EVALUATION OF THE DECENTRALISATION STRATEGY AND PROCESS IN THE AFRICAN DEVELOPMENT BANK. Concept Note

Resolution adopted by the General Assembly

Evaluation: annual report

Evaluation. Guidelines

Partnership for Market Readiness (PMR) Strategic Direction for the Future of the PMR. Team Leader, PMR Secretariat

(UNEP/FAO/RC/COP.4/24)

Critical milestones towards a coherent, efficient, and inclusive follow-up and review of the 2030 Agenda at the global level COVER NOTE:

VACANCY ANNOUNCEMENT

Governing Body Geneva, November 2006 PFA FOR INFORMATION. Human Resources Strategy: Annual report. Introduction. Structure of the report

IFAD s updated Social, Environmental and Climate Assessment Procedures. Managing risks to create opportunities

UNITED NATIONS DEVELOPMENT PROGRAMME Junior Professional Officer (JPO) JOB DESCRIPTION. I. Position Information

In brief: WFP s Evaluation Function

Secretariat. United Nations ST/SG/AC.6/2000/L.8. Review of the United Nations Programme in Public Administration and Finance

High Impact Practices in Family Planning Common Vision

Resolution adopted by the General Assembly on 9 April [without reference to a Main Committee (A/68/L.37)]

(In Support Of the ECOSOC Annual Ministerial Review) CONCEPT NOTE A. BACKGROUND. 1. The Annual Ministerial Review

Reproductive, Maternal, Newborn, and Child Health Rapid Health Systems Assessment

MONITORING AND EVALUATION Appeal no /2003

TIER II STANDARD FOR TECHNICAL COOPERATION ADMINISTRATORS INTRODUCTION

International <IR> Framework Implementation Feedback. Invitation to comment

INTERNATIONAL ORGANIZATION FOR MIGRATION. Keywords: internal audit, evaluation, investigation, inspection, monitoring, internal oversight

ILO Action Plan for Gender Equality

UNITED NATIONS DEVELOPMENT PROGRAMME Junior Professional Officer (JPO) JOB DESCRIPTION

Assessing the Development Effectiveness of Multilateral Organizations: Guidance on the Methodological Approach

Working Party on Aid Evaluation

Information and communication technologies for development: Progress in the implementation of General Assembly resolution 57/295

Joint Evaluation of Joint Programmes on Gender Equality in the United Nations System Management Response

Transcription:

INTERNATIONAL LABOUR OFFICE GB.285/PFA/10 285th Session Governing Body Geneva, November 2002 Programme, Financial and Administrative Committee PFA TENTH ITEM ON THE AGENDA ILO evaluation framework Evaluation within a strategic budgeting context Contents Page I. Introduction... 1 II. Evaluation in a strategic budgeting context... 1 Independence and credibility... 3 Commitment to organizational learning... 3 Accountability and transparency... 3 Flexible focus and methodology... 3 Coordination of evaluation under all sources of funds... 4 Relevance, effectiveness and efficiency... 4 III. Evaluation types and methods... 4 InFocus and other core programmes: Strategy design, implementation and impact... 5 Focused and thematic evaluation... 6 Technical cooperation project evaluation... 7 Decent work country programmes... 8 IV. Capacities needed to implement the ILO s proposed evaluation framework... 8 Evaluation reporting, dissemination and systems development... 8 Resourcing... 8 GB285-PFA-10-2002-09-0073-1-EN.Doc/v2

I. Introduction 1. In accordance with views expressed by the Governing Body, 1 the Office has moved forward on its framework for evaluation, emphasizing the importance of evaluation for decision-making, planning, and the design and implementation of programmes and projects. 2. This paper sets the evaluation systems that the ILO proposes to put in place within the strategic budgeting framework. The goal is to institute a comprehensive, coherent, transparent evaluation framework that contributes to more effective programme and project planning, monitoring and reporting. Based on the guidance the Governing Body provides, the Director-General plans to instruct the Office to develop the new systems, introduce measures to enhance the capacity of the Office to apply them, improve information sharing, and consolidate this with Office-wide circulars. This work would be carried out during the next three years. 3. The approach to evaluation proposed in this paper reflects the norms and standards recommended within the United Nations system, as set forth by the Secretary-General. 2 In line with these, the objective of evaluation in the ILO would be: (a) to determine as systematically and objectively as possible the relevance, efficiency, effectiveness and impact of the ILO s activities in relation to their objectives; (b) to enable the Office, member States and tripartite constituents to engage in systematic reflection, with a view to increasing the effectiveness of the main programmes by altering their content and, if necessary, their objectives. II. Evaluation in a strategic budgeting context 4. The introduction of strategic budgeting has required the reconsideration of the role of evaluation within a results-based management approach. The core emphasis of evaluation in this context is on assessing organizational performance by monitoring progress made against intended outcomes, in line with a medium-term planning process (the strategic policy framework), a biennial strategic budget process, and annual implementation reporting. 5. Within the ILO s strategic budgeting framework, programming involves an integrated system consisting of four processes: (a) Strategic policy framework. This is a medium-term perspective considering goals, priorities and the modalities for programme implementation. 1 GB.279/PFA/8; GB.283/14; GB.283/9/1; Report of the Chairperson of the Governing Body to the Conference for the year 2001-02, Provisional Record No. 3, International Labour Conference, 90th Session, Geneva, 2002. 2 United Nations: Regulations and rules governing programme planning, the programme aspects of the budget, the monitoring of implementation and the methods of evaluation (19 Apr. 2000, doc. ST/SGB/2000/8). GB285-PFA-10-2002-09-0073-1-EN.Doc/v2 1

(b) Strategic budgeting. This is a results-based budgeting process that is built upon measurable performance. (c) Work planning and monitoring. This involves the systematic collection and reporting of information to track resources used, work produced and progress towards intended outcomes. (d) Evaluation. This is done by analysing the determinants of performance and assessing how best to make improvements. Figure 1. The ILO strategic budget programming cycle Report Report Strategic framework Programme Work plan Implement Monitor Evaluate 6. Strategic budgeting in the ILO is based on an iterative process through which insights gained are fed into the planning process at different organizational levels with the primary goal of improving performance and strengthening the ILO as an institution. Its success depends on encouraging individuals within the ILO to think and act strategically at different management levels. Each unit is responsible for integrating it into its planning and reporting the key information on how well strategies are working and performance milestones are being achieved. Collectively, this transparency provides an important means of coordination across programmes, countries and regions, to develop coherence within the Decent Work Agenda. 7. A dynamic linkage between evaluation and other stages of programming is at the core of a results-based evaluation framework. As strategies are identified, measures of performance, monitoring methods and evaluation plans are integrated into the planning process to ensure that feedback will be transparent and well functioning. 8. Evaluation is one of several management tools that help to determine the success of a strategic approach. Other processes include external and internal audits, management audits and reviews of topics and programmes by Committees of the Governing Body. 9. All evaluations will share the goal of improving performance through the dissemination of sound practices and lessons learnt from experience. The Office proposes following a set of guiding principles to help achieve this aim. 2 GB285-PFA-10-2002-09-0073-1-EN.Doc/v2

Independence and credibility 10. The ILO is committed to ensuring the credibility and independence of evaluation. Independence and impartiality are maintained throughout evaluation planning, development of the terms of reference, selection of evaluators and actual execution of the evaluation. The separation of evaluation responsibility from line management functions for programme and project delivery, the transparency of process, and the integrity and expertise of evaluators all contribute to ensuring the credibility of evaluation findings and recommendations. These practices are reinforced by the Director-General s commitment to promoting accountability and transparency, and protecting evaluation processes from pressure and influence. 11. The Office will seek to ensure that evaluation findings are objective, independent, transparent and conducted by professionals with expertise in the subject matter. Credible evaluation also requires the involvement of stakeholders direct and indirect clients or beneficiaries, tripartite constituents, staff, other parts of the United Nations system, and relevant partner groups, under the coordination of an independent evaluation team. These groups can participate in the evaluation design, support data gathering and analysis, and assist in the interpretation of findings, conclusions and recommendations. Stakeholders can also advise on effective dissemination and follow-up action. Commitment to organizational learning 12. Evaluations systematically review what does or does not work well and identify improvements. They should be closely integrated with tools and technical support to promote learning and forward thinking. All evaluations will share the goal of improving performance through the dissemination of sound practices and lessons learnt from experience. If well designed and implemented, evaluations will be highly valued for their usefulness in shaping decisions on strategies and the allocation of resources. A wellfunctioning feedback mechanism is critical to stimulating interest in and use of evaluation. 13. Evaluation reports will focus on findings, conclusions and recommendations for action. In addition, lessons learnt, both positive and negative, that can guide decision-making elsewhere will be disseminated widely. This can benefit policy-makers, programme managers, tripartite constituents, the larger United Nations and international development community, as well as researchers and the general public. The dissemination of results needs to be timely to support decision-making processes. Accountability and transparency 14. Evaluation results should lead to action. Systems of accountability and reporting are needed to track whether there is effective follow-up of recommendations. The quality, relevancy and degree to which key stakeholders have a sense of ownership of the assessment outcome will influence the effectiveness of the follow-up. A key output of every evaluation should be a plan for follow-up action, prepared jointly by the evaluators and relevant decision-makers. Results and progress toward commitments should be reported and recognized. Flexible focus and methodology 15. Evaluations are structured to answer specific decision-making needs. A variety of methodologies may be adopted, including both qualitative and quantitative methods. GB285-PFA-10-2002-09-0073-1-EN.Doc/v2 3

Qualitative methods provide descriptive and in-depth information gathered from stakeholders with different viewpoints. Quantitative information, where adequate and reliable, or which is deemed critical for meeting the objectives of the evaluation, can support summarization, comparisons and generalizations. Coordination of evaluation under all sources of funds 16. Technical cooperation projects provide an important complement to the development of programmes funded under the ILO s regular budget. The integration of regular budget activities with those funded by extra-budgetary resources is a critical factor for ensuring coherence in the ILO s substantive programmes and is essential for advancing the Decent Work Agenda. This can be supported by linking the evaluation and reporting of outcomes at the project, programme and country levels. Programme evaluation will build on the performance outcomes and lessons learnt that are documented in project-level evaluations. Relevance, effectiveness and efficiency 17. Evaluation will provide opportunities to reassess programmes in the light of new policies and priorities. It is one of several tools for assessing the continued relevance, effectiveness and efficiency of programmes and determining whether to downscale or phase out outdated or ineffective programmes or programme components. There are ongoing initiatives to establish an integrated approach to ILO policy-making and programming, to decentralize ILO programmes and embed cross-cutting objectives in the ILO s work. The ways in which different units in the Organization respond to these initiatives should be captured through the evaluation process. 18. The cost of evaluations should constitute a minor portion of the budget of the subject being evaluated, and their benefit should outweigh their cost. III. Evaluation types and methods 19. As appropriate, evaluation responsibility will be decentralized to increase both the efficiency and effectiveness of evaluation processes and outcomes. A centralized capacity will remain within the Bureau of Programming and Management to provide Office-wide guidance on evaluation issues, the elaboration of guidelines and procedures, and the coordination of evaluation planning. This will be complemented with outside expertise and in-house experience. 20. Evaluations will continue to take a variety of forms, each of which addresses decisionmaking needs at various organizational levels. Taken together, these evaluations ensure that all ILO programmes, projects and activities will be subject to evaluation. Major evaluations to be submitted to the Governing Body will be announced in each programme and budget. A summary of proposed evaluations by type is given in table 1 below. 4 GB285-PFA-10-2002-09-0073-1-EN.Doc/v2

Table 1. Typology and characteristics of evaluation Type of evaluation Project Programme/ thematic Country Focus of evaluation Relevance, effectiveness, sustainability, efficiency (actual performance against planned); findings, recommendations, lessons learnt Coherence, effectiveness and strategic focus of ILO programmes (actual performance against planned); findings and recommendations to improve programming Integration of ILO Decent Work Agenda with country-level priorities; coherence of country programming with ILO strategic framework Primary responsibility Form Timing Managers responsible for successful implementation of projects (progress reporting); monitoring and independent evaluations responsibility of unit to which project managers report For InFocus Programmes, PROGRAM; for themes and special programme issues, the unit(s) implementing or senior management; oversight from PROGRAM or CODEV for technical cooperation issues Country directors or decent work teams for independent evaluations; PROGRAM for coordination and oversight PROGRAM: Bureau of Programming and Management. CODEV: Development Cooperation Department. Independent evaluation and results-based reporting Independent and results-based reporting at programme level Internal review and subject to independent evaluations Annual, or as set in the project s monitoring and evaluation plan Annual, coordinated with programme and budget and implementation report, InFocus Programmes independently every four years Ongoing, coordinated with programme and budget process InFocus and other core programmes: Strategy design, implementation and impact 21. Beginning in November 2002, the Office will submit annually to the Governing Body two evaluations of InFocus Programmes, so that all eight will be subject to independent evaluation over a four-year period. 22. The programme evaluation methodology will include analysis of the focus and approaches used by programmes and selected units with their reported outcomes to determine where performance is on track and where potential for improvement exists. Programme evaluations complement the ongoing performance-monitoring process by developing more in-depth understanding of how well programmes achieve intended outcomes. 23. More specifically, programme evaluation components will include aspects of the following: (i) how well programme managers and staff institutionalize processes to improve their capacity to develop and implement effective strategies; GB285-PFA-10-2002-09-0073-1-EN.Doc/v2 5

(ii) how the strategies achieve designated outcomes, with particular emphasis on efficiency, effectiveness and sustainability of results, as determined by the ILO and other key constituents, partners and beneficiaries of the programme; (iii) how programme direction and approaches can be improved to achieve higher levels of performance. 24. Specific areas of assessment. The assessment of how effectively a programme is achieving its intended results can be approached from several directions. 25. Appropriateness of programme design. Sound design is a critical determinant of favourable performance. Assessment will focus on how operational objectives have been translated into specific strategies, whether strategies have incorporated analysis of external environments and internal capacities, how well programme design defines and integrates stakeholder needs and partner links, and the validity and effectiveness of strategies in achieving intended outcomes. 26. How well management makes use of performance information. Programme management monitors and evaluates its performance on an ongoing basis. Feedback, primarily in the form of indicators and qualitative reports, provides the means for regularly assessing how actual outcomes compare to what was targeted. Programme management is responsible for seeing that this performance information is analysed and used to make improvements. Evaluation can provide objective assessment of how well the feedback process is performing. 27. Programme managers use performance measurement as a tool to monitor progress towards meeting their objectives. As programmes evolve, the indicators used for measurement may become inappropriate or their interpretation more complex. Programme evaluators will work with programme management to assess the appropriateness and relevance of the performance measurement system, and specific indicators within the programme monitoring system. Special efforts may be required to assess the extent to which particular outcomes can be attributed to programme influences. 28. Causal links and impact. Programme evaluations can also review evidence to assess the plausibility of assumed causal links between programme outputs and intended outcomes. Where the links are found to be weak, programme evaluations can help to identify alternative means of measurement. 29. Gender focus and other cross-cutting objectives. The ILO is committed to mainstreaming gender equality. Evaluation will focus on the substantive nature of programme management performance, gender analysis and planning within technical programming, and human resource staffing and policies in terms of gender equality and mainstreaming. 30. Evaluations also will assess the extent to which ILO commitment to cross-cutting issues such as poverty reduction and social inclusion are integrated and promoted within programmes. They will be used to promote policy coherence within the Decent Work Agenda. Focused and thematic evaluation 31. On a periodic basis, strategic and thematic evaluations review policy issues that arise as a result of the ILO s evolving policy agenda. These may be organized in a variety of ways to cover, for example, clustered projects, programmes or geographical areas. Strategic and 6 GB285-PFA-10-2002-09-0073-1-EN.Doc/v2

thematic evaluations require a high level of specialized expertise as well as close coordination with senior staff seeking evaluation feedback. For a number of years the Governing Body Committee on Technical Cooperation has reviewed annually, at its March session, thematic evaluations on selected substantive areas. Appropriate arrangements will therefore be made for the continued preparation of thematic evaluations related to technical cooperation. Technical cooperation project evaluation 32. Building on existing ILO evaluation methodologies and capacities established for technical cooperation funding, the Office will apply standardized ILO methodologies for the design, monitoring and evaluation of technical cooperation projects throughout the Office. This will be facilitated by the further development of tools and guidelines, training, selection of outside evaluators, as well as the establishment of systems for tracking various elements in the process. 33. The criteria for evaluation will continue to include the relevance, efficiency, effectiveness, impact and sustainability of a technical cooperation project. Projects will be assessed from the perspective of country frameworks, particularly in terms of support for integrated programmes at the country level and responsiveness to the needs and priorities of ILO constituents. Projects will also be assessed to see how they fit into an integrated strategic budget. 34. The Office will also analyse technical cooperation evaluation reports for lessons learnt. These will contribute to Office-wide databases and dissemination systems for evaluation results, best practices and reports. An important task will be to introduce relevant elements from evaluation results into the ongoing implementation process and into the design of future programmes and projects. The Office will report regularly to the Governing Body on technical cooperation evaluation activities, in collaboration with technical and field units, and will continue to propose thematic evaluations around technical cooperation issues. 35. All technical cooperation projects will be subject to evaluation and, depending on the project and evaluation plan established therein, will take the form of self-evaluation, independent internal evaluation, external evaluation, or a combination of such forms. To assess the longer term effectiveness, impact and sustainability of major programmes and projects, ex post evaluations should be carried out on a selective basis. However, to date the budgets of projects financed from extra-budgetary resources, aside from a few exceptions, have not included provisions for ex post evaluations. New approaches to the donor community will be made to address this issue within the larger context of resourcing for technical cooperation discussed later. 36. Rules will continue to be established on the timing and nature of project evaluations. At present these rules provide that projects of under 18 months duration will have a final evaluation upon completion; projects with a duration of 18 to 30 months have a mid-term evaluation and a final evaluation upon completion; projects of over 30 months duration have annual reviews and a final evaluation upon completion. All technical cooperation programmes or projects with a budget of over US$350,000 are subject to annual selfevaluations and evaluations are required before starting a new phase, should there be any. An independent evaluation will be carried out at least once during the programme or project cycle. GB285-PFA-10-2002-09-0073-1-EN.Doc/v2 7

Decent work country programmes 37. Work has been under way over the past few years to enhance coordination between headquarters and the field. This was launched by the common programming framework outlined by the International Labour Conference in June 1999. 3 Joint programming exercises have been used to plan joint activities and these have been reinforced by the establishment of decent work teams. Moreover, new initiatives are under way to ensure that the development of overall objectives, strategies and programmes is coordinated across projects, countries and programmes, in particular in the preparation of the Programme and Budget proposals for 2004-05. 38. In future, the ILO will strengthen the country-level perspective in its strategic planning through a wider establishment of decent work country programmes. Country programme evaluations can assess the appropriateness of country-specific programming and guide understanding of needed processes to support better-integrated country-level goals and strategies. IV. Capacities needed to implement the ILO s proposed evaluation framework Evaluation reporting, dissemination and systems development 39. Evaluation reports are a primary means of supporting decision-making and the dissemination of lessons learnt. Database systems will support the comparison of planned performance with actual outcomes over a period of years, especially once the Enterprise Resource Planning (ERP) system is operational. These comparisons can prompt timely decisions to modify programmes during implementation, or to explore issues on a deeper level. 40. The implementation of the ILO s evaluation strategy requires the development of key capacities, including an evaluation monitoring database and website or newsletter for disseminating evaluation findings and lessons learnt. Performance-monitoring and evaluation outcomes need to be submitted through a well-defined and maintained feedback process. 41. Feedback systems will be institutionalized and systematized, and dissemination approaches developed for key target audiences. Those evaluated will be held accountable for acting on findings, conclusions and recommendations. Resourcing 42. The evaluation framework described in this paper requires a substantial strengthening of the ILO s capacity to organize evaluations and to ensure that the lessons of those evaluations are taken into account. This will necessarily be an Office-wide effort. 3 ILO: Decent work, Report of the Director-General, International Labour Conference, 87th Session, Geneva, 1999. 8 GB285-PFA-10-2002-09-0073-1-EN.Doc/v2

43. Part of the work that is required is the development of guidance and related training throughout the Office. The paper on the use of the 2000-01 surplus 4 outlines plans for a one-time investment in the establishment of evaluation systems and the training of staff in the technical programmes and regions. This can also be reinforced through new management training initiatives, for example. Governments and employers and workers organizations will be invited to make available their expertise to help with the further development of evaluation methodologies, and to provide independent evaluators for specific evaluations. 44. One area in which additional resources are needed relates to the evaluation of technical cooperation. This is linked to wider needs in terms of the improved design of technical cooperation, including quality control and an appraisal system. For one-time development costs, the 2000-01 surplus can help. However, this will not be sufficient for ongoing training and feedback. It is therefore proposed to negotiate with donors a procedure under which a part of the evaluation resources now built into project budgets would be reserved for a central project evaluation capacity. 45. The core resources for evaluation in the ILO do not permit the creation of a large central unit. This may be an advantage to the extent that technical programmes and regions take ownership of evaluation responsibilities. A small central oversight capacity can be efficient and effective in ensuring the independence of evaluations and encouraging the spread of lessons throughout the Office. 46. The Committee, in the light of its discussion, may wish to recommend to the Governing Body that it request the Director-General to apply the evaluation framework found in this paper in the ILO s future work. Geneva, 1 October 2002. Point for decision: Paragraph 46. 4 GB.285/PFA/9. GB285-PFA-10-2002-09-0073-1-EN.Doc/v2 9