IAS EVALUATION POLICY

Similar documents
Evaluation policy PURPOSE

WHO reform. WHO evaluation policy

Monitoring and Evaluation Policy

Executive Board of the United Nations Development Programme and of the United Nations Population Fund

NZAID Evaluation Policy Statement

Evaluation Policy for GEF Funded Projects

Terms of Reference. Projects Outputs Evaluation

CI-GEF PROJECT AGENCY MONITORING AND EVALUATION POLICY FOR GEF-FUNDED PROJECTS

ESCAP M&E SYSTEM Monitoring & Evaluation System Overview and Evaluation Guidelines

Evaluation: evaluation policy (2018)

DAC Principles for Evaluation of Development Assistance (1991) 1

Irish Aid. Evaluation Policy

UNODC Evaluation Policy Independent Evaluation Unit

Developing ACFID s first State of the Sector Report

The IUCN Monitoring and Evaluation Policy

Making the choice: Decentralized Evaluation or Review? (Orientation note) Draft for comments

Economic and Social Council

TOOLKIT ON EVALUATION UNITED NATIONS CHILDREN S FUND/ EGYPT COUNTRY OFFICE

The IUCN Monitoring and Evaluation Policy

Executive Board of the United Nations Entity for Gender Equality and the Empowerment of Women

Monitoring & Evaluation in the GEF

Working Party on Aid Evaluation

Management response to the annual report for 2017 on the evaluation function in UNICEF

Evaluation. Evaluation Document 2006, No. 1. Office GLOBAL ENVIRONMENT FACILITY. The GEF Monitoring and. Evaluation. Policy

GGGI EVALUATION RULES

BACKGROUND PAPER FOR THE FIRST INFORMAL CONSULTATION ON THE WFP GENDER POLICY ( ) Informal Consultation

CARE International Evaluation Policy 1

GEROS Evaluation Quality Assurance Tool Version

Inter-Agency Working Group on Evaluation Report of the Meeting 2-3 April 2001 Palais des Nations, Geneva, Switzerland

Implementation Plan For the Regional CDM Strategy

Frequently Asked Questions: UN-SWAP Evaluation Performance Indicator Reporting

ITC Evaluation Policy

Technical Note Integrating Gender in WFP Evaluations

INTERNAL AUDIT AND OVERSIGHT DIVISION

TERMS OF REFERENCE. Independent Evaluation of the ILO Action Plan for Gender Equality

Evidence, Learning and Impact Manager Programme Evidence, Learning and Impact

PARTICIPATION AND TRANSPARENCY IN MONITORING AND EVALUATION PROCESSES

Terms of Reference (ToR) End-of-the Programme Evaluation UNDP Support to Inclusive Participation in Governance May 2013

UNICEF-Adapted UNEG Quality Checklist for Evaluation Terms of Reference

Governing Body Geneva, November 2002 PFA. ILO evaluation framework. Evaluation within a strategic budgeting context TENTH ITEM ON THE AGENDA

Template: Organizational Capacity Assessment

UN Evaluation Week High level panel

Introduction Concept of the National Policy Project Objective Guiding principles

Consultancy Vacancy UNHCR Evaluation Service

ACFID Code of Conduct PMEL Guidance Note. Prepared for ACFID by Learning4Development

INTERNATIONAL ORGANIZATION FOR MIGRATION. Keywords: internal audit, evaluation, investigation, inspection, monitoring, internal oversight

50th DIRECTING COUNCIL 62nd SESSION OF THE REGIONAL COMMITTEE

Evaluations. Policy Instruction. United Nations Office for the Coordination of Humanitarian Affairs Ref.

Guidance: Quality Criteria for Evaluation Reports

PLANNING AND CONDUCT OF EVALUATIONS

STRATEGIC FRAMEWORK. National CASA Association

Terms of Reference Mid Term Review (MTR) UNDP Country Programme Action Plan (CPAP )

UNICEF Global Evaluation Report Oversight System (GEROS) Review Template

A New Approach to Assessing Multilateral Organisations Evaluation Performance. Approach and Methodology Final Draft

OIG STRATEGY FOR THE MANAGEMENT

UNITED NATIONS CHILDREN S FUND GENERIC JOB PROFILE

Terms of Reference (TOR)

Independent Evaluation Office Review of the Quality Assessment Of 2016 Decentralized Evaluations

Foreword... iii. 2 Situation Analysis... 1 Current status... 3 Key issues affecting Health Systems Global... 3

E Distribution: GENERAL POLICY ISSUES. Agenda item 4 EVALUATION POLICY ( ) For approval

The 2018 evaluation policy of UNICEF. George Laryea-Adjei Informa Briefing of the UNICEF Executive Board 22 May 2018

THE EVALUATION OF UNDP S CONTRIBUTION AT THE COUNTRY LEVEL UNDP Evaluation Office, January 2009

EVALUATION POLICY December 2014

UNEG Principles of Working Together

Indicative content of evaluation final reports Draft, Jérémie Toubkiss, Evaluation Office, NYHQ (updated 4 February 2016)

Strategic Framework International Recovery Platform

OPEV. a profile. Evaluation Department of the African Development Bank Group. Introducing. the Operations

UNEG Strategy

Presentation of UN Women Evaluation Policy Second Regular Session of the Executive Board. 30 November 2012

UNICEF Global Evaluation Report Oversight System (GEROS) Review Template

IASC Gender in Humanitarian Action Reference Group Terms of Reference

Guide for Misereor partner organisations on commissioning external evaluations locally

Critical milestones towards a coherent, efficient, and inclusive follow-up and review of the 2030 Agenda at the global level COVER NOTE:

For: Approval. Note to Executive Board representatives. Document: EB 2018/LOT/G.4 Date: 1 November 2018 Distribution: Public Original: English

Evaluation Handbook. Guidance for designing, conducting and using independent evaluation at UNODC

Terms of Reference (ToR) End-of-the Project Evaluation UNDP Support to the Strategic Capacity Building Initiative

Guidelines for UNODC Evaluation Terms of Reference

GLOBAL EVALUATION REPORT OVERSIGHT SYSTEM

AREA I: ASSESS NEEDS, ASSETS, AND CAPACITY FOR HEALTH EDUCATION

Evaluation: annual report

UNICEF Evaluation Management Response

UNICEF Evaluation Management Response 1

Core Humanitarian Standard

Our Future, Part 2. Building a better organisation together

ARRANGEMENTS FOR JOINT OECD- UNDP SUPPORT TO THE GLOBAL PARTNERSHIP FOR EFFECTIVE DEVELOPMENT CO- OPERATION

TERMS OF REFERENCE SOUTH AFRICA FINANCIAL SECTOR DEVELOPMENT AND REFORM PROGRAM INDEPENDENT PROGRAM EVALUATION. I. Background

Terms of Reference (TOR)

Critical milestones towards a coherent, efficient, and inclusive follow-up and review of the 2030 Agenda at the global level COVER NOTE:

Committee on Development and Intellectual Property (CDIP)

Plan International AU Liaison Office ToR for PAO Advocacy Strategic Review

Aid Program Monitoring and Evaluation Standards

JOB DESCRIPTION FOR THE POSITION OF FINANCE MANAGER

In brief: WFP s Evaluation Function

Terms of reference Evaluator for mid-term review of 4.5-year EuropeAid Grant Agreement

Terms of Reference (TOR)

UNICEF Plan for Global Evaluations. Highlights of the Plan

UNITED NATIONS CHILDREN'S FUND (UNICEF) CALL FOR INSTITUTIONAL EXPRESSIONS OF INTEREST

CARIBBEAN DEVELOPMENT BANK EVALUATION POLICY

JAG/DEC-2008/01 ITC EVALUATION POLICY EXPORT IMPACT FOR GOOD

The GEF Monitoring and Evaluation Policy 2010

Transcription:

IAS EVALUATION POLICY May 2011

Table of contents Background...3 Purpose of the evaluation policy...3 Evaluation definition...3 Evaluation objectives...4 Evaluation scope...4 Guiding principles for the evaluation function...6 Evaluation management...7 Page 2 of 8

Background Founded in 1988, the International AIDS Society (IAS) is the world's leading independent association of HIV professionals, with over 16,000 members from more than 196 countries working at all levels of the global response to AIDS. IAS members include researchers from all disciplines, clinicians, public health and community practitioners on the frontlines of the epidemic, as well as policy and programme planners. The IAS is the custodian of the biennial International AIDS Conference and lead organizer of the IAS Conference on HIV Pathogenesis, Treatment and Prevention. Those conferences alternately take place every two years. In addition, the IAS has initiated several projects to further support the professional development of key stakeholders coming from resource-limited countries and to leverage knowledge, experience and influence of its members to advocate for the policy changes and political commitments necessary to end the AIDS epidemic. The IAS is committed as well to provide assistance to regional AIDS societies/networks and conferences. In order to successfully achieve this mission, evaluation has become a full part of the IAS strategy. Since 2004, all IAS-convened conferences have been systematically evaluated and since May 2008, the IAS secretariat has a full time staff devoted to evaluation. Given the growing internal and external demand for evaluation and the widening scope of the evaluation function at the IAS 1, it is important for the IAS to have its own evaluation policy. Purpose of the evaluation policy The purpose of the evaluation policy is to ensure that IAS has timely, strategically focused and objective information on the performance and impact of its conferences, projects, initiatives and strategies to better achieve its goals. The policy aims to foster a common institutional understanding of the evaluation function at IAS, and further strengthen evidence-based decision-making and advocacy, transparency, accountability and effectiveness. Evaluation definition According to the UNEG Norms 2 for Evaluation, an evaluation is an assessment, as systematic and impartial as possible, of an activity, project, programme, strategy, policy, topic, theme, sector, operational area or institutional performance. It focuses on expected and achieved accomplishments examining the results chain, processes, contextual factors and causality, in order to understand achievements or the lack thereof. It aims at determining the relevance, impact, effectiveness, efficiency and sustainability of the IAS s interventions and contributions. An evaluation should provide evidence-based information that is credible, reliable and useful, enabling the timely incorporation of findings, recommendations and lessons into the decision-making processes. 1 The evaluation function is no more restricted to conferences: it also covers other IAS projects, initiatives and strategies. 2 Norms for Evaluation in the UN System endorsed by the UNEG in April 2005. Page 3 of 8

Evaluation is distinct from financial and compliance audit. It also differs from monitoring, which forms a part of management s accountability for self-assessment and reporting. However, its must be recognized that evaluation findings both draw from and inform the products of monitoring. Evaluation objectives All evaluations share the same objectives of organizational learning and accountability. 1. Evaluation is essential for learning and supporting decision-making process, so as to improve the design of future activities to be conducted by the IAS. This requires a commitment from the IAS managers to follow-up and act upon lessons learnt. 2. Evaluation provides the basis for a system of accountability to IAS members, partners, sponsors, donors and ultimately to the IAS Governing Council. It allows to assessing results and determining the extent to which expected results were successfully achieved. Evaluation plays also a critical role in promoting the work carried out by the IAS. Evaluation scope What to evaluate? The following categories are considered for evaluation: Conferences convened by the IAS and its regional partners. Meetings, summits and other events convened by the IAS. Membership benefits and resources not restricted to IAS members. IAS projects and initiatives such as workshops, professional development programmes, fellowship programmes, prizes and awards, the Industry Liaison Forum, awareness campaigns, etc. IAS strategies (i.e. the IAS strategic plan and departmental strategies such as the partnership strategy) Thematic evaluations will be also considered, especially themes addressed by IAS policy/advocacy activities. For the purpose of this policy, any of the above categories will be referred to as evaluand, i.e. the object to be evaluated/subject of the evaluation. Page 4 of 8

When to evaluate? Most evaluations are post-evaluations, meaning the object to be evaluated is considered completed. However, in view of the need, in selected cases, to learn from experience earlier, evaluation can be conducted during the life cycle of the evaluand. This applies to certain projects, services, policies and strategies, and is usually carried out through a mid-term review. What are the evaluation criteria? The IAS considers the following DAC Criteria 3, as laid out in the DAC Principles for Evaluation of Development Assistance: Relevance: measures the extent to which the objectives and design of the evaluand are suited to the priorities of the target stakeholders and remain valid. It also refers to the extent to which the objectives and design of the evaluand are aligned with IAS s mission, strategy and specific priorities. Relevance can be understood as are we doing the right thing? This includes the question are we the best placed organization to do it? (in other words, do IAS s comparative advantages/added values justify its role?). Effectiveness: assesses whether the evaluand achieved/is achieving progress towards its expected results. It also refers to the major factors influencing the achievement or non-achievement of the results. Efficiency: measures the outputs in relation to the inputs. It examines the extent to which the approved outputs have been achieved within the agreed budget, timeframe and specifications. It is an economic term which signifies that the evaluand uses the least costly resources possible in order to achieve the desired results. This generally requires comparing alternative approaches to achieving the same outputs, to see whether the most efficient process has been adopted. Impact: assesses the positive or negative, intended or unintended effects produced by the evaluand and the extent to which these effects can be attributed to the intervention (i.e. the evaluand). An impact evaluation usually takes place after the intervention has evolved to a steady state. Sustainability: measures the extent to which changes generated by the IAS s intervention are maintained over a longer period and identifies the major factors which influenced the achievement or non-achievement of sustainability of the intervention. There are different aspects of sustainability, including institutional, capacity, technological and financial sustainability. These different aspects have to be assessed when looking at the sustainability of an intervention. 3 Sources: The DAC Principles for the Evaluation of Development Assistance, OECD (1991), Glossary of Terms Used in Evaluation, in 'Methods and Procedures in Aid Evaluation', OECD (1986), and the Glossary of Evaluation and Results Based Management (RBM) Terms, OECD (2000). Page 5 of 8

Given the wide range of potential evaluands at the IAS, not all criteria can be systematically considered. Guiding principles for the evaluation function All evaluations follow the same guiding principles, based on the UNEG Norms and Standards and Code of Conduct for Evaluation: Independence/Impartiality Evaluation must be conducted in an independent and impartial manner. Feasibility Evaluation must be feasible. To this end, evaluation concerns must be addressed at the design stage of the evaluand, with adequate resources set aside 4. Credibility Evaluation must be credible by meeting professional quality standards and rigour 5. Inclusiveness Whenever possible, evaluations must be planned and undertaken in close collaboration with key stakeholders. Transparency Evaluation methodology, findings, recommendations and lessons must be made public and disseminated to all stakeholders concerned through a range of channels. Utilisation Evaluation must be duly considered, with management responses through action plans and progress reports. The use of evaluation must be an integral part of IAS's planning and implementation. Whenever possible, data must be disaggregated by gender, age and other key demographics or variables. Evaluation should also include trend analysis whenever possible. 4 An amount totaling 3 to 5 per cent of programme/project expenditures should be dedicated to evaluation. 5 No method is superior to others. Evaluation methods must be chosen that are appropriate for the evaluand and the evaluation to be performed, and include both qualitative and quantitative data. Page 6 of 8

Evaluation management Because independence and objectivity are vital for the credibility of the evaluation work, evaluation is conducted by an independent department, the Planning, Monitoring and Evaluation (PME) department. Although this department is integrated into the organizational structure of the IAS secretariat, the head of this department is directly and solely responsible to, and takes his/her instructions only from the IAS Executive Director. The PME department is responsible for the following tasks: Designs evaluations 6 and data collection instruments, in collaboration with key stakeholders. Recruits and supervises external consultants, interns and volunteers to support the evaluation function. Conducts evaluations, including data collection. Performs statistical analysis (with SPSS) and qualitative data analysis. Drafts evaluation reports and finalizes them based on consultations with key stakeholders 7. Ensures the wide dissemination of evaluation findings and recommendations. Coordinates the evaluation follow-up process (see details in the table below). Ensures quality assurance for evaluation through reviewing and approving surveys forms, evaluation plans and evaluation reports produced by staff and consultants. Evaluation Follow-Up Process The PME department collates all recommendations of the final evaluation report and those included in internal reports relevant to the evaluand, cluster them by area and subarea and assign a responsible person for each of them. All this information is contained in the Management Response Sheet and shared in due time with all staff. Staff responsible for implementing recommendations is also responsible for reporting progress on follow-up actions and providing justifications for any failure in implementing fully or partially the recommendation(s) in question (this is done directly on the management response sheet saved on sharepoint). The PME department periodically monitors the information in the management response sheet and shares it with the Executive Director. 6 Evaluation Terms of Reference (ToRs) or evaluation plans. 7 The Evaluation Report is logically structured; it contains an executive summary, a detailed description of the evaluation methodology that has been used for conducting the evaluation, evidence-based findings, conclusions and recommendations, as well as acknowledgements, automatic tables of contents and figures, a list of acronyms and relevant appendixes. Findings are presented in a way that makes the information accessible and comprehensible. The PME department is responsible for drafting the evaluation report, getting and incorporating feedback from key stakeholders, editing the report and obtaining the final approval of senior manager(s) and/or director(s) for publishing and disseminating the report. Page 7 of 8

With regards to capacity building and knowledge sharing, the PME department builds knowledge of good evaluation practices with a view to increase staff capacity in evaluation and to promote an evaluation culture in IAS. It is also committed to build/strengthen M&E capacities of IAS members and partners, and to share knowledge with external evaluators and conference managers through: Organization of workshops. Provision of technical assistance. Dissemination of evaluation products, guidelines and other resources (through the IAS website, a google group dedicated to conference evaluation, other websites and blogs). Participation in meetings, committees and online forums. The PME department has no responsibility in project implementation except at the planning stage where it is responsible for: Ensuring the project objectives are measurable. Developing or reviewing the M&E plan. Checking the overall project logic, using the logical framework approach. *************************** Page 8 of 8