Indicators, Targets and Data Sources

Similar documents
Results Based Management: Theory and Application

Handbook Basics of Indicators, Targets and Data Sources. 11/30/2011 Produced by Poverty and Environment Initiative (PEI) Project Tajikistan-Phase 1

Mid-term Evaluation for the Gender and Generational Empowerment project in Tanzania.

Evaluation Policy for GEF Funded Projects

Monitoring and Evaluation: the Foundations for Results

Draft Key Workshop Findings

Lessons Learned from Global MDG Monitoring

TIPS BASELINES AND TARGETS ABOUT TIPS

SMART Goals: A How to Guide

DIAKONIA TERMS OF REFERENCE FOR BASELINE SURVEY CONSULTANCY PROCUREMENT REFERENCE NUMBER: DK/CONS/09/001/2015

CAPACITY BUILDING IN CIVIC ENGAGEMENT

GUIDANCE ON CHOOSING INDICATORS OF OUTCOMES

Technical Note Integrating Gender in WFP Evaluations

How to write a logframe

David N. Margolis, Paris School of Economics

Logical Framework for the project

Executive Board of the United Nations Development Programme and of the United Nations Population Fund

Tools for Building Monitoring and Evaluation Systems in Public Works Projects

Results-Based Management (RBM) approach Presentation for Intangible Cultural Heritage Category 2 Institutes

Developing a Monitoring and Evaluation Plan for Food Security and Agriculture Programmes

Monitoring and Evaluation: A Logical Frame. Evaluating the Impact of Projects and Programs Beijing, China, April 10-14, 2006 Shahid Khandker, WBI

Draft Feedback on Workshop Results

Indicator One Country Ownership and Results

Monitoring & Evaluation Guidelines

Bilateral Support to Primary Education

Regulating State-Owned and Municipal Utilities: Information, Incentives and Governance

Core Humanitarian Standard

Guidance: Quality Criteria for Evaluation Reports

Gender Mainstreaming in Project Cycle Management

NATIONAL EXPORT STRATEGY

UNCT Performance Indicators for Gender Equality. Users Guide

WHO reform. WHO evaluation policy

Servacius B. Likwelile EXECUTIVE DIRECTOR Tanzania Social Action Fund (TASAF) Dar es Salaam, TANZANIA October 1-5, 2007

EXPERT GROUP MEETING ENGAGING CITIZENS TO ENHANCE PUBLIC SECTOR ACCOUNTABILITY AND PREVENT CORRUPTION IN THE DELIVERY OF PUBLIC SERVICES

Glossary of Performance Measurement Terms Ontario Public Service Reference Document for Provincial Ministries

Monitoring and Evaluation Guidance Note

The below rating scale is used to determine UW-Stevens Point competency proficiency.

Evaluation: annual report

AFRICA FOR RESULTS INITIATIVE GUIDANCE ON ASSESSING THE PERFORMANCE OF MFDR PROJECTS. Introduction. Massoud Moussavi Tool N 2 SYNOPSIS

GENERAL SECRETARIAT OF THE ORGANIZATION OF AMERICAN STATES SECRETARIAT DEPARTMENT OF PLANNING AND EVALUATION. Call for resumes:

Independent Evaluation Office Review of the Quality Assessment Of 2016 Decentralized Evaluations

REPORT ON CONFERENCE OUTCOMES

SDG10 Expert Group Meeting Reducing Inequalities: Progress and Prospects Geneva, 2-3 April 2019 Agenda

Joint Assessment of National Health Strategies and Plans

Evaluation of UNICEF Strategies and Programmes to Reduce Stunting in Children under 5 years of age

The PEFA Performance Measurement Framework

10/3/17. Increasing computational power Decreasing costs for data storage Increasing data collection. Data as the Foundation of Wisdom.

Further Development of the Initial Investment Framework

Athena SWAN Biosciences Best Practice Workshop

Impact Evaluation for Safety Nets

FSC SOCIAL STRATEGY: BUILDING AND IMPLEMENTING A SOCIAL AGENDA VERSION 2.1. Section C: FSC Social Strategy

DEAF DIRECT: Performance Management Policy: April Performance Management Policy

GEROS Evaluation Quality Assurance Tool Version

Workshop II Project Management

1.1 SDAA is a project that has been supported by the Executive Secretariat for Integral Development of the

Caritas Development Fund (CDF) - Manual Introduction to the Caritas Development Fund (CDF)

Designing IAEA Technical Cooperation Projects using the Logical Framework Approach

Capturing The Value:

TERMS OF REFERENCE FOR THE FINAL EVALUATION 1. BACKGROUND: ONE UN S PARTNERSHIP WITH THE GIHUNDWE HOSPITAL

Citizen Voice and Action FIELD GUIDE

Assessing the Development Effectiveness of Multilateral Organizations: Guidance on the Methodological Approach

Gender Policy Statement

PROJECT: Applying MDG Acceleration Framework: addressing governance bottlenecks to achieve water and sanitation coverage in Belize

C-18: Checklist for Assessing USAID Evaluation Reports

ISPC Commentary on the resubmission of the proposal CRP3.6: Dryland cereals (Revision of February 2012)

City of Cardiff Council Behavioural Competency Framework Supporting the Values of the Council

MANDATES ARISING FROM THE SIXTH SUMMIT OF THE AMERICAS. We, the Heads of State and Government of the Americas, resolve:

UNITED NATIONS RESIDENT COORDINATOR S OFFICE JOB DESCRIPTION

QQI Consultation on: Quality Assurance Guidelines and Criteria for Voluntary Providers of Further Education and Training V.01

Terms of Reference (ToR) End-of-the Programme Evaluation UNDP Support to Inclusive Participation in Governance May 2013

PDP s Third Call for Proposals

Kenya City Integrity Project. Methodology Whitepaper

The Assessment Process. Asian Development Bank Workshop October 2014 PEFA Secretariat

Open Government Data for Sustainable Development

Professional Development: Leadership for Performance Improvement

Evaluation policy PURPOSE

Indicative content of evaluation final reports Draft, Jérémie Toubkiss, Evaluation Office, NYHQ (updated 4 February 2016)

RESULTS BASED MANAGEMENT POLICY

Canadian International Development Agency 200 Promenade du Portage Gatineau, Quebec K1A 0G4 Tel: (819) Toll free: Fax: (819)

ICMA PRACTICES FOR EFFECTIVE LOCAL GOVERNMENT LEADERSHIP Approved by the ICMA Executive Board June 2017; effective November 2017

Value for Money: a modern challenge for Progressio

CONSULTANCY FOR REVIEW OF THE ECOWAS LABOUR AND EMPLOYMENT POLICY AND ITS STRATEGIC ACTION PLAN

National Guiding Principles and Standards for M&E of Public Sector Policies & Programmes in South Africa

Global Financing Facility in Support of Every Woman Every Child Country Platforms

Core Humanitarian Standard

WORKSHOP ENGAGING CITIZENS TO COUNTER CORRUPTION FOR BETTER PUBLIC SERVICE DELIVERY AND ACHIEVEMENT OF THE MILLENNIUM DEVELOPMENT GOALS

Level 4 NVQ Diploma in Customer Service. Qualification Specification

South Asian forum on the Sustainable Development Goals - New Delhi India Empowering people and ensuring inclusiveness and equality

PROCEDURE AND TOOLS FOR EVALUABILITY REVIEW OF ILO PROJECTS OVER US$5 MILLION

Level 4 NVQ Diploma in Customer Service. Qualification Specification

CI-GEF PROJECT AGENCY MONITORING AND EVALUATION POLICY FOR GEF-FUNDED PROJECTS

Yale University. Pilot Mentoring Program. Mentee Guide. Rev 6/10

A Guide for Writing S.M.A.R.T. Goals

Developing a Monitoring and Evaluation Plan for Food Security and Agriculture Programmes

50th DIRECTING COUNCIL 62nd SESSION OF THE REGIONAL COMMITTEE

Practices for Effective Local Government Leadership

UNAIDS 2015 REPORT. operational guidelines for selecting indicators for the HIV response

Module j:\cc69\nmm\website docs\new\pm&e core course\module 3 logic models 2\module 3 - logic models ii nov 2004.doc

Terms of Reference (ToR) End-of-the Project Evaluation UNDP Support to the Strategic Capacity Building Initiative

GENERAL SECRETARIAT OF THE ORGANIZATION OF AMERICAN STATES

Transcription:

Indicators, Targets and Data Sources

At the end of the session, participants will be able to: Construct Establish Describe Agree SMART Indicators Targets Good quality Data Sources on next steps 2

Results Management The art of result management is defining and achieving outcomes that are meaningful to both provider and client/consumer, are measured in a credible way, and are used in decision making- Susan Stout 3

Why Monitor, Report and Assess? Alerts managers to risks, problems and impediments to achieving results Forms basis for accountability within and across offices and for corporate reporting Encourages integration of lessons learnt into management decisions Provides inputs for validating technical and managerial policies and strategies. the benefits being? - improved programme delivery and achievement of ORs 4

Objectives Indicators Measures Targets Results Designing good Monitoring System-Key Questions what are we trying to achieve? What are we going to measure? How are you going to measure? What is the result that you want? What have you actually achieved? 5

Baseline Terms the situation prior to a development intervention against which progress can be assessed or comparisons made Indicator Target a quantitative and/or qualitative variable that allows the verification of changes produced by a development intervention relative to what was planned. a specific level of performance that an intervention is projected to accomplish in a given time period.

Baseline Target Achievement Baseline, Targets and Achievement Commitment Performance Current level of achievement 7

Baselines a clear picture of the pre-existing situation needed e.g. without knowing the baseline, how can one assess a 25% improvement in crop production baseline studies needed before targets can be set and before Approval can generally be given in some circumstances, an Inception Phase including some baseline data collection, may be appropriate 8

Indicators, Targets and Milestones Indicators are means; the proportion of girls achieving Grade 4 Targets are ends; increase by 15% in girls achieving Grade 4 by month 36

Types of Indicators Qualitative and Quantitative Combined both Qualitative and Quantitative Binary yes/no Direct and Indirect (Proxy) Product and Process

Indicators Indicators = How do we know?- Essential elements of the overall planning and monitoring system.

Indicators Indicators describe how the intended results will be measured - accountability Objectively verifiable, repeatable measures of a particular condition They force clarification of what is meant by the result.the fine print! Must be accompanied by baselines and targets

Logical Flow of Indicators Intervention Process and output Intermediate outcomes Final outcome (Impact) 13 30 September 2001

Performance Logic Inputs Process Outputs Outcomes Impact Funding, Planning & policies, Harmonization & efficiency Training & Capacity building, Procurement and supply Quality, Behavioural Interventions, & knowledge Service utilization and intervention coverage, Behaviour change Mortality, morbidity, nutrition So where are the results? 14

Indicators-Be Strategic! Be careful not to tie the hands of managers allow discretion and flexibility as performance-based incentives encourage innovation from the bottom up Golden rule: objectives of your program should guide choice of indicators Good indicators are outcomes closely correlated with impact, and outputs closely correlated with outcomes May consider process measures initially to establish critical systems (information, management, financial, etc.) 15

Indicators, Baseline, Target and Source of Data Outcome: By 2007, more girls in Belem Province enjoy a quality, basic education Output: 800 teachers in Belem Province can deliver the new curriculum effectively Indicator: Net enrolment ratio (M; F) Baseline: F:45% Target: F: 75% - Improvement in school test scores Indicator: # Teachers with new certification Baseline: 0 Target: 800 -Teacher proficiency reports -Improvement of school satisfaction ratings Source of Data -Annual school test scores report Source of Data: -School satisfaction surveys

Indicators and Targets: What s the Difference? Indicators A set of key measures that help you define and track progress toward your objective Targets Commitments, expressed in quantitative terms, of what you plan to achieve a specific statement of the amount of improvement to be achieved and the date by which it is to be achieved 17 Always set targets!

Targets: Guiding Principles Use targets to encourage improved performance and motivate people when there is a probability of success Targets should be realistic but challenging Too easy complacency Too difficult discouragement, demotivation Ensure targets at different levels are linked and define who is responsible for achievements 18

Tips for Targets Larger increase possible when starting from a low baseline Attainable Within a defined period Linked To performance 19

Performance Indicator Selection Criteria Validity - Does it measure the result? Reliability - Is it a consistent measure over time and, if supplied externally, will it continue to be available? Sensitivity - When a change occurs will it be sensitive to those changes? Simplicity - Will it be easy to collect and analyze the information? Utility - Will the information be useful for decision-making and learning? Affordable Do we have the resources to collect the information? Go beyond process to include outcome!

What is meant by:..enhanced policy....fully adopted by...capacity strengthened....fao s role is recognized....countries implementing...countries that have or are developing efficient and uniform services.. etc.? BUT in reality few result indicators are fully SMART They rarely stand-alone What does each clause and adjective and adverb mean in practice?

Typical Pitfalls Wordy (..and no change language) To promote equitable economic development and democratic governance in accordance with international norms by strengthening national capacities at all levels and empowering citizens and increasing their participation in decision-making processes Too ambitious Strengthened rule of law, equal access to justice and the promotion of rights Containing multiple results The state improves its delivery of services and its protection of rights with the involvement of civil society and in compliance with its international commitments

Typical Pitfalls Wishy-washy (ie. Support provided to improve..) Support to institutional capacity building for improved governance So general, they could mean anything To promote sustainable development and increase capacity at municipal level Overlapping with National goals/ MDGs (impacts) Substantially reduce the level of poverty and income inequality in accordance with the MDGs and PRSP Confusing means and ends Strengthen the protection of natural resources through the creation of an enabling environment that promotes sound resources management

Clarity Improves Reliability & Consistency Partners independently assessing the same change must come to the same conclusion No set of indicators is perfect; their limitations and weaknesses need to be acknowledged and allowed for Team members change; incoming staff must be able to clearly interpret indicators in the way intended by formulators Stakeholders may want clarification of, and justification for, the means of measuring, and the conclusions drawn

The need to clarify indicators Indicators need clarification eg. with a footnote or performance measurement matrix or explanatory note To take a well-known example: The MDG Indicator Handbook

MDG 5 - in FAO s remit By way of clarification: キ Definition キ Rationale キ Method of Computation キ Data Collection and Source

By way of clarification (cont): キ Periodicity of Measurement キ Gender Issues キ Disaggregation Issues キ Comments and Limitations Person or Agency responsible.

The need to clarify indicators To take another well-known example: The Paris Declaration Indicator 11

Tips for Indicators Have them; Align them to standards. Test them for validity and reliability. Be honest and unambiguous Discuss them and review them. Are they really key? Don t overestimate their importance! Benchmark them Do something with them. KPI should lead to change. Keep them simple but not too simple! How Many Indicators: No Perfect Answer:Too many can be counter-productive: difficult to communicate and manage, and can induce- indicator overload and fatigue Too few may focus attention and new resources on selected indicators at the expense of other important areas

Getting feedback Dialogue and data collection need prioritizing and resourcing. Allocate appropriate costs and time for getting it, usually from programme budget, rather than from other sources Agree means to open and maintain 2-ways channels, to survey users, to learn lessons and record success stories

Data Sources What evidence do we need? How do we get it? Available from existing sources? Is special data gathering required? Who will pay for data collection? How much data gathering is worthwhile? Who will collect and document the evidence? Where will it be located? 31

Examples of Data Sources records e.g. of secretariat; minutes, attendance lists, resolutions, budgets, accounts, etc. stakeholder feedback, results of focus groups documents, film, audio surveys and commissioned research reports annual reviews; harmonized partner reviews external evaluation reports local, national and global statistics and data 32

Humanize the Process Address staff concerns and ideas about performance management Address concerns about the uncertainty of the data Regardless of the topic, keep performance review processes fair and transparent Place performance review within a context of learning Enable a two way dialogue about performance Remember to highlight good news

Donors 34 For COWP Pilot Use Only

Q&A 35