Results Based Management: Theory and Application

Size: px
Start display at page:

Download "Results Based Management: Theory and Application"

Transcription

1 Results Based Management: Theory and Application JPO Training Programme October-November 2011 Marielza Oliveira Operations Support Group/Executive Office

2 Key Message 1 We get what we focus on: to get highlevel results (national impact), this is what we must focus on. Key Message 2 RBM is about having a clear articulation of what we want to change and how to make it happen.

3 The results we seek - DEFINITIONS Impact: Change in the lives of people Actual or intended changes in human development as measured by people s well-being. Reduced infant and maternal mortality by 2013 Outcome: Institutional & Behavioural Change Actual or intended changes in development conditions that an intervention(s) seeks to support. The contribution of several partners is usually required to achieve an outcome. Improved provision of public sanitary services to rural communities by 2013 Atlas: award Outputs: Operational Change - capacity development Atlas: project ID Tangible product or service of an intervention that is directly attributable to the initiative. Outputs relate to the completion (rather than the to provide sanitation services to rural conduct) of activities and are the type of results over which managers communities have most influence. National Public Works Agency has the management systems, equipment, and skills

4 The results we seek Sanitation example Impact Level - signs that people s lives have improved: 1. # of cases of water borne diseases; 2. under 5 infant mortality rate in x region. Outcome Level - signs of change in institutional capacity/provision of services 1. % of public satisfied with delivery of services 2. Amount of resources allocated to solid waste management 3. % of public solid waste collected by private sector contractors 4. # of new households being served 5. % on time pick up of solid waste Output level what needs to be produced for outcomes to be achieved? New policy drafted to facilitate private sector participation in delivery of public sanitation services; public education campaign implemented in rural communities; vehicles and other critical equipment in place and training provided to staff.

5 Planning INFLUENCE THE RESULTS CHAIN efficiency Inputs Activities Performance indicators effectiveness Outputs Outcomes Impacts R E S U L T S Implementation

6 From Inputs to Impacts: Theory of Change Certain resources/inputs are needed to operate your program. IF you can gain access to them, THEN you can use them to accomplish your planned activities. IF you accomplish your planned activities, THEN you will deliver the amount of product or service that you intended. IF you produce these outputs, THEN certain changes in systems, behaviors, etc, take place. IF these benefits to participants are achieved, THEN your stakeholders benefit from changes in their life conditions.

7 Results at country level Project document Annual Work Plan (AWP) Public Works Agency has the systems, equipment, and skills to provide sanitation services to rural communities. CPD Country Programme Document CPAP Country Programme Action Plan UNDAF UN Development Assistance Framework Improved access of rural population to basic health and sanitation services by Natl. Dev. Plans Improved infant and maternal mortality.

8 Writing good Outcomes Outcomes are actual or intended changes in development conditions that interventions are seeking to support. Some tips: 1. Avoid action verbs Strengthening, enhancing, etc 2. Avoid intentions To assist the government, 3. Use completed verbs: reduced, improved, have greater access to, etc 4. Must signal that something has changed for someone 5. The something which has changed must be important to the country/region/community, not just UNDP. 6. Avoid UN speak: gender mainstreamed 56-58

9 Outcome examples 1. Legal and regulatory framework reformed to provide people with better access to information and communication technologies. 2. The poor in x region have better access to capital and other financial services. 3. Reduction in the level of domestic violence against women by Increased regional and sub-regional trade 5. Higher and more sustainable employment and income for urban slum dwellers.

10 Typical pitfalls Wordy (..and no change language) To promote equitable economic development and democratic governance in accordance with international norms by strengthening national capacities at all levels and empowering citizens and increasing their participation in decision-making processes Too ambitious Strengthened rule of law, equal access to justice and the promotion of rights Containing multiple results The state improves its delivery of services and its protection of rights with the involvement of civil society and in compliance with its international commitments Beyond comprehension The Poverty/environment nexus is enhanced.

11 Typical pitfalls Wishy-washy (ie. Support provided to improve..) Support to institutional capacity building for improved governance So general, they could mean anything To promote sustainable development and increase capacity at municipal level Overlapping with National goals/ MDGs (impacts) Substantially reduce the level of poverty and income inequality in accordance with the MDGs and PRSP Mixing means and ends Strengthen the protection of natural resources through the creation of an enabling environment that promotes sound resources management

12 Writing good Outputs Tangible products and services or improvement in skills and abilities. Relate to the completion (rather than the conduct) of activities Some tips: 1. Must be clear what is being delivered 2. Must be achievable within the project period 3. Managers have a high degree of control If the result is mostly beyond the control of the programme or project, it cannot be an output Failure to deliver the outputs usually means failure of the project 58-59

13 Output examples a. National electoral body has adequate personnel, equipment and skills to administer free and fair national and local level elections by b. Study of environment-poverty linkages completed c. Police forces and judiciary trained in understanding gender violence d. National, participatory forum convened to discuss draft national anti-poverty strategy e. National human development report produced f. Revised electoral dispute resolution mechanism established g. Business processes reengineered h. Compliance mechanisms established

14 Exercise: Review the UNDAF/CPD Select 2 CPD outcomes. Review the quality of the outcomes Is it clear where the country should be at the end of the cycle? Are CPD outcomes/resources the same as in UNDAF? Is it clear who is supposed to benefit, and how? How can they be improved? Review the related outputs (use ERBM platform) Are they worded as products, services or improvement in abilities? Are they the product of completed activities? Are they the right things to deliver, to make the desired change happen? (Are they the right set? Are they under our control?) How can they be improved?

15 Performance measurement is essential: During: To improve the programme After: To assess whether the programme is successful in achieving results To learn INDICATORS Are we are making progress? To help communicate with stakeholders Reliable signals that inform about real changes

16 Measuring performance Promise to deliver Performance: Actual delivery Readings on the same indicator at different points in time Baseline Target Measurement at the end of each period Before Expected after Real after

17 Good indicators Technical aspects Valid measures what it is intended to measure Reliable consistent and comparable over time Sensitive able to detect extent and direction of change during the cycle Practical aspects Specific Identifies the nature of the expected change, the target groups, the target area, etc Measurable reliable and clear measure of results Attainable realistic given resources likely to be available Relevant helps us to correct course during, and to learn after Time bound data collection is feasible and not overly burdensome within timeframe

18 Performance measurement Using indicators requires a systematic approach, including a clear definition of how measurements will be done: Information required: What is the nature of the information required for each indicator? Sources of information: From where can the information be obtained? Methods of collection: How will the data be collected? Frequency and responsibilities: Align data collection to reporting cycles Who is accountable for measuring and analysing?

19 Performance measurement matrix Result Indicator Data Description Data sources Collection methods Frequency Responsibility

20 Quantitative indicators Measures of quantity including statistical statements Number: # people below extreme poverty line Percentage: % government budget allocated to social protection Ratio: # doctors for 1,000 people Terminology needs to be clear to ensure validity and reliability of what is being counted

21 Qualitative indicators Judgments and perceptions derived mostly from objective analysis e.g. extent of commitment of Government to respect human rights quality of sanitation services received by rural dwellers level of client satisfaction with public services Terminology needs to be clear to ensure validity and reliability of what is being measured.

22 Proxy indicators Not a direct measure of the stated result, but an indirect reflection of the situation Used when more direct measures are not available due to lack of information or complexity of situation Based on a sound assumption about behaviour of certain phenomena relating to the stated result.. and therefore are context specific Can be quantitative or qualitative Stock market index as a gauge of the economy Electricity consumption as a view of income Number of women in management positions as an assessment of social justice

23 OUTCOME INDICATOR Focus on what is critical to see happen: Versus or Versus Number of audit recommendations implemented Number of critical audit recommendations implemented # of persons provided with information on HIV/AIDS - # of persons who say they have changed behaviour based on information received on HIV/AIDS 62

24 EXERCISE: INDICATOR CONSTRUCTION Using the same 2 CPD outcomes as before Review the quality of the outcome indicators Are they good signals of the change we are trying to achieve? Do they focus on what is critical? Are they disaggregated? Are they SMART? It is possible to collect data during the cycle? Are the baselines and targets using the same unit of measurement? Has monitoring taken place using them? How can they be improved? Review the related output indicators Do they indicate completion? Do they focus on what is critical? Are they clear and measurable? How can they be improved?

25 KEY MESSAGE 3 You must structure your organization so that people work effectively together to achieve the results that matter. 76

26 Results Chain through the life of the program Purpose of implementing performance information system: to improve management practices Reasons for implementing MfDR - managing for development results : enhancing institutional learning strengthening staff accountability improving decision-making

27 Results Chain through the life of the program CLARIFY PROGRAMME THEORY Describe problem to solve Identify desired results (long- and short-term) List strategies to seek results Examine risks and assumptions IMPLEMENT AND MONITOR YOUR PROGRAMME Describe impact you expect to have in the community, in 10 years time Identify desired results Identify outputs you expect to produce/deliver to achieve results Describe process for checking if outputs are being delivered and progress is being made DETERMINE AND COMMUNICATE RESULTS Identify key audience for each component of the programme Identify the questions they may have about your programme Describe the information you need to collect and analyse to indicate programme achievements and lessons Structure the information to the audience s needs

28 Challenge: To KNOW that results are being achieved or not Monitoring quality High Low Monitoring infor will show important elements: the big picture, challenges/issues, opportunities. Virtuous circle: Achieving results transparently and knowing what s going on: Big picture and small picture Evidence based, quality decisions Vicious circle: low performance; and do not know what s going on Opinion-based decisions Low Small picture good, but M&E information does not bring out big picture results. Do not really know what s going on. Unable see the overall change. High Progress toward results

29 EVALUATION FINDINGS The programme addresses clear and critical national priorities. However, the programme has spread too thin into many activities within 11 outcomes that have not been adequately specified nor do they present indicators for measuring progress. The thematic areas are neither linked nor do they reinforce each other. Furthermore, the thrust is on outputs oriented to the short-run or middle-run results without a long run view. With this approach, the Agency neglected the opportunity to assume a position to engage in advocacy and policy dialogue on critical issues for the transformation of society and the process of democratization in xxx country.

30 Communicating results - Basic Principles - 1. Accuracy and verifiability 2. Coherence and clarity 3. Relevance and value 4. Remembering the audience 2

31 ROAR 10-Point Checklist 1. Does the report capture the main development changes in the country? 2. Does the report highlight UNDP s main contributions to these changes? 3. Does the report have a strong focus on high level change rather than outputs? 4. Could the report be read and understood by someone from outside the country? 5. Could the report be read and understood by someone from outside UNDP? (excluding the management section) 6. Does the report present a compelling picture of UNDP s development work? 7. Does the report include any significant advocacy, communications or partnership building work conducted successfully during the year? 8. Does the report read as a single coherent effort rather than a series of separate pieces of reports stitched together? 9. Can my office verify the statements made in the report? 10. Has the report been discussed at different points in its formulation (particularly start and end)?

32 Some examples of pitfalls (1) Focusing on programming processes instead of the progress of the programme. EXAMPLE: A number of quality annual work plans based on broad stakeholder consultation and buy in were finalised, approved (through LPAC meetings) and signed in This followed an extensive consultation for the development of the new programming system based on the United Nations Development Framework (UNDAF), Country Programme Document (CPD) and the Country Programme Action Plan (CPAP). More important is the fact the process of developing the AWPs was led and owned by the national institutions. And over 20 of UNDP's implementing partners were microassessed in line with the harmonised cash transfers methodology (HACT).

33 Some examples of pitfalls (2) Listing outputs and activities, often cut and pasted from elsewhere. EXAMPLE: 2008 was the first year of the CPAP implementation. Activities were therefore implemented and results achieved in each of the key areas below: 1.democratic governance and human rights a. draft of the law on prevention and fight against corruption and related practices technically validated b. draft of the presidential decree reorganizing the National Anti corruption Commission technically validated c. advocacy in favour of the implementation of Election [X] d. study of the electoral code e. teachers manual of Human Rights elaborated f. national plan of action for human rights promotion elaborated g. general population sensitized on human rights issues h.40 focal points trained on Human Rights Based Approach

34 Some examples of pitfalls (3) Focusing on inputs and delivery: EXAMPLE: Programme and financial targets have all been met, although several large initiatives have been unduly delayed due to the financial crisis, government budgetary reductions and MDGF delays. Overall this concerns some US$8,5M of projects which should have started in In spite of this all delivery and efficiency indicators have improved compared to 2007, with other programmes picking up the slack. Financial sustainability of the CO has also improved and gov-t co-financing has surpassed US$4M with an additional US$11M in parallel financing attracted.

35 Some examples of pitfalls (4) Focusing on low level results: EXAMPLE: 60 rural women in select pilot areas have enhanced their opportunities in income generation through business development training, networking and distribution of grants in that way testing approaches and creating enabling environment for the introduction of micro-financing opportunities. The capacity needs assessment and follow up strategy has been developed to continue promotion of self-employment and business opportunities for rural women.

36 Exercise: Report Using one of the 2 CPD outcomes used before Reporting/Communicating on results (using ERBM platform, UNDP Annual Report of the Administrator, and website) Have the relevant outputs been monitored? Do we know how close we are to achieving outcome? What have we reorted to HQ on these outcomes? What has UNDP reported to the EB? What, if anything, have we communicated to stakeholders?

37 Communicating for Results Toolkit Intranet Home Toolkits Communications Toolkit 3 rd Header on the right hand side External Access Available to interested partners

38 Communicating for Results Toolkit Each of the Five Sections: Core Concepts Tools Best Practices

39 You Tube UNDP FlipCam Video Project

40 UNDP Success Stories Website We have the tool the challenge now is to populate it

41 UNDG Strategic Communication Resources

42 SUMMARY 1. Understand your office s strengths, weaknesses and comparative advantages. 2. Prioritise. Look for opportunities to make impact. 3. You get what you focus on, so focus on what you really want 4. Be clear on what you want to achieve, and know how to measure whether you are achieving it 5. Think big picture, strategy. results not just activities, outputs, outcomes. 6. Communicate results in simple, meaningful ways.

43 Plenary: Back to the Office What can I do differently when I go back? In Planning? In Monitoring? In Reporting/Communicating on results?