Non-Lending Technical Assistance. Quality of. A QAG Assessment. October 25, Quality Assurance Group. Public Disclosure Authorized

Size: px
Start display at page:

Download "Non-Lending Technical Assistance. Quality of. A QAG Assessment. October 25, Quality Assurance Group. Public Disclosure Authorized"

Transcription

1 Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized Quality of Non-Lending Technical Assistance A QAG Assessment October 25, 2005 Quality Assurance Group

2 Table Of Contents I. Executive Summary 3 II. Introduction 13 III. Findings 21 IV. Conclusions 34 V. Recommendations 35 Quality Assurance Group 2

3 I. EXECUTIVE SUMMARY Quality Assurance Group 3

4 Summary of Findings Assessment results show that NLTA has been a highly effective knowledge transfer instrument But improper coding of tasks under NLTA is high and calls for remedial actions Quality Assurance Group 4

5 WHY ASSESS NLTA? Management concern over rapidly growing share of NLTA in AAA program, almost doubling in dollar terms (from $21m to $42m) and in number (from 175 tasks to 303 tasks) between FY01 and FY04 NLTA quality never systematically reviewed by QAG Quality Assurance Group 5

6 WHAT IS NLTA? The definition of free-standing NLTA applicable to the FY04 review period was human and technological inputs to the development of advisory services and instructional activities. NLTA often took the following form: Workshops and conferences: Vehicles to share international best practice, exchange knowledge, or build consensus Technical advice: Practical assistance to support implementation of policies, programs or projects Technical notes: Written practical guides to assist the client in implementation of policies, programs or projects The distinction between ESW and TA was whether or not the activity aims to produce original diagnostic/analytic content for the purpose of influencing the client s policies Quality Assurance Group 6

7 THE SAMPLE The FY04 NLTA universe includes 303 tasks for a total cost of $41 million 1/ The NLTA sample includes 75 2/ randomly selected tasks for a total cost of $19 million, representing 25% of all tasks by number and 46% by cost 3/ Tasks costing less than US$ 20,000 were excluded from the sample Sample yields robust results at the Bank-wide level only 1/ At the time of sampling. This amount has since increased to $42.4 million 2/ The original sample included 85 NLTA tasks delivered to the client during FY04, of which 10 were Reimbursable TA tasks. It was subsequently decided that Reimbursable TA should be subjected to a separate QAG review. The sample size was reduced accordingly 3/ The sample includes 15 tasks (20%) funded under the Public-Private Infrastructure Advisory Facility (PPIAF) for a total cost of $4.3 million (23%) Quality Assurance Group 7

8 MAIN FINDINGS 48% of tasks sampled (36 4/ out of 75), accounting for 53% of the sample cost ($10.1 million), were improperly coded as NLTA. Improper coding is primarily caused by a weak overall governance framework (unclear product definition, low SAP user friendliness, insufficient guidance/support to TTLs, unclear accountability) Only 52% of tasks sampled were assessed and rated (39 tasks 5/ out of 75 with a $ amount of 8.9 million or 47% of the total sample cost) as they were appropriately coded as NLTA 6/ 4/ Including 5 PPIAF funded tasks or 7% of the number of tasks sampled, accounting for 6% of the sample cost and representing 30% of the number of PPIAF tasks sampled and 26% of their cost 5/ Including 10 PPIAF funded tasks or 13% of the number of tasks sampled, accounting for 17% of the sample cost and representing 70% of the number of PPIAF tasks sampled and 74% of their cost 6/ The balance of 36 tasks could not be assessed and rated, using the NLTA guidance questionnaire, as they were not NLTA Quality Assurance Group 8

9 MAIN FINDINGS Continued At 99% overall Moderately Satisfactory or better (SAT) for both the weighted number of tasks and their weighted value, the quality of FY04 NLTA is strong. This compares well with the 94% overall Satisfactory or better rating for the FY04 NLTA tasks reviewed under the country AAA assessment A high number (29%) of tasks were rated Highly Satisfactory (HS) overall Strategic Relevance and Timeliness, Internal Quality and Dialogue and Dissemination were found particularly impressive by the panels as these dimensions were rated 98% SAT overall with only 4%, 5% and 8% of tasks rated Moderately Satisfactory, respectively Quality Assurance Group 9

10 MAIN FINDINGS Continued Close client involvement and solid consultant and Bank inputs resulted in both high quality content generation and significant knowledge transfer as indicated by the 99% and 96% SAT overall rating for Internal Quality and Likely Impact, respectively Managerial attention over the implementation of PPIAF funded tasks has often been limited, resulting in missed opportunities for better dialogue and dissemination and greater impact Although resources were found to have been very effectively used, panels noted a number of instances where task costs had not been fully recorded thus understating the cost of about a quarter of tasks assessed Quality Assurance Group 10

11 KEY RECOMMENDATIONS OPCS should: Simplify the presentation of the ESW and TA Decision Tree (eliminate overlapping definitions, clearly define the terms client, training, internal order and clarify the trigger for task creation) Prepare and launch an NLTA portal, similar to the ESW portal currently under development, to provide better guidance and support to TTLs Given SAP s low user friendliness and the extent of improper coding, allow all NLTA related entries (including the original task coding) into SAP to be made through the NLTA portal Explore, jointly with SFR and TFO, ways to ensure full accounting of tasks costs (including TFs and other external contributions) Quality Assurance Group 11

12 KEY RECOMMENDATIONS continued The Regions and Sector Anchors should monitor the use of the TA product code by their staff to reduce the incidence of improper coding Quality Assurance Group 12

13 II. INTRODUCTION Quality Assurance Group 13

14 NLTA DEFINITION 1/ The definition of free-standing NLTA applicable to the FY04 review period was human and technological inputs to the development of advisory services and instructional activities. NLTA often took the following form: Workshops and conferences: Vehicles to share international best practice, exchange knowledge, or build consensus Technical advice: Practical assistance to support implementation of policies, programs or projects Technical notes: Written practical guides to assist the client in implementation of policies, programs or projects. 1/ The key distinction between ESW and NLTA was that ESW aims at generating original analytic content while NLTA aims at disseminating/applying existing knowledge Quality Assurance Group 14

15 THE NEW NLTA DEFINITION The new (July 04) NLTA definition 2/ includes activities 3/ that are: Aimed at enabling an external client to implement reforms or strengthen its institutions; Free standing; and Linked to a Bank unit accountable for the services provided (exclude client-executed grant-funded activities) 2/ The key distinction between ESW and NLTA remains that ESW aims at generating original analytic content while NLTA aims at disseminating/applying existing knowledge 3/ With the following SAP output types: (a) institutional development plan; (b) how-to guidance;(c) model/survey; (d) client document review; and (e) knowledge-sharing forum Quality Assurance Group 15

16 ASSESSMENT OBJECTIVES Promote accountability for quality by providing indicators on quality of NLTA work on a Bank-wide basis: Encourage systemic change through: - Improved understanding of key determinants of NLTA quality - Disseminating assessment findings to appropriate Bank units Quality Assurance Group 16

17 METHODOLOGY Methodology and Guidance Questionnaire were developed by reviewing the NLTA governance framework as well as from lessons learned from a recent assessment of the quality of OESW Tasks costing between US$ 20,000-$50,000 were reviewed by a triage panel and were rated using a simplified version of the Guidance Questionnaire 4/. All other tasks were rated by a customized panel The overall quality of each task was assessed on five quality dimensions (Strategic Relevance and Timeliness, Internal Quality, Dialogue and Dissemination, Likely Impact and Bank Inputs and Processes) 110 panelists conducted the assessment 22 Bank staff participated in the assessments as observers 4/ The simplified questionnaire provides and overall rating and a summary rating for each quality dimension but does not include ratings for individual questions Quality Assurance Group 17

18 METHODOLOGY Continued Panelists used QAG s new six point scale: - Highly Satisfactory (1): Best practice in several respects and no significant deficiencies - Satisfactory (2): Satisfactory or better on all aspects - Moderately Satisfactory (3): Satisfactory on all key aspects but significant missed opportunities - Moderately Unsatisfactory (4): Significant deficiencies in a few key aspects - Unsatisfactory (5): Significant deficiencies in several key aspects - Highly Unsatisfactory (6): A broad pattern of deficiencies Quality Assurance Group 18

19 SAMPLE The formal trigger for entry into the universe was the standard SAP milestone Delivered to the Client The FY04 task universe included 303 tasks for a total cost of $41 million 5/ All tasks with a cost of less than $20,000 were excluded from the universe The sample was stratified by cost ($1m+, $400-$999k, $200-$399k, k and $20,-$50k) with over sampling of large tasks 5/At the time of sampling. This amount has since increased to $42.4 million Quality Assurance Group 19

20 SAMPLE ROBUSTNESS The sample includes 75 6/ randomly selected tasks with a total cost of $19 million, representing 25% of the universe and 46% of its cost 7/ Statistical Robustness: The sample is only representative at the Bankwide level (95% confidence level with a 9% sampling margin of error) 8/ Results in this report are presented on a weighted basis to adjust for over/under representation in the sample because of stratification and sample design 6/ The original sample included 85 tasks, of which 10 were Reimbursable TA (RTA) tasks. It was subsequently decided that RTA should be subjected to a separate QAG review. The sample size was reduced accordingly 7/ The sample includes 15 tasks (20%) funded under the Public-Private Infrastructure Advisory Facility (PPIAF) for a total cost of $4.3 million (23%) 8/ Results are less robust for Regional and Network cohorts given their small size Quality Assurance Group 20

21 III. FINDINGS Quality Assurance Group 21

22 CODING 48% of tasks sampled were miscoded (36 9/ tasks out of 75 sampled) representing 53% of the $ amount ($10.2 million out of $19.1 million) Miscoding was found across all Regions and Networks There is no significant correlation between tasks cost and miscoding. The average cost of miscoded tasks is $283,000 while the average cost of all tasks is $255,000 9/ Including 5 PPIAF funded tasks or 7% of the number of tasks sampled, accounting for 6% of the sample cost and representing 30% of the number of PPIAF tasks sampled and 26% of their cost Quality Assurance Group 22

23 CODING Continued SAP not sufficiently interactive and user friendly as it does not provide adequate guidance to users to prevent task miscoding Preparation of the ESW and TA decision tree a good initiative but: Several TTLs found it overly complex, confusing, needing simplification Some of its definitions are overlapping and could lead to miscoding Some basic definitions are missing (who is a client, what is a legitimate trigger for task creation, what is training, what is an internal order etc ) A number of Panelists as well as TTLs wondered why training was not considered part of TA for coding purposes Quality Assurance Group 23

24 CODING Continued Of the 36 Miscoded tasks: - 8 were Partnership activities (PT) 10/ - 7 should not have been coded as NLTA since they were conducted for the Bank s benefit - 5 should have been coded as ESW given their original analytical content - 4 should have been coded as project preparation/appraisal and 4 were not yet delivered to the client as work was under way - 3 were not separate tasks (created simply to separately account for a funding source) - 2 were a combination of several activities (ESW, preparation, IO etc ) - 1 was External Training, 1 was project supervision and 1 should have been cancelled as it never started implementation 10/ Clear definition and processing guidance for this product line have not yet been issued Quality Assurance Group 24

25 RATINGS 52% of tasks sampled were assessed and rated (39 tasks 11/ out of 75) as they were appropriately coded as NLTA (with a $ amount of 8.9 million or 47% of the total sample $ amount) 12/ Overall quality of FY04 NLTA is strong (99% SAT 13/ by number of task and by value) and above the 91% satisfactory or better rating for NLTA tasks assessed under the country AAA assessment 14/ The most frequently stated development objective of NLTA is Institutional Development/Capacity Building 11/ Including 10 PPIAF funded tasks or 13% of the number of tasks sampled, accounting for 17% of the sample cost and representing 70% of the number of PPIAF tasks sampled and 74% of their cost 12/ The balance of 36 tasks could not be assessed and rated as they were not NLTA 13/ SAT = Moderately Satisfactory or Better 14/ The country AAA assessment reviewed a total of 64 NLTA tasks (of which 1 was delivered in FY01, 9 in FY01, 20 in FY02, 23 in FY03, 10 in FY04 and 1 in FY05). The FY04 NLTA cohort was rated 94% satisfactory or better Quality Assurance Group 25

26 RATINGS Continued Distribution of Ratings: Rating No. of Tasks % % of $ Amount Highly Satisfactory (HS) Satisfactory (S) Moderately Satisfactory (MS) Moderately Unsatisfactory Unsatisfactory Highly Unsatisfactory Total Quality Assurance Group 26

27 OPERATIONS RATED HS Country Task Title Sector Board Africa PEFA Workshops Fin. Management ECA Enhancing Poverty Anal. & Monitoring Poverty Reduction LCR Indigenous Peoples & Sustainable Dev. Social Development Moldova Social Protection Policy Dialogue Social Protection Nepal Living Standards Survey Poverty Reduction Peru Local Governance & delivery of Services Urban Development Russian Federation Education Advisory Services Education Slovenia Pension Model Social Protection Sri Lanka Primary Dealers Capital Adequacy Framework Financial Swaziland Railways Legal and Regulatory Private Sector Dev. World Global Labor Toolkit Private Sector Dev. Quality Assurance Group 27

28 OPERATIONS RATED HS 11 tasks or 29% of both the weighted number of tasks assessed and their total $ amount ($2.3 million) were rated Highly Satisfactory (HS) The percentage of HS NLTA tasks (29%) is above the OESW and the FY02 ESW HS levels of 18% and 20%, respectively and comparable to the 29% SAT rating for NLTA tasks assessed under the Quality of Country AAA assessment The 11 HS tasks were distributed among seven Networks (PREM, HDN, PSD, INF, ESSD, OPCS and FIN) and 4 Regions (ECA, SAR, LCR and AFR) Quality Assurance Group 28

29 OPERATIONS RATED HS Continued Outstanding aspects of HS tasks include (by order of importance): - Strategic relevance, particularly the consistency of the task objectives with the CAS - Internal quality, resulting from either high quality direct task team contributions or the preparation of thorough consultant TORs, a well managed consultant selection process leading to the hiring of highly qualified consultants and high quality supervision of their work by the Bank - Strong client involvement and participation and high quality interaction with the client Quality Assurance Group 29

30 RESULTS BY QUALITY DIMENSION Percent of Tasks in Each Rating Category Quality Dimension HS S MS SAT < Percent > Overall Assessment Strategic Relevance & Timeliness Internal Quality Dialogue and Dissemination Likely Impact Bank Inputs and Processes Quality Assurance Group 30

31 STRENGHTS Assessed tasks were implemented within a well defined strategic framework Task objectives were clearly stated For workshops/conferences the consultative process and high degree of client involvement were usually very instrumental in defining content, structure, venue and participants. High degree of client involvement noted for other tasks as well Extensive and relevant knowledge from both inside as well as outside the Bank was drawn upon to generate high quality content Quality Assurance Group 31

32 STRENGHTS Continued Both the quality of partnership arrangements with other donors and dissemination arrangements were very appropriate For a number of tasks, client impact was evident very shortly following delivery (shift in policy positions, setting up of a regulatory body, introduction of competitive bidding process, modeling of pension options etc ) Solid task team contributions often with help from highly qualified consultants, well supervised by the bank Many panels noted that task implementation had resulted in significant knowledge transfer Quality Assurance Group 32

33 WEAKNESSES Panels noted that for PPIAF funded tasks, beyond the selection of qualified TTLs, managerial attention had often been limited during task implementation, resulting in missed opportunities for better dialogue and dissemination and greater impact Costs had not been properly recorded for about a quarter of tasks assessed: Tasks often cross subsidized from other budgets/task codes TF contribution sometimes not accounted for Other contributions (from other donors, NGOs etc ) not accounted for With about 54% of tasks delivered to the client in June, the Bank s internal processes, more than client needs, seem to drive the timing of delivery Quality Assurance Group 33

34 IV. Conclusions Results show that NLTA is a valuable knowledge transfer tool that has been used very effectively by the Bank to engage clients in a timely manner and to deal with strategically important issues with appropriate expertise But improper recording of tasks under NLTA is high and calls for remedial actions Quality Assurance Group 34

35 V. RECOMMENDATIONS Quality Assurance Group 35

36 OPCS should: - Simplify the presentation of the ESW and TA Decision Tree (avoid overlapping definitions, clearly define the terms client, training, internal order and the trigger for task creation) - In order to provide better guidance and support to TTLs, finalize and launch an NLTA portal similar to the ESW portal currently under development - Given SAP s low user friendliness, allow all NLTA entries (coding, milestones, AIS etc ) into SAP to be made through the NLTA portal - Explore, jointly with SFR and TFO, ways to ensure full accounting of all task costs (including TFs and other external contributions) The Regions and Sector Anchors should monitor the use of the TA product code by their staff to reduce the incidence of mis-coding Quality Assurance Group 36