Performance Reports vs. Evaluation Reports: Two Case Examples from the Social Sciences and Humanities Research Council of Canada (SSHRC)

Size: px
Start display at page:

Download "Performance Reports vs. Evaluation Reports: Two Case Examples from the Social Sciences and Humanities Research Council of Canada (SSHRC)"

Transcription

1 Performance Reports vs. Evaluation Reports: Two Case Examples from the Social Sciences and Humanities Research Council of Canada (SSHRC) Robert Lalande & Courtney Amo, SSHRC Natalie Kishchuk, Research and Evaluation Inc Joint CES/AEA Conference, Toronto, October 28, 2005

2 Presentation Outline > Purpose of the Presentation > Context Canadian Context SSHRC Context > Overview of Case Examples > Overview of Studies > Overview of the Elements of a Performance Report > Performance Report vs. Evaluation Report: Differences, Similarities > How Performance Reports have been received at SSHRC > Questions for Discussion 2

3 Purpose of the Presentation > To discuss the features of Performance Reports > to differentiate them from Evaluation Reports in terms of purpose and process > to compare Performance Reports to Evaluation Reports in terms of impact and use > Overall Aims: to suggest that rigorous, systematic, and methodologically sound studies linked specifically to a program s RMAF may both serve the requirement for timeliness, and the need to generate sound, objective and reliable evidence for decision-making to promote the use of performance reports as a viable alternative to evaluation under certain circumstances, and as a more rigorous alternative to reviews to help stimulate discussion in this area, and generate further research on the variety of ways in which evaluators can contribute to various requirements for evaluative-information 3

4 Canadian Context Performance Measurement and Evaluation > Treasury Board of Canada s Policy on Transfer Payments (2000) Results-based Management and Accountability Frameworks (RMAFs) Risk-based Audit Frameworks (RBAF) > Treasury Board of Canada s Evaluation Policy (2001) Risk-based Performance and Evaluation Plans > House of Common s Expenditure Review Committee (2004) Program Activity Architecture (PAA) > Report - Effectiveness of evaluation practices commissioned by Treasury Board's Centre of Excellence for Evaluation (CEE) 4

5 SSHRC Context What is SSHRC? > Canada s national funding agency in the social sciences and humanities > Types of support provided: Graduate fellowships Small to large scale disciplinary and multidisciplinary research projects Strategic research in key areas of importance Knowledge mobilization (dynamic dissemination) > All support provided through peer-reviewed competitive processes 5

6 SSHRC Responses to Treasury Board Requirements > Creation of the Corporate Performance, Evaluation and Audit Division > Evaluation Plan Priorities identified based on: Formal Requirements Risk (Corporate Risk Profile) Usefulness (for decision-making) Timeliness Data Availability Cost-effectiveness > Agency-level RMAF & Risk Profile > Program Activity Architecture (PAA) > Studies tailored to needs (evaluations, performance reports, special studies, etc...) 6

7 SSHRC Responses to Treasury Board Requirements Performance Reports > To date, used in cases where intrinsic value of the program not in question > Two cases for this presentation: Community-University Research Alliances (CURA) Program innovative aspect of the program necessitated closer monitoring Major Collaborative Research Initiatives (MCRI) Program program generally perceived to work well and had potential for learning (best practices) 7

8 Overview of Case Examples: Case 1 CURA Program > Launched in 1999 > Alliances between community groups and universities to generate new knowledge aimed at developing Canadian communities > 1 st time SSHRC opens door to funding research outside of universities > Focus on: Ongoing collaboration and mutual learning Innovative research Student training and employability Community decision-making and problem-solving > CURA grants are currently valued at up to $200,000 per year for up to 5 years 8

9 Overview of Case Examples: Case 2 MCRI Program > Long-standing, well-established SSHRC program (1993) > Focus on: Large-scale, collaborative, multi-institutional, multi-disciplinary research teams and partnerships directed by leading Canadian scholars Addresses broad and critical issues of intellectual, social, economic and cultural significance Integrates diverse research activities and results nationally and internationally Training of students and postdoctoral fellows in a collaborative, interdisciplinary research environment Broader dissemination to both traditional and new audiences for greater impact Involving postsecondary institutions in long-term commitments to the development of unique, large-scale inter-university research initiatives > Awards grants of up to $500,00 per year for up to 5 years 9

10 Overview of Studies > Both of these projects began with the development of a Results-based Management and Accountability Framework (RMAF), identifying the underlying program logic, and defining performance measures for program outputs and outcomes > The performance of both programs was assessed against performance frameworks based on the RMAFs, taking into account changes in performance expectations over time > These "Performance Reports" have thus far served many purposes; including management decisionmaking, program refinement, accountability and program promotion 10

11 Overview of the Elements of a Performance Report > Performance Assessment Framework (based on program RMAF) > Study Design e.g.: case studies > Methods e.g.: file reviews; original data collection through interviews, surveys; etc... > Results achieved findings by program outputs and outcomes as per program logic model > Specific questions e.g.: best practices, impact of certain aspect of program design, risk assessment > Conclusions 11

12 Performance Assessment framework CURA > Based on RMAF, developed through a consultative process Internal stakeholder consultations File and document review > Performance dimensions and indicators identified based on logic model and performance measurement strategy MCRI > Based on RMAF, developed through a consultative process Internal stakeholder consultations File and document review > Performance dimensions and indicators identified based on logic model and performance measurement strategy 12

13 Study design and methods CURA > Preparation of a Performance Profile for 21 CURAs funded in and who had applied for completion grants in 2002 Systematic review using a structured template of original and completion applications against performance dimensions in the framework > Interviews with a few internal and external stakeholders to develop risk assessment MCRI > Case studies of 11 MCRIs funded between 1995 and 2000 Case selection based on content analysis of mid-term review committee reports, for variability on performance dimensions Data sources: interviews with project directors, Canadian and foreign investigators, students, staff and project partners File review > Secondary data analysis > Interviews with internal stakeholders 13

14 Results achieved CURA > Findings by program outputs and outcomes as per program logic model Training and development Research Community and university capacity Knowledge mobilization MCRI > Findings by program outputs and outcomes as per program logic model Research Collaboration and partnerships Training and mentoring Dissemination > Characteristics of successful projects and best practices 14

15 Specific questions > Impact of specific aspect of program design MCRI: impact of inclusion (funding) of foreign researchers > MCRI: best practices in terms of collaborative research, training and monitoring, and dissemination > CURA: risk assessment 15

16 Performance Report vs. Evaluation Report: Differences, Similarities Purpose/ Aim Focus Timing Clients Stakeholder Involvement Study Design Performance Report Assess and report on performance of program; relevance not in question Focused on performance as per RMAF; Focused questions as appropriate Early in the lifecycle of a program, or at a moment in time where it is important to take stock of the program s performance or draw lessons from what has been experienced thus far; link to risk (RMAF/RBAF) Council, Senior management and program management Involved in development of RMAF; early buy-in; involved in Advisory Committee Retrospective, descriptive, can be analytical, focused on program Evaluation Report Evaluate relevance, costeffectiveness, success of program Standard categories of evaluation questions; Focused questions as appropriate (e.g., risk assessment questions) Half-way through the lifecycle of a program (formative) or towards the end (summative); post risk assessment Council and Senior Management Involved in Advisory Committee; buy-in happens during the evaluation process Retrospective and prospective, descriptive and analytical, can be comparative 16

17 Performance Report vs. Evaluation Report: Differences, Similarities Performance Report Evaluation Report Data Sources and Methods Time to complete study Report structure Recommendations Follow-up Appeal/Response Cost Administrative data, File reviews, document reviews, interviews, surveys, case studies, etc... Only difference: Performance Report uses the methods to get at dimensions of performance 6-8 months Organized by dimensions of performance; shorter, focused report Not required Management response to issues raised in the Performance Report Easier to sell than evaluation; Positive reaction to process and report; early buy-in Approx. 2/3 the cost of an evaluation of the same program 8-12 months + depending on program Organized by evaluation questions; longer, more detailed report Expected and often required Management response to recommendations More difficult to sell; fear/ apprehension; reaction based on previous evaluation experiences (positive or negative); buy-in over time Varies by program and complexity of the evaluation 17

18 How Performance Reports have been received at SSHRC > Positive effects: Highly positive response Anxiousness to share reports with community Use of reports on a regular basis by program officers and management Positive example of the different types of evaluative studies that can be conducted > Negative or unanticipated effects: Expectation that evaluations will be more like performance reports Time to complete study Cost of study Content and structure of report Place/role of traditional evaluations questioned Need to educate clients, stakeholders as to the difference between evaluations and performance reports 18

19 Questions for Discussion > Have you had experience with performance reports or similar studies? > Do you agree with our analysis of differences and similarities? > What do you see as the key differences between evaluation and performance reports? > Does participation in RMAF development and Performance Reporting increase stakeholders future receptivity to evaluation? > Do you see this type of study as a threat to formal evaluations? > Other??? 19

20 THANK YOU FOR YOUR TIME! To contact the presenters: