Putting Outcomes First: A Roadmap for Evidence-Based Decision Making in the OPS

Size: px
Start display at page:

Download "Putting Outcomes First: A Roadmap for Evidence-Based Decision Making in the OPS"

Transcription

1 PAGE 1 Putting Outcomes First: A Roadmap for Evidence-Based Decision Making in the OPS Treasury Board Secretariat Centre of Excellence for Evidence-Based Decision Making June 2016

2 Purpose PAGE 2 01 Introduce the Centre of Excellence for Evidence-Based Decision Making Meet the team, find out what we re up to, and learn what keeps us up at night 02 Share the new framework for evidence-based decision making Learn how the framework was developed, what the standards are, and how it will be implemented across government 03 Ask you for your ideas and feedback Tell us your initial thoughts on the framework and what areas you need support with

3 PAGE 3 PART 1 ABOUT US

4 Who We Are PAGE 4 CoE s HISTORY Announced in the 2015 Budget Created to build capacity within the OPS to assess how programs are performing, using evidence to inform choices and lead change in critical public services 1 CoE s VISION An OPS where decision making processes are based on evidence and: Focus on outcomes and continuous improvement; Enable transparency and better accountability; and, Meet high standards for quality of evidence and analytical rigour. CoE s MANDATE 2 3 Develop a framework for evidence-based decision making Develop supporting tools and build capacity Support the decisionmaking process Work with internal and external partners Support public reporting

5 Meet the Team Didem Proulx (Director) PAGE 5 Shannon Fenton (Manager) Lee Coplan (Manager) Jennifer Baker (Sr Advisor) Janine Hope (Sr Evaluation Consultant) Haris Khan (Policy Co-op) Rebecca Herbert (Economist) Honey Dacanay (Sr Advisor) Laurie McNelles (Sr Evaluation Consultant) Emily MacKay (Policy Co-op) Ben Shannon (Sr Economist) Karen Prokopec (Sr Advisor) Lisa Peacock (Sr Evaluation Consultant)

6 Evidence-Based Decision Making Why Are we Here? PAGE 6 Are we doing the right things? Does the framework help the OPS meet the needs of clients/ontarians? Are we using evidence to support decisions for current policies and programs? Are we doing things the right way? Will we know if policies/ programs achieving intended outcomes and impacts? Are we continuously improving in terms of program options, program design and service delivery? Could we be doing better? Will we know if we should continue to do the things we are doing? Does the framework help in reflecting on the ground experience? Should resources be allocated differently to support the most effective programs?

7 PAGE 7 PART 2 EVIDENCE-BASED DECISION MAKING AND THE FRAMEWORK

8 Why Have an OPS Framework? To set out consistent expectations around: What constitutes quality evidence? What culture do we need to foster evidence use? How do we share evidence across the organization? How can we design, implement, evaluate programs differently, and use data, analytics and evidence, to support continuous improvement and better outcomes? PAGE 8 The Framework introduces: A COMMON LANGUAGE AROUND CORE ELEMENTS What does evidence-based decision making mean in various contexts? A SET OF STANDARDS How can we add rigour to the way we design, measure and evaluate policies and programs? A ROBUST SET OF TOOLS What resources can help ministries apply evidencebased decision making in their day-to-day work?

9 Core Elements Elements of Evidence-Based Decision Making PAGE 9 Focus on Outcomes Flexible Policy & Program Design Program Assessments Implementation Oversight & Continuous Improvement Targeted Impact Evaluation Decision Making Explicit definition of desired outcomes and how they will be measured Evidenceinformed and client/outcome focused. Design incorporates rapid feedback loops for continuous improvement Review evidence of effectiveness of existing/base programs and policies. Acting on the results. Active monitoring and evaluation of performance to ensure program effectiveness and continuous improvement Rigorous and regular evaluation to determine impact Evidence incorporated into policy and funding/budget decisions ENABLERS / COE MANDATE Data Tools and Capacity Knowledge Mobilization Culture of Evaluation Active Use/Demand for Evidence

10 The OPS Framework PAGE 10 The OPS Framework for Evidence-based Decision Making is a practical roadmap that embeds the core elements of evidence-based decision making into the work of government. Data, analysis and evidence have a key role to play in every step of the policy and program development, delivery and evaluation process.

11 Problem Definition PAGE 11 Standard The problem is clearly defined and scoped, and considers the client/user/citizen perspective. There is a strong rationale for government intervention, with linkages made to government priorities and ministry KPIs. Implications for the existing public service system/suite of existing programs have been analyzed. Meeting the Standard Requires: Quantitative/systematic evaluation of the problem, including indicators of the scope and scale of the issue. Assessment against strategic priorities for alignment and relevance. A map of the existing suite of provincially funded/supported programs in this area (within ministry and across government). Demonstration of identified problem within the suite of current programs offerings (gap, duplication, variation in service levels/outcomes, etc.) and incorporate the client perspective, that allows evaluation of horizontal implications, and identifies opportunities to leverage. Evidence of effectiveness in government intervention in addressing this problem (Evaluation results from other jurisdictions, analysis of the relevant academic/grey literatures, etc.).

12 QUESTION #1: How would you rate your own and your branch s capacity to meet the framework s standards for Problem Definition? PAGE 12 Low capacity High capacity

13 Outcomes Logic PAGE 13 Standard Clearly defined outcome statements and definitions of success and benefits that provide a structure for decision-making, performance targets and monitoring. Meeting the Standard Requires: A theory of change, including all assumptions about causation and/or correlation and the basis for these assumptions (e.g., research evidence). A logic map or model of the theory of change (i.e., inputs-process-outputs-outcomes). Operational definitions and a measurement plan for the components of the logic map (measures, data, data sources). Mechanisms identified in the logic map to refine the logic and/or implementation based on performance data. A framework for evaluation that establishes a baseline and/or counterfactual (randomized design, control group, pre-implementation assessments, etc.).

14 QUESTION #2: How would you rate your own and your branch s capacity to meet the framework s standards for Outcomes Logic? PAGE 14 Low capacity High capacity

15 Options Assessment PAGE 15 Standard A range of viable options are developed with consistent rigor and assessed against a defined set of criteria (outcomes, value for money). Key assumptions and risks are stated and substantiated, and sources of evidence are cited. Potential risks and impacts are comprehensively modeled (financial, stakeholder, etc.). Meeting the Standard Requires: Critical analysis of the information available on possible options and their effectiveness (New programs: evaluation results from other jurisdictions, academic/grey literatures, admin data; Existing programs: evidence using multiple lines of inquiry including analysis of admin data). Options embed flexible/prospective design to account for risk/uncertainty and to allow continuous improvement based on on the ground experience. Economic and/or financial analysis of all options (costs, benefits, value for money, return on investment). Transparent and thorough assessment of each option against key criteria, including relevance to strategic priorities.

16 QUESTION #3: How would you rate your own and your branch s capacity to meet the framework s standards for Options Assessment? PAGE 16 Low capacity High capacity

17 Risk and Performance Management PAGE 17 Standard All material risks have been identified, assessed and prioritized, along with mitigation strategies and plans. A plan is in place for tracking outcomes and performance (pre-implementation), including early feedback loops to allow for timely course correction (post-implementation). Meeting the Standard Requires: Outcome monitoring, performance measurement frameworks developed and included in the submission. Flexible/prospective program design that uses administrative data for quick and regular assessment of implementation fidelity/quality and service/program impacts. Continuous improvement plans (prospective or retrospective) that allow for regular modifications to program design based on performance and administrative data (e.g., rapid cycle improvement, predictive analytics). Assessment of risks, with risk monitoring part of implementation/continuous improvement plans.

18 QUESTION #4: How would you rate your own and your branch s capacity to meet the framework s standards for Risk and Performance Management? PAGE 18 Low capacity High capacity

19 Implementation PAGE 19 Standard All key milestones have been mapped out. Rigorous outcome and performance monitoring and defined feedback mechanisms are actively utilized for timely course correction and continuous improvement (as identified in the previous step). Active oversight to ensure policy/program implemented as intended. Meeting the Standard Requires: Detailed descriptions of key steps, roles and responsibilities (including governance and accountability), and milestones. Assessment of costs and resource requirements, and plans for securing necessary resources over time to ensure sustainability. Data collection tools and processes to collect performance measures for continuous improvement, risk monitoring and evaluation, as well as progress reporting. Explicit action plan for risk monitoring, reporting and mitigation.

20 QUESTION #5: How would you rate your own and your branch s capacity to meet the framework s standards for Implementation? PAGE 20 Low capacity High capacity

21 Evaluation PAGE 21 Standard Evaluation is based on regular collection of performance data, from the outset. Baseline data is collected at or prior to implementation. The evaluation framework incorporates a counterfactual and assessment of outcomes vis-à-vis the baseline and the counterfactual. Outcomes are considered from the client/user/citizen perspective. Meeting the Standard Requires: The regular collection of process and outcome data, beginning at or prior to implementation. Data sources capture client experiences in a meaningful way. A systematic evaluation framework that draws on administrative data and performance measures to confirm the theory of change. An evaluation approach that applies a randomized design, wherever possible, and includes baseline and/or control group comparisons to assess impact. Assessment of cost and benefits, including value for money and/or return on investment.

22 QUESTION #6: How would you rate your own and your branch s capacity to meet the framework s standards for Evaluation? PAGE 22 Low capacity High capacity

23 Tools on the Website PAGE 23

24 Tools on the Website PAGE 24

25 Implementing the Framework The Framework applies to both new and existing/base policies and programs, but there may be emphasis on a specific phase, standard, or set of tools depending on the type of decision being made. PAGE 25 1 As new policies and programs come forward for policy and resource requirements, the application of the framework should set higher expectations for the role of evidence, and require robust outcome and evaluation frameworks. Priority files and horizontal initiatives could be early adopters. 2 The framework should guide review and reconsideration of existing policies and programs, raising expectations for performance measurement and continuous improvement. KPIs will highlight priority areas for review/action. PRRT and the PRRT Sub-Committee will drive implementation.

26 Embedding Performance Measurement and Continuous Improvement PAGE 26 Key Performance Indicators (KPIs) focus on outcomes and strategic priorities and serve to: 1. Demonstrate progress where visible; and, 2. Highlight areas where progress is limited, so that the programs can be evaluated and redesigned using the Evidence-Based Decision-Making Framework to support continuous improvement. Last fall, as part of the PRRT process, the CoE worked with ministries to develop KPIs that will form the basis for government-wide accountability measures and demonstrate the impact of investments in the province. Ministry KPIs should be closely tied to transformational initiatives and key priorities in order to provide meaningful and relevant data. Target setting should encourage movement towards improved outcomes or enhanced efficiency.

27 Embedding the Framework PAGE 27 TB/MBC and Cabinet approval processes PRRT Process and PRRT Subcommittee Relevant OPS Directives (Transfer Payment, Realty, etc Incorporating Outcomes & Measures in Public-Facing Documents Decision-Making Processes The Framework will be embedded systematically in decision-making processes.

28 PAGE 28 Develop new, or leverage existing, training opportunities as appropriate Facilitate improved access to data, evidence and external partners Support and champion priority projects as appropriate Assessing and Building Capacity The Centre of Excellence for Evidence-Based Decision Making will assess OPS capacity and resources to meet the standards set out by the Framework and fill identified gaps.

29 PAGE 29 PART 3 TELL US WHAT YOU THINK

30 QUESTION #7: What other training and tools do you anticipate you ll need to help you? PAGE 30

31 QUESTION #8: In which part of the framework do you anticipate the most challenges? PAGE 31

32 QUESTION #9: What advice can you give us to help embed the framework across the OPS? PAGE 32

33 Desired End-State The Framework is intended to strengthen accountability, support the expansion of new innovative programs, and improve outcomes and increase value for money of existing interventions. As the Framework s evidence standards become embedded in policy and program processes, changes should be evident at all levels. PAGE 33 People of Ontario Benefit from better policy, programs and value for money from government, with more transparency about how and why decisions are made. Clients have a voice and are a source of evidence for decision makers. Ministers and Political Leaders Receive a higher standard of evidence and analysis in making decisions, and are more able to communicate the rationale for decisions to citizens and stakeholders. OPS Leaders Can better monitor and manage the performance of their programs, based on higher quality analysis from ministry staff. The evaluation of horizontal implications enables a broader systems perspective. Policy and Program Analysts Provide robust, outcomes focused analysis and evaluation with the support of the right tools, training, and skills development.

34 How We ll Measure Success PAGE 34 Meaningful measures of the impact of the Framework on decision making are being developed, including assessment of the: Rigour of evidence and analysis presented in TB/MBC and Cabinet submissions Documented capacity of ministries to analyze data and assess evidence Availability of ministry and program-level performance measures that are reliable, relevant and outcomes-focused Extent to which evidence and analytics, including information and performance measures, play a role in decision making The success of the CoE s efforts will depend on sustained effort, strategic and critical partnerships in central agencies and line ministries, and proactive change management.

35 Thank you! We d love to hear from you PAGE 35 Tell us what you thought of the session through the PIL Survey Visit us on the Intranet at: us at: Putting_outcomes_first@ontario.ca