Experimentation: A toolkit to improve evidence- based decision- making

Size: px
Start display at page:

Download "Experimentation: A toolkit to improve evidence- based decision- making"

Transcription

1 Experimentation: A toolkit to improve evidence- based decision- making Presentation at the Professional Development Week Financial Management Institute of Canada November 2018

2 Why Experiment? Our Programs, Services & Operations In 2017/18, the Government of Canada spent over $270 billion on delivering programs and services to Canadians, employing over 500,000 public servants across the country. Delivering Public Value Canadians expect that their governments will be efficient, effective and responsive, making decisions based on sound evidence to ensure we obtain good value for money. Experimentation is a key tool to establish information about what works, ensuring that the activities of government can adapt to changing circumstances and deliver strong public value. The Commitment to What Works Becoming more experimental as a government is a cross-cutting commitment. It s about getting better at getting better. To this end, the President of the Treasury Board was also tasked in his mandate letter with supporting more experimentation across the Government. 2

3 Experimentation Can Enable innovation, by de-risking the unknown and allowing government to try new things based on evidence of what can work. As part of ongoing service delivery, allow government through data-enabled testing to continuously improve. Enhance the government s ongoing commitment to evaluation, deepening the expectations for impact analysis: using data and measurement in increasingly sophisticated ways, and adopting new technologies, to know what is working and what is not. 3

4 Providing direction and guidance: the 2016 Experimentation Direction to Deputy Heads Conduct experiments to generate evidence around what works to inform policies and programs Gather baseline data through experimental rigour and build capacity to measure results and impact Mainstream experimentation as a part of program, policy and service design across government 4

5 What do we mean by Experimentation? There are many instances where the Government of Canada generates good program and policy evidence to improve programs. Immigration, Refugees and Citizenship Canada uses comparative analysis (trials) to improve client experiences. In 2017, the department tested new call centre protocols and messaging to decrease anxiety for the spousal sponsorship process, leading to a 30% reduction in same-day repeat calls. Implemented across all business lines, this has resulted in a more than two fold increase in unsolicited positive client feedback. When we add rigour to our decisions, we constantly get better at what we do. How do we make this the norm, and not just an exception? 5

6 Experimentation examples, con t Name-Blind Recruitment (NBR) was a 2017 pilot project run by The Public Service Commission (PSC) in collaboration with TBS. With a goal of improving diversity in public service hiring, the objective of the experiment was to determine whether concealing candidate personal information impacted reviewers decision to choose candidates, when compared to the current method. The pilot used an experimental methodology to test the hypothesis, focused on the results of 53 reviewers reviewing 2,226 candidates (randomized into using NBR and not using it). The initial results showed no significant effect in terms of visible minorities being chosen. A new variation of the experiment is currently being designed, in order to gain information about other currently-held assumptions of what could work. 6

7 Experimentation examples, con t CRA recently designed a policy it thought would incentivize more people to pay their taxes. Instead of rolling it out across the department, the CRA tested it using a rigorously designed trial, which cost roughly $20,000. The trial showed the intervention did not meet the expected outcomes, so the CRA did not use it again. This valuable information prevented a much costlier roll out that would have been ineffective. When we add rigour and testing to our decisions, we can decrease risk for new or untested ideas, creating large potential cost-savings for Canadians. 7

8 TBS has been working to build the foundation 8

9 Our Unique Approach: Learning by Doing In order to better understand the challenges we face in embedding experimentation, we launched Experimentation Works Practical cohort model Open-by-default access Fully scalable Builds network & momentum The EW cohort should help build skills and resources for experimentation across the GoC by running experiments in pre-identified areas of opportunity and need 9

10 Experimentation Works (EW) How It Works EW features small-scale experiments in the GoC through a unique learning-by-doing model o Build public servants capacity in experimentation by showcasing 3-5 small-scale experiments o Documenting and sharing experimentation development, deployment, results, and initial impacts (if any) o Focus on a variety of methodologies and policy areas (policy design, delivery, and back office experimentation) The players: o o Partner departments are conducting experiments in their policy and program area (Natural Resources, Heritage, Health). Expert departments lend experimentation experts to help support the project (Employment and Social Development, Revenue, Natural Resources, Immigration). Treasury Board Secretariat s role is to help enable the experiments by removing barriers, convening partners, and documenting and publishing the process for others to learn from, hopefully leading them to scale what works. Prepare Launch Experiment* Report Impact Fall / Winter 2017 Spring 2018 Spring - Fall 2018 Winter 2018 Spring 2019 *Depending on methodology, duration may vary 7

11 Capacity: Experimentation Portal To support capacity in departments, TBS has built a portal on which we curate experimentation methods, FAQs, and case studies. The portal acts as a one-stop-shop for GoC employees to learn about and start practicing experimentation. 11

12 Capacity: Experimentation Maturity Model To help departments better understand where and how to support experimentation in their departments, TBS co- created (with PWC and 7 Departments) a maturity model. 12

13 Systems Change: Departmental Plans and MAF In order to better track and incentivise experimentation, TBS has started to incorporate the concept into existing measurement tools 2018/2019 Departmental Plan Analysis These tools are helping us establish a baseline and identify areas of opportunity and need. 13

14 Government of Canada ADM Committee on Experimentation and DG Working Group, Open Office Hours, Experimentation Works (EW) Learning Events and Cohort Connecting the Community: Building networks Other government Nationally: Government of Ontario (Centre for Excellence for Evidence-Based Decision-making, Government of Saskatchewan (Ministry of Corrections and Policing) Engaging with various functional communities (evaluators, auditors, financial managers, regulators) Internationally: Columbia (National Planning Department), Denmark (National Center for Public Service Innovation), Finland (Prime Minister s Office), France (Le Fonds d Experimentation pour la Jeunesse), UAE (Centre for Government Innovation), UK (What Works Unit, Cabinet Office), USA (Office of Management and Budget). Academia Nationally: Université Laval, University of Toronto. Internationally: USA (Stanford University s METRICS Centre). NGOs / Foundations Nationally: Canadian Science Policy Centre, Canadian Foundation for Healthcare Improvement, Institute on Governance, McConnell Foundation, Mowat Centre, Social Research & Demonstration Corporation. Internationally: UK (NESTA), Institute for Government. 14

15 Insights from Departments To gain a better understanding on how the experimentation direction was being operationalized in departments, the DG Experimentation Working Group commissioned several focus groups and a survey, which uncovered a number of barriers: Leadership gaps. A number of barriers were attributed to competing expectations by and of senior managers, linked to accountabilities and structural requirements in the culture of decision-making that are often not well aligned to promoting experimentation. Limited time, capacity, and resources. Several focus groups flagged a lack of skills in the department to design and execute experiments. The survey also highlighted strongly the inhibiting factors of workload pressures and lack of time and resources to do this work. Limited understanding. There remains a range of definitions for what counts as an experiment. Senior leaders often lack a basic understanding of experimentation that would allow them to prioritize where they might deliver value. Process burdens. For those who are trying to experiment and innovate, they signal that they have repeatedly encountered counter-pressures due to unwillingness throughout the organization to challenge or re-interpret rules and procedures, or entertain new processes. Some of these derive from central policies and directives. Fear of failure. Finding out what works is risky because it also means finding out what doesn t. Public servants continue to voice concerns that finding evidence of programmatic failures will be seen as individual failings by management. Structural limits to experimentation: Several respondents felt that there were areas of departmental work for which experimentation will be difficult or simply impossible to implement. These barriers seem to indicate the need for a larger scale culture change 15

16 Experimentation and Financial Management Experimentation is relevant to the financial management community in two central ways: Experimenters use or rely upon financial instruments to test new ideas (e.g., pay-for-performance contracts; micro-grants; prizes; ) Financial managers themselves experiment with new ways of working (e.g., AI, robotic process automation, ) 16

17 Experimentation and Financial Management: Examples 1. Experimenters are using new financial tools in order to test new ways of addressing policy problems a) For example, one new type of funding model that experimenters are testing is pay-for-outcomes or pay-forperformance funding. In this type of arrangement, non-government entities (often NGOs) are not payed when they perform an activity (e.g., running a youth employment workshop) but only when they achieve a certain outcome (e.g., youth they are working with actually get jobs). b) The system is also changing to accommodate these new approaches, with the new Generic Terms and Conditions having been put into place by Treasury Board in order to allow for more of this kind of experimentation. The Generics allow for pay-for-performance agreements, as well as using policy prizes and challenges and micro-grants. 2. Corporate sectors are also experimenting with new ways of providing oversight and improving financial administration: a) Examples of other sectors using new techniques include: lean procurement approaches, post-contract microsurveys, robotic process automation, embedding of technical experts into policy teams, using social media as Gs&Cs reporting, sentiment and social media analysis in evaluation. 17

18 Contact Myra Latendresse-Drapeau Natalie Frank Generic Experimentation GCPedia Page: Experimentation Works Blogs: 18

19 THANK YOU! 19