Results Based Accountability

Size: px
Start display at page:

Download "Results Based Accountability"

Transcription

1 1

2 Results Based Accountability Measuring and Communicating Impact Learning Objectives A practical understanding of the principles of RBA. How to develop common language around: results, indicators, baselines, and performance measures. How to use the RBA framework to measure results and communicate impact. How funders and grantees can use language discipline and data to work more effectively. 2

3 Introductions Name Affiliation Group or Flying Solo What is your priority result? (don t panic) The RBA Framework Aims to: Define the results we are seeking in our community, for the population served, for our organization Determine and get buy in for Indicators of Success. How would you know if the result was achieved? Identify What Works Best practices, promising practices and prioritize them Identify all those who could potentially play a role in contributing to results Who are our partners? Move from talk to action The RBA Framework also Moves doing good things to doing effective things that can have a population level impact Allows you to clearly communicate the need and the progress being made Creates a table for everyone to contribute to achieving the result Guides investment of time, energy, and money And did we mention. move from talk to action 3

4 SIMPLE COMMON SENSE PLAIN LANGUAGE MINIMUM PAPER USEFUL Results Accountability includes two parts: Population Accountability about the well-being of For Communities Cities Counties States Nations Performance Accountability about the well-being of For Programs Agencies Service Systems Results Accountability COMMON LANGUAGE COMMON SENSE COMMON GROUND 4

5 COMMON LANGUAGE Language Confusion Outcome : Goal Target : Result Objective : Measure Indicator : Benchmark THE LANGUAGE TRAP Too many terms. Too few definitions. Too little discipline Outcome Benchmark Result Indicator Modifiers Measurable Core Urgent Qualitative Priority Programmatic Targeted Performance Incremental Strategic Systemic Goal Measure Target Objective DEFINITIONS Population Performance RESULT or OUTCOME A condition of well-being for children, adults, families or communities. Children born healthy, Children ready for school, Safe communities, Clean Environment, Prosperous Economy INDICATOR or BENCHMARK A measure which helps quantify the achievement of a result. Rate of low-birthweight babies, Percent ready at K entry, crime rate, air quality index, unemployment rate PERFORMANCE MEASURE A measure of how well a program, agency or service system is working. 1. How much did we do? Three types: 2. How well did we do it? 3. Is anyone better off? = Customer Results 5

6 From Ends to Means Population RESULT or OUTCOME INDICATOR or BENCHMARK ENDS Performance PERFORMANCE MEASURE Customer result = Ends Service delivery = Means MEANS Definitions Result/Outcome: A condition of well being for children, adults, families or communities (ENDS) Indicator/Benchmark: A measure which helps quantify the achievement of a result (ENDS) Performance Measure: A measurement of how well a program, agency or service system is working (MEANS) RESULT INDICATOR PERF. MEASURE RESULT INDICATOR RESULT INDICATOR PERF. MEASURE IS IT A RESULT, INDICATOR OR PERFORMANCE MEASURE? 1. Safe Community 2. Crime Rate 3. Average Police Dept response time 4. An educated workforce 5. Adult literacy rate 6. People have living wage jobs and income 7. % of people with living wage jobs and income 8. % of participants in job training who get living wage jobs 6

7 Population Accountability Population Accountability about the well-being of For Communities Cities Counties States Nations Performance Accountability about the well-being of For Programs Agencies Service Systems Results for Children, Families and Communities A Working List Healthy Births Healthy Children and Adults Children Ready for School Children Succeeding in School Young People Staying Out of Trouble Stable Families Families with Adequate Income Safe and Supportive Communities Results for Children SAFE HEALTHY AT HOME IN SCHOOL OUT OF TROUBLE 7

8 COMMON SENSE Leaking Roof You come home after a long day of rain, and find that water is dripping from the ceiling of the top floor of your house! What do you do? Leaking Roof (Results thinking in everyday life) Experience: Inches of Water Not OK Measure:? Fixed Turning the Curve Story behind the baseline (causes): Partners: What Works: Action Plan: 8

9 The 7 Population Accountability Questions 1. What are the quality of life conditions we want for the children, adults, and families who live in the community? 2. What would these conditions look like if we could see them? 3. How can we measure these conditions? 4. How are we doing on the most important of these measures? 5. Who are the partners that have a role to play in doing better? 6. What works to do better including low- and no-cost ideas? 7. What do we propose to do? Indicators Provide the evidence to suggest that we are seeing a change in conditions Quality over quantity Criteria for Choosing Indicators as Primary vs. Secondary Measures Communication Power Proxy Power Data Power 9

10 Criteria for Choosing Indicators as Primary vs. Secondary Measures Communication Power Does the indicator communicate to a broad range of audiences? Criteria for Choosing Indicators as Primary vs. Secondary Measures Proxy Power Does the indicator say something of central importance about the result? Does the indicator bring along the data HERD? Criteria for Choosing Indicators as Primary vs. Secondary Measures Data Power Quality data available on a timely basis. 10

11 Criteria for Choosing Indicators as Primary vs. Secondary Measures Communication Power Does the indicator communicate to a broad range of audiences? Proxy Power Does the indicator say something of central importance about the result? Does the indicator bring along the data HERD? Data Power Quality data available on a timely basis. Choosing Indicators Worksheet Outcome or Result Safe Community Candidate Indicators Communication Power Proxy Power Data Power Measure 1 Measure 2 Measure 3 Measure 4 Measure 5 H M L H H H M L H H H M L H L Measure 6 Measure 7 Measure 8 Data Development Agenda Three Part Indicator List for each Result Part 1: Primary Indicators 3 to 5 Headline Indicators What this result means to the community Meets the Public Square Test Part 2: Secondary Indicators Everything else that s any good (Nothing is wasted.) Used later in the Story behind the Curve Part 3: Data Development Agenda New data Data in need of repair (quality, timeliness etc.) 11

12 The Matter of Baselines H M OK? L History Point to Point Forecast Turning the Curve Baselines have two parts: history and forecast MADD Marion County Reentry Coalition All ex offenders are successfully reintegrated into the community. 12

13 50% 45% 40% 35% 30% 25% 30.8% 20% 15.8% 15% 10% % Offenders Released to Marion County from IDOC who Returned to IDOC within 6 or 12 months 6 month return 12 month return 5% 0% 21.7% 10.9% % Offenders Released to Marion County from IDOC who Returned to IDOC within 6 or 12 months 50% 45% 6 month return 12 month return 40% 35% 30.8% 30% 25% 20% 15.8% 15% 10% 5% 0% Story behind the data 27.9% 21.1% Enlisting Partners Who else cares about this issue? Who else benefits from achieving the result? What is their role in achieving the result? Are there non-traditional partners? Do they have a sphere of influence? Are they willing to move from Talk to Action? If I include you, you will be my partner. If I exclude you, you will be my judge. - Rosell 13

14 What Works What does the research say? Are there Evidence-Based Practices? Are there Promising Practices? What are some low cost/no cost strategies? What do we know about what contributes to the RESULT? What factors push line down? What factors push line up? Prioritizing Strategies Specificity: Is the it actionable (who, what, when, where, how) Leverage: Will it make a difference (turn the curve) Values: Is it consistent with personal/ community values? Reach: Can it be done this year? Is it affordable/ feasible? 14

15 Prioritizing Strategies What Works Specificity Leverage Values Reach (Strategies) Strategy A H/M/L H/M/L H/M/L H/M/L Strategy B Strategy C Overcoming We tried that before! Proposal Based Decision-Making (PBDM) or The Rule of Thumb Consensus is A way of reaching a decision, as a group Finding a proposal acceptable enough that everyone can support it and no one opposes it Consensus is not A unanimous vote A majority vote Everyone in total agreement 15

16 Proposal Based Decision-Making Step 1: Proposal Development Step 2: Finding the decision everyone can go along with Step 3: Making the decision Population Accountability vs. Performance Accountability Performance Accountability Population Accountability about the well-being of For Communities Cities Counties States Nations Performance Accountability about the well-being of For Programs Agencies Service Systems 16

17 All performance measures that have ever existed for any program in the history of the universe involve answering two sets of interlocking questions Performance Measures Quantity Quality How Much did we do? ( # ) How Well did we do it? ( % ) Performance Measures Effort How hard did we try? Effect Is anyone better off? 17

18 Performance Measures How Much Effort Effect How Well Performance Measures Quantity Quality Output Input Effect Effort How much service did we deliver? How much change / effect did we produce? How well did we deliver it? What quality of change / effect did we produce? Education Quantity How much did we do? Quality How well did we do it? Effect Effort Number of students Number of high school graduates Is anyone better off? Student-teacher ratio Percent of high school graduates 18

19 Education Quantity How much did we do? Quality How well did we do it? Effect Effort Number of students Number of 9 th graders who graduate on time and enter college or employment after graduation Is anyone better off? Student-teacher ratio Percent of 9 th graders who graduate on time and enter college or employment after graduation Drug/Alcohol Treatment Program Quantity How much did we do? Quality How well did we do it? Effect Effort Number of persons treated Number of clients off of alcohol & drugs - at exit - 12 months after exit Is anyone better off? Percent of staff with training/ certification Percent of clients off of alcohol & drugs - at exit - 12 months after exit Not All Performance Measures Are Created Equal Effect Effort Quantity How much did we do? Least Important Is anyone better off? Quality How well did we do it? Also Very Important Most Important 19

20 RBA Categories Account for All Performance Measures (in the history of the universe) Process Input Cost Product Output Impact Effect Effort Benefit value Quantity TQM Quality Effectiveness Value added Productivity Efficiency, Admin overhead, Unit cost Staffing ratios, Staff turnover Staff morale, Access, Waiting time, Waiting lists, Worker safety Customer Satisfaction (quality service delivery & customer benefit) Cost / Benefit ratio Return on investment Client results or client outcomes RBA Categories Account for All Performance Measures (in the history of the universe) Process Input Cost Product Output Impact Effect Effort Benefit value Quantity TQM Quality Effectiveness Value added Productivity Efficiency, Admin overhead, Unit cost Staffing ratios, Staff turnover Staff morale, Access, Waiting time, Waiting lists, Worker safety Customer Satisfaction (quality service delivery & customer benefit) Cost / Benefit ratio Return on investment Client results or client outcomes RBA Categories Account for All Performance Measures (in the history of the universe) Process Input Cost Product Output Impact Effect Effort Benefit value Quantity TQM Quality Effectiveness Value added Productivity Efficiency, Admin overhead, Unit cost Staffing ratios, Staff turnover Staff morale, Access, Waiting time, Waiting lists, Worker safety Customer Satisfaction (quality service delivery & customer benefit) Cost / Benefit ratio Return on investment 1. Did we treat you well? 2. Did we help you with your problems? Client results or client outcomes * World s simplest complete customer satisfaction survey * 20

21 The Matter of Control Effect Effort Quantity How much did we do? Most Control Is anyone better off? Quality How well did we do it? Least Control PARTNERSHIPS The Matter of Use 1. The first purpose of performance measurement is to improve performance. 2. Avoid the performance measurement equals punishment trap. Create a healthy organizational environment. Start small. Build bottom-up and top-down simultaneously. Comparing Performance 1. To Ourselves Can we do better than our own history? 2. To Others When it is a fair apples/apples comparison. 3. To Standards When we know what good performance is. 21

22 Comparing Performance 1. To Ourselves First Can we do better than our own history? Using a Baseline CHART ON THE WALL 2. To Others When it is a fair apples/apples comparison. 3. To Standards When we know what good performance is. Comparing Performance 1. To Ourselves First Can we do better than our own history? 2. To Others When it is a fair apples/apples comparison. Reward? Punish? 3. To Standards When we know what good performance is. Comparing Performance 1. To Ourselves First Can we do better than our own history? 2. To Others When it is a fair apples/apples comparison. 3. To Standards When we know what good performance is. 22

23 Effect Effort Quantity The Matter of Standards AND 3. Both require a LEVEL PLAYING FIELD and an ESTABLISHED RECORD of what good performance is. 1. Quality of Effort Standards are sometimes WELL ESTABLISHED Child care staffing ratios Application processing time Handicap accessibility Child abuse response time BUT 2. Quality of Effect Standards are almost always EXPERIMENTAL Hospital recovery rates Employment placement and retention rates Recidivism rates Advanced Baseline Display Create targets only when they are: FAIR & USEFUL Goal (line) Target or Standard Avoid publicly declaring targets by year if possible. Your Baseline Comparison Baseline Instead: Count anything better than baseline as progress. Program Performance Measures Quantity Quality Effect Effort How much did we do? Is anyone better off? How well did we do it? # % 23

24 All Data have two Incarnations Lay Definition Technical Definition HS Graduation Rate % enrolled June 1 who graduate June 15 % enrolled Sept 30 who graduate June 15 % enrolled 9 th grade who graduate in 12th grade Separating the Wheat from the Chaff Types of Measures Found in Each Quadrant How much did we do? # Clients/customers served # Activities (by type of activity) # # # # How well did we do it? % Common measures e.g. client staff ratio, workload ratio, staff turnover rate, staff morale, % staff fully trained, % clients seen in their own language, worker safety, unit cost % Activity-specific measures e.g. % timely, % clients completing activity, % correct and complete, % meeting standard Is anyone better off? % Skills / Knowledge (e.g. parenting skills) Point in Time % Attitude / Opinion (e.g. toward drugs) vs. Point to Point Improvement % Behavior (e.g.school attendance) % Circumstance (e.g. working, in stable housing) Choosing Headline Measures and the Data Development Agenda Quantity How much did we do? Quality How well did we do it? Effect Effort # Measure % Measure # Measure % Measure # Measure % Measure # Measure % Measure # Measure % Measure # Measure % Measure # Measure % Measure Is anyone better off? # Measure % Measure # Measure % Measure # Measure % Measure # Measure % Measure # Measure % Measure # Measure % Measure # Measure % Measure

25 LR UR RBA and Grantmaking 25

26 Why Results Based Accountability? Move from funding projects to funding impact Build collaboration between agencies Return on investment RBA and Roles for Funders Support infrastructure for collaborative work (investment boards, results based decisionmaking systems) Engage community members Sponsor tables to turn curves Support new tools (report cards, data management software) Support analysis, research, education Support convening of partners, conferences, and leadership development Support pilot programs, research, evaluation, and dissemination of what works Support innovative and gap filling services as part of a larger strategy Support and use performance accountability Programs, Services, Interventions 26

27 Basic Steps of RBA Grantmaking Identify the result to be achieved the target population the indicators of success what works Basic Steps of RBA Grantmaking Provide support/technical assistance to the grant applicants the review committee the grant recipients Potential Grantees Must. Identify the target population they will impact Identify the desired program result Identify their strategies and make the case for their effectiveness Identify their partners and their role Identify the number of people they will impact Identify how they will track the data 27

28 RFP Performance Measure Questions How much service will you provide? How well will you do it? How much? How well? How will you measure who will be better off? Better off (N)? Better off (%)? Technical Assistance Opportunities RBA training for applicants and review committee Provide information on Evidence Based Practices Provide technical assistance to grantees as they determine how they are going to collect the data Provided standardized data collection tool to all grantees At the end of the grant cycle Produce a public data report: How much was done How well it was done How many were better off The story behind the data 28

29 Communicating Impact 1. Data a. Population indicators Actual turned curves: movement for the better away from the baseline. b. Program performance measures: customer progress and better service: How much did we do? How well did we do it? Is anyone better off? 2. Accomplishments: Positive activities, not included above. 3. Stories behind the statistics that show how individuals are better off. Understanding RBA & Grantmaking Community Crime Prevention Grant Program One: RESULT or OUTCOME or GOAL A condition of well being for a population (clients, neighborhoods, counties) What result to you want to achieve? What should be different for the population served? 29

30 Community Crime Prevention Grant Program RESULT: All neighborhoods in our community will be safe and free from crime. Two: Who is the population that will be impacted? Who do you want to achieve this result for? Community Crime Prevention Grant Program TARGET POPULATION(S): Universal: All residents in the community Targeted: Residents of high crime zip codes Reentrants At Risk Youth 30

31 Three: If the result is achieved, what will be different? What will you see in your community/agency? What won t you see any more? Community Crime Prevention Grant Program How would you know? What would you experience? People would not be victimized murder, rape, theft, assault Children would be playing in yards and at parks Police and community residents would be friends/partners Older residents would go for walks in the evenings People would have jobs to go to and pay taxes No more drug dealers on the corner Four: What key Indicators would change? What do those key Indicators look like now (what is the trend line)? What is the story behind the data? 31

32 Community Crime Prevention Grant Program Headline Indicators Crime Rate Recidivism Secondary Indicators Employment Substance Abuse Community Engagement Five: Who else cares about this issue? Who else benefits from achieving the result? What is their role in achieving the result? Are there non-traditional partners? Do they have a sphere of influence? Are they willing to move from Talk to Action? Community Crime Prevention Grant Program Law Enforcement Social/Human Services Youth Services Neighborhood Associations Community Development Corporations Schools Religious Communities Hospitals 32

33 Six: What does the research say? Are there Evidence-Based Practices? Are there Promising Practices? What are some low cost/no cost strategies? Community Crime Prevention Grant Program What Works Cognitive Behavioral Therapy Employment and Housing Services/Supports Mental Health/Addictions Treatment Positive Youth Development CPTED Crime Prevention Through Environmental Design Hospital-Based Violence/Gang Programs Seven: What are you going to do? Are the actions aligned? Do they leverage each other? Do they contribute to the target population being better off? 33

34 Community Crime Prevention Grant Program Strategies Transitional Jobs Program Inpatient Drug Treatment for Ex-Offender Youth Rx for Hope (gang intervention program) CPTED Project Human Services for Ex-Offenders Community Crime Prevention Grant Program Performance Measures Program information Demographic Information Outcomes Information How Much How Well # grantees/programs # clients served # clients served, by race, gender, age, education Zip code #agencies attending training sessions #/% grantees meeting grant obligations Cost/client #/% of clients in the target population Geographic coverage of programs # Recidivism # Employment # Substance Abuse Treatment # Maintain Sobriety # Connection to Positive Support # Volunteering Better Off % Recidivism % Employment % Substance Abuse Treatment % Maintain Sobriety % Connection to Positive Support % Volunteering 34

35 Group Activity: RBA and Grantmaking Applying RBA to Your Grantmaking Approach Choose a scenario: 1. Foundations and nonprofit organizations in leadership roles in community-wide, resultsoriented initiatives. 2. Organizations with specific grant programs designed to achieve a targeted results. 3. Foundations and nonprofit organization who would like to strengthen their ability to measure and communicate their community impact. Turn the Curve Exercise: RBA and Grantmaking 5 min: Starting Points - timekeeper and reporter - two hats (yours plus partner s) 5 min: Baseline - pick a result and a curve to turn - forecast OK or not OK? 10 min: Story behind the baseline - causes/forces at work - information & research agenda part 1 - causes 10 min: What works? (What would it take?) - what could work to do better? - each partners contribution - no-cost / low-cost ideas - information & research agenda part 2 what works 5 min: Report convert notes to one page Two pointers to action 15 min: Performance Measures How much? How well? Better off? 35

36 ONE PAGE Turn the Curve Report: Population Result: Indicator Baseline Indicator (Lay Definition) Story behind the baseline (List as many as needed) Partners (List as many as needed) Three Best Ideas What Works No-cost / low-cost Off the Wall Performance Measures What s Next? A Basic Action Plan for Results Accountability TRACK 1: POPULATION ACCOUNTABILITY Establish results Establish indicators, baselines and charts on the wall Create an indicators report card Set tables (action groups) to turn curves TRACK 2: PERFORMANCE ACCOUNTABILITY Organize grant program around results, indicators, what works, performance measures, and accountability Use 7 Questions supervisor by supervisor and program by program in management, budgeting and strategic planning QUESTIONS? Lisa Osterman, MA losterman@communitysolutionsinc.net

37 37