Performance Management City of Columbus Case Study. Mark Freeman Chief Data Scientist IBM North America

Size: px
Start display at page:

Download "Performance Management City of Columbus Case Study. Mark Freeman Chief Data Scientist IBM North America"

Transcription

1 Performance Management City of Columbus Case Study Mark Freeman Chief Data Scientist IBM North America

2 About City of Columbus 15 th largest City in US, largest in Ohio Mayor-Council form of government Performance Management History Mayor Coleman s Columbus Covenant (2000) Department-level performance measures (2002) Program-level performance measures (2004) Establish Office of Performance Management (2005) Implement Columbus*Stat (2006)

3 Performance Management is an Investment Costs to define measures to gather data to report data to train staff to change processes Benefits More effective services More efficient services More value for the taxpayer s dollar More public support of government

4 Performance Measurement Can Be Wasteful If... Measures are not relevant Performance targets are not set Data is inaccurate or infrequently reported Managers do not know how to use it Then... Waste time to collect data that is not used Difficult to motivate service improvement Managers cannot use it to inform decisions Performance data will not be used

5 Performance Management Works When... Performance measures are relevant... Reasonable performance targets are set... Accurate and timely data is reported... Managers communicate about performance... Performance data informs decisions...

6 Performance measures are relevant Relevant to customers What do our customers expect from our services? How will we know when they get what they need? 2. Relevant to managers Who is in charge of the performance? What does their team do to influence performance? 3. Relevant to executives and elected officials What is important to government leaders? Do measures help them track progress on their priorities? 4. Relevant to taxpayers Are we delivering value for the money spent? Are we getting the most from limited resources?

7 Relevant to Customers What do our customers expect? Less Relevant # code enforcement inspections completed More Relevant % code enforcement violations resolved within 90 days # fires extinguished % fires contained to room of origin # seniors served % seniors served who avoid nursing home placement

8 Relevant to Managers Is this something we can influence? Less Relevant # code enforcement inspections requested More Relevant % code enforcement inspections provided within 10 days of request # fire incidents dispatched % fire incident responses provided within 5 minutes of dispatch # seniors requesting service % seniors enrolled within 30 days of service request

9 Relevant to Leaders Do measures relate to leaders priorities? Example Priorities Get Green Columbus Relevant Measures # gallons of fuel consumed in City Fleet # pounds recycled in City Offices Neighborhood Revitalization # code violations per 1000 parcels Average property value in target areas Crime Reduction # violent crimes per 1000 residents # property crimes per 1000 residents

10 Relevant to Taxpayers Are we measuring efficiency and productivity? Less Relevant # code enforcement inspectors More Relevant # inspections completed per code enforcement inspector $ spent on fire services $ property value saved per dollar spent on fire services $ spent on senior home care services $ spent per senior served (compared to nursing home cost)

11 Columbus Measurement Framework EFFECTIVENESS EFFICIENCY PRODUCTIVITY LEVEL OF SERVICE Tells us when customers are benefiting from our services % fires contained to room of origin % code violations resolved within 90 days of first inspection Tells us when our spending is in line with workload $ fire service expenditure per capita $ expenditure per code enforcement inspection completed Tells us when our staffing is in line with workload # fire emergency responses per firefighter # code enforcement inspections completed per inspector Tells us when we are meeting the demand for our services % fire responses provided within 8 minutes % code violation complaints investigated within 10 days

12 Performance Measure Review Collaborative process What do other cities measure? Identify streamlined list of key indicators Direct input from executives and managers

13 Sample Results: Fire Emergency Services BEFORE 5 fire containment rates (object, room, floor, etc.) # fire and EMS runs (by area) # fire and EMS responses (by area) % runs within 8 minutes (by area) Average response time (by area) AFTER % fires contained to room of origin % fire responses within 6.5 minutes from dispatch % EMS responses within 6.5 minutes from dispatch % service areas within 8 minute total response time 80 percent of the time # runs per emergency service FTE (combined fire and EMS) 279 different numbers 10 different numbers

14 Reasonable performance targets are set... A target is what we are shooting for: 60% of job trainees will get and keep a job 80% of seniors served will avoid nursing home placement Targets are needed to manage performance Some Guidelines: Targets should be achievable with the resources available to us Targets should push us to be better (or at least maintain excellence) Targets are often based on historical performance and/or industry benchmarks

15 Example: Job Training Targets FY2005 Actual FY2006 Actual FY2007 Estimate FY2008 Target $ total program expenditure $200,000 $300,000 $330,000 $250,000 % of job trainees obtaining and retaining employment $ expenditure per job trainee trained % of job trainees enrolled within 30 days of referral 50% 60% 65% 60% $1,000 $1,200 $1,245 $1,190 50% 70% 75% 55%

16 Accurate and timely data is reported... Performance Measure Definitions Manual Input Performance Management Information System Automated Input

17 Planning for Data Collection Indicators and Measures Description Data Collection 1. Average wait time per call in seconds How long, on average, a caller waits before being connected with a live agent. Excessive wait times can Who? Rich Eichorn rjeichorn@columbus.gov create frustration for callers. Over time, we try to keep When? Monthly What? Number 1.1 Total seconds of wait time for all answered calls the average wait time as low as possible. Source 311 System 1.2 # calls answered Formula # seconds of wait time / # calls answered Who is responsible When (how often) data needs to be entered What the data reflects (i.e., unit of measure) Where the data comes from (Source) Why and how the data is used (Description and Formula)

18 Performance Dashboards The Columbus PMIS What are they? Built in Microsoft Excel, Performance Dashboards are the City s tool for monitoring performance. Like an automobile dashboard, a performance dashboard organizes data on key performance indicators (KPIs) in a simple to use display that helps managers determine when they are efficiently and effectively serving their customers. Why are we using them? Peak Performance: a high-performing City government Mayor s Priority: implement a performance information system Help us communicate about performance

19 Dashboards and Management Citywide Dashboard CITY OF COLUMBUS PERFORMANCE DASHBOARD Mission: To be the best city in the nation in which to live, work, and raise a family. Chief of Staff: Mike Reese Chief's mdreese@columbus.gov Report Date: Thursday, January 10, 2008 Department Civil Service Commission Community Relations Commission Development Equal Business Opportunity Education Finance and Management Human Resources Public Health Public Safety Public Service Recreation & Parks Technology Utilities Performance Measures On Red Yellow Green % Performance Measures On... Red 7%Yellow 17% Green 76% Department Dashboards PUBLIC SERVICE DEPARTMENT PERFORMANCE DASHBOARD Mission: To provide high standards of excellence in the delivery of improved Director: Mark Kelsey transportation, infrastructure and basic city services; promote Director's mkelsey@columbus.gov partnerships to resolve neighborhood concerns; and ensure Report Date: Thursday, January 10, 2008 accountability to the public. Go to City Dashboard Performance Measures On Program Red Yellow Green % Performance Measures On Service Center (4) PUBLIC SERVICE DEPARTMENT PERFORMANCE DASHBOARD Mission: To provide high standards of excellence in the delivery of improved Director: Mark Kelsey transportation, infrastructure and basic city services; promote Director's mkelsey@columbus.gov partnerships to resolve neighborhood concerns; and ensure Report Date: Thursday, January 10, 2008 accountability to the public. Go to City Dashboard Performance Measures On Program Red Yellow Green % Performance Measures On Service Center (4) Parking Violations Bureau (4) Parking Violations Bureau (4) Refuse Division (21) Red 15% Refuse Division (21) Red 15% Transportation Division (35) Green 61% Yellow 24% Transportation Division (35) Green 61% Yellow 24% Program Dashboards 311 Call Center Program Performance Dashboard 311 Call Center Program Performance Dashboard 311 Call Center Program Performance Dashboard 311 Call Center Program Performance Dashboard Mission: To provide better service and convenience to residents Program Manager: 311 Manager Mission: To provide better service and convenience to residents Program Manager: 311 Manager Mission: To provide better service and convenience to residents Program Manager: 311 Manager Mission: To provide better service and convenience to residents Program Manager: 311 Manager and customers through a single point of contact, the 311 Manager's 311manager@columbus.gov and customers through a single point of contact, the 311 Manager's 311manager@columbus.gov and customers through a single point of contact, the 311 Manager's 311manager@columbus.gov and customers through a single point of contact, the 311 Manager's 311manager@columbus.gov number. Report Date: Tuesday, July 15, 2008 number. Report Date: Tuesday, July 15, 2008 number. Report Date: Tuesday, July 15, 2008 number. Report Date: Tuesday, July 15, 2008 Go to Department Dashboard Go to Data Entry Go to Measure Definitions Go to Department Dashboard Go to Data Entry Go to Measure Definitions Go to Department Dashboard Go to Data Entry Go to Measure Definitions Go to Department Dashboard Go to Data Entry Go to Measure Definitions YTD Year End Annual YTD Year End Annual YTD Year End Annual YTD Year End Annual Key Performance Indicators Actual Estimate Target Status Performance Narrative Key Performance Indicators Actual Estimate Target Status Performance Narrative Key Performance Indicators Actual Estimate Target Status Performance Narrative Key Performance Indicators Actual Estimate Target Status Performance Narrative 1. Average wait time per call in seconds Through June of this year, we are doing better than our performance targets for wait time, agent availability and 1. Average wait time per call in seconds Through June of this year, we are doing better than our performance targets for wait time, agent availability and 1. Average wait time per call in seconds Through June of this year, we are doing better than our performance targets for wait time, agent availability and 1. Average wait time per call in seconds Through June of this year, we are doing better than our performance targets for wait time, agent availability and average calls handled. High availability and productivity average calls handled. High availability and productivity average calls handled. High availability and productivity average calls handled. High availability and productivity 2. % of calls answered within 30 seconds 71% 66% 80% - (measures 4&5) are driven by lower than budgeted staffing to handle an increasing call volume. So while 2. % of calls answered within 30 seconds 71% 66% 80% - (measures 4&5) are driven by lower than budgeted staffing to handle an increasing call volume. So while 2. % of calls answered within 30 seconds 71% 66% 80% - (measures 4&5) are driven by lower than budgeted staffing to handle an increasing call volume. So while 2. % of calls answered within 30 seconds 71% 66% 80% - (measures 4&5) are driven by lower than budgeted staffing to handle an increasing call volume. So while 3. % calls abandoned 9.0% 11.2% 6.0% - we are a very productive call center, the quality of our service is suffering. Calls answered within 30 seconds (measure 2) is around 70%, 10% below target. And our 3. % calls abandoned 9.0% 11.2% 6.0% - we are a very productive call center, the quality of our service is suffering. Calls answered within 30 seconds (measure 2) is around 70%, 10% below target. And our 3. % calls abandoned 9.0% 11.2% 6.0% - we are a very productive call center, the quality of our service is suffering. Calls answered within 30 seconds (measure 2) is around 70%, 10% below target. And our 3. % calls abandoned 9.0% 11.2% 6.0% - we are a very productive call center, the quality of our service is suffering. Calls answered within 30 seconds (measure 2) is around 70%, 10% below target. And our 4. % of agent availability per month. 86% 90% 80% + abandonment rate (measure 3) is around 9%, 3% higher than target. To improve performance, we will need 4. % of agent availability per month. 86% 90% 80% + abandonment rate (measure 3) is around 9%, 3% higher than target. To improve performance, we will need 4. % of agent availability per month. 86% 90% 80% + abandonment rate (measure 3) is around 9%, 3% higher than target. To improve performance, we will need 4. % of agent availability per month. 86% 90% 80% + abandonment rate (measure 3) is around 9%, 3% higher than target. To improve performance, we will need more staffing. more staffing. more staffing. more staffing. 5. Average calls handled per call taker 1,002 1, Average calls handled per call taker 1,002 1, Average calls handled per call taker 1,002 1, Average calls handled per call taker 1,002 1,

20 Red, Yellow, Green? 80% of seniors served will avoid nursing home placement Light What does it mean? On When A green + indicates that actual performance is better than target. A yellow ~ indicates that actual performance is close to target. A red - indicates that actual performance is worse than target. > 80% 70-80% < 70%

21 The City Dashboard Mission: CITY OF COLUMBUS PERFORMANCE DASHBOARD To be the best city in the nation in which to live, work, and raise a family. Chief of Staff: Mike Reese Chief's mdreese@columbus.gov Report Date: Friday, June 13, 2008 Department Civil Service Commission Community Relations Commission Development Equal Business Opportunity Education Finance and Management Human Resources Public Health Public Safety Public Service Recreation & Parks Technology Utilities Performance Measures On Red Yellow Green % Performance Measures On... Green 46% Red 12% Yellow 42% Links to Department Dashboards Count of Measures by Status Chart of Measures by Status

22 Department Dashboards Mission: Go to City Dashboard PUBLIC SERVICE DEPARTMENT PERFORMANCE DASHBOARD To provide high standards of excellence in the delivery of improved transportation, infrastructure and basic city services; promote partnerships to resolve neighborhood concerns; and ensure accountability to the public. Director: Mark Kelsey Director's Report Date: Friday, June 13, 2008 Program 311 Service Center (4) Performance Measures On Red Yellow Green % Performance Measures On... Parking Violations Bureau (4) Refuse Division (21) Red 12% Transportation Division (35) Green 53% Yellow 35% Links to Division/Program Dashboards Count of Measures by Status Chart of Measures by Status

23 Program Dashboards 311 Service Center Program Performance Dashboard Mission: To provide better service and convenience to residents Program Manager: Lois Bruce and customers through a single point of contact, the 311 Manager's lfbruce@columbus.gov number. Report Date: Friday, June 13, 2008 Go to Department Dashboard Go to Data Entry Go to Measure Definitions Key Performance Indicators YTD Actual Year End Estimate 1. Average wait time per call % of calls answered within 30 seconds 79% 80% 80% 3. % of calls abandoned after 20 seconds of wait time 2.3% 3.9% 4. % of agent availability per month. 81% 79% Annual Target 30 ~ 5.0% 77% Status ~ + + Performance Narrative In March, the Call Center experienced a 28% increase in the number of calls answered from the previous month. This is largely due to the record snow event that the City received, causing over 2,000 weather-related incoming calls. As can be expected, the percentage of calls answered within 30 seconds fell to 76%, and average time to answer of 37 seconds dropped below the monthly performance standard of 30 seconds. We expect this to be a one-time performance indicator. Links to Detailed Information Data, Targets, and Status Narrative Description of Performance

24 Detailed Information Trend Data 1. Average wait time per call JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC Target More Information Indicator Description Average Wait Time How long, on average, a caller waits before being connected 60 with a live agent. Excessive wait 50 times can create frustration for callers. Over time, we try to keep 40 the average wait time as low as 2007 possible Lower is better JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC Target Links to More Information Trend Chart Indicator Description

25 311 System and Performance Data 5. % of 311 pothole requests closed within 72 hours. JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC % 63% 56% 57% 61% 71% 55% 68% 70% 62% 70% 84% % 72% 64% 49% Target 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% 70% More Information Indicator Description Component Measures Analysis Quarterly by Outpost 311 Condition of Alley 311 Condition of Street Higher is better % 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% % 311 Pothole Requests Closed Within 72 Hours Source: 311 Days to Close Report JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC 311 Call Center ( ) Target 311 website (311.columbus.gov) Citizens submit service requests Performance: time to resolution Critical to maintaining the streets is providing timely repair of potholes. This percent shows how often we are able to respond to citizen pothole reports in a timely manner. Over time, we want this percent to be as high as possible.

26 Drilling Into the Data Quarterly by Outpost 5. % of 311 pothole requests closed within 72 hours. # SRs 2006Q1 2006Q2 2006Q3 2006Q4 2007Q1 2007Q2 2007Q3 2007Q4 2008Q YTD PER YEAR 25th Avenue 50% 34% 49% 46% 54% 59% 46% 58% 57% 25th Avenue 43% 55% 57% 757 Central 84% 67% 66% 75% 79% 69% 78% 88% 72% Central 73% 76% 72% 661 Marion Rd 63% 45% 65% 43% 34% 50% 68% 73% 67% Marion Rd 54% 42% 67% 626 North 74% 96% 93% 91% 74% 55% 59% 80% 57% North 88% 69% 57% 653 Roberts Rd 77% 71% 76% 65% 68% 67% 70% 80% 97% Roberts Rd 73% 69% 97% # of 311 pothole requests resolved within 72 hours 2006Q1 2006Q2 2006Q3 2006Q4 2007Q1 2007Q2 2007Q3 2007Q4 2008Q1 25th Avenue Central Marion Rd North Roberts Rd # of 311 pothole requests 2006Q1 2006Q2 2006Q3 2006Q4 2007Q1 2007Q2 2007Q3 2007Q4 2008Q1 25th Avenue Central Marion Rd North Roberts Rd % 80% 60% 40% 20% 0% % 311 Pothole Requests Closed Within 72 Hours By Outpost, th Avenue Central Marion Rd North Roberts Rd YTD

27 Managers communicate about performance... REPORT REVIEW 311 Call Center Program Performance Dashboard Mission: To provide better service and convenience to residents Program Manager: 311 Manager and customers through a single point of contact, the 311 Manager's 311manager@columbus.gov number. Report Date: Tuesday, July 15, 2008 Performance Go to Department Dashboard Go to Data Entry Go to Measure Definitions YTD Year End Annual Key Performance Indicators Actual Estimate Target Status Performance Narrative Through June of this year, we are doing better than our 1. Average wait time per call in seconds performance targets for wait time, agent availability and average calls handled. High availability and productivity (measures 4&5) are driven by lower than budgeted 2. % of calls answered within 30 seconds 71% 66% 80% - staffing to handle an increasing call volume. So while Dashboards we are a very productive call center, the quality of our 3. % calls abandoned 9.0% 11.2% 6.0% service is suffering. Calls answered within 30 seconds - (measure 2) is around 70%, 10% below target. And our 4. % of agent availability per month. 86% 90% 80% + abandonment rate (measure 3) is around 9%, 3% higher than target. To improve performance, we will need more staffing. 5. Average calls handled per call taker 1,002 1, Staff Meetings C*Stat Briefs Columbus*Stat

28 Columbus*Stat Modeled after Baltimore Citi*Stat... but a kinder and gentler *Stat OPM uses dashboards to prepare briefs and presentations... in collaboration with departments Some departments conduct their own *Stat meetings (Health*Stat, Crime*Stat)

29 Performance Management Training Half day workshops for Mayor s Office, department directors, program managers Training Content: What makes a good performance indicator How to use your dashboard How to interpret performance information How to use your data for service improvement How to use your data to tell your story Case Studies to illustrate concepts through hands-on exercises

30 Performance Narratives Key Performance Indicators YTD Actual Year End Estimate 1. Average wait time per call in seconds Annual Target 2. % of calls answered within 30 seconds 71% 66% 80% 3. % calls abandoned 9.0% 11.2% 6.0% 5. Average calls handled per call taker 1,002 1, Status % of agent availability per month. 86% 90% 80% Performance Narrative Through June of this year, we are doing better than our performance targets for wait time, agent availability and average calls handled. High availability and productivity (measures 4&5) are driven by lower than budgeted staffing to handle an increasing call volume. So while we are a very productive call center, the quality of our service is suffering. Calls answered within 30 seconds (measure 2) is around 70%, 10% below target. And our abandonment rate (measure 3) is around 9%, 3% higher than target. To improve performance, we will need more staffing. Tables & graphs by themselves do not effectively communicate. A narrative explanation is needed to tell the story of successes and challenges. Provides an opportunity to explain the numbers and avoid misinterpretation.

31 Source: 311 System Source: 311 System JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC Source: 311 System Target Target Target Performance data informs decisions... EVALUATE IMPROVE + 2. % of calls answered within 30 seconds JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC % 88% 91% 77% 70% 71% 76% 67% 66% 73% 79% 82% % 65% 68% 74% 71% 69% Target 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% Links to More Information Staffing Needs to Meet Target Performance by Hour Higher is better... % Calls Answered Within 30 Seconds 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC Indicator Description This percentage shows how often we are able to answer calls within 30 seconds. To avoid excessive wait times and abandoned calls, we want the percent to be as high as possible. Reward Success Spread Success Reset the Target? ~ 2. % of calls answered within 30 seconds JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC % 88% 91% 77% 70% 71% 76% 67% 66% 73% 79% 82% % 65% 68% 74% 71% 69% Target 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% Links to More Information Staffing Needs to Meet Target Performance by Hour Higher is better... % Calls Answered Within 30 Seconds 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC Indicator Description This percentage shows how often we are able to answer calls within 30 seconds. To avoid excessive wait times and abandoned calls, we want the percent to be as high as possible. Are we close to red/green? Can we do better? Should we do better? - 2. % of calls answered within 30 seconds % 88% 91% 77% 70% 71% 76% 67% 66% 73% 79% 82% % 65% 68% 74% 71% 69% Target 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% 80% Links to More Information Staffing Needs to Meet Target % Calls Answered Within 30 Seconds Performance by Hour 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Higher is better... JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC Indicator Description This percentage shows how often we are able to answer calls within 30 seconds. To avoid excessive wait times and abandoned calls, we want the percent to be as high as possible. Corrective Actions Process Improvements Reset the Target?

32 Putting it All Together Use Data To Improve Talk About Performance Track Performance Set Performance Targets Measure What Matters

33 Measuring What Matters Group Exercise

34 How to Find Out What Matters 1. What is our program s mission or purpose? 2. What measures will help us know if we are efficiently and effectively accomplishing our mission?

35 What is our mission? Your program mission should identify... Your customers The services you provide to your customers The benefits your customers get by using your services... in one clear and concise sentence.

36 Example: Job Training Program To provide job training and support services to individuals with disabilities so they can obtain and retain employment. Services Customers Benefits

37 Example: Senior Services Program To provide home and community-based care to eligible seniors so they can live safely in the community and avoid nursing home placement. Services Customers Benefits

38 Mission Statement: Tips Use clear and concise language (no jargon) Do not confuse services with benefits Be specific about your customer the intended beneficiary of your services Be specific about the benefits customers should experience from your services Benefits are measurable and significantly influenced (but not fully controlled) by your services

39 4 Steps to a Clear and Concise Mission 1. Identify your services Brainstorm a list of phrases Narrow the list to a single phrase 2. Identify your customers Brainstorm a list of phrases Narrow the list to a single phrase 3. Identify the benefits Brainstorm a list of phrases Narrow the list to a single phrase 4. Put the phrases together in a mission statement To provide <insert phrase #1> to <insert phrase #2> so they can <insert phrase #3>.

40 What indicators will help us... To know if our customers are benefiting from our services? To know if we are efficiently delivering services? To know if we are meeting the demand for our services?

41 Four Types of Indicators EFFECTIVENESS EFFICIENCY PRODUCTIVITY LEVEL OF SERVICE Tells us when customers are benefiting from our services % job trainees who get and keep a job for at least 180 days % seniors served who avoid nursing home placement Tells us when our spending is in line with workload $ expenditure per job trainee trained $ expenditure per senior served Tells us when our staffing is in line with workload # of trainees trained per job trainer # seniors served per home care case manager Tells us when we are meeting the demand for our services % trainees enrolled in job training within 30 days of referral % seniors enrolled in home care within 1 month of request

42 BY THE RATIOS BY THE NUMBERS Why Ratios? Job Training Last Year 500 trained 350 got a job Spent $500,000 70% got a job $1,000 per trainee Job Training This Year 1,000 trained 500 got a job Spent $1,000,000 50% got a job $1,000 per trainee QUESTIONS: Did the program get larger or smaller? Did the program get more effective? Did the program get more efficient?

43 MISSION: Mission & Measures: Job Training Program To provide job training and support services to individuals with disabilities so they can obtain and retain employment. MEASURES: EFFECTIVENESS EFFICIENCY PRODUCTIVITY LEVEL OF SERVICE Tells us when customers are benefiting from our services % job trainees who get and keep a job for at least 180 days Tells us when our spending is in line with workload $ expenditure per job trainee trained Tells us when our staffing is in line with workload # of trainees trained per job trainer Tells us when we are meeting the demand for our services % trainees enrolled in job training within 30 days of referral

44 MISSION: Mission & Measures: Senior Services Program To provide home and community-based care to eligible seniors so they can live safely in the community and avoid nursing home placement. MEASURES: EFFECTIVENESS EFFICIENCY PRODUCTIVITY LEVEL OF SERVICE Tells us when customers are benefiting from our services % seniors served who avoid nursing home placement Tells us when our spending is in line with workload $ expenditure per senior served Tells us when our staffing is in line with workload # seniors served per home care case manager Tells us when we are meeting the demand for our services % seniors enrolled in home care within 1 month of request

45 Performance Indicators: Tips Don t measure everything focus on the critical few Align your indicators with your mission Make sure your services influence your indicators Don t limit yourself to indicators you fully control Value of information > Cost of measurement Ask: Will this information help us make better decisions? Will this information help us improve our services? Will this information help us tell our story?

46 Exercise: Mission and Measures 1. Review Assigned Program Description 2. Evaluate Mission Is it a clear and concise sentence describing services, customers and benefits? If yes, go to 3. If no, revise as needed. 3. Develop Measures (no more than 10) Effectiveness indicators first, aligned with mission Efficiency and/or Productivity indicators next Level of Service (as needed)