What You Measure is What Gets Done: Using Data to Enhance Operations & Track the Progress of Your Strategic Plan

Size: px
Start display at page:

Download "What You Measure is What Gets Done: Using Data to Enhance Operations & Track the Progress of Your Strategic Plan"

Transcription

1 What You Measure is What Gets Done: Using Data to Enhance Operations & Track the Progress of Your Strategic Plan Joseph Naughton-Travers, Ed.M., Senior Associate, OPEN MINDS 2011 Planning and Innovation Institute June 15, 2011

2 Outline I. How to use management metrics in supervision and program management II. Steps in designing management metrics system assessing and deploying metrics in your current operations III. Case studies in metrics-based management balanced scorecards, key performance indicators, status reports, and more 2

3 Today s Reality When used correctly, measurement data can help an executive team better lead its organization and can help direct the organization toward positive change. Most organizations have the data they need to support metrics-based management in their current information systems but they don t know how to organize the information. 3

4 There are four cornerstones of data that management needs in order to drive innovation, enhance care quality, and improve operational performance. 4

5 Four Cornerstones of Metrics-Based Management Routine Operational & Management Reports Key Performance Indicators Benchmarking & Performance Targets Management Dashboards & Alerts 5

6 #1 Routine Operational & Management Reports These ensure that all staff has access to the accurate, up-to-date information that they need to do their jobs on a daily basis. With roughly ninety percent of the reporting needs for organizations being planned and predictable, this simply means producing and using meaningful reports from the management information system and making certain that staff know how to use them. 6

7 #2 Key Performance Indicators With basic operational and management reports in place, the next building block for developing a measurement culture is to make certain that the executive team and operational managers know whether or not the organization s strategic and operational objectives are being achieved. An effective key performance measurement system uses financial and non-financial measures and is driven by structured data from the information system, based upon the organization s strategic objectives. 7

8 #3 Benchmarking & Performance Targets The third cornerstone is using benchmarks and performance targets to challenge and drive continuous improvement in service quality and operations. Benchmarking performance metrics allow your organization to compare with other firms in your industry; develop cross-industry comparisons; develop points of reference or standards of practice; make best-in-class determinations; and to develop best practices. 8

9 #4 Management Dashboards and Alerts The last phase in implementing metrics- based management is to make certain that your managers and staff cannot miss key performance information by using some form of a management dashboard. This is the in-your-face daily data that managers receive to keep them abreast of key issues and routine operations delivered via computer screen dashboards, widgets, or Smartphone and alerts. 9

10 #4 Management Dashboards and Alerts (cont.) Typically, management dashboards or alerts include discrete performance metrics that measure critical operational components of your organization. Examples include: daily census versus budget for inpatient, residential, and day program units billable staff productivity month-to-date versus budget hospitalizations and critical incidents in the last 24 hours 10

11 I. How to Use Management Metrics in Supervision & Program Management 11

12 Building the Foundation of Measurement: Start your metrics management approach by ensuring that staff have access to data via routine reporting to manage day-to-day operations and supervise staff 12

13 Routine Report Needs Analysis Internal or External Internal reports are reports identified for use by the agency staff to manage operations. External reports are those required periodically by outside parties (such as state agencies). This list was developed for a large county-based mental health provider organization rolling out new reporting and software across its program lines. 13

14 Routine Report Needs Analysis (cont.) Priority Level Additionally, reports are given a preliminary priority level by the management team. The priority levels are as follows: Priority One An essential report that must be operational on the day the organization begins to use its new software application. Priority Two A non-essential report, but highly desired as soon as possible after the new software implementation. Priority Three A non-essential report, but anticipated as part of a longer term expansion of management reporting and enhanced operations 14

15 Sample Management & Operational Report List Caseload Report This is a consumer caseload listing for individual clinicians. Staff Productivity Report The purpose of this report is to manage the time spend on behalf of consumers (direct and indirect client care) as well as billable hours. Thus, the report should summarize by clinicians the total number of direct, indirect, billable, and non-billable hours for a given time frame as well as the accompanying gross and net revenues for the billable time. Care Access Measures Reports This report would detail the average number of days between various metrics related to the care access process (e.g., number of days between initial call and diagnostic evaluation, etc.) 15

16 Sample Management & Operational Report List (cont.) Waiting List Report This report provides detailed and summary information for consumers on waitlists for specific programs, services, funding sources, and/or individual clinicians. Closed Case Notification Report This report details the consumers who have not had a service within a specific time frame to aid clinicians in identifying cases that may need to be closed. Care Provider and Program Admissions Report by Client This report is a listing of all internal and external care providers for a consumer as well as what programs into which the consumer is admitted. 16

17 Sample Management & Operational Report List (cont.) External Referral Report This report details and summarizes the referrals received by the organization in a given time period. It should include basic consumer information, referral source, referral reason, and disposition information. Internal Referral Report - This report is like the previous one, except used for referrals from one program to another within the organization. No-Show and Cancellation Report This report details the number of cancellations and no-shows for a given date range of services. 17

18 Sample Management & Operational Report List (cont.) Discharge Report - This report details and summarizes the consumer discharges from the organization in a given time period. It should include basic consumer information, discharge reason, and disposition information. Consumer Demographics Reports This report should be available in detail and summary format to identify consumers with a broad range of specific demographics or combinations thereof (e.g., age group, gender, race, zip code, diagnosis, medication, etc.) Mandated Treatment Monitoring Reports These reports should be available in detail and summary formats to support compliance with various mandated treatment requirements. 18

19 Sample Management & Operational Report List (cont.) Medication Profile & Report - This report details the number and percentage of consumers who are receiving specific medications. Daily Census This report should detail the names and basic information for consumers admitted to a per diem program on a specific date. Tickler Report - This report details all required, but not completed, medical records events and other requirements by consumer and care provider. High Risk / Priority Consumer Report This report should detail and summarize the consumers identified as high-risk and/or priority cases by program and overall. 19

20 Sample Management & Operational Report List (cont.) Crisis Call Tracking - This report details and summarizes the crisis calls received by the organization in a given time period. It should include basic consumer information, referral source, referral reason, and disposition information. Consumer Service History - This report details all of the services rendered to an individual consumer within a specific time period. Substance Use Report - This report details the number and percentage of consumers who are using specific non-prescribed medications or substances. 20

21 Sample Management & Operational Report List (cont.) Credentialing Warning Report - This report warns users of clinician credentialing needs and other required events (such as continuing education requirements, contract renewals, etc.) Aged Trial Balance Report - This report details and summarizes the current accounts receivable, including all billing, payment, contractual expense, and write-off data on an individual claim basis. General Ledger Posting Report - These reports detail the credit/debit entries by G/L account number for posting month end financial data into an accounting software application. 21

22 Sample Management & Operational Report List (cont.) Pre-Billing Edit Report - These reports detail all services entered into the system that encountered a billing edit (e.g., missing diagnosis, missing authorization, etc.), allowing users to correct problems that may prevent payment prior to billing services. Deposit Reconciliation Report - This report details the bank deposits, including payers and check amounts, for a given date range so that users can reconcile deposits entered into the system with bank statements. Collection Follow-Up Report - This report is an expanded Aged Trial Balance report, with the addition of payer contact information and collection/follow-up notes. 22

23 Sample Management & Operational Report List (cont.) Bad Debt Write-Off Report - This report details and summarizes the bad debt write-offs to the accounts receivable for a given date range of transactions (billing and service dates), including consumer, payer, amount, and write-off reason. Missing Demographics Report -This report details consumers who are missing required demographic data elements (for billing, state reporting, etc.) Missing Service Authorizations Report -This report details all services entered into the system that are missing required service authorizations, including primary clinician, consumer, date of service, and service. 23

24 Sample Management & Operational Report List (cont.) Service Authorization Warning Report -This report is designed to be warn clinical staff of service authorizations that are pending expiration or exhaustion, based on the expiration date or the number of authorized services remaining.. Appointment Schedule -This report details the scheduled appointments for service. Outcome Reports -These reports would detail and summarize consumer outcome information for selected metrics (such as level of functioning measures, hospitalizations, days in sobriety, maintained employment, etc.). 24

25 Sample Management & Operational Report List (cont.) Average Length of Stay Report This report should summarize the average length of stay by days or visits by clinical program or overall. Average Number of Services Report - This report should summarize the average number of services per consumer per organizational or clinical program admission. Dismissal Report This report should detail consumers referred to a program but not admitted either due to non-acceptance or non-compliance. Service Plan Variance Reporting This report should detail by consumers all services rendered that were not part of the service plan. 25

26 But Routine Reporting Is Not Enough! Organizations Need Key Operational & Strategic Performance Metrics 26

27 Key Performance Indicators With basic operational and management reports in place, the next building block for developing a measurement culture is to make certain that the executive team and operational managers know whether or not the organization s strategic and operational objectives are being achieved. An effective key performance measurement system uses financial and non-financial measures and is driven by structured data from the information system, based upon the organization s strategic objectives. 27

28 Key Performance Indicators (cont.) A performance indicator or key performance indicator (KPI) is a measure of performance. Such measures are commonly used to help an organization define and evaluate how successful it is in terms of making progress towards its long-term organizational goals. 28

29 Key Performance Indicators KPIs can be specified by answering the question, "What is really important to different stakeholders?" KPIs are typically tied to an organization's strategy using concepts or techniques such as the Balanced Scorecard (more on this soon!). 29

30 An Ideal Performance Indicator.... A ideal KPI is a key part of a measurable objective, which is made up of a direction, KPI, benchmark, target, and time frame. For example: "Increase average revenue per customer from $30 to $50 by 2008." In this case, 'Average revenue per customer' is the KPI. 30

31 SMART Performance Indicators A KPI can follow the SMART criteria. This means : The measure has a Specific purpose for the business. It is Measurable to really get a value of the KPI. The defined norms have to be Achievable. The KPI has to be Relevant to measure (and thereby to manage). It must be Time phased, which means the value or outcomes are shown for a predefined and relevant period. 31

32 Example of a Key Performance Indicator For instance, the measure Training person-days under the objective Provide adequate employee training under the Learning and Growth perspective is not very meaningful since the fulfillment of the target in persondays alone does not indicate the usefulness of training. Hence, in addition, a suitable objective called Maximize Training Effectiveness, and a measurable KPI may be defined under this new objective. This new KPI might, for instance, be called Percentage of training programs put to practical use within three months after training. 32

33 The Balanced Scorecard Concept 33

34 History of the Balanced Scorecard Concept The roots of the balanced scorecard concept include: The work of General Electric on performance measurement reporting in the 1950 s The work of French process engineers (who created the Tableau de Bord literally, a "dashboard" of performance measures) in the early part of the 20th century 34

35 History of the Balanced Scorecard Concept (cont.) The first Balanced Scorecard was created by Art Schneiderman (an independent consultant on the management of processes) in 1987 at Analog Devices, a mid-sized semi-conductor company. He participated in an unrelated research study in 1990 led by Dr. Robert S. Kaplan in conjunction with US management consultancy Nolan-Norton, and during this study described his work on Balanced Scorecard. 35

36 History of the Balanced Scorecard Concept (cont.) Kaplan and David P. Norton included anonymous details of this use of Balanced Scorecard in their 1992 article The Balanced Scorecard - Measures that Drive Performance" (Harvard Business Review). In 1996, they published the book The Balanced Scorecard: Translating Strategy into Action (Harvard Business School Press). Due to these works, Kaplan & Norton are often seen as the creators of the Balanced Scorecard concept. 36

37 Balanced Scorecard Basics Design of a Balanced Scorecard ultimately is about the identification of a small number of financial and non-financial measures and attaching targets to them, so that when they are reviewed it is possible to determine whether current performance 'meets expectations. 37

38 Balanced Scorecard Basics (cont.) The idea behind this is that by alerting managers to areas where performance deviates from expectations, they can be encouraged to focus their attention on these areas, and hopefully as a result trigger improved performance within the part of the organization they lead. 38

39 Kaplan & Norton s Notion of the Balanced Scorecard Design Process Translating the vision into operational goals Communicating the vision and link it to individual performance Business planning; index setting Feedback and learning, and adjusting the strategy accordingly 39

40 The Four Perspectives The 1st Generation design method proposed by Kaplan & Norton was based on the use of three nonfinancial topic areas as prompts to aid the identification of non-financial measures in addition to one financial measure. 40

41 The Four Perspectives (cont.) Based on assumption that focusing on financials alone is path to failure Balanced set of KPI reflect four key management perspectives Financial performance Customer performance Organizational innovation Internal operations (clinical and administrative) 41

42 Financial Perspective Encourages the identification of a few relevant highlevel financial measures Choose measures that help inform the answer to the question "How do we look to shareholders?" 42

43 Financial Perspective (cont.) Timely and accurate funding data will always be a priority But the point is that the current emphasis on financials leads to the "unbalanced" situation with regard to other perspectives 43

44 Customer Perspective Encourages the identification of measures that answer the question "How do customers see us? Multiple customers: Consumers and families Payers and funders Communities we provide services in 44

45 Customer Perspective (cont.) An increasing realization of the importance of customer focus and customer satisfaction in any business These are leading indicators: if customers are not satisfied, they will eventually find other suppliers that will meet their needs Typically these are satisfaction and outcome measures 45

46 Internal Business Perspective Encourages the identification of measures that answer the question "What must we excel at?" 46

47 Internal Business Perspective (cont.) This perspective refers to internal business processes and operations. Metrics based on this perspective allow the managers to know how well their business is running, and whether its products and services conform to customer requirements (the mission). Typically, these include metrics measuring the health of our various administrative departments and other key operations. 47

48 Learning & Growth Perspective Encourages the identification of measures that answer the question "Can we continue to improve and create value?" 48

49 Learning & Growth Perspective This perspective includes employee training and corporate cultural attitudes related to both individual and corporate self-improvement. Kaplan and Norton emphasize that 'learning' is more than 'training'; it also includes things like mentors and tutors within the organization, as well as that ease of communication among workers that allows them to readily get help on a problem when it is needed. 49

50 50

51 Why Implement a Balanced Scorecard? Increase focus on strategy and results Improve organizational performance by measuring what matters Align organization strategy with the work people do on a day-to-day basis Focus on the drivers of future performance Improve communication of the organization s vision and strategy Prioritize projects / initiatives 51

52 Kaplan and Norton on the Balanced Scorecard "The balanced scorecard retains traditional financial measures. But financial measures tell the story of past events, an adequate story for industrial age companies for which investments in long-term capabilities and customer relationships were not critical for success. These financial measures are inadequate, however, for guiding and evaluating the journey that information age companies must make to create future value through investment in customers, suppliers, employees, processes, technology, and innovation." 52

53 Good Performance Measures: Provide a way to see if our strategy is working Focus employees' attention on what matters most to success Allow measurement of accomplishments, not just of the work that is performed Provide a common language for communication Are explicitly defined in terms of owner, unit of measure, collection frequency, data quality, expected value (targets), and thresholds Are valid, to ensure measurement of the right things Are verifiable, to ensure data collection accuracy 53

54 Government or Not-for-Profit Metrics Criteria or metrics for government or nonprofit agency performance can thus be broken down into the following three categories: Strategic need (current and in the foreseeable future Mission-specific effectiveness metrics (uniqueness and viability) Generic efficiency metrics 54

55 55

56 56

57 57

58 How the Executive Team Uses KPIs Monitor representative metrics of organization s performance Proactively address problems ( leading indicator concept) Use as guide to selecting issues for further exploration To stretch to better performance! 58

59 Benchmarking & Performance Targets: Moving Strategic Performance to the Next Level 59

60 Benchmarking & Performance Targets The third cornerstone is using benchmarks and performance targets to challenge and drive continuous improvement in service quality and operations. Benchmarking performance metrics allow your organization to compare with other firms in your industry; develop cross-industry comparisons; develop points of reference or standards of practice; make best-in-class determinations; and to develop best practices. 60

61 Why Benchmark? To compare with other firms in your industry To develop cross-industry comparisons To develop points of reference or standards of practice To make best-in-class determinations To develop best practices 61

62 Benchmarking Process: A Collaborative Approach Develop group of organizations to initiate benchmarking process Agreement of organizations about process, goals, measures to benchmark Collect data and analyze Create data collection instrument Reality check and refinement Publish benchmarks 62

63 Benchmarking Another option is to purchase or acquire benchmark data and compare it to your organization s measures 63

64 Management Dashboards: Up-to-the-Minute Performance Data Your Staff Can t Miss! 64

65 Management Dashboards or Alerts The last phase in implementing metrics- based management is to make certain that your managers and staff cannot miss key performance information by using some form of a management dashboard. This is the in-your-face daily data that managers receive to keep them abreast of key issues and routine operations delivered via computer screen dashboards, widgets, or Smartphone and alerts. 65

66 Management Dashboards or Alerts (cont.) Typically, management dashboards or alerts include discrete performance metrics that measure critical operational components of your organization. Examples include: Daily census versus budget for inpatient, residential, and day program units Billable staff productivity month-to-date versus budget Hospitalizations and critical incidents in the last 24 hours 66

67 Management Dashboards A dashboard is an information system user interface that (similar to an automobile s dashboard) is designed to be easy to read. They are part of a common suite of business intelligence tools 67

68 Management Dashboards (cont.) These are discrete performance metrics that measure critical operational components of your organization Daily census versus budget Productivity or units of service month-to-date versus budget Hospitalizations, critical incidents, etc. in last 24 hours??? 68

69 Management Dashboards (cont.) Select these must-have data elements and work with your technology team and partners to make them unmissable. Computer screen dashboards or widgets Smartphone and alerts?? 69

70 70

71 71

72 72

73 Management Dashboards: Steps to Success Step One: Select metrics for the dashboards Use select KPIs or other performance measures for the management team dashboard Select job specific metrics for managers and other operational staff 73

74 Management Dashboards: Steps to Success (cont.) Step Two: Select a reporting services tool to create your dashboards: Warning: You ll encounter terms such as BI (business intelligence), OLAP (online analytical processing), data mining, DSSs (decision support systems), EISs (executive information systems), digital dashboards, enterprise portals, and enterprise data buses The purpose of these tools is to simply "get out" what was "put in to your data systems. 74

75 75

76 Management Dashboards: Steps to Success (cont.) Step Three: Test drive the dashboards and train staff how to use them. 76

77 The Must-Have Management Reports Five Categories of Management Reporting You Can t Live Without! 77

78 The Must-Have Management Reports #1 Access To Care Reports # of referrals / % accepted for admission # of admission denials by reason Waitlist information Time metrics for access to care #2 Revenue / Productivity Reports Census to budget Unit-fee productivity reports 78

79 The Must-Have Management Reports (cont.) #3 Financial Reports Averages days in A/R Collection % Bad debt percentage, dollars, by reason Late charges Income statement, budget to actual 79

80 The Must-Have Management Reports (cont.) #4 Outcome & Satisfaction Reports Outcomes Successful discharges #5 Compliance Reports Records compliance Critical incidents 80

81 II. Steps in Designing Management Metrics System 81

82 Steps for Implementing the Four Cornerstones of Metrics-Based Management #1 Routine Management & Operational Reports Analyze business needs and find or develop required reports, focusing on the must-have reports first! Ensure easy access to reports and train staff in their use #2 Key Performance Indicators As part of your next strategic planning or budgeting process, select a balanced scorecard of KPI metrics Implement, train, and modify as needed 82

83 Steps for Implementing the Four Cornerstones of Metrics-Based Management #3 Benchmarking & Performance Targets For select KPIs, select performance targets and benchmark with other organizations if possible. #4 Management Dashboards & Alerts For key staff positions, focus management efforts on performance targets by displaying dashboard data as frequently as possible! 83

84 Steps for Developing KPIs Start with strategy and strategic plan Create performance measures from strategic plan: quantifiable objectives, timelines, assignments, and budgets Design the metrics of success (the KPI ) in terms of those measures Create reporting formats for these metrics On-going monitoring and proactive management based on KPI 84

85 III. Case Studies in Metrics-Based Management 85

86 Case Study #1: A Typical Balanced Scorecard Examples from a large rural behavioral health provider in the southwest United States 86

87 Financial Metrics Revenues by line of business Expenses by line of business Profit/loss by line of business Unit cost by service by site 87

88 Customer Metrics Number of referrals by referral category Discharges by reason category Number of complaints Number of critical incidents Return rate after intake by site 88

89 Internal Operational Metrics Number of new hires Number of terminations Average number of days positions vacant by site by category Percent claims denied Number of crisis responses, after hours Number of crisis responses, daytime Percent of crisis responses resulting in hospitalization 89

90 Internal Operational Metrics (cont.) No show rate for intake No show rate for MDs Percent of consumers served by out of region contract providers by line of business Percent of staff who met clinical productivity requirements by site Rolling turnover rate (3-month) by site by category Number of new positions Number of vacancies 90

91 Innovation Metrics Number of linkages and new resources in community Number of community ed hours Number of community ed participants Number of PR events Number of staff development hours Number staff development participants Number of positive media mentions 91

92 Case Study #2 $20 million catholic family service agency Chose to measure performance from three perspectives: #1 Management team #2 Board of directors and key funding organization #3 Community report card 92

93 Management Team & Board KPI Two types of metrics: organization-wide indicators and program specific indicators Organization-wide indicators Revenues Expenses Net profit (loss) Staff vacancies by department/program Staff hires and terminations 93

94 Management Team & Board KPI (cont.) Program-specific indicators Number of new cases opened Number Of cases closed Unsuccessful discharges by outcome Client satisfaction indicators 94

95 Building a Community Report Card Strengthening families and our community Number of clients service by program Number/percent of church clients served by program Number of community trainings and education programs Number of participants in community training and education programs Number/percent of clients receiving service subsidy Number/percent of church clients receiving service subsidy 95

96 Building a Community Report Card (cont.) Achieving results for the community Client satisfaction indicators Successful discharges, by program 96

97 Case Study #3 Residential and Community-Based Children and Family Service Organization in Midwest 97

98 Interactive Key Performance Indicator Spreadsheet

99 More Examples of KPI Templates & Dashboards 99

100 Strategic Intent: The leader in translating research into effective practice solutions. Stakeholder Financial S1. HFA s system of care produces positive outcomes. F1. Diversify revenue streams S2. Local communities experience a sense of ownership of Hillside. F2. Develop a stronger system of care through growth. Customer Operational Excellence C1. EB practices achieve effective outcomes. Innovation C2. Create customer solutions. C3. Preferred provider of services Partnerships W3.EB s practices are delivered according to design W7. Service growth includes research grants. W10. We understand Strategic Partners needs. W2. User-friendly systems provide accurate data W1. Efficient standard work processes drive productivity. Internal Work Processes W5. Service models are based on research. W6. We infuse innovation across our value chain. W4. Hillside has research partnerships. W9. Strong parent and youth advocacy W8. Communication exceeds customer expectations. L1. IT systems add value and ease complexity Learning and Growth L2. Organized system shares knowledge and learning. L3. Research infrastructure supports HFA system of care. L5. Behaviors consistent with COI values. L4. Hillside is a destination for diverse talent. HFA BSC FY0809, rev

101 101

102 Deliver Our Mission Excellence & Innovation Strategic Themes Provider, Employer, Charity of Choice Strategy Map Finance Grow Non-operating Revenue Increase Margins & High Margin Business Customer Strong Brand Significant Player in Operating Markets Internal Processes Program Quality Lean Corporate Infrastructure Build A/R Capabilities Know Key Customer Drivers Innovative Service CoreCompeten cy Innovate & Grow Positively Aligned & Accountable Retain, Recruit, Train the Best Staff Access to Strategic Info 102

103 103

104 104

105 105

106 106

107 107

108 108

109 109

110 110

111 111

112 112

113 113

114 114

115 115

116 Questions & Discussion 116

117 Upcoming Educational Events Human Services Leadership Academy Seminar: Financial Risk Management Best Practices for Executive Teams & Boards, July 18, 2011, Ewing, New Jersey Human Services Leadership Academy Seminar: Technology in Behavioral Health & Social Services: What Your Organization Will Need & How Much It Should Cost, July 26, 2011, Ewing, New Jersey The 2011 OPEN MINDS Executive Leadership Institute: Discover the Lessons of Great Leaders in Times of Crisis, October 5-7, 2011, Gettysburg, Pennsylvania The 2011 OPEN MINDS Institute for Behavioral Health Informatics: The 7 th Annual Conference on the Future of Technology in Behavioral Health, October 19-21, 2011, Baltimore, Maryland The 2012 OPEN MINDS Best Management Practices Institute: Finance and Technology to Maximize Performance, February 16-17, 2012, Clearwater Beach, Florida 117

118 Bringing The Management of Behavioral Health & Social Services Into Focus York Street, Gettysburg PA