The Bad KPI Habits Causing Your Performance Measurement Struggles

Size: px
Start display at page:

Download "The Bad KPI Habits Causing Your Performance Measurement Struggles"

Transcription

1 The Bad KPI Habits Causing Your Performance Measurement Struggles To end your KPI struggles, you have to do things differently to how you ve been doing them.

2 Contents BAD HABIT 1: Using measures to judge people. 3 BAD HABIT 2: Using weasel words to articulate your goals. 4 BAD HABIT 3: Brainstorming to find KPIs or performance measures 4 BAD HABIT 4: Getting people to sign-off on their acceptance of the measures. 6 BAD HABIT 5: Rushing too quickly to data and dashboards. 7 BAD HABIT 6: Using performance reports to cover your bum. 8 BAD HABIT 7: Comparing this month s performance to last month, to the same month last year, and to target. 9 BAD HABIT 8: Using educating, resourcing & funding as improvement initiatives. 10 Overview One of the things that Albert Einstein was famous for defining, over and above E=MC2, was insanity: doing the same thing over and over again and expecting different results. Clearly, if you want to stop struggling to find meaningful performance measures, that align to strategy and engage people in improving performance, then you have to change what you re doing. Page 2 of 11

3 A few very fruitful changes you can make are to unlearn some limiting habits that you may not even realise are at the root of most performance measurement struggles. BAD HABIT 1: Using measures to judge people. Bad Habit #1 is to measure people, or use measures in a way that judges or assesses how people are performing. This also includes rewarding or punishing people based on information from performance measures. This is a bad habit because it causes people to protect themselves. And this usually means making excuses, fudging the figures, sweeping problems under the rug. It does not encourage true performance improvement. For example: In the railways, to keep the On Time Running measure looking okay, people would leave out cancelled trains in their calculation. And they would also cancel train services that were running late enough to affect their On Time Running measure! In call centres, Average Call Handling Time is a common measure. What do you think a call centre operator is going to focus on when they are trying to help you with your problem, if they have a target to keep average call durations under 3 minutes? Page 3 of 11

4 BAD HABIT 2: Using weasel words to articulate your goals. The problem with strategy in most organisations and companies is that, in its sanitised and word-smithed published form, it s not measurable. Look at any strategic plan and the chances are astronomically high that you ll see a glut of words like effective, efficient, productive, responsive, sustainable, engaged, quality, flexible, adaptable, well-being, reliable, key, capability, leverage, robust, accountable. They are empty words that sound important and fail to say to anything at all, or at least speak of anything that can be verified in the real world, or measured. It s no wonder people with goals or objectives like the following keep asking how do you measure that? Provide efficient, unique, unbiased and responsive, high quality support Strengthen student engagement and learning outcomes by enhancing student support and intervention services Support general practice to facilitate and encourage optimal and culturally appropriate health care of indigenous persons The new habit to replace the weasel-word habit with is to write goals in clear and simple language that evokes in the minds of people an accurate picture of successful achievement of the goal. If you can t see it in your mind, you won t be able to measure it. BAD HABIT 3: Brainstorming to find KPIs or performance measures For the most part, people are not that conscious or aware of the approach they take to find or choose performance measures. Page 4 of 11

5 Brainstorming is the most common approach to select measures, but it s as useful as a chocolate teapot. People try and select performance measures for a particular critical success factor or key result area or objective or goal, or for their function or process, by asking a simple question: So, what measures could we use? Everyone sits around and randomly suggests potential performance measures for that particular critical success factor or key result area or objective or goal or for their function or process. They might produce a list that looks something like this if they were brainstorming measures for staff engagement: Turnover Sick days Retention Rate Introduction of talent management Overtime Staff Survey Engagement Index Staff satisfaction with their job Leadership development Performance management Brainstorming can generate lots of ideas for measures quite rapidly, it s easy to do, no special knowledge or skill is required, it s familiar so it won t be distracting, and it can be very collaborative and engage people to be part of the measure selection process. But brainstorming rarely produces good measures. The truth is you re not really finished after the brainstorming is over because you still have to work out how to get a final selection of measures from that long list. And in all honesty, voting or ranking the ideas and skimming the few that rise to the top is not the answer. Ideas for potential measures need to be vetted or tested to weed out the ideas on the list that are not measures at all ( introduction of talent management ), that aren t really relevant to the goal ( overtime ), and that are not feasible to implement. Instead of habitually brainstorming to find performance measures, think about listing potential measures that are the best observable evidence of the successful achievement of Page 5 of 11

6 your goal, and then choose the measures that are most relevant balanced with being feasible to implement. BAD HABIT 4: Getting people to sign-off on their acceptance of the measures. Performance measurement has such a terrible stigma. Many people associate it with the inane drudgery of data collection, with pointing fingers and with big sticks that come beating down on them when things go wrong. They associate it with the embarrassment of being compared with whoever is performing best this month. The emotions people typically feel about performance measurement are frustration, cynicism, defensiveness, anxiety, stress and fear. What we really want is for performance measurement to be seen as a natural and essential part of work. We want people to associate it with learning more about what works and what doesn t, with valuable feedback that keeps us on the right track, with continuous improvement of business success. We want people to feel curiosity, pride, confidence, anticipation and excitement through using performance measures of performance results that matter. This is buy-in, not sign-off. Buy-in is a natural product of showing people that measurement is about feedback, not judgement; of giving them tools that make measurement easy and fun; of allowing them to decide the measures most useful for their goals. Of course you need to stay sensitive to the fact that measurement of performance is an organisation-wide system, and each team is only a part of that system. But the trade-off should be biased more toward their buy-in than it is toward sophistication of the measures. You can improve the sophistication of measures on a foundation of buy-in more easily than you can get buy-in to a suite of sophisticated measures. Page 6 of 11

7 BAD HABIT 5: Rushing too quickly to data and dashboards. In general, a lot of effort is wasted in bringing performance measures to life. The waste is in the time spent to select measures that are never brought to life, or in the time spent bringing measures to life in the wrong way. Thus the labour of bringing many measures into the world is far more excessive and painful than it needs to be. This, as you no doubt have experienced, breeds cynicism, overwhelm and disdain for and disengagement in the process of measuring anything, let alone measuring what matters. People argue about data or measure validity instead of making decisions about how to improve performance. Measures misinform and mislead decisions due to the wrong calculation or analysis being used. Too many conflicting versions of the same measure result from duplication and lack of discipline in performance reporting processes and causes confusion and cynicism about the value of measures. This was the case for Martin, a manager in a freight company, who was receiving 12 different versions of a measure of the cycle time of coal trains from a range of business analysts throughout his department. And because no two of these 12 different measures matched, he had no idea which one was the true and accurate measure of cycle time. Martin had 12 measures and no information. Defining a performance measure means fleshing out the specifics of its calculation, presentation, interpretation and ownership. And this is the new habit to learn to avoid making assumptions that result in waste and misinformation in performance measurement. Page 7 of 11

8 BAD HABIT 6: Using performance reports to cover your bum. If your performance reports are stacking up in a pile, unread and unused, then they're obviously not "stacking up" well as sources of invaluable insight to guide performance improvement. It s an emotional thing, performance reporting. Executives give up the precious little time they have for their families and 9 holes of golf to instead paw through piles of strategic reports often more than an inch thick. Or they leave the pile of reports on their desk and make decisions from their gut instead. Managers earnestly trawl through operational reports to check if anything needs a bit more positive light thrown on it. Supervisors and teams cynically scoff about the volumes of time and effort they waste reporting tables of statistics that track their daily activities to audiences they never see or hear from. Performance reports need to provide the content that truly matters most, and provide that content so it is fast and easy for managers to digest. But most performance reports are just the opposite: They are thrown together in an ad hoc way, making it very hard to navigate to the information most relevant or urgent. They are cluttered and cumbersome with too much detail that drowns out the important signals with trivial and in-actionable distractions. Their information is displayed poorly, using indigestible tables and silly graphs that are designed with entertainment in mind and unwittingly result in dangerous misinterpretation of the information. The layout is messy and unprofessional, wasting visual real estate, detracting from the report s importance and disengaging users before they find the insights they can use. The habit of using performance reports to justify our existence has to stop. Rather, we ought to get comfortable with making performance reports answer only three simple questions: What s performance doing? Why is it doing that? And what response, if any, should we take? Page 8 of 11

9 BAD HABIT 7: Comparing this month s performance to last month, to the same month last year, and to target. Managers at one of the major saw mills in a timber company thought their performance dashboard was the duck s guts. It tracked a multitude of various performance measures about how the saw mill operations were going, and the data update for many of these measures was almost live, so their dashboard could update very regularly. Traffic lights red, green and yellow visual flags that summarise if performance is bad, good or heading toward unacceptable were also updated each time the data feed refreshed. The managers and supervisors would react to these traffic lights with unnecessarily large interventions, like changing the settings on timber processing equipment or altering work procedures. These traffic lights were changing based on data often from a small sample, like a day. Long term trends and natural variability in the data was ignored. Rather than looking for signals in their data, they were reacting to the noise; reacting to any variation in the data at all. Everyone was so busy reacting to data, the key elements that did need changing were left unattended and therefore the overall performance worsened. Performance will always vary up and down over time you can safely assume that comparing any two points of performance data will always reveal a difference of some magnitude. Drawing a conclusion about whether performance has changed by comparing this month to last month is tantamount to making things up as you go along. The insightful conclusions - which will lead you to act when you should and not act when you shouldn t - come from the patterns in your performance data. Insights do not come from comparisons between the points of your performance data. Unlearn that bad habit of making point-to-point comparisons and performance will improve. Page 9 of 11

10 BAD HABIT 8: Using educating, resourcing & funding as improvement initiatives. Performance measurement is a process that weaves through your existing management processes. It doesn t stand alone and apart from them. The steps of selecting performance measures for goals or objectives need to be performed as part of the strategic and operational planning processes, or you end up with goals and objectives that are vague and immeasurable. Reporting processes, including the business intelligence systems, data warehouses and information dashboards that support them, need to quickly refocus on the data and information for new performance measures and their cause analysis. Strategy execution needs to be informed by the current performance measures and their targets, and the strategies themselves need to be changed when the measures show they aren t working. Performance measurement isn t something you do after your strategic plan is cast in stone, and just in time for the annual review. It s not something we do for bureaucratic reasons. We do it because it provides the feedback that about how well we re achieving the endeavours we chose to pursue. If those endeavours are important enough to pursue, they are too important not to measure, and measure well. Page 10 of 11

11 About the author Stacey Barr Stacey Barr is a globally recognised specialist in organisational performance measurement. She discovered that the struggles with measuring business performance are, surprisingly, universal. The biggest include hard-to-measure goals, trivial or meaningless measures, and no buy-in from people to measure and improve what matters. The root cause is a set of bad habits that have become common practice. Stacey created PuMP, a deliberate performance measurement methodology to replace the bad habits with techniques that make measuring performance faster, easier, engaging, and meaningful. Stacey is author of Practical Performance Measurement and Prove It!, publisher of the Measure Up blog, and her content appears on Harvard Business Review s website and in their acclaimed ManageMentor Program. Discover more about Stacey and practical performance measurement at Copyright. Feel welcomed to or print this white paper to share with anyone you like, so long as you make no changes whatsoever to the content or layout. Disclaimer. This article is provided for educational purposes only and does not constitute specialist advice. Be responsible and seek specialist advice before implementing the ideas in this white paper. Stacey Barr Pty Ltd accepts no responsibility for the subsequent use or misuse of this information. Page 11 of 11