You Didn't Use Brainstorming to Select Your KPIs, Did You?

Similar documents
Leading Performance Without Positional Power

Evidence- Based Leadership

Are You Cascading Your Strategy, Or Fragmenting It?

Measure What Matters How to end your performance measurement struggles and reach your goals sooner and with less effort.

Measure What Matters

The Bad KPI Habits Causing Your Performance Measurement Struggles

Putting our behaviours into practice

PROVE IT! Book Launch

Risk Culture. Reflections of Risk Managers March Sally Bennett Managing Director Enhance Solutions

PROVE IT HOW TO CREATE A HIGH-PERFORMANCE CULTURE AND MEASURABLE SUCCESS STACEY BARR

PRODUCTIVITY IN THE WORKPLACE: WHAT S THE REAL PROBLEM?

BLUEPRINT. The PuMP. Great measures focus people on the results that matter most. But most organizations don t have great measures

introduction by Stacey Barr

getting the most out of the middle thought paper

Holding Accountability Conversations

30 Course Bundle: Year 1. Vado Course Bundle. Year 1

15 TIPS ITSM FOR By Stuart Rance

The PuMP Primer Webinar Series Lesson 1: The Top 5 KPI Questions Strategy & Performance Professionals Ask Most And The Critical One to Answer First

How to Pitch Content Marketing to Your Boss

1) Mo People Mo Betta

BUY BUTTON OR BACK BUTTON HOW WEBSITE WRITING CAN ENGAGE OR DETER CUSTOMERS

Become A Change Champion

4-Step Process. PAUL CHERRY Author of Questions That Sell. Questions to Help Unleash Your Team s Goals for the Future

The Language of Accountability

STOP. COLLABORATE & LISTEN. EIGHT BEST PRACTICES FOR IMPROVING COLLABORATION IN THE PROPOSAL PROCESS

A summary of the principles from The Speed of Trust Book:

your guide to boosting booth presence

Stepping Forward Together: Creating Trust and Commitment in the Workplace

How Do I Find the Right CRM for My Business?

Talking with Consumers

Impactful 1:1 Meetings

Three steps to joining and participating in unions

Small business guide to hiring and managing apprentices and trainees

LEADERSHIP COMPETENCY FRAMEWORK

By: Aderatis Marketing

Testimonial MAINSTREAM FIBER NETWORKS. We interviewed: LYNN GABRIEL CHIEF OPERATING OFFICER. Mainstream Fiber Networks - A Procurify Success Story

INTRODUCTION. Choices. Supply and Demand. Economics. Introduction Guide Questions to Consider

BROUGHT TO YOU BY: Instagram Growth GUIDE

Communicate business value to your stakeholders

How Account Aggregation Can Lead You to Heaven or Trap You in Hell

Webinar Wealth. Webinar Template

10 Key Components for a Winning Candidate Experience

Seven Principles for Performance

Brand Fabrik. Let s stick together. Creating a more joinedup approach to marketing activity in the further education sector

The slightest perception of something negative happening can affect an employee s emotional state.

How to Create a Thought-Leadership Program

Making Partnerships Work

LEX. A Better Way for In-house Lawyers. Justin Hansen. Lex Australia Pty Ltd Suite 1, 514 Glenferrie Rd Hawthorn, Victoria 3122

7 Steps for Building an Effective CSR Program

How we re listening to our stakeholders

introduction by Stacey Barr

Putting non-service employees on the phones

XpertHR Podcast. Original XpertHR podcast: 25 January 2018

How to Hire a Consultant

By Lynn Scheurell, Copywriter and Business Catalyst

Turning Feedback Into Change

Toolkit. The Core Characteristics of a Great Place to Work Supporting Framework and Tools. Author: Duncan Brodie

YOU VE BEEN WARNED THE BRIC TALENT ISSUE

(800) Leader s Guide

IT Turning Point. Winter Courses Courses to Create Turning Points... Turning Points in IT use and understanding

The Stages of Performance Measurement Excellence

The best Paralegal interview questions you ve not been asking

Improving Employee Performance: How Technology Can Help

Meet the Author 3. Introduction 4. What are the attributes you look for when you hire a new salesperson? 6

The Leadership Secret of Gregory Goose

Was there a particular moment when you two knew you had to start Lola?

ABB s Net Promoter Score: Why we re not satisfied with customer satisfaction...

Worst Practices in ACCOUNTS RECEIVABLE

A Capstone Conversation With The 2011 Marketer of the Year. Raissa Evans Executive Manager of Practice Growth PKF Texas

Law firms & the 7 Ps. Why is there no real legal marketing?

cambridge Institute for Family Enterprise

Peter Taylor The Lazy Nimble PMO

EXECUTIVE SUMMARY. Union Metrics unionmetrics.com

GUIDE. A Modern Communicator s Guide to Corporate Communications

Linda Carrington, Wessex Commercial Solutions

Inclusive DRM toolkit

Innovative Marketing Ideas That Work

EXECUTION. The Gap Nobody Knows. The Authors THE DISCIPLINE OF GETTING THINGS DONE. In This Summary

You can t have authentic leadership without this one crucial thing

Brand architecture made easy

Millennials are crowdsourcingyouhow companies and brands have the chance to do

Lesson 16. Turn On The High Beams...Continued. Another major area where our nearsightedness

ENGAGED EMPLOYEES OWNERSHIP

7 TIPS TO HELP YOU ADOPT CONTINUAL SERVICE IMPROVEMENT, BY STUART RANCE 1

How to Give Effective Performance Reviews The most effective reviews are less a formality and more a conversation. By Charles A. Volkert, Esq.

Let us introduce you to Course Match

Changing Customers Expectations in the Water Industry by Scott J. Rubin. Good morning. It s a pleasure to be here with you this morning.

How to Gain Competitive Advantage on Amazon For More Sales

A BUYER S GUIDE TO CHOOSING A MOBILE MARKETING PLATFORM

A Business Agility ebook. Records Management within SharePoint: A guide to best practice

1 SEO Synergy. Mark Bishop 2014

A GUIDE TO REVIEWING OUTSOURCED CONTRACTS

BALANCE MAGIC. Understanding the Language of the Markets. Part 2. Written By Michael J. Parsons. All Rights Reserved

Ideal Interview Process

LOOKING BEHIND THE NUMBERS: HOW ARE YOUR STATISTICAL ETHICS?

IN A REAL ESTATE WORLD

The CRM Pocket Book. What works & what doesn t & CX

Business Assessment. Advisor Tool Galliard, Inc. All Rights Reserved

7 MISTAKES MOST LOCAL BUSINESSES ARE MAKING WITH THEIR ADVERTISING

8 Tips to Help You Improve

Transcription:

You Didn't Use Brainstorming to Select Your KPIs, Did You? The secrets to designing performance measures and KPIs that far better than brainstorming could ever produce.

Contents There are 5 common ways people select performance measures. 3 But these common ways are limited. 3 Measure design needs a better methodology. 8 Overview When Alex Osborn invented the creativity technique called brainstorming, I wonder if he had any idea just how extensively business would apply it. Almost every meeting employs some kind of brainstorming event, but there s one meeting that really should leave it off the agenda: the performance measure selection meeting. Page 2 of 11

There are 5 common ways people select performance measures. The selection of performance measures has never really been treated as anything more than a trivial, and often pesky, decision brought around by the annual business planning workshop. Usually people will take the fastest route to finalising a list of performance indicators in the KPI column of their business plan, and depending on your organisation, the fastest routes are usually some combination of the following: brainstorming, where participants just list as many potential measures as they can think of and then do some kind of short-listing benchmarking, or some other version of adoption (copying) measures from other organisations using existing data or measures, to save the costs of measuring something new, and having to collect the data measuring what stakeholders tell us to measure listening to what the experts in our industry have to say what they know we should measure Each of these methods certainly has some conveniences, but we often forget to examine the drawbacks. And these drawbacks far outweigh the conveniences. But these common ways are limited. If you re using one of the following methods to select your measures and KPIs, then it s likely you don t have the measures you need. You need a different way of thinking about measure design. Page 3 of 11

Brainstorming seems quick, but is really very hit-andmiss. Probably the most common approach taken to decide what to measure, brainstorming is the easy way out of an activity many people dread. Quality in equals quality out. A process that was designed for creativity and not measure design will not produce useful and usable measures. Seems quick. Lots of ideas for measures can be generated rapidly. Collaborative ideas - two heads are better than one. Easy to do, no special knowledge or skill is required. Engages people to be part of the measure selection process. A known/accepted approach, so the process doesn t get in the way of doing the activity. All ideas are idered/accepted, which helps people willingly participate. Not really finished after the brainstorming is over how to get a final selection of measures is vague. There is more to measurement than just selecting measures thought about how to bring the measures to life is also needed. Too much information is produced, therefore too many measures often results. Ideas are not vetted or tested, our thinking is not challenged. We often are brainstorming against different understandings of the same objective/goal we want to measure. The bigger picture is not taken into account e.g. unintended equences, relationships to other objectives/goals, silo thinking. Often what is brainstormed is not really a measure at all instead it is an action, a milestone, a piece of data, a vague fluffy concept. Measure design needs to produce a few measures that have been thoroughly tested for their relevance or strength in tracking the goal or result they are selected for, and are supported by the people that they will affect. Page 4 of 11

Benchmarking is convenient, but ignores strategic uniqueness. Benchmarking is about finding out what another organisation is doing, and this almost always involves or is based on some comparison of performance measures. If organisations share the same measures, then benchmarking is certainly easier to do, but there are equences of adopting a bolt on set of performance measures. We feel safe & secure because others have gone before us. Others have (we assume) already put a lot of thought into those measures why reinvent the wheel? We can compare our performance with the performance of other similar organisations. We get a feeling of how good (or better) we are compared to others. It s easy just have to look and ask. Easier to justify to others why we are measuring what we are measuring. Widely accepted approach. There is more to measurement than just selecting measures. Not always collaborative so little buy-in by people who will produce and use the measures. Not always like for like (apples with apples) in fact, probably never is to the extent we assume. Isn t driven by the decisions we need to make and the information we need for those decisions. Doesn t challenge our thinking. It makes us bring some other organisation s strategy to life, not what is right for us (aren t we unique?). The goal posts change more frequently than the benchmarking process occurs. Our bigger picture is not taken into account, such as how this area of performance affects others in our organisation. Selecting measures against different understandings of the outcome to measure. Measure design needs to produce measures that encourage learning and sharing of knowledge, but not at the expense of discovering and focusing on the unique business strategy that best suits the organisation. Page 5 of 11

Data availability makes it cheap, but focuses on yesteryears strategy. What data do we have? What have we measured in the past? What are we already measuring? All questions that are symptomatic of an organisation that is not open to challenging whether the data they are collecting really is capable of telling them what they need to know about where they are going. Measure what you have always measured, get what you have always gotten. What s strategic about that? Very easy, very quick. Known data sources mean low cost in data collection/capture already have systems to support it. People more likely share a common understanding of measures already. Consistency in information over time, to have valid comparisons over time. Have historic data available for trend analysis. We only bring yesterday s (or yester-year s) strategy to life. Rarely challenge the measure itself, so no better measures are explored (and therefore no better data will ever be collected to manage emerging strategic risks and opportunities). Not collaborative, because it is from previous thinkers, not today s doers. Bigger picture is not taken into account. Parts of our strategy that are new will go unmeasured. Measure design needs to produce measures that are cost-effective and have some historic data before too long, but must produce measures that will take the organisation toward its vision, not drag it back into its past. Stakeholders need information, but that s not the same as performance measurement. What s imposed on an organisation by regulators, shareholders, government, industry bodies and other stakeholders is often idered a traint on the measures it can use to manage its performance. They struggle to retrofit the stakeholder-chosen measures to their Page 6 of 11

strategy, or renegotiate the stakeholder-chosen measures. But these aren t the only two options. This method of measure selection is not really measure selection at all. We get told what to measure, and don t have to do the hard work ourselves. We give them what they want and thus we won t get into trouble. Often can be negotiated resourcing by the government group or stakeholder imposing it. Get higher management commitment to the need (and therefore to data collection and reporting) it will get done. Our governance requirements are more likely to be met (assuming we report these measures properly). There is more to measurement than just selecting measures. Encourages an autocratic/ patriarchal management style. The imposers don t understand our strategic direction/don t trust that we do. Isn t driven by the decisions we need to make and the information we need for those decisions. Lack of ownership by us of those measures (and the results they track). Bigger picture is not taken into account. The focus may not be the right focus or the only focus that matters. Parent-child (instead of partner) relationship with stakeholders could become the norm. Assumes that the stakeholders have robust methods of designing meaningful measures. Measure design needs to produce measures that are relevant to the organisation s strategic direction, a direction that is understood supported by its key stakeholders. But the measure design process can also be used to design reports to stakeholders that are largely separate to its organisational performance measures. Experts have experience, but can be locked into onesize-fits-all. Industry experts, ultants, people with years of experience or self-nominated experts all carry a mystique of knowledge and wisdom that can make their ideas about what to measure sound more like truth than suggestion. Page 7 of 11

We get told, and don t have to do the hard work. A focus is quickly clear. Experts can bring new ideas and experience we may not have. Have approaches that have worked for other organisations. The focus may not be the right focus or the only focus that matters for us. We usually don t challenge experts even though we have intimate knowledge of our unique business. Experts can assume we don t need to know their thinking behind the measures, so we don t learn how to think more wisely about the measures for ourselves. We may not really understand the measures that are recommended to us. Experts often cost a lot of money. Experts may not understand our organisation enough to know our uniqueness (the one size fits all problem). Experts may not take account of how feasible it might be (or not) to bring those measures to life in our organisation. Measure design needs to produce measures that we understand and have ownership of, and be a process that allows us to continue refining and refreshing our selection of measures as our performance and strategic direction changes. Measure design needs a better methodology. The reason that the quick and easy methods above are used to select measures is the same reason that performance measurement is a dreaded event: people have no idea that measure design is a process of dialogue around the goals or objectives they want to measure. It is a way for them to engage in a deeper understanding of the results they are really trying to achieve, and how they would be convinced as to the degree to which they have achieved those results. Page 8 of 11

And that s why we ve created an alternative method to measure design. A method that produces measures which: are few, not prolific have been thoroughly tested for their relevance or strength in tracking the goal or result they are selected for are supported by the people that they will affect encourage learning and sharing of knowledge assist the discovering and focusing on the unique business strategy that best suits the organisation are cost-effective have as much historic data as feasible will take the organisation toward its vision, not drag it back into its past have resulted in people (including stakeholders) having a deeper and richer understanding of what results the organisation is really trying to create people understand and have ownership of This method is based on a few important principles: 1. You have to start with the goal or objective or result you want to achieve (and thus measure). Stay focused on this particular result while you are designing measures for it (don t let your attention wander to other results). 2. Describe this result in a lot more detail, explaining what you and others would see, hear, feel, be doing (or even taste and smell!) if that result were happening now, explain the differences you would notice. 3. List the potential things you could count or measure that would give you evidence of how much each of those differences was actually occurring. 4. Choose the best measures don t wait for perfect measures. This method, just like all the traditional methods, has its own and as well: Page 9 of 11

It produces very few and very meaningful measures. Thinking about strategy and measures is challenged and tested. The feasibility of measures is assessed. It is a cious and deliberate approach that people have control over. Measures strongly link to the specific strategic outcomes that are to track. The bigger picture is taken into account and links and relationships with other measures are automatically identified (eg cause-effect, companion, etc ). Very collaborative because it is dialogue based, therefore high ownership results. Measures are not vague ideas but very evidence based. The measure is designed from the context of how it will be used. Can deliberately test the authenticity of each measure relative to its goal/objective. The type of language used to design the measures promotes a common and shared understanding of the result to be created (and this makes it easier to communicate strategy to all staff). It s not easy the first few times through it s new and can potentially distract people if they forget what step they are doing and why (until they have been through it once or twice). It takes time it needs good quality dialogue to build a deeply shared understanding of what is being measured. It will need training or resources to teach people how to do it and why to do it. Sometimes strategy needs to be altered (as this approach often deepens understanding of implications or misunderstandings of chosen strategy/wording of). So, you no longer need to feel compelled to take the easy way out of selecting performance measures, and in a few simple steps, you can not only design useful and usable measures, but also deepen your understanding of the results you are trying to achieve! Page 10 of 11

About the author Stacey Barr Stacey Barr is a globally recognised specialist in organisational performance measurement. She discovered that the struggles with measuring business performance are, surprisingly, universal. The biggest include hard-to-measure goals, trivial or meaningless measures, and no buy-in from people to measure and improve what matters. The root cause is a set of bad habits that have become common practice. Stacey created PuMP, a deliberate performance measurement methodology to replace the bad habits with techniques that make measuring performance faster, easier, engaging, and meaningful. Stacey is author of Practical Performance Measurement and Prove It!, publisher of the Measure Up blog, and her content appears on Harvard Business Review s website and in their acclaimed ManageMentor Program. Discover more about Stacey and practical performance measurement at www.staceybarr.com. Copyright. Feel welcomed to email or print this white paper to share with anyone you like, so long as you make no changes whatsoever to the content or layout. Disclaimer. This article is provided for educational purposes only and does not titute specialist advice. Be responsible and seek specialist advice before implementing the ideas in this white paper. Stacey Barr Pty Ltd accepts no responsibility for the subsequent use or misuse of this information. Page 11 of 11