introduction by Stacey Barr

Similar documents
introduction by Stacey Barr

Seven Principles for Performance

Should you measure individual people's performance? two schools of thought on using performance measures to manage people in organisations

8 ways to make your survey useless if you are making any of these mistakes with your surveys, you might as well throw away the data and start all over

Chart Recipe ebook. by Mynda Treacy

BALANCE MAGIC. Understanding the Language of the Markets. Part 2. Written By Michael J. Parsons. All Rights Reserved

The 10 Big Mistakes People Make When Running Customer Surveys

Innovative Marketing Ideas That Work

(800) Leader s Guide

Are You Cascading Your Strategy, Or Fragmenting It?

Forecasting Introduction Version 1.7

You Didn't Use Brainstorming to Select Your KPIs, Did You?

SCALING LAND-BASED INNOVATION GROUP DECISION-MAKING TOOLKIT

The slightest perception of something negative happening can affect an employee s emotional state.

Peter Taylor The Lazy Nimble PMO

30 Course Bundle: Year 1. Vado Course Bundle. Year 1

32 BETTER SOFTWARE JULY/AUGUST 2009

A Just Love Guide to Succession & Handover

The 10 Important Things To Be and Do

Applying Lean Principles to Your Business Processes 6 Simple Steps to More Business Insight, Control and Efficiency

THE INBOUND MARKETING HANDBOOK FOR FURNITURE MANUFACTURERS

Campaigns - 5 things you need to know. 27 Signs You Need A New Agency. What the AdWords Update Means for Your Paid Search Strategy

TEST AUTOMATION METRICS WHAT DO YOU REPORT ON?

7 Interview Questions You Need to be Asking

What happens after your Ofsted inspection?

Agile TesTing MeTrics Quality Before Velocity

Getting Your Performance Measures and Reports To Mean Something Interview with Stacey Barr The Performance Measurement Specialist

Leading Performance Without Positional Power

Drive Predictability with Visual Studio Team System 2008

TEAM ALIGNMENT TRUST INSIDE. report. assessments. for Team Name January 30, 2010

The Myth of Low Engagement Marc Effron

The greatness gap: The state of employee disengagement. Achievers 2015 North American workforce survey results

10 Things To Never Say

TickITplus Implementation Note

Law firms & the 7 Ps. Why is there no real legal marketing?

Addressing the Challenges of Medical Content Authoring

Technical Services Consultant, McDowell Consulting

SWISS SUSTAINABILITY WEEK. MARKETING CONCEPT FOR LSWs

HOW TO GET PEOPLE TO BUY YOUR PRODUCT OR SERVICE USING THE 7 NEW RULES OF SELLING IN 2010

5 Digital Marketing mistakes that Insurance Brokers make, and how to avoid them

The nine keys to achieving growth through innovation By Dr Amantha Imber

What Is The Easy JV Money Program?

developer.* The Independent Magazine for Software Professionals Automating Software Development Processes by Tim Kitchens

STATISTICAL TECHNIQUES. Data Analysis and Modelling

CRM Success Tips. This guide is the intellectual property of LookOut Software Inc.

Metrics For The Service Desk

Business Result Advanced

Articulate Storyline 2 for PC.

Before You Start Modelling

Inbound Certification

Types of Process Methodologies

MAKING SENSE OF DATA

8 Tips to Help You Improve

Putting non-service employees on the phones

IS YOUR JOB ANALYSIS DOING THE JOB FOR YOU?

Module - 01 Lecture - 03 Descriptive Statistics: Graphical Approaches

Daily Optimization Project

In this video I want to share with you some thoughts about strategic or big picture planning. Thinking long term about some difficult topics that

XpertHR Podcast. Original XpertHR podcast: 29 September 2017

XpertHR Podcast: Gender pay gap reporting - what we have learnt so far. Original XpertHR podcast: 19 th October 2018

Internet Options. Building Relationships: The Internet s Real Purpose. Growing Wisely Using The Internet

How to Pitch Content Marketing to Your Boss

A successful implementation of an ERM dashboard - Remko Riebeek

The PuMP Primer Webinar Series Lesson 1: The Top 5 KPI Questions Strategy & Performance Professionals Ask Most And The Critical One to Answer First

An Effective AP Policies & Procedures Manual can Help Move AP from Good to Great!

Data Privacy. May 2018

Chapter 5: A Redesigned Cup Priscilla s Challenge

ERP Project Toolkit: The Ultimate Checklist and Tools to Start your ERP Project [Part 3]

Involve your team in continuous improvement: Content guide

Mobile Marketing. This means you need to change your strategy for marketing to those people, or risk losing them to your competition.

1) Mo People Mo Betta

A BUYER S GUIDE TO CHOOSING A MOBILE MARKETING PLATFORM

Impactful 1:1 Meetings

CAREER PLANNER WORKBOOK

15 tips for managing negative reviews and difficult feedback. Wake up to Booking.yeah

PRACTICE SOLUTION. They love us on GET MORE LOVE. Leveraging Yelp to Enhance Your Online Reputation and Grow Your Practice.

FOUR SOCIAL MEDIA TACTICS EVERY REAL ESTATE AGENT NEEDS

GDPR UNIQUEULOGY. Hello. If you re working in the funeral sector, this is what you need to know about the General Data Protection Regulations

Implementing a Project Infrastructure

THE BRANDING A COMPLETE GUIDE TO DEVELOPING A STRONG BRAND IDENTITY + BROUGHT TO YOU BY OKSANA RADIONOVA FOXYOXIE.COM

The Single, Easiest Branding Thing To Do On The Internet for Financial Practices. A free report provided by USA Financial

DON T FORGET ABOUT MEASUREMENT. Written by: Miko Kershberg, WSI Digital Marketing Expert

Advice for Career Planners Helping Their IT Field Customers Using LinkedIn August 10, 2015

Employee engagement. Introduction. benchmark trends report. Our ETS benchmark

Achieving More with the Career Framework

PROVE IT! Book Launch

10 AM 2 PM. Either goes home right at 4 and comes back sometimes, or stays altogether

Managers at Bryant University

PHYSICAL EDUCATION. Analysing and Evaluating Performance (AEP) Task GCSE (9 1) Exemplar Candidate Work. J587 For first teaching in 2016

CHAIR Welcome. Would you like to make an opening statement?

THE HR GUIDE TO IDENTIFYING HIGH-POTENTIALS

MAKING ORDER OUT OF CHAOS CAN JOB DESCRIPTIONS REALLY HELP? thought leadership

Risk Culture. Reflections of Risk Managers March Sally Bennett Managing Director Enhance Solutions

Organizational Change Management for Data-Focused Initiatives

ADWORDS INDUSTRY BENCHMARK REPORT

Improving. the Interview Experience

Unit QUAN Session 6. Introduction to Acceptance Sampling

Changing Customers Expectations in the Water Industry by Scott J. Rubin. Good morning. It s a pleasure to be here with you this morning.

Creating Your Value Proposition

The Real Estate Philosopher

Transcription:

The business questions your performance measures should you can't make informed decisions if the information you're using can't your questions by Stacey Barr introduction The report design working group sat around the table, sifting through the draft strategic performance report to suggest how to make it more useful. Measure by measure they chatted and suggested and critiqued and debated: this one would look better if it was a bar chart, yeah, I like the three-dimensional bar charts, we should add another line to this chart because it would be interesting to show, it s pretty easy to get Excel to turn this one into a stacked bar chart, that way we could get more information onto it. Then someone asked: hang on, what questions are we trying to with these measures anyway?, and there was dead silence. Page 1 of 6

are you using this kind of performance information? Performance reports are most commonly filled with a combination of information like the following: tables of comparisons of this month with last month, this month with the same month last year, year to date with target the default bar charts that Microsoft Excel formats for you fancy three-dimensional, stacked bar charts other charts that are crowded with information that looks visually exciting and at best, might be interesting occasionally some short term trends, consisting only of 5 or 6 or barely a few more points of data commentary about project or initiative progress or milestone completion How many of these kinds of information are in your performance reports? How much of that information is read, valued, validly interpreted, understood and applied to inform decisions? How aligned to your organisation s strategic, tactical or operational priorities is this information? what s wrong with this kind of performance information? If you ed these three questions with quite a lot, not very much of it and not very well, then you d be fairly normal. Most organisations performance reports are created with only a basic awareness of good business statistics and even less of an awareness of the business questions the report should. Most of the information provided in these reports is in the form of limited comparisons, to borrow a phrase from Donald Wheeler (1). Limited comparisons can t really any business question, because they are not a representative picture of the performance results they monitor. This is due to the fact that there is always natural variation from month to month, week to week, day to day, and 2 points of data can never tell you what amount of variation is normal versus abnormal. What makes matters worse, is that often it seems this information is designed this way purely because it always has been. Rarely is the design of the information questioned or challenged. And rarer still, is the design of the information reviewed in the context of the business questions it must. Information design, particularly the visual design of quantitative information (Edward Tufte has written some amazing books on this very topic see references 2, 3 and 4), is a real body of knowledge, linking statistical theory with cognitive theory to provide insights into how we can make information more useful and usable in decision making. Page 2 of 6

what are the right business questions to? What is business performance management really about? Ultimately it s about business success, and providing the information to make the decisions that increase that success, now and into the future. Performance management is about three specific things. Firstly, it s about monitoring your business actual progress toward the outcomes (and targets) implied by your business strategy. If one of those outcomes is to increase customer loyalty, then it s about monitoring how much customer loyalty you have as time goes by (such as the average number of orders per customer per quarter), and comparing this actual level of customer loyalty with the targeted level (say 20 orders per customer per quarter). Secondly, business performance management is about knowing which of your initiatives or projects are working and not working in making those outcomes happen. If you have an initiative around developing a customer relationship management system to improve customer loyalty, then you would expect to see that customer loyalty increases the more the customer relationship management system is implemented and used. If you don t see a change in customer loyalty despite implement customer relationship management, then how can you say the system was working? You can t. Thirdly, business performance management is about knowing why those things are working or not working so you can choose better things to do, or fix the things you are doing. Perhaps the customer relationship management system isn t impacting customer loyalty because customers are already happy with their relationships with you, but just feel your products or services aren t as relevant as they expected. This description of business performance management suggests several specific questions are important in managing a business so it s success improves. Table 1: examples of excellent questions for business performance management Have we achieved our target? Are we progressing toward our target? Are their any unintended consequences of our actions? We monitor business performance so we can know when we are actually performing at the level we need to, or want to, perform at. Targets are the description of the need to or want to. Rather than just waiting to the end of the year, or the date we wanted to achieve the target by, monitoring continuously throughout that timeframe gives us more power to influence the end result. Chaos theory, the butterfly effect, and system thinking all tell us that there will always be some kind of flowon effect from our actions. These flow-on effects can be anywhere from small and insignificant, to shockingly dramatic. Page 3 of 6

Why are we getting the results we are getting? What is likely to happen in the future? This is a question that is seeking information about reasons. Of all the possible reasons that you are getting a particular performance result, which reasons are the main ones, those that have most of the impact? If you are getting a good result, then knowing these reasons helps you confirm what to celebrate and keep doing more of. If you are getting a bad result, then knowing these reasons helps you modify your course of action to turn the result around. Predictive information is some of the most valuable information in business. While no information can really do the crystal ball thing, a really good understanding of drivers (or lead indicators) can certainly give some great clues about likely future results. Thus, you can prepare for the most likely outcomes, before they happen. design your information to your business questions The types of information needed to these business questions are different for each question. And unless that information is designed with the question in mind, it s likely that you won t be able to the question, or if you do, it will suffer the risks of misjudgment. One of the keys to designing the kinds of information that will support your judgment in ing these questions is to start with identifying the type of comparison you are trying to make. What do you need to compare with what, in order to the question? Another key is to be familiar with the kinds of qualitative and quantitative analyses that can reliably make those comparisons for you. The following sections provide some examples of how these keys can be used to design information to the generic types of business questions above. Have we achieved our target? This is basically a comparison between actual performance and targeted performance. But it s not quite as simple as that. Because business results are constantly affected day in, day out, as time goes by, this comparison can only be really valid if it takes into account the natural variation in results over time. This means that a simple comparison of actual performance for the year (such as the average number of orders per customer per month, rolled up into an annual statistic) compared to the target (of 20 orders per customer per month) is too simplistic. It doesn t take into account the very likely event that improvements may have happened within the last year, so it underestimates actual performance. A better analysis would be a run chart (1) that shows the real changes in Page 4 of 6

the overall average as time goes by, and compares that latest overall average (the mean line in the run chart) with the targeted level. Are we progressing toward our target? This question requires a comparison quite similar to the question above: actual performance compared to targeted performance. But it s not just the comparison between the mean line and the target level that matters here, it s also a comparison of how the actual overall level (the mean line) is moving as time goes by. Is it moving closer to the target level, fast enough? If you relied just on monthly comparisons to target, you d be misled by the natural variation that happens from month to month. You need to see the big picture pattern or trend over time. Are their any unintended consequences of our actions? A slightly more complex comparison is needed to this question. Answering this kind of business question means you need to have some idea of what kinds of unintended consequences you could have expected, and some information about the extent to which these consequences are occurring before and after you take action to achieve the performance result you want. When you have this information, it would ideally take a similar form to the information that s the previous two questions: a run chart that shows real changes in the overall actual level as time goes by. You then can compare the patterns or trends over time of your performance result, with those of the unintended consequences, and if you see some kind of correlated pattern, there s a strong clue that by achieving your performance result, you are also getting some other kind of result as a consequence. This may be a good thing, but it also may be a bad thing. And by knowing, you can take action if it s needed. Why are we getting the results we are getting? The comparison type here is to be able to see the relative size of impact of each of a range of possible reasons for the result you are getting. So step one is to have a good idea of what those reasons could be. The second step is to be able to source some data that lets you know how often, or to what extent, each reason has actually played out during the timeframe you monitored your performance result. A really useful analysis of such data is a Pareto chart. This shows the relative size of impact of each reason, from largest to smallest. It will highlight the 20% of reasons that are having 80% of the impact on your result. And viola, you have a place to start investigating further, to find how you can turn your result around. What is likely to happen in the future? The kind of comparison needed to questions like this is the comparison between or among a set of measures or factors or variables that you hypothesize have significant influence over the direction your performance result will head. These measures or factors or variables are your drivers, or lead indicators. You test these Page 5 of 6

hypotheses through a scatter plot (entirely visual), correlation analysis (very visual, with a quantitative measure of strength of the relationship) or regression analysis (not visual, but with a quantitative model of the relationship) to determine the strongest of these lead indicators. Then, using the run chart analysis described under previous questions, you can interpret the emerging trends in your lead indicators and estimate or calculate the impact this will have on your performance result. deliberately do two things Designing excellent information to inform decisions about business performance is not rocket science. But it usually does require some effort be applied to clearly articulating each business question that needs to be ed in order to understand and make the decisions, and applied to deliberately designing the kind of information that can adequately (even if not completely) those questions. references (1) Understanding Variation: The Key to Managing Chaos, Donald Wheeler, SPC Press, Inc., 1993 (2) The Visual Design of Quantitative Information, Edward Tufte, Graphics Press, 1983-1999 (3) Envisioning Information, Edward Tufte, Graphics Press, 1990 (4) Visual Explanations, Edward Tufte, Graphics Press, 1997 about the author Stacey Barr is a specialist in performance measurement, helping people to move their business or organisation s performance from where it is, to where they want it to be. Sign up for Stacey s free email newsletter at to receive your complimentary copy of her e-book 202 Tips for Performance Measurement. Page 6 of 6