The things you want to know about metrics but you don t know what to ask!

Similar documents
Operational Excellence Methodology Operational Transparency

Operational Excellence Methodology Operational Transparency

Duvan Luong, Ph.D. Operational Excellence Networks

Leading Practice: Approaches to Organizational Change Management

WHITE PAPER. The Foundation of a Successful ITAM Program - In 5 Not So Easy Steps

Operational Excellence Methodology Continuous Improvement

HOW TO CREATE A CUSTOMER SUCCESS PLAN. A step-by-step guide to delivering on expectations and ensuring success

STRATEGIC MANAGEMENT

Fighting the WAR on BUGs A Success Story. Duvan Luong, Ph.D. Operational Excellence Networks

Also we will try to recap what we studied in the last class. (Refer Slide Time: 00:35)

TenStep Project Management Process Summary

Decision Analysis Making the Big Decisions

BETTER TOGETHER: BUILDING MORE EFFECTIVE CROSS-FUNCTIONAL TEAMS

HOW TO COACH. The Data + Behaviors Approach to Effective and Efficient Performance Coaching

Software Quality Management

Welcome to this IBM podcast, Agile in the. Enterprise: Yes You Can. I'm Kimberly Gist with IBM. If

DevOps. Changing the way you deliver software

Continuous Improvement

A Systematic Approach to Performance Evaluation

NCOVER. ROI Analysis for. Using NCover. NCover P.O. Box 9298 Greenville, SC T F

The Basics of ITIL Help Desk for SMB s

An Early Defect Elimination Best Practice. Duvan Luong, Ph.D. Operational Excellence Networks

POSSE System Review. January 30, Office of the City Auditor 1200, Scotia Place, Tower Jasper Avenue Edmonton, Alberta T5J 3R8

Continuous Process Improvement Organizational Implementation Planning Framework

Asset Visibility through Data Integrity. Matt Bayne Access Midstream

Advantages and Disadvantages of. Independent Tests. Advantages. Disadvantages

How to implement PPM Technology to Improve Your Organisational Maturity. Guy Jelley, CEO, Project Portfolio Office

Functionality First: A Prescriptive Approach to Simpler, Faster and More Cost-effective Project and Portfolio Management (PPM)

WORKGROUP-LEVEL OVERVIEW. What You Will Learn. What You Will Apply To Your Workgroup

Software Quality Management

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

Software Project Management

Scope Management. 2. Meetings 2. Requirements Management Plan 3. EEF 4.OPA

Law Firm Marketing Discovery Workbook: Getting in Motion

PROJECT QUALITY MANAGEMENT. 1 Powered by POeT Solvers LImited

ROEVER ENGINEERING COLLEGE Elambalur,Perambalur DEPARTMENT OF CSE SOFTWARE QUALITY MANAGEMENT

Quality Management_100_Quality Checklist Procedure

Results without Authority: Controlling a Project when the Team Doesn t Report to You. Tom Kendrick, Program Manager

Measuring Business Value with Agile

Software Helps You Manage the Project Portfolio

1.0 PART THREE: Work Plan and IV&V Methodology

Requirements Analysis and Design Definition. Chapter Study Group Learning Materials

The Move from Traditional Change Management to Agile Methodology

IT Project Management. Essentials of IT project management 25 Sep 2017

OE RESOURCE REQUEST APPLICATION

EBM EVIDENCE-BASED MANAGEMENT GUIDE

EBM EVIDENCE-BASED MANAGEMENT GUIDE

Quality-Based Requirements Definition. Richard E. Biehl, CQA, CSQE Data-Oriented Quality Solutions

Cisco Services Best Practices Battle Card

SESSION 607 Thursday, April 14, 2:45pm - 3:45pm Track: Metrics and Measurements. The Good, Bad and Ugly of Service Desk Metrics. Session Description

Surviving the Top Ten Challenges of Software Testing

BASICS OF SOFTWARE TESTING AND QUALITY ASSURANCE. Yvonne Enselman, CTAL

Rational Software White Paper TP 174

QUALITY ASSURANCE PLAN OKLAHOMA DEPARTMENT OF HUMAN SERVICES ENTERPRISE SYSTEM (MOSAIC PROJECT)

4 Steps To Scaling Agile Across The Enterprise. The Guide To Agile At Scale

ILRHR555: HR Analytics for Business Decisions

Sample Chapter. Producing Meaningful Metrics

A Practical Guide to Implementing Levels 4 and 5

Bridging the Gap between Business Strategy and Software Development

Chapter 4 Software Process and Project Metrics

Team Development. Copyright Fors Leadership Academy not to be reproduced without expressed written consent

Leadership Release Management Continuous Integration. october 9, 2013


Organizational Development & Performance

Knowledge, Certification, Networking

Vendor: ISEB. Exam Code: BH Exam Name: ITIL V3 Foundation Certificate in IT Service Management. Version: Demo

PMP Exam Prep Coaching Program

Optimize Your Incentive Strategy

An IT Managers Guidebook to Implementing ITIL

A Software Metrics Primer 1

THE PMO AND PROJECT MANAGEMENT CHRIS DOWHOWER, PROJECT MANAGEMENT DIRECTOR, PMP, VIZO FINANCIAL

developer.* The Independent Magazine for Software Professionals Automating Software Development Processes by Tim Kitchens

Implementation & Testing Plan. CS 307: Software Engineering Pascal Meunier

EBM EVIDENCE-BASED MANAGEMENT GUIDE

Competing for Customers: How the Best Companies Use Customer Success to Deliver Results that Matter

Quantifying the Value of Investments in Micro Focus Quality Center Solutions

Project vs Operation. Project Constraints. Pankaj Sharma, Pankaj Sharma,

Agile transformation is hard in large organizations JAOO Kati Vilkki

2012 Color Code International 145 W. Crystal Ave. Salt Lake City, UT MOTIVE

Information Technology Project Management, Eighth Edition. Note: See the text itself for full citations.

Strategy 6 Table of Contents

TEST AUTOMATION METRICS WHAT DO YOU REPORT ON?

Session 608 Tuesday, October 22, 2:45 PM - 3:45 PM Track: Industry Insights

Continuous Improvement

OPTIMIZED FOR EXCELLENCE. An Incentive Compensation Management (ICM) Assessment Case Study of OpenText Corporation

HOW TO DEVELOP A BUSINESS PROCESS ASSESSMENT, STRATEGY, AND SYSTEMS ROADMAP FOR COMPENSATION

Agile TesTing MeTrics Quality Before Velocity

Integrating Project Online with Office 365 Planner. Presented by: Jenny Ward September 13, 2018

1 P a g e P r i s m e t r i c T e c h n o l o g i e s

How To Manage a Troubled Project GTC Center for Project Management, All Rights Reserved

Differentiators that Make a Difference

DEVELOPING A LEARNING CULTURE

reasons to invest in a CMMS

Comparing PMBOK Guide 4 th Edition, PMBOK Guide 5 th Edition, and ISO 21500

Reliability Engineering. RIT Software Engineering

TRENDICATORS BEST PRACTICES REPORT HELPING MANAGERS TAKE ACTION ON SURVEY RESULTS

Winning at Implementation, Losing at Effectiveness

Safety Perception / Cultural Surveys

Marketing Communications Essentials:

Project Manager s Roadmap We re all smarter together

Transcription:

The things you want to know about metrics but you don t know what to ask! Duvan Luong, Ph.D. Operational Excellence Networks 3/16/2012 Contents copyright Operational Excellence Networks 2012 1

Famous Quotes To measure is to know If you can not measure it, you can not improve it Lord Kelvin "You can't control what you can't measure" Watts Humphrey, SEI What gets measured gets done. What gets measured and fed back gets done well. What gets rewarded gets repeated John E. Jones 2

Definition Measure : to obtain measurement information of an object Measurement : A figure, extent, dimensions, capacity, amount, etc. obtained by measuring Metric : A standard of measurement 3

Operational Transparency The setup where the key information about the company executions and associated results is readily available for use to ensure the progress toward the desired objectives. Transparency is the state in which all relevant execution information is fully and freely available to all who need to know. Operational transparency context includes the visibility of the operations performance and the analysis and use capabilities for the execution status/progress data/information. Transparency is also a trust building mechanism generally used to "open up" the books or practices of a company to stakeholders with a "right to know". 4

Operational Transparency Benefits Improve the bottom line by reducing implementation cost and improving productivity and effectiveness. Allow the alignment of the executions to the plan and the desired objectives. Permit execution of business operations on a continuous basis (no pause to resolve unplanned issues). Provide a rational basis for selecting and prioritization of what process improvements to make. Allow the identification best practices and expand/share their usage elsewhere. Availability of execution data/information availability supports better and faster budget decisions and control of processes in the organization - This helps to reduce implementation risk. Provide accountability and incentives based on real data Enable benchmarking of process performance against outside organizations. Availability of process cost data past projects allows us to learn how to estimate costs more accurately for future projects. If you are in a US Federal agency, it's the law. The Government Performance and Results Act of 1993 requires a strategic plan, and a method of measuring the performance of strategic initiatives. Raise the level of operational excellence for your company. 5

Establishing Operational Transparency The effective establishment and use of the measurements (metrics) system is the key factor for the achievement of operational transparency. Establishing operational transparency involves: Ensure the desire, support and Executive sponsorship. Adequately allocate resources for the measurement (metric) program implementation. Get the right people involved in the measurement (metric) program. Document, train and communicate the measurements (metrics). Ensure the use of measurements (metrics) as part of company operations. Emphasize the project level measurements (metrics). Do not use measurements (metrics) for other purposes. Select, analyze and report measurements (metrics). Ensure that measurement (metric) is viewed as a tool, not the end goal. Let the data be interpreted by the people involved. Automate the collecting, analyzing and reporting of measurements (metrics). Simplify the measurement (metric) effort with a smaller number of key simple measurements (metrics). 6

The Howtos of Metrics How to Use metrics? Define metrics? Collect and track metrics? Interpret metrics? 7

How to Use Metrics? If you don't know where you are, a map won't help. Watts Humphrey, SEI Use Metrics as indicators of progress Adoption implementation progress Result achievement evaluation 8

Implementation Strategy Goals 9

How to Define Metrics? The Goals-Questions-Metrics (GQM) methodology How to identify Goals? What questions to ask? What metrics to select? Examples Questions Goals Metrics 10

How to Identify Goals? Process: Brainstorm the organizational or product objectives and goals that you are trying to achieve. Break into smaller chunks as needed so that they are: specific and understandable within your power to attain measurable appropriate chunk size with a stretch fixed time period Prioritize and choose a small set of goals Best Practices: Ask questions to clarify goals What do we want to achieve? What are our organization objectives? What do they meant to you? Do we have any specific goals? Is there any dependency on company level goals? What are company goals? 11

What Questions to Ask? Process: Brainstorm a list of questions for each objective/goal: What do we need to ask in order to get information we need to manage to this goal? Synthesize, sort and prioritize the questions to get to a core set of questions whose answers will give you sufficient information to effectively manage toward these goals Best Practices: Ask questions to know more about the requirements for the metrics What are we going to do to achieve the identified objectives/goals? What do we want to know about the identified goals? How do we know we are implement the planned activities for the identified goals? How do we know we do the right things? How do we know we achieve the set objectives? If we achieve those defined objectives/goals, what are the expected results? How are those results look like? feel like? When do we need the metric data? Etc. 12

What Metrics to Select? Process: Brainstorm metrics that will provide answers to the questions. Reduce the list to a small set of essential metrics based on the following considerations: Ease of collection Reliability Usefulness easy to understand Breadth of coverage Credibility with key targets Present-ability/communicability Best Practices: Apply the following tests to the selected metrics, a good metric should satisfy many/most of the tests: Behavior Tests Definition Tests Communication Tests Formatting Tests 13

Tests for Good Metrics (1) Behavior Tests: Use these questions to ensure that the metric will cause the right behavior: Is this metric consistent with the goals of the organization? What are the negative behaviors that could result if we encourage this metric? Are the behaviors encouraged in consistent with how individuals and teams are rewarded? Are the behaviors encouraged in consistent with the organizations culture? Definition Tests Do we know what 'good' performance is? Is the metric common with other metrics used elsewhere? Is the definition clearly defined? Is the metric objective? Is it quantifiable as much as possible? Are the people whom the metric measures able to improve it? Does the metric promote learning and continuous improvement? Does the definition and formula for the metric provide visibility to what drives the metric? Is the metric owned by the decision makers who use it? Does the metric point people in the right direction for corrective action? Is the metric easily and objectively verifiable? Is the metric validly measuring what it was intended to measure? 14

Tests for Good Metrics (2) Communication Tests Will the metric be accepted at all levels at which it is used? Will the metric show favorable results when things improve? Is the metric presented in both numerical and graphical form? Can the metric be summarized in an aggregate form for reporting upwards? Is the metric reported out at the level of the organization that can do something about improving what the metric measures? Formatting Test Is the metric easy to apply? Does a numerical formula exist? Is the data to construct the formula readily available? Is it cost effective to gather the data and calculate the formula? Is the formula constructed so that it is easily understood? Has the type of graph for displaying the metric been identified? 15

Example - Goals Project Objectives: Implement Formal Technical Review Process (FTR) to improve Product Quality Goals: To reduce the amount of product defects that escaped to Customer. To reduce the amount of product defects that found by PV. To increase the defects found early in the development timeframe, before formal testing (PV) Use information from the FTR found defects for fine tuning later PV testing activities and also to improve preventive development practices. To optimize the costs for defects detection Questions to ask for the identified goals: Are these goals clear to you? If they are not clear why? Why do we have these objectives and goals? Are there any company level objectives that link to these objectives? What are the potential numerical goals/limits for the defect reduction goals? Do we have any company level goals related to the FTR process? What is the timeframe for the goals? Do we have a plan for implement FTR process? What are the projected cost for the implementation? Who are the owner of these goals? Who need to implement FTR? Etc. 16

Example Questions for FTR Implementation Metrics 1. What is the FTR process we are going to implement? 2. How do we know we are implementing the FTR process? 3. How do we know people have the necessary skill to do FTR? 4. What is the level of adoption for FTR in the organization? What is our expectation for the adoption? 5. How do we know we have the necessary structure to support FTR implementation? 6. How do we know we do the right FTR activities? 7. How do we know FTR implementation produce actual improvement? 8. How much investment do we want to spend on FTR implementation? 9. Which FTR technique is better: inspection or walk-through? 10. How to link FTR results to the overall product Quality picture? 11. What are the Industry standard FTR metrics? 12. What are the Industry standard FTR metric baselines? 13. Who have to implement FTR? Who will provide the support for FTR implementation? Etc. 17

Example Potential Metrics for FTR Implementation Adoption: 1. # of Engineers who involved with FTR get trained 2. % of trained vs. total Engineers who are involved in FTR activities 3. # of Moderators for the organization 4. % trained vs. total moderators 5. Ratio of moderator vs. Engineers who are involved in FTR activities 6. # of products with a list of key deliverables that need FTR 7. % of products with a list of key deliverables that need FTR vs. total products in the organization 8. Time spent in FTR activities 9. Amount of code (in KLOC) or documentation (pages) that get FTR during last month 10. # of FTR sessions done during last month 11. % of actual FTR sessions vs. planned sessions Result: 12. Problems found by FTR activities (breakdown to Major vs. Minor, inspection vs. walkthrough 13. # Major FTR problems closed 14. % of Major FTR problems closed vs. total Major problems found 15. # of Major FTR problem not closed 18

FTR Questions Metrics Mapping Questions 1. What is the FTR process we are going to implement? 2. How do we know we are implementing the FTR process? 3. How do we know people have the necessary skill to do FTR? 4. What is the level of adoption for FTR in the organization? What is our expectation for the adoption? 5. How do we know we have the necessary structure to support FTR implementation? 6. How do we know we do the right FTR activities? 7. How do we know FTR implementation produce actual improvement? 8. How much investment do we want to spend on FTR implementation? 9. Which FTR technique is better: inspection or walkthrough? 10. How to link FTR results to the overall product Quality picture? 11. What are the Industry standard FTR metrics? 12. What are the Industry standard FTR metric baselines? 13. Who have to implement FTR? Who will provide the support for FTR implementation? Metrics 1. # of Engineers who involved with FTR get trained 2. % of trained vs. total Engineers who are involved in FTR activities 3. # of Moderators for the organization 4. % trained vs. total moderators 5. Ratio of moderator vs. Engineers who are involved in FTR activities 6. # of products with a list of key deliverables that need FTR 7. % of products with a list of key deliverables that need FTR vs. total products in the organization 8. Time spent in FTR activities 9. Amount of code (in KLOC) or documentation (pages) that get FTR during last month 10. # of FTR sessions done during last month 11% of actual FTR sessions vs. planned sessions 12. Problems found by FTR activities (breakdown to Major vs. Minor, inspection vs. walkthrough 13. # Major FTR problems closed 14. % of Major FTR problems closed vs. total Major problems found 15. # of Major FTR problem not closed 19

How to Collect and Track Metrics? In order to be real and useful, the metrics must be easy to collect and track. Ask questions to have enough understanding of how to best collect and track the selected metrics: Collecting metrics: Do you know how the data will be collected? Who will collect the metrics? How often to collect metrics? How much metrics data to collect? Where to keep the metrics data? Has the person who will collect the data been identified? Etc. Tracking/reporting metrics: Has the person who will calculate and plot the metric been identified? Has the person who will report the metric been identified? Can the metric be calculated and reported without delay? Will the metric be available to those who need it? Will the metric be reported at a pre-agreed frequency? Who are the audiences for the metrics report? Etc. 20

Example Collecting & Tracking FTR Metrics Collecting: The moderator collects and logs the following information for each FTR session: Time spent (provided by the participants) Problems found during the FTR session (with associate data such as problem type, origin, potential causes, major vs. minor, and inspection vs. walk-through) Department Manager maintain the FTR training information for the Engineers. Tracking: The FTR coordinator for the organization, at the beginning of each month, tracks, analyzes, and reports the following information: FTR training metrics Information for the availability of the list of key deliverables that need FTR Information about actual vs. planned FTR sessions Time spent for FTR activities Problems found by FTR activities % Major FTR problems closed vs. total Major FTR problems found # of not closed FTR major problems Using: Organization GMs and Staff review the monthly FTR metrics Quality Manager review the FTR metrics FTR coordinator use the metrics to plan for FTR implementation/process improvement 21

Use Metrics for Improvements Metric data, when compare against the baselines and trends, should provide the information about the progress/status of the measured item. What is the status of the adoption of the measured item? What is the result of the measured item? Heuristics for evaluate the achievement must be defined for the collected metrics so decision can be made and appropriate actions can be taken. 22

Interpreting metrics for Adoption status (1) Adoption Achievement Status Red Awareness/ Communication Some people (<50%) got communicated or knew about the practice or improvement Training Few people (<30%) get sufficient training or knowledgeable about the practice or improvement Deployment The practice or improvement is not used or rarely used by few people (< 30 %) Yellow Most people (~ 50% - 80%) got communicated or knew about the practice or improvement Many people (~ 30% - 60%) get sufficient training or knowledgeable about the practice or improvement The practice or improvement is often used by many people (~ 30-60 %) Green Almost/all people (> 80%) got communicated or knew about the practice or improvement Most people (> 60%) get sufficient training or knowledgeable about the practice or improvement The practice or improvement is always/most of the time used by most of people (>60%) Heuristics: select one cell from each columns If two or more cells have same color -> overall achievement has same color Special cases: R+G+G, R+Y+G, G+Y+R -> Y overall achievement 23

Interpreting metrics for Adoption status (2) Adoption Achievement Status Red Overall Description Reactive to problems No systematic practice/improvement is evident; information is anecdotal Organization is not aligned around the common goals Yellow Transition to proactive to problems Starting of a basic systematic practice/improvement is evident Organization is aligned around the common goals Green Fully proactive to problems A full systematic practice/improvement is evident Organization is aligned around the common goals The practice/improvement is well documented 24

Result Achievement Status Red Interpreting Metrics for Result Status (1) Metrics No metrics definition No communication about metrics to people No infrastructure to support the collecting, tracking, analyzing and using of metrics Outcomes No/little result data available TREND data are not reported or show mainly adverse TRENDS Yellow Green Some key metrics defined and communicated to people Some infrastructure in-place to support metrics Metrics are used for decision making Metrics are defined for all key operations and communicated to people Fully supported infrastructure in-place to support metrics deployment Metrics are used for both decision making and continuous improvement Some results are available and show the achievement of some objectives Trend data are available and starting to show improvement Some comparative data are available Results are available for all key operations Trend data are show permanent/sustaining improvement Benchmark of most results show many areas of leadership and solid achievement Results are documented for future analysis Heuristics: select one cell from each column If all cells have same color -> overall achievement has same color If cells have different color, use the lower level achievement color Special cases: R+G, G+R -> Y overall achievement 25

Interpreting Metrics for Result Status (2) Result Achievement Status Red Overall Description Organization has little focus on metrics Little or no result available Yellow Green Organization starting to focus on metrics Some data available and show some improvement Metrics are integrated into organizational operations Data for all the key operations are available and show solid results Metrics are use ford both decision making and continuous improvement Metrics are documented for future data analysis 26

Example 3 rd Year Metrics for FTR Implementation Adoption: 150 of Engineers who involved with FTR get trained 100% of trained vs. total Engineers who are involved in FTR activities 30 Moderators in the organization 100% trained vs. total moderators 1 moderator for five Engineers (Industry 1 per 8) 15 products have list of key deliverables that need FTR 450 Hours spent in FTR activities 10 KLOC of code that get FTR during last month (23 LOC per hr) 17 of FTR sessions done during last month 100% of actual FTR sessions vs. planned sessions Result: (Ins: 272 13 Walk: 191 18) Bugs found by FTR activities (breakdown to Major vs. Minor, inspection vs. walkthrough 31 Major FTR problems closed 0 Major FTR problem not closed 27

Example: 3 Years Metrics for TRP Implementation Metrics 1. Number of engineers involved with TRP training 2. Percentage of trained versus total engineers who are involved in TRP activities 3. Number of moderators for the organization 4. Percentage trained versus total moderators 5. Ratio of moderator versus engineers involved in TRP activities First Measurement Second Measurement Third Measurement 30 90 150 20% 60% 100% 2 9 30 100% 100% 100% 1/15 1/10 1/5 6. Number of products with a list of key deliverables that need TRP 7. Percentage of products with a list of key deliverables that need TRP versus total products in the organization 8. Time spent in TRP activities 9. Amount of code (in KLOC) or documentation (pages) that get TRP during last month 10. Number of TRP sessions done during last month 11Percentage of actual TRP sessions versus planned sessions 12. Problems found by TRP activities (breakdown to total versus major, inspection versus walkthrough 13. Number of major TRP problems closed 14. Percentage of major TRP problems closed versus total major problems found 15. Number of major TRP problem not closed N/A 8 15 N/A 80% 100% N/A 474 hrs 450 hrs N/A N/A 10 KLOC (1hr 23 LOC) 2 15 17 100% 100% 100% N/A Ins: 110 11 Ins: 272 13 Walk:169 22 Walk: 191 18 N/A 26 31 N/A 80% 100% N/A 7 0 28

Example Metrics for FTR Implementation (after 3 years) 1200 1000 800 600 400 200 0 450 400 350 300 250 200 150 100 50 0 154 0.86 132 33 2 31 0.94 219 0.55 121 232 110 0.47 XXX Inspection Results 165 1.36 224 C/S Defects 0.91 477 433 358 486 1.36 XXX Walk-through Results 759 1096 1.44 289 452 1.56 Inspection Effort (hours) Defects Found Defect Discovery Rate 490 440 8 6 11 29 47 33 85 80 48 13 August September October November December January February March April May June 119 0.26 31 176 0.36 64 384 237 0.62 242 169 0.70 408 SNSL Review Results 343 193 0.31 60 0.62 251 0.32 111 395 0.51 203 223 0.30 C/S Defects 68 1.11 268 C/S Defects 259 0.96 22 16 13 26 17 16 6 18 August September October November December January February March April May June 272 178 0.94 256 191 1.07 1.80 1.60 1.40 1.20 1.00 0.80 0.60 0.40 0.20 0.00 1.80 1.60 1.40 1.20 1.00 0.80 0.60 0.40 0.20 0.00 29

Summary The effective establishment and use of the measurements (metrics) system is the key factor for the achievement of operational transparency. Operational transparency is the cornerstone of organization operational excellence. Measurement (Metric) data becomes information when it inspires action. When that information is easily and always accessible, an organization can be optimized around people acting in fully informed and aligned ways. With new insights for the time, cost and quality of the capabilities capitalization across the company, it becomes easier to address business imperatives/focuses that will improve operational excellence. Company also needs to ensure that operational issues are visible and addressed before they turn into costly problems 30