Agenda. Measuring in Traditional vs. Agile. The Human Side of Metrics

Size: px
Start display at page:

Download "Agenda. Measuring in Traditional vs. Agile. The Human Side of Metrics"

Transcription

1

2

3 Agenda Measuring in Traditional vs. Agile The Human Side of Metrics

4

5 We Need Tangibles As gauges or indicators For status, quality, doneness, cost, etc. As predictors What can we expect in the future? As decision making tools Can we release yet? A visual way to peer into a mostly non-visual world Because we don t completely understand what s going on in the software/project and we need to.

6 History Tons of research, mostly based on industrial metrics. Implementation of metrics in project management has grown exponentially Hasn t really affected project success What a metric!

7 Modern Apps Require Modern Metrics Application software development is evolving into a strategic business process Shift the conversations with business stakeholders toward value delivered Traditional metrics assume slow, controlled, predictable change Agile projects use different measures

8

9

10

11 Agile vs. Traditional Short sprints Inspection every sprint Inherent Risk Mitigation Builds Quality In Long time horizon Intangible for months Manual Risk Mitigation Sacrifice quality by fixing schedule/scope Chief metric is working software Many metrics used and needed

12 Are metrics abused?

13 Consider 2 MIN How much time does your team devote to preparing metrics in a month? Who are the metrics for? Do they help you? What would you consider time wasters?

14 What Do We Measure Jim Highsmith

15 Scrum Builds Quality In Definition of Done + Acceptance Criteria Quality Sprint Review + Stakeholder and Customer Feedback Quality

16 The only metric that really matters is what I say about your product.

17

18 Hawthorne Effect When you measure something, you influence it You can exploit this effect in a positive way Most traditional metrics have a negative Hawthorne effect Gaming = Hawthorne Effect * Deliberate Personal Gain Tell me how you will measure me and I will tell you how I will behave -Goldratt

19 Hawthorne Effect Where have you seen this in software development? Where have you experienced gaming? What have you gamed? What do you measure now that might have negative Hawthorne effects or easily be gamed?

20 Hawthorne Effect Try Identify positive/negative Hawthorne effects on each metric that exists Measuring things you want more of No-questions-asked policy of reporting gaming (so you can simply stop wasting time gathering that metric) Avoid Using metrics with negative Hawthorne effects Easily gamed systems Measuring things you really want more of, and don t really have an effect on outcomes

21 Measure Up Austin Corollary: You get what you measure, and only what you measure; and you tend to lose others you cannot measure: collaboration, creativity, happiness*, dedication to customer service Suggests measuring up Measure the team, not the individual Measure the business, not the team Helps keep focus on outcomes, not output

22 Measure Up What are some possible outcomes of the following common metrics? Lines of code Defects/person Defects/week Velocity

23 Measure Up How about these? Accepted Features or Features/Month Revenue or Revenue/Feature Customer Retention or Churn Rate Net Promoter Score Happiness*

24 Measure Up Try Customer reported defects Team Throughput Accepted Features Customer LTV Value Avoid Defects during development Capacity/Efficiency Velocity (or worse: LoC) New Customers Cost

25 The Measurement Paradox Product Dev is a complex system Metrics used in isolation don t measure what you think they do Stakeholders are focused on the system, the rule of WYSIATI applies Beware of low hanging fruit Value of Measurement = 1/Ease of Measuring Not everything that can be counted counts, & not everything that counts can be counted Albert Einstein

26 Measurement Paradox Try Measuring up! Making measurements visible only at the appropriate level Measuring what really matters, and has a direct line-of-sight Contribution to outcomes Avoid If we just had more data Management by metrics Sets of easy-to-gather metrics that purport to tell you something about the system/outcome.

27

28 Guiding Principles We no longer view or use metrics as isolated gauges, predictors, or decision making tools; rather they indicate a need to investigate something and have a conversation, nothing more. We realize now that the system is more complex than could ever be modeled by a discrete set of measurements; we respect this. We understand there are some behavioral psychology concepts associated with measuring people and their work; we respect this.

29 No Single Prescription What really matters? Listen to the customer Trends over static numbers Will this help us be more agile? For each one, let s ask: What is this really measuring? Who is the metric for? Who should see it? What behaviors will this drive? What s the risk of negative Hawthorne effects or gaming? Are we measuring at the right level? Up?

30 Metrics According to Project Complexity Source: Cynefin Framework by David Snowden

31 Appropriate Measures Simple Concentrate on business value metrics Complicated Measures shift from business value to progressive quality and efficiency Complex Employ strongly progressive metrics Chaotic Focus on reducing risk

32

33 Metrics for the Team These are primarily for the team (can be communicated to management) Sprint Burndown Velocity Release Burndown From the management level, intense focus or incentivizing on these is not good Allow the team to use empirical data, and remain transparent and honest

34 Sprint Burndown

35 Velocity

36 Release Burndown

37 Metrics for Management These are for the team and for management Working Software Throughput Happiness* Higher level measurements (measure up!) Positive Hawthorne effects

38 Working Software Can everybody confidently give the thumbs up for the increment?

39 Throughput Measures how much stuff is: Getting Done Adding Value The right :stuff: Need to view team AND business throughput simultaneously Careful with correlation and causation Empirical way to gauge value/spend In place of direct capacity or productivity measures

40 Throughput What does this mean to you? How do you define the right stuff How would you measure it? What does value mean in your context?

41 Throughput: Team

42 Throughput: Business Revenue If we re delivering features all the time, how is that effecting revenue? Are our development efforts effecting revenue? Or is it something else? Revenue/Feature Revenue-data-driven decision making Split A/B testing Does variant A or B result in more revenue? Cohort Analysis How is revenue changing across cross-sections of prospects/customers?

43 Throughput: What to Look For

44 Happiness* It s really about motivation and treating people well Motivation Autonomy, mastery, purpose Treating people well Trust, compensation, work environment, training, support

45 Continuous Improvement Value delivered Quality Predictability Team cohesiveness

46

47 Consider 2 MIN How is success defined in your organization? What drives investments? How agile are you?

48 A Matter of Managerial Culture Evidence-Based Management (EBMgt ) Roots in the medical profession. The application of direct, objective evidence* by managers to make decisions. For software development, EBMgt is employed to maximize the value of software to the entire organization. Evidence, broadly construed, is anything presented in support of an assertion: Direct Evidence is the strongest type of evidence. It provides direct proof of the validity of the assertion. Circumstantial evidence is the weakest type of evidence. It is merely consistent with the assertion, but doesn t rule out contradictory assertions. *Source: Wikipedia

49 Direct Evidence of an Organization s Value

50 Direct Evidence of Value measured by Outcomes Revenue per Employee Release Frequency Installed Version Index Product Cost Ratio Employee Satisfaction Customer Satisfaction Release Stabilization Cycle Time Usage Index Innovation Rate Defects Current Value Time to Market Ability to Innovate

51 Circumstantial Evidence Is Supportive, Not Direct Evidence of the effectiveness of teams and practices is circumstantial evidence. Diagnosing organizational patterns helps analyzing direct evidence.

52 Relating value to organizational patterns Measure Direct Evidence Informs priority Diagnose Circumstantial Evidence More at Influences Outcomes Improve

53

54 EBMgt measures the value of the organization.

55 EBMgt diagnoses likely contributors, and discovers capabilities you can build on.

56 EBMgt helps you improve by using evidence as a driver of change

57 Specific Metrics Measured

58 For more info Get the ebook about Evidence-Based Management at

59

60

61 Cargo Cults & False Idols

62

63

64 Shifting Mindsets What s the opposite of a fragile defect-ridden, return-tosender, zero sales, customers hate you, all-around-bad product? 1st Premise: Zero Defects! Meets requirements! Better Premise: High Value / High Revenue High Customer Satisfaction A quick-to-change Agile Product

65 The Customer Measure Up. Start with the Customer. Build it quick enough & often enough to make measuring on the build side irrelevant. Focus measurements on the Customer side. The only metric that really matters is what your customers say about your product. What are they saying about yours?

66 Thank you