So Dave, why are you looking to go agile?

Size: px
Start display at page:

Download "So Dave, why are you looking to go agile?"

Transcription

1 1

2 2

3 My objec+ve with this session is to engage all of you in a conversa+on about how we define Agile Success, How we might measure the degree of success we re realizing. But before star+ng that conversa+on, I d like to share with you a story that has had a profound effect on how I think about agility, measurement and con+nuous improvement. This is the story of Dave. Dave was one of the first clients I worked with as a young agile coach. He is the owner of a small sojware company of about 100 people. He s crazy smart, and great guy, and one of the best all around businessmen I ve worked with in my career. I met Dave for the 1 st +me in a sales mee+ng. One of our sales reps brought me in to help explain Agile to Dave, discuss what an agile transforma+on might look like, and see if our coaching services might be able to help them in their journey. So, I of course opened our conversa+on with a sage and leading ques+on - 3

4 So Dave, why are you looking to go agile? 4

5 Dave s response surprised me. To be honest, I don t want to. It sounds hard, and honestly I have other things I need to be doing. 5

6 However, John (Development Manager) and Chris (Sales Rep) tell me that Agile will help me realize meaningful business results. THAT, I m interested in, so I m considering making an investment of my +me and energy in Agile but what I care about isn t Agile, it s business success. 6

7 So, I told Dave all about the wonderful business results that other companies had realized due to their Agile efforts 7

8 And I described the beau+ful simplicity of the agile model, and why it works. AJer I d spent about 20 minutes waxing poe+c, Dave told me that he was interested enough to give it a try, but before we did anything he wanted to know something. 8

9 How are we going to Measure the success of our Agile implementa+on. 9

10 So I whipped out my Agile Maturity Assessment (this isn t mine. It s Rally s. But mine was rather similar). I told him about how we would measure their adop+on and use of key agile prac+ces to gauge the overall health of their agile implementa+on. 10

11 But Dave held up his hand to stop me. I think you re puang the cart before the horse though Isaac. You re measuring how Agile we are or how mature our agile implementa+on is. But that s not really the point is it? 11

12 I m not interested in measuring how well we re doing agile - at least that s not my primary interest. What I m interested in is measuring how well agile is doing for us. 12

13 Now, to be honest, in that moment, Dave blew my mind. I d never really thought about measuring agile that way before. So, I fell back on an age old consultant trick the consultant stare where I gaze fixedly at the client, nodding my head slowly, and perhaps thougheully rubbing my chin. And it worked as it always does because he con+nued. 13

14 You see Isaac, I m not a sojware engineer by training, and I never really studied business. My degree is in chemistry and so I kind of tend to see everything in the context of an experiment. As I see it 14

15 You ve got a hypothesis and I d like to test it. 15

16 Your hypothesis is that by applying agile prac+ces, I will realize posi+ve business results. How shall we test it. In that moment, Dave pregy much redefined the way I think about Agile. Because this is the essence of what it is to BE agile. To be a learning organiza+on. To focus on con+nuous, incremental improvement. Agile organiza+ons are in a constant state of curiosity and experimenta+on. Constantly ques+oning their assump+ons and constraints, and challenging themselves to seek out opportuni+es to learn. To run experiments. 16

17 So, how would we structure an experiment to test our hypothesis? 1 st we need to define what our dependent variable is what experimental outcomes we expect and how we can observe/measure those outcomes. 2 nd we isolate our independent variable that lever that we wish to move, the ingredient we believe, by adding or subtrac+ng, we will impact our independent variable. 3 rd we run an experiment we manipulate the independent variable in an experimental group and don t manipulate it in a control group. 4 th and finally, we assess the results. And this is important because, the Dependent variable not moving the way we expected it to is not a failure if we come at this from an experimental perspec+ve. Disproving a hypothesis is as valid and valuable a learning as providing it and may even yield more insights and new ques+ons than if the expected outcome had been realized! 17

18 So, what are our Dependent Variables? How business outcomes define agile success for us? Your will be unique, but I ve found that 95% of organiza+ons define business value as some combina+on of these 6 factors: - Produc+vity - Quality - Responsiveness - Customer Sa+sfac+on - Employee Sa+sfac+on - Predictability 18

19 And what specific hypotheses do we have? What agile levers do we believe we can apply to impact our business outcomes? - TDD? - CI? - Co- located Teams? - CSM trained scrum masters? - On- site Product Owners? - Pair Programming? - etc 19

20 So, now that we have this model of Experimenta+on and Measuring Dependent and Independent variables, lets talk about how we measure agile success. How do we measure Produc+vity? Produc+vity is defined (economically) as: - Quan+ty of Valuable Output / Quan+ty of Costly Input Input for sojware development is primarily the cost of personnel. Output can be measured a number of ways with 2 major categories: Units of Work which might be: Story Points (though this can be hard to standardize across teams or to equate to non- agile teams) Func+on Points (famously difficult and expensive to measure) Lines of Code (easily available and unambiguous, but possibly promo+ng some unhealthy behaviors) Units of Value which might be Money (from the business case for the ini+a+ve, or some percentage of that, or perhaps even actual revenue generated) Value Points a rela+ve unit of Value for an ini+a+ve and/or a Release My preference is for Value based Produc+vity, and I have a par+cular affinity for defining that in monetary terms. However, any of these measures create interes+ng insights into team performance. 20

21 Quality Number of Defects Defect Density Defect Arrival / Kill Rate Code Complexity Etc 21

22 Cycle Time and/or Lead Time; for different work items New Features; Change Requests; Defects (by severity) 22

23 Net Promoter Score 23

24 Net Promoter Score 24

25 This is one I m s+ll working on. Exis+ng waterfall Predictability measures Cost and Schedule conformance are inadequate. - Team throughput (velocity) is variable, which creates a cone of uncertainty related to how much work will be completed by when. - Es+mated size of work items is variable (the fibonacci sequence makes this clear) which creates a cone of uncertainty as the total body of work becomes larger and larger. - Es+mates of scope and/or schedule should, if predictable, form the mid- point of a normal distribu+on of likely actual outcomes. However, tradi+onal sojware development organiza+ons almost never have a normal distribu+on of actuals around their es+mates. I have not come up with a really good metric for Predictability that takes these aspects into account and I would love to get all of your thoughts and help in developing one because like it or not this IS an important Business Success criteria for most organiza+ons. 25

26 So, I hope this has been helpful for you. I want to emphasize that these may or may not be YOUR Business Outcomes, your metrics for measuring those Outcomes. And the poten+al levers I ve men+oned for impac+ng those Outcomes are just my hypotheses based on what I ve seen with other organiza+ons. Is that causal rela+onship applicable in your organiza+on? I don t know but I d like to test that hypothesis 26

27 27