A Business Case for Software Process Improvement

Size: px
Start display at page:

Download "A Business Case for Software Process Improvement"

Transcription

1 A Business Case for Software Process Improvement Paul Evans, Black Diamond Software Copyright 2003, Black Diamond Software, Inc. Summary There is substantial evidence that a successful software process improvement (SPI) program will result in improvements in productivity, quality, schedule, and business value. There is also evidence that most SPI programs fail. This paper will present evidence gathered from published studies of successful projects that sheds light on the impact of SPI programs in terms of productivity, quality, cost and return. It will also discuss the risks associated with SPI programs and how they can be mitigated. We will argue that high returns are possible for SPI programs if known risks are mitigated. In addition, we present a set of specific recommendations that will help ensure the success of an SPI program. Introduction First we will survey the literature on SPI programs. The projects described are generally wildly successful. The various studies have somewhat different ways of measuring cost, productivity, quality and financial benefit. But there is no mistaking the fact that most studies document very significant returns. The studies usually try to measure the cost, benefits, and return on investment (ROI) of SPI efforts. The costs generally come from training, Software Engineering Process Groups (SEPG) to develop and maintain the process, ongoing quality reviews of project artifacts, and tools used to help manage, design, build, and document in an intelligent way. The literature also yields insights into where the value comes from. There are practices described below which, when they are institutionalized and made part of everyday operations, yield improvements in productivity and quality. The benefits are usually estimated using some variant of the following analysis. One source of benefit is reduced defects. First, defect measures are defined e.g. defects per thousand lines of code (for code) or defects per artifact (for artifacts like requirements documents). Second, the average cost of fixing a defect in each phase is estimated. This includes rework and re-testing. As process maturity improves over time and additional measurements are taken, defect rates go down. The benefit is calculated by multiplying the difference in defect rates by the cost of fixing a defect.

2 A second source of financial benefit is increased productivity. Measures of output such as lines of code or function points per month go up with maturity level. Also, reuse becomes more predominant at higher maturity levels and contributes to productivity. Benefit is calculated by multiplying the improvement in production by the average labor rate. Costs of SPI generally include training, maintaining a Software Engineering Process Group (SEPG), cost of process assessments, and cost of inspections and reviews. ROI is calculated based on the benefits and costs over time. Reported returns of 400% to 800% on dollars invested in SPI are not uncommon. Of course, not all SPI programs succeed. Anecdotal evidence and analyses by organizations like the Gartner Group indicate a significant failure rate. Case studies of the failures are generally not written up in the SPI literature. We will present material from the analysts and personal experience describing the risks to SPI programs and what it takes to do them successfully. Lastly we will present some simple improvements that are practically guaranteed to yield significant improvements quickly as well as improvements that will provide added benefits over a longer period of time. Literature Survey The relationship between software process maturity, quality, and productivity has been studied extensively. In this section, we will review the results of several studies. Though the studies differ in the details, they show very substantial improvements in productivity and quality as well as very large returns on investment. Many of the studies are oriented around the Software Engineering Institute s Capability Maturity Model (CMM). This model describes five levels of process maturity. Each level is associated with particular portions of the process that mark a logical progression up from the next lowest level. Many of the studies document the results of progression through the levels.

3 Summary of CMM Levels In Level 1 (Initial), projects rely on the skill and individual efforts of software engineers. There is not a defined process that is consistently followed by different project teams. Activity is chaotic and success depends on heroic efforts of individuals. Measurement and reporting, if they take place at all, vary from project to project. Level 2 (Repeatable) focuses on several practices that bring projects under management control and make it possible for an organization to achieve repeatable results. The main areas addressed are project management, requirements, configuration management, and quality assurance. Level 3 (Defined) is about institutionalizing the good practices of level 2. That is, these practices need to be documented and teams need to be trained in them. In Level 4 (Managed), the organization institutes a quality and metrics management program. The software process as well as the products produced are quantitatively managed and controlled. In Level 5 (Optimizing) continuous process improvement is enabled by responding to quantitative feedback from the process and by systematically testing innovative ideas and technologies. (SEI 94) Software Engineering Institute Study In one of the early classic studies of the impact of CMM, the Software Engineering Institute (SEI) studied 13 companies using CMM. The companies differed significantly in size, application domain, and approach to process improvement. To compensate for differences in scale, organization, and environment, results were measured within each organization over time. A summary of the study results is in the table below. Category Range Median Total yearly cost of SPI activities $49,000 -$1,202,000 $245,000 Years engaged in SPI Cost of SPI per software engineer $490 - $2004 $1375 Productivity gain per year 9% - 67% 35%

4 Category Range Median Early detection gain per year (defects discovered pre-test) 6% - 25% 22% Yearly reduction in time to market Yearly reduction in postrelease defect reports Business value 1 of investment in SPI (value returned on each dollar invested 2 ) 15% - 23% 19% 10% - 94% 39% The study shows the improved productivity, decreased defects and high return on investment that are possible in a favorable environment. (SEI 95) Raytheon Raytheon used a cost of quality framework to measure and evaluate their performance. Quality is the degree of conformance to a requirement, design, or specification. Total cost of quality consists of cost of conformance plus cost of nonconformance. Cost of conformance in software consists of appraisal costs (reviews, walkthroughs, initial testing) and prevention costs (training, methodology, policies and procedures, data gathering and analysis, quality reporting). Cost of nonconformance consists of rework to correct defects, re-reviews, re-tests, revising documentation, and external failures. The process now encompasses a 1,200-person software engineering lab. Over 2 1/2 year period at the start of their SPI program, cost of nonconformance (i.e. rework, re-testing) dropped approximately in half from 45% to 20% of project cost. Productivity increased by 70%. Average project cost went from 42% over budget to less than 5% over. Defect density dropped almost in half. ROI on the SPI effort was estimated at 7.7 to 1. 1 Business value includes: savings from productivity gains and from having fewer defects. Do not include secondary benefits of earlier time to market. 2 Costs include: Software Engineering Process Groups (SEPG), assessments, and training. They do not include staff time to put new procedures in place.

5 (CROS 0302) General Dynamics Decision Systems (GDDS) The three software development organizations within GDDS have about 360 developers and have been assessed at CMM level 5. Their metrics databases contain history from level 2 on. The following table contains productivity and quality figures associated with levels 2 through 5. Phase Containment Effectiveness 3 Productivity Relative to Level 2 Percent Defect Density CMM Level Rework per KSLOC % 25.5% % 41.5% % 62.3% % 87.3% About 2.5% of base staffing is dedicated to process improvement efforts. GDDS estimated that the ROI of going from Level 2 to Level 3 was 167%. Going from level 3 to Level 4 generated additional ROI of 109%. (IEEE 0997) Motorola Motorola has a relatively sophisticated quality measure. First, they make a distinction between problems that are detected in the same phase as they are introduced (errors) and problems that escape until a later phase (defects). They measure defects per million assembly-equivalent lines of code. They also measure a variable called cycle time. Cycle time is defined as the ratio of time to complete a baseline project to time to complete a new project. For example, if the baseline project took six months and the new project was completed in 2 months, the Cycle Time Factor would be 3. Quality and productivity measures by CMM level are in the table below. CMM Level Defects per MAELOC4 Cycle Time Factor Relative Productivity ROI on Motorola s investments in SPI was calculated to be over 600%. 3 Problems detected divided by problems inserted in a phase 4 Million assembly-equivalent lines of code

6 Benefits of Successful SPI Programs The studies above and others document the benefits of successful SPI programs. Typical increases in productivity are on the order of 30% to 35% per year. Reported Return On Investment (ROI) figures are in the neighborhood of 500%. Annual reduction in post-release defect reports is typically around 40%. In most cases, the improvement continues year after year over the duration of the study even for organizations that have been practicing CMM for five or more years. To summarize, the benefits of SPI programs are: High return on investment Increased productivity Shorter schedules Fewer defects Cost savings More accurate estimates and schedules Value of SPI A main source of value in SPI programs is higher productivity. Higher productivity is a result of fewer defects inserted, more defects caught earlier, and more reuse. More mature software organizations insert fewer defects since they use more formal software development methods. They also find more defects earlier due to more formal inspection processes. In general, the longer a defect goes undetected, the more it costs to fix. If fewer defects are inserted and they are found earlier, there is far less rework and re-testing. The result is improved productivity, lower cost, and reduced schedule. Improved reuse tends to come later in SPI efforts because it depends on other process areas being established for it to be effective. Inspections Inspections should be formal processes intended to find defects in software and other artifacts at or near the point of insertion of the defect. The cost of finding and fixing defects early is 10 to 100 times less than during testing or after release. Many defects are from enhancement requests that would have been handled had more care been taken in requirements gathering. Better requirements management and change management can mitigate this problem. Inspections, where they are practiced, consume about 15% of total project cost (FAGAN 86). However, it has been estimated that 30 hours are recouped for each hour of inspection. An additional benefit if inspections is that developers who participate produce fewer defects in future work than otherwise.

7 Reuse Reuse has the obvious benefit of reducing the amount of new code to be written. Reuse also lowers defects since the work product is cleaned up more each time it is reused and is more reliable going forward. The result is a 40% to 60% increase in productivity. There are costs to reuse. First, designing and coding software to facilitate reuse is more expensive than otherwise (estimates are 111% to 200% as much). Second, there has to be infrastructure in place that facilitates finding code to reuse as well as the time to search for modules to reuse before coding something new. Barriers to reuse are largely cultural (CARD 94). They include skill and knowledge (or lack thereof) of the consumer about reuse. There are also issues around how well the producer and consumer requirements match up. There are also real architectural issues that also impede reuse. Despite research on how to evaluate compatibility and development of best practices for integrating components, projects that attempt to reuse components (particularly purchased components) frequently run into unexpected difficulties, incompatibilities, and behaviors. Despite the costs and difficulties involved, over 10 years, ROI on reuse ranged from 210% to 410%. Cleanroom Process While they are not in as widespread use as inspections and reuse, cleanroom techniques show promise as a way to increase productivity. They do this by reducing defects inserted to nearly zero. Cleanroom development places greater emphasis on design and verification rather than testing. It has the side benefit that errors that do occur are easier to fix than in traditional methods. The Cleanroom process is built upon function theory in which programs are treated as mathematical functions. Design takes place in a series of iterations in which functions are specified in a top-down fashion with increasing detail in each iteration. Time spent on design and specification is greater than in traditional approaches, but productivity is better and much less time is spent in test. Overall, lifecycle cost is much lower due to the fact that there are fewer latent errors and that the programs are easier to modify since greater care went into the design. Secondary Sources of Value Secondary sources of value from SPI programs include improved customer satisfaction, better staff morale, reduced overtime, reduced turnover, and lower absenteeism. In addition, quicker turnaround means shorter time to market and faster returns on software investments. Secondary sources of benefits are usually not counted in return on investment calculations, but could be the most important reasons to improve software processes.

8 Cost of SPI SEI and Jones studies on cost are in relatively good agreement. They used different indices to measure maturity, but the general conclusion is that it takes 2 to 4 years and costs between $6K and $19K per developer in SPI (training, SEPG, assessments, SDLC, etc.) and between $5K and $25K in tools (DACS 01) to get from an initial state to optimizing (in CMM terms). Several studies estimate the ongoing cost of SPI (not including tools) to be between 1.5 and 2.5 percent of base staffing. Some of the studies are not as thorough in quantifying the costs of SPI, as they are the gains in productivity and quality. Based on my experience, there are a number of costs that are underestimated or not accounted for in the studies. Communication to promote acceptance of the process is one. Infrastructure, maintenance, and support for tools is another. Return on SPI Returns of 400% to 600% (and higher) are frequently reported. I believe that measurements of benefits are fairly accurate, but, as indicated earlier, cost estimates are probably low. My conclusion is that returns are still very substantial, especially considering the secondary sources of value, but probably not 500%. Risk In SPI Programs As demonstrated above, there is a compelling case that successful SPI programs pay. Yet, such projects are risky. The Gartner Group has estimated that 70% of SPI programs fail within three years. Major causes of failure are: Lack of Organizational Focus and Vision inability to articulate a vision or set direction. They do not know what needs to change, why it needs to change, or how they will know if they are successful. Failure to Assess (GART SP3JUL02) -- With models such as the Software Engineering Institute's (SEI's) Capability Maturity Model (CMM) readily available, it's tempting for many application development (AD) leaders to assume a "ready, fire, aim" mentality and rush off to make improvements. Unfortunately, this misplaced faith in a "one size fits all" model often results in the AD organization doing the wrong things better, while taking too long to do them. (GART SP3JUL02) Lack of Enforcement Monitoring compliance with the process is fundamental to success. If no one is enforcing the process, project teams will tend to drift and much of the benefit can be lost. Lack of Measurement Measurement is really the only way to know how the organization is doing and whether changes that are made improve the situation. Even a relatively simple measurement program can be invaluable by providing evidence that SPI efforts are working. Failure to Market People leading the SPI effort must understand who will be asked to change their behavior and what the new behavior is expected to be. They should

9 look at the benefits and barriers to change and define the specific activities that will ensure that each group will agree to change. Misunderstanding the Role of Tools Many believe that a tool of some sort will provide a universal remedy. However, SPI efforts are primarily about understanding and modifying the behavior of people. Process provides a framework for behavior. Technology provides support for the process. Lack of Stamina Process improvement is a marathon, not a sprint. According to data acquired from the SEI s own assessors, the median time for organizational improvement from Level 1 to Level 2 is 27 months. From there, moving to Level 3 is likely to take another one to two years. (GART SP6OCT98) Some organizations obviously do better than the averages, and measurable improvements start much sooner, but unrealistic expectations about how long it will take can doom an SPI programs. Failure to Maintain a software process that is not maintained withers. The process needs to evolve to meet the changing requirements of technology and the business environment. Failure to evolve the process contributes to lack of acceptance over time since the process becomes less relevant to the needs of the organization. Awareness of the risks goes a long way towards helping to mitigate them. Leaders of an SPI effort can develop a Risk Management Plan to prepare contingency plans. The SPI team needs to have access to the expertise to address these risks. If the right skills are not available internally, be prepared to engage outside experts to help. The Road to Success Some Quick, Sure Payoffs Depending on the circumstances and goals, there are several measures that should be considered at the outset of an SPI program. The first is doing an assessment. By evaluating current practices in the organization against best practices, the most promising areas for gaining high returns can be determined. If your organization is not doing frequent inspections of both code and documents (e.g. requirements documents and design models), it will pay to develop review mechanisms as quickly as possible. This will yield tangible benefits very rapidly. Another activity that can start to pay off quickly is measurement. We ve all heard the maxim that the measured variable improves. The key is to start simple. When delivery dates are compared with planned milestone dates and actual hours are matched to estimates the variances inevitably start to close, plans and estimates become more accurate, and productivity improves.

10 To make these changes effective and lasting it is not sufficient to talk about them in a staff meeting or even to provide formal training. They need to be institutionalized in a process that people can reference any time. And someone needs to be checking that the process, however formal or informal it may be, is being followed and that metrics are being collected. The Longer Term The long-term payoffs from process improvement have been documented above. The major risks to such efforts have also been covered. The key to success is to understand how the risks are manifested in your organization so they can be effectively mitigated. Here are some recommendations for avoiding the big risks: Start with realistic expectations about what it s going to take to achieve the benefits. Management involvement and commitment is cited in many studies as critical to success. The results of the studies above provide realistic ranges of the time and costs involved. Do not be put off by the timeframes cited to go from one CMM level to another. Benefits can begin to accrue soon after an SPI program is started. Use a documented process tailored to organization and maintained regularly. If you do not have a well-defined and accepted process, start with an industry-standard process framework and tailor it to your organization. [See the White Paper Software Process Improvement Through Process Customization on Understand that changing process means changing the organizational culture. Changing process means changing how people work and how they are measured. There will be resistance and actions will have to be taken to overcome it. Tailor the process to meet the needs of the organization. Involve thought leaders in the tailoring effort so that the organization really owns the new process. This will build support and acceptance while the process is evolving. Support the process with tools. Good tools improve understanding, quality, and productivity. For example, they take the tedium out of maintaining documentation that in the absence of a tool is a redundant and error-prone task. Measure your performance. At some level, process improvement activity has to make economic sense. Measurement of time, production, and quality is the only way to demonstrate economic benefit. A well-documented track record is extremely useful in defending a budget and negotiating with end users. Enforce the process. When the minimum level of process formality has been defined, all projects should conform to it. Inspections will keep everyone marching to the same beat. Evolve the process. Keep it relevant and in sync with new techniques and technologies and what project teams really need to do. Conclusion This paper has provided evidence of the high returns of successful projects from several representative studies. Considerable detail on the sources of value and costs of these projects was also provided. Failures of SPI programs are not as rigorously documented, but a list of major risks was presented based on Gartner surveys and anecdotal evidence.

11 Recommendations on how to avoid the major risks are presented. Our conclusion is that returns on SPI programs can be very high if the known risks are mitigated.

12 References CARD Why Do So Many Reuse Programs Fail?, Card, D., Comer, E., IEEE Software, September 1994 CROS 0302 How CMM Impacts Quality, Productivity, Rework, and the Bottom Line, M. Diaz and J. King, Crosstalk, March 2002 CURTIS 00 The Cascading Benefits of Software Process Improvement, Bill Curtis, Presented at Profes 00, June 2000 DACS 99 A Business Case for Software Process Improvement Revised, A DACS State-ofthe-Art Report, T. Gibbon, Air Force Research Laboratory - Information Directorate, 9/30/1999 FAGAN 86 Advances in Software Inspections,, Fagan, M. E., IEEE Transactions on Software Engineering, Vol. SE-12, No. 7, July 1986, pp GART RN9APR02 Gartner Group, M. Hotle, Tactical Guidelines, TG , Why Process Improvement Efforts Fail GART SP6OCT98 Gartner Group, M. Hotle, Strategic Planning, SPA , Climbing the CMM: How Long Can You Tread Water? GART SP3JUL02 Gartner Group, M. Hotle, Strategic Planning, SPA , Correct Diagnosis Is Crucial to a Successful SPI Effort IEEE 0394 Cleanroom Process Model, R. C. Linger, IEEE Software, March, 1994 IEEE 0997 How Software Process Improvement Helped Motorola, M. Diaz, J. Sligo, IEEE Software, September/October 1997 IEEE 0399 A Critique of Software Defect Prediction Models, N. Fenton, M. Neil, IEEE Transactions On Software Engineering, Vol. 25, NO. 3, May/June 1999 IEEE 00 Quantifying the Effects of Process Improvement on Effort, B. K. Clark, IEEE Software, November/December 2000 JONES 96 The Economics of Software Process Improvement, C. Jones, Computer, January 1996

13 KANER 96 Quality Cost Analysis: Benefits and Risks, C. Kaner, January 1996, NASA 00 Adapting Financial Measures: Making a Business Case for Software Process Improvement, W. Harrison D. Raffo J. Settle, N. Eickelmann, NASA's IV&V Facility, Fairmont, WV, 2000 NSQE 02 National Software Quality Experiment, D. O Neil, 2002 ONEIL 01 Return on Investment Using Software Inspections, D. O Neil, 10/9/01 SEI 94 J. Herbsleb, A. Carleton, J. Rozum, J. Siegel, D. Zubrow, Benefits of CMM-Based Software Process Improvement: Initial Results, CMU/SEI-94-TR-013, August 1994 SEI 95 Raytheon Electronic Systems Experience in Software Process Improvement, Haley, T., Ireland, B., Wojtaszek, Ed., Nash, D., Dion R., CMU/SEI-95-TR-017, November 1995 SEI Cleanroom Software Engineering Reference Model Version 1.0, R. C. Linger, C. J. Trammell, CMU/SEI-96-TR-022, November 1996 SEI 01 Investing in Software Process Improvement, An Executive Perspective, M. Paulk, Software Engineering Institute, July, 2001