Implementation Guidance for the Accelerated Improvement Method (AIM)

Size: px
Start display at page:

Download "Implementation Guidance for the Accelerated Improvement Method (AIM)"

Transcription

1 Implementation Guidance for the Accelerated Improvement Method (AIM) James McHale Timothy A. Chick Eugene Miluk December 2010 SPECIAL REPORT CMU/SEI-2010-SR-032 Software Engineering Process Management Unlimited distribution subject to the copyright.

2 This report was prepared for the SEI Administrative Agent ESC/XPK 5 Eglin Street Hanscom AFB, MA The ideas and findings in this report should not be construed as an official DoD position. It is published in the interest of scientific and technical information exchange. This work is sponsored by the U.S. Department of Defense. The Software Engineering Institute is a federally funded research and development center sponsored by the U.S. Department of Defense. Copyright 2011 Carnegie Mellon University. NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN AS-IS BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. Use of any trademarks in this report is not intended in any way to infringe on the rights of the trademark holder. Internal use. Permission to reproduce this document and to prepare derivative works from this document for internal use is granted, provided the copyright and No Warranty statements are included with all reproductions and derivative works. External use. This document may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other external and/or commercial use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. This work was created in the performance of Federal Government Contract Number FA C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The Government of the United States has a royalty-free government-purpose license to use, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so, for government purposes pursuant to the copyright license under the clause at For information about SEI publications, please visit the library on the SEI website (

3 Table of Contents Acknowledgments Executive Summary Abstract Structure and How To Use This Document vii ix xi xiii 1 Introduction to the Accelerated Improvement Method (AIM) The Need for AIM What is AIM? Rapid Deployment CMMI TSP: Team Software Process SCAMPI Six Sigma How Does AIM Work? TSP Training The Personal Software Process (PSP) TSP Measurement Framework TSP Coaching and Self-Directed Teams Comprehensive Quality Management Project Team-Focused Improvement Strategy 8 2 Using AIM Securing and Maintaining Executive Sponsorship Characterizing Current and Future Capability and Performance Identifying, Training, and Launching Pilot Projects Identifying, Training, and Launching the PG Evaluating Pilot Projects and Planning Further Rollout Building a Culture of Excellence and Continuous Improvement 15 3 Specific Guidance for AIM Implementers Certified Experts A Caveat and Disclaimer Three Use Cases Which Version of TSP Supports AIM? Implementation Approach Measurement and Analysis Organizational Policy Process Group Strategy and Coaching The Support PAs Configuration Management (CM) and GP Process and Product Quality Assurance (PPQA) and GP Decision Analysis and Resolution Stakeholder Involvement GP What About the Other GPs? Engineering Practices Additional Guidance for TSP Teams Concerning Six Sigma Preparing for High Maturity 28 CMU/SEI-2010-SR-032 i

4 Appendix A: Activity Charts for Selected Processes 30 Appendix B: TSP+ Scripts for Process Operation and Team Operations 45 Appendix C: Goal-Question-(Indicator)-Metric Examples 53 Appendix D: Process Improvement Proposals (PIPs) 60 References/Bibliography 87 CMU/SEI-2010-SR-032 ii

5 List of Figures Figure 1: TSP Launch Meetings (Source: [Humphrey 2010]) 7 Figure 2: Project Team Implementation Cycles 9 Figure 3: The TSP Development Cycle Structure 15 CMU/SEI-2010-SR-032 iii

6 CMU/SEI-2010-SR-032 iv

7 List of Tables Table 1: Role-Based Training 12 Table 2: Selected TSP+ Activity Flows 30 CMU/SEI-2010-SR-032 v

8 CMU/SEI-2010-SR-032 vi

9 Acknowledgments The authors wish to acknowledge the following people for their early and critical work in using TSP to implement first the CMM for Software and then CMMI-DEV, for enabling and championing this work at critical junctures, and for innovating and validating many of the methods described in this report: the extended process improvement teams at the NAVAIR SSAs for AV-8B, P-3C, and E-2C, especially Jeff Schwalb and Lisa Pracchia; David Webb and the many other pioneers at Hill Air Force Base; the talented people at Advanced Information Systems and Quarksoft; Jenna Fleshman and Jason Huibregtse and the team of CGI Federal Inc. Software Engineering Division; Dr. Fernando Jaimes Pastrana and Rafael Salazar Chávez of Tecnológico de Monterrey; Oscar Mondragon Campos of the Tecnológico de Monterrey SIE Center, and Pedro Beltran Ortiz and Edgar Fernández Rodríguez of SILAC Ingeniería de Software; and our friend and former teammate Noopur Davis of Davis Systems. To you and to the many unnamed others in these and other organizations who have worked both harder and smarter than us in this area, we say thank you. CMU/SEI-2010-SR-032 vii

10 CMU/SEI-2010-SR-032 viii

11 Executive Summary This report describes the Accelerated Improvement Method (AIM), which implements CMMI practices rapidly, reliably, and with high performance. AIM combines the Team Software Process (TSP) and tailored SCAMPI appraisals with elements of Six Sigma and other techniques to achieve typical project productivity gains of 30% while reducing delivered defect rates by 80% and conforming to CMMI maturity level 3 practices, nominally within 18 months for small-tomedium sized organizations. AIM starts up quickly with known costs and proceeds project-byproject until the entire organization has been transformed. This document provides guidance to AIM implementers but also provides background information for executives, line managers, and other affected parties. It is a companion to the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010]. CMU/SEI-2010-SR-032 ix

12 CMU/SEI-2010-SR-032 x

13 Abstract This report is a description of and aid for implementing the Accelerated Improvement Method (AIM), and is a companion to the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM). The intended audience is anyone responsible for implementing CMMI using the Team Software Process (TSP), Six Sigma, and other methodologies management sponsors and champions, line and support management directly affected by such changes, process group leads and members responsible for implementing such changes, and the team leaders and developers enacting such new methods in concert and combination with their existing practices. This guide is not exhaustive; rather it is a starting point on the road to using CMMI and related technologies to help organizations achieve business objectives using world-class process management techniques. CMU/SEI-2010-SR-032 xi

14 CMU/SEI-2010-SR-032 xii

15 Structure and How To Use This Document Section 1 of this report provides a context for and a general description of AIM and its components. Section 2 describes a series of overlapping role-based execution threads for AIM usage; the roles are those typically found in an organization implementing AIM. Section 3 provides specific guidance to process and development groups on AIM implementation issues that they are likely to face. Four appendices are included. Appendix A contains a series of process flow diagrams tying together specific roles and activities for certain process elements of TSP+, the main implementation component of AIM. Appendix B contains a few important process elements in TSP+. Appendix C contains sample templates from the GQ(I)M paradigm that is central to Goal Driven Measurement, as described in a course taught at the Software Engineering Institute 1, and in several SEI publications [Park 1996, Goethert 2004]. Appendix D is a series of Process Improvement Proposals (PIPs) that were used to identify gaps in CMMI coverage in previous versions of TSP. We recommend the following sections of this report for our readers: for management sponsors and champions, all of Section 1 and at least Sections 2.1, 2.2, and 2.6 for line and support management, all of Sections 1 and 2 for most process group leads and members, the entire report excluding Appendix D, with special attention to Section 2 and parts of Section 3 and Appendices A, B, and C as needed, in addition to the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010] for development team leaders and members, Sections 1 and 2, plus specific parts of Section 3 and Appendices A, B, and C as needed in addition to the above, for process groups and development teams with prior TSP experience, Appendix D. 1 Implementing Goal-Driven Measurement, CMU/SEI-2010-SR-032 xiii

16 CMU/SEI-2010-SR-032 xiv

17 1 Introduction to the Accelerated Improvement Method (AIM) This report describes the Accelerated Improvement Method (AIM), a rapid deployment of highperformance CMMI practices the Team Software Process (TSP), tailored SCAMPI appraisals, and elements of Six Sigma as the core technologies of an approach that addresses maturity levels 2 and 3, and provides significant support for the higher maturity levels. This approach builds upon field experience by SEI staff, client organizations, and others that have recognized the potential in using these technologies together. CMMI provides an organizational what-to-do viewpoint, while the other practices bring organizational, project team, and individual how-to-do-it viewpoints as well as critical feedback, with the following features. The approach allows more rapid implementation than CMMI norms, hence Accelerated Improvement Method, or AIM. The nominal AIM timeframe for a small-to-medium-sized development organization to achieve CMMI maturity level 3 is 18 months, or less than half the time normally attributed to the IDEAL-based improvement approach. The AIM approach achieves excellent results in terms of measureable project performance improvements, beginning with the first project. Predictable schedule and costs with a 30% improvement in productivity and 80% fewer delivered defects are common results. Initial pilot projects can begin within weeks of a decision to proceed. CMMI implementation proceeds on a project-by-project basis, rather than using a maturity level (ML) or process area (PA) approach, although certain groupings of PAs are naturally addressed by AIM. The project-by-project approach assures constrained, identifiable, and incremental costs, with measureable results that justify those costs. AIM provides a path to sustain and improve upon excellent and in some cases, world-class results, while building the internal capability to support the new way of working. AIM thus addresses the ongoing debate in the CMMI community of performance vs. compliance. Does an organization implementing CMMI target the achievement of a particular CMMI maturity level? Or does the organization instead use CMMI as a guide for improving performance in terms of critical business measures such as cost, schedule, and quality? AIM recognizes that framing the debate as an either-or proposition creates a false, perhaps even dangerous, choice, and instead affirms that an organization can and must do both to gain maximum advantage from any such technology investment. However one chooses to frame this debate, and whatever method one might employ, CMMI implementations generally face multiple related issues that influence the choices that organizations make when implementing CMMI. How much will this improvement effort cost? How long will it take? How much better will the organization s performance be once a certain CMMI maturity level is achieved? What is this change in performance worth to the business? What must happen to sustain these changes once they are made? AIM provides answers to these questions based on customer experience not on academic projections. 1.1 The Need for AIM Many CMMI improvement efforts rely to some extent on the IDEAL change model [McFeeley 1996]. IDEAL dates back at least to the early 1990s when it was associated with implementing the original CMU/SEI-2010-SR-032 1

18 CMM for Software, or SW-CMM [Paulk 1994]. Newcomers to CMMI-based improvement inevitably learn IDEAL principles since it is the one change management approach mentioned in the standard SEI Introduction to CMMI training. But this also implies that newcomers almost inevitably accept the historical assumptions and limitations built into many early IDEAL implementations, especially the inherited staged-model mindset of the SW-CMM. Assuming that SEI data is reflective of early improvement approaches, the semi-annual CMMI Maturity Profile exposes some of those limitations. One such historical limitation is that CMMI (staged) implementation almost necessarily proceeds maturity level by maturity level, and associated with that procession is an average of one to two years elapsed time per level [SEI 2010]. The improvements from the earlier Software Maturity Profile based on the SW-CMM, most notably a reduction in the time from ML1 to ML2 (from 19 months average over 175 instances, to 4.5 months average for just 6 instances), are probably due to most companies starting on this path not bothering to pay for an appraisal that they expect to show ML1. This is evidenced by the very small number of official ML1-to-ML2 transitions. Paradoxically some of IDEAL s limitations are due to the fact that it is a generic model for improvement and change. IDEAL makes few (if any) assumptions either about where an organization begins or about what methods it uses. As such, even the best of implementations can require a lot of time to gather data, survey a situation, set goals, and build a plan of attack. Lacking specific guidance on methods, or when the one-to-two-years-per-level mindset colors planning assumptions, an improvement team such as a process group (PG) can consume many months acquiring or developing this information. In contrast, the AIM approach uses known, trainable methods with known performance characteristics, and which seem to work in a wide variety of development environments. 1.2 What is AIM? AIM is composed of five key elements that work together to accomplish its goals. 1. a strategy of rapid deployment, project team by project team 2. the current version of CMMI-DEV, in particular the specific and generic goals and practices at maturity levels 2 and 3, as the reference model of best practices [CMMI-DEV 2010] 3. the Team Software Process (TSP) [Humphrey 2000], beginning with training in the Personal Software Process (PSP) [Humphrey 1995], which provides project teams and individuals, including the process group (PG), with a transparent operational framework of process, measurement, and management practices consistent with CMMI 4. tailored SCAMPI appraisals [SCAMPI 2006b], guided by the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010] which provides standard SCAMPI-C reference information to practitioners and appraisers, one or more targeted SCAMPI B/C events at intermediate points during implementation, and, if desired, a SCAMPI A [SCAMPI 2006a] to verify maturity level attainment 5. methods from the Six Sigma discipline for analyzing operational data and then identifying and acting upon improvement opportunities [Motorola 2010] These five elements blend together to produce superior results, beginning from the first pilot project through deployment across the organization. CMU/SEI-2010-SR-032 2

19 1.2.1 Rapid Deployment The logic for AIM rapid deployment proceeds as follows. 1. Use proven methods with known performance characteristics (TSP, CMMI, SCAMPI, and Six Sigma). 2. Characterize current performance and gather information on existing process assets. 3. Train developers and their direct managers quickly in TSP methods on a project-by-project basis. In parallel, train PG and other involved personnel in TSP and CMMI as appropriate. 4. Identify pilot projects and launch each project as a TSP team as soon as training for the team is complete. One of the early teams trained and launched this way, although preferably not the first, is the organization s process group (PG). 5. Gather data on both performance and conformance (the latter through a tailored SCAMPI B appraisal), and make adjustments as each new project is trained and launched. 6. Use data and experience from early pilots to plan and implement broader adoption within the organization. Introduce Six Sigma training and techniques as appropriate. 7. If necessary, confirm CMMI compliance with a SCAMPI A appraisal. Even the smallest organizations can take advantage of early projects to expose potential internal experts to the various technologies. Building internal capability is crucial to the long-term viability of these techniques in any organization CMMI CMMI, including its predecessor CMM for Software, is probably the most widely used family of improvement models for the development of software and software-intensive systems. As a model of best practices, CMMI describes characteristics of such practices, and though examples of implementation techniques are numerous, by design the model itself avoids any recommendations of how one should design and implement such practices. The intent of AIM is to provide just such a set of recommendations, though not an exclusive set. For example, CMMI does not specify an appraisal method, stating only that one should appraise the organization s processes periodically and as needed to maintain an understanding of their strengths and weaknesses (OPF-SP1.2 CMMI-DEV V1.2). SCAMPI appraisals are most commonly used for formal appraisal in this regard. Many others some adapted from SCAMPI or other techniques, some home-grown are used both formally and informally, and often in conjunction with SCAMPIs of one flavor or another, in order to implement the intent of the model. AIM specifies the formal use of tailored SCAMPI appraisals in several modes, but other techniques can and should be used to supplement such activities. AIM s scope with respect to CMMI is limited and specific: CMMI-DEV V.1.2 maturity level 3, excluding Supplier Agreement Management (SAM). However, this scope does not imply that one cannot implement SAM or higher maturity processes within AIM, only that AIM practices make no specific claim that they implement any of that PA s practices. This also does not imply that AIM might somehow become obsolete with the next version of CMMI- DEV, or even that, for example, one should avoid attempts to implement CMMI-SVC practices using AIM. As of this writing, CMMI V.1.3 is in its final stages of development, with minimal changes at maturity levels 2 and 3. Furthermore, the constellation architecture of the CMMI models specifies CMU/SEI-2010-SR-032 3

20 16 of the 22 PAs in CMMI-DEV as core PAs that are common to CMMI-DEV, CMMI-SVC (CMMI for Services), and CMMI-ACQ (CMMI for Acquisition), 12 of those at ML3. All 12 core PAs are within the scope of AIM, as well as the structure provided by all 12 generic practices at capability levels 2 and 3 that are common to all PAs, perhaps making AIM a good starting point to address not only CMMI-DEV (the current focus) but CMMI-SVC and CMMI-ACQ as well TSP: Team Software Process As implied by its name, the Team Software Process component of AIM deals directly with the most generic units of implementation project teams and their members. In some sense, the most fundamental objective of CMMI is to improve the performance of project teams by changing the behavior of those teams individual members. Beginning with training in the Personal Software Process (PSP) and carrying over into actual development projects, TSP team members implement a task-oriented framework of processes and management and measurement frameworks that provide ongoing performance feedback to both the individual and the team. This approach enables rapid and profound changes in behavior, as reflected by measured performance improvements. PSP and TSP were designed to guide not only software development, but any structured intellectual activity. For example, TSP has been adapted for use in the development of video games, for which a typical team includes game designers, graphic designers, writers, artists, and musicians, all of whom outnumber the software developers [Bala 2007]. PSP and TSP principles have also been used to create most of the training, presentation, and print materials used to spread TSP knowledge. As of this writing, TSP pilot projects exist for services applications and for development teams using SEI architecture methods. All of these are in addition to the expected uses for TSP in more traditional software development venues such as banking and finance, industrial and embedded control systems, shrink-wrapped software, and IT applications. TSP has been in use for over 10 years with excellent results [Sasao 2010, Nichols 2009, Wall 2007, Davis 2004]. Capers Jones recently identified TSP/PSP as one of the top development methods in use today across small, medium, and large systems, and in fact, is the only method that ranked either first or second in all project size categories [Jones 2009]. TSP s built-in performance measurement framework is used continuously to verify implementation and to identify opportunities for further improvement SCAMPI As mentioned above, SCAMPI appraisals are a key component of AIM implementation. If necessary, an early SCAMPI B/C appraisal may be used to survey organizational practices. At intermediate stages, SCAMPI B/C is used to verify implementation of other AIM practices, to identify both potential weaknesses to be corrected and unique adaptations that have the potential to improve the common practice, and also to verify that, where AIM and organizational practices have combined, those combinations are CMMI-compliant. To verify the attainment of a particular CMMI maturity level, the only official option is the SCAMPI A appraisal led by an SEI-certified Lead Appraiser. The Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) provides a practice-by-practice breakdown of expected AIM artifacts through CMMI maturity level 3 [Miluk 2010]. These artifacts are the expected results of TSP implementation, without consideration of any other valuable practices that might exist on a team or in CMU/SEI-2010-SR-032 4

21 an organization. Obviously, these expectations should be adjusted based on an organization s particular implementation, as well as the results of any intermediate SCAMPI B/C appraisals Six Sigma Six Sigma methods have a documented history of use in CMMI-based improvement, especially for the attainment of higher maturity levels [EB 2008, Habib 2008, Siviy 2008, Siviy 2005]. However, the use of these methods in AIM is not to address CMMI maturity levels 4 and 5, but rather to provide a framework for evaluating the often voluminous data provided by TSP teams, and to then take action on the opportunities presented by that data. The Six Sigma concentration on the Voice of the Customer (VoC) is usually the most immediately useful method within the AIM framework. In addition to the obvious use in Requirements Development (RD) and Validation (VAL) within CMMI, VoC is a valuable tool for eliciting stakeholder needs in preparation for a TSP launch. Beyond that, Six Sigma deployment typically proceeds after some significant number of teams and projects have completed cycles and projects in an organization. Both DMAIC (Define, Measure, Analyze, Improve, Control) and DFSS (Design for Six Sigma) methods are useable within AIM. The general recommendation is to begin with DMAIC unless an experienced Six Sigma Black Belt is available to lead DFSS efforts. 1.3 How Does AIM Work? While the five major AIM components must work together and be balanced properly for any particular organization, it is fair to say that TSP implementation is the key to AIM effectiveness. By themselves, TSP practices address a large majority of CMMI specific practices. In addition, there are five significant distinguishing features of the TSP which, while consistent with CMMI practices, go significantly beyond the requirements of the model. The total effect of these practices is to approach the CMMI ideal of the right functionality delivered defect-free, on time, and on budget. These TSP features are: 1. Personal Software Process (PSP) training 2. TSP measurement framework 3. TSP coaching and self-directed teams 4. comprehensive life-cycle quality practices 5. a project-team-focused improvement strategy These features reflect the view that software-intensive systems are produced by knowledge work. As Watts Humphrey observed, the primary rule for managing knowledge work is that managers can t manage it; the workers must manage themselves. In order to manage their own work, softwareintensive development teams must be motivated properly, make their own plans, negotiate their own commitments, track these plans and commitments, and manage their own quality. These five features of TSP are essential for its use by and integration into AIM. Therefore, it is essential that those implementing AIM understand something of these features TSP Training The Personal Software Process (PSP) It is fair to say that TSP, the Team Software Process, would not exist without PSP, the Personal Software Process. PSP development preceded that of TSP by several years [Humphrey 2000]. But CMU/SEI-2010-SR-032 5

22 more than that, both in philosophy and in operational detail, PSP training and practices are foundational to TSP. PSP, as the full name implies, teaches process skills at the individual level. PSP is taught as a series of progressively more comprehensive operational development processes, beginning with a plain vanilla framework usable by any competent software developer, with basic measurements of time and defects. Developers write a series of small programs, gathering data along the way on their own performances, adding size measurement and estimation practices, test and design documentation, personal reviews, and personal planning (a partial list). Each of the PSP practices are used, relied upon, or built upon in the TSP TSP Measurement Framework The TSP measurement framework is the first obvious way in which TSP builds upon PSP skills. Strictly speaking, TSP measurement adds no new data to the set specified by PSP (time on task, defects, product size, and task completion date). However, dates are not used in any substantive way in the PSP training, whereas, of course, they are critical to most real-world projects. These basic data requirements pre-date the PSP, finding their source in the early days of SEI measurement research [Carleton 1992]. Unique to TSP, these data originate with individual developers in real-time. The data then aggregate at the team level, and are used at least weekly for the purposes of project and quality management, both by the project team collectively and by the team members individually. Thus, the measurement framework provides enormous leverage to project teams to manage their own work and their own projects, and to the teams, projects, and organization as a whole to analyze and improve process performance. The questions about what data to report to what layer of management, and how often to report, are, of course, questions that must be answered by any CMMI effort. In TSP, however, this becomes more a question of how to store, select, and summarize data, rather than the more usual case of having to figure out what data to collect and how to collect it in the first place. Another SEI technology, Goal- Driven Measurement (GDM) [Park 1996], which is also a descendant of the original SEI measurement research, may be useful within this context as well, by providing documented coverage of CMMI Measurement and Analysis (MA) practices [Goethert 2004] and a convenient way to summarize TSP measurement usage within an organization (see Appendix C) TSP Coaching and Self-Directed Teams TSP teams are self-directed, but this does not mean that there is no identified leader of the team. The TSP team leader is designated by and responsible to the management chain, just as in more traditional authority structures. However, TSP is based on the idea that because software is knowledge work, such knowledge work can be managed effectively only by the people who are doing the work. The structure of the TSP team thus is not the traditional one with a team leader exerting a command-andcontrol management style, but rather one that recommends a more egalitarian approach with the team leader as a first-among-equals directing and coordinating the team. Self-directed TSP teams rely on their PSP skills and the TSP measurement framework to plan and track their own work, usually more effectively than with traditional team leadership. The most visible vehicle for planning is the TSP launch, an intense series of meetings lasting from one to five days, depending on the size and number of team members and project teams involved, as well as on the scope and duration of the project. CMU/SEI-2010-SR-032 6

23 Figure 1: TSP Launch Meetings (Source: [Humphrey 2010]) During a launch, a TSP team chooses team roles, defines its own objectives for achieving management goals, identifies work products, defines a technical strategy, selects or defines a development process, and builds both an overall plan for the project and a detailed near-term plan (down to the levell of several weekly tasks for the individuals on the team). The TSP team also identifies and evaluates risks to the plan, prepares a briefing report for management, and then presents the plan to management for negotiation and approval. After the launch, the team generates weekly reports of the data that they collect using the measurement framework (both individually and aggregated as a team), and uses the data to track progress and risks and to identify and respond to issues proactively, including replanning as necessary. All of these activities are recognizable practices within various CMMI Project Management PAs. While effective TSP team leaders are often effective coaches to their teams, the TSP defines a separate role for a designated coach. The TSP coach is critical to the success of the self-directed team in the launch, periodic relaunches, and execution of the project. The role of the coach is perhaps best explained by analogy to sports teams. A coach typically does not take the field, but rather focuses especially before and after the game on the skills and processes used by the team and its members. While it was originally thought that the need for an independent coach would diminish as a team leader gained experience with TSP methods, in practice, launches and relaunches (at a minimum) seem to demand an independent process expert role that cannot be filled effectively by the team leader, especially for larger teams. One notes, however, that an effective team leader seems to be able to function, at least in routine situations, as a coach outside of the launch environment, allowing scarce TSP coaches to extend their circle of influence more widely Comprehensive Quality Management Both CMMI and TSP have their roots in the manufacturing quality movement. The fundamental idea that rework is waste that can be measured and intelligently reduced goes back at least as far as Dr. Walter Shewhart [Shewhart 1980] (originally published in 1931), and carries on through the work of W. Edwards Deming [Deming 1982], Philip Crosby [Crosby 1980], and many others. The idea was used by Watts Humphrey and his colleagues at IBM to address software quality issues; he brought this with him when he founded the SEI s Software Engineering Process Management (SEPM) program in 1986 and led the development of the original CMM for Software. The formulation of Total Quality CMU/SEI-2010-SR-032 7

24 Management (TQM) [Deming 1982] and Six Sigma practices [Motorola 2010] other systems of practice derived from manufacturing quality ideas was also underway in this same general period. The systematic identification and elimination of waste underlies the fact that faster, better, cheaper is not so much a choice of alternatives as a recipe for improvement. In other words, while one might be able to choose faster or better or cheaper at the expense of one or both of the others, there is almost always a way to choose better that is also faster and cheaper. Significantly in software, the ways that are at once better, faster, and cheaper can be known and implemented only by those who actually perform the work in question that is, by the knowledge workers themselves. This idea is fundamental to AIM methods and results. When TSP teams produce overall and detailed plans in a launch, they don t just plan what to do, they also plan how well they are going to do it when they are likely to inject defects and how many, when they are likely to discover and remove those defects, and how many defects will leak through the final stages of integration and testing, and ultimately will be delivered to the customer. This planning and its subsequent execution is not a utopian exercise dedicated to some impossible ideal of perfection, but rather a hard-nosed business practice that recognizes that the largest variations, and therefore the largest controllable costs and some of the biggest project risks, are embedded in the issue of quality. The theme of quality management is woven throughout the various threads of the CMMI-TSP-Six Sigma tapestry. In PSP training, every defect found is cataloged and later analyzed for use in building and updating review checklists. The defect density in test and the amount of time spent finding and fixing defects in test (i.e., rework) are primary indicators of both product and process quality. This idea is carried through intact into the TSP measurement framework, and, as mentioned previously, is a primary focus of self-directed team planning, execution, and coaching throughout the development life cycle. When enough data accumulates across projects to define relationships between effort, schedule, and quality, Six Sigma methods analyze those relationships by looking for opportunities to make process changes that will improve performance across all projects. These changes can take the form of eliminating common issues with current process execution (for example, by formalizing the different perspectives to be covered in a design inspection) or by modeling or piloting the use of a new tool or method (which can be as trivial as specifying the use of a hitherto unused feature in an existing development environment, or as substantial as specifying a method for architectural design) Project Team-Focused Improvement Strategy Many CMMI implementation efforts attempt to move an entire organization up the maturity ladder level by level, or perhaps PA by PA in slightly more enlightened instances. Part of the rationale for this lock-step approach is that moving one part of an organization very far above another can result in organizational dysfunction, for example, by imposing different operational requirements upon other otherwise similar projects, or by having closely cooperating projects working in dissimilar ways. The TSP approach, as depicted in Figure 2, is to move entire projects to new, more efficient behaviors representing significantly higher maturity levels very rapidly. CMU/SEI-2010-SR-032 8

25 Select the team(s) Repeat Train managers and developers Refine and evaluate the approach Launch and coach Figure 2: Project Team Implementation Cycles The experience of TSP projects in organizations at all maturity levels is that the potential issues related to this approach are easily managed, for four reasons. First, pilot projects by definition are trying something new in the organization, and clear communications concerning what is being attempted and why typically result in a willingness to accommodate necessary changes, at least on a trial basis. Second, because TSP and now AIM offer an operational system of practices, the critical internal interfaces required for smooth project operation are well-defined. A third and related reason is that TSP mechanisms are also defined for multiple team projects, so that dealing with interface issues in such cases does not require case-by-case definition. Finally, because TSP is a defined and measured operational system, each project serves as an example of model behavior, not to mention potentially exemplary results. The project team focus is the key factor enabling the rapid deployment that is a hallmark of AIM. A significant majority of CMMI practices are project-level practices. In AIM, many of the practices that apply to organizational activities are implemented by the PG team. Thus, by training and launching team-by-team as quickly as training resources and project cycles permit, and by including the PG as one of those teams, maximum diffusion of AIM methods is achieved in minimum time. CMU/SEI-2010-SR-032 9

26 2 Using AIM Many who wish to use the AIM technologies only want to know one thing what to do. This section describes six overlapping execution threads that generally begin in the order presented, then run in parallel with one another over the course of months and years as an organization adopts these methods and adapts them for use in its unique circumstances. Although each thread is described separately here, various threads constantly interact during the course of AIM introduction and ongoing execution, as will become apparent in the following descriptions. The AIM execution threads are: 1. Securing and maintaining executive sponsorship 2. Characterizing current and future capability and performance 3. Identifying, training, and launching pilot projects 4. Identifying, training, and launching the PG 5. Evaluating pilot projects and planning further rollout 6. Building a culture of excellence and continuous improvement 2.1 Securing and Maintaining Executive Sponsorship One way in which AIM is very like other improvement initiatives is in its need for executive sponsorship. Sponsorship demonstrates explicit interest on the part of senior management to create and sustain an organization that exhibits the high-performance characteristics of TSP teams using methods that will be recognized as CMMI-compatible by a competent SCAMPI appraisal team. For AIM, serious interest is signaled by attendance at one-day seminars such as Implementing CMMI for High Performance 2 or the TSP Executive Seminar 3. Each seminar conveys some of the information in the present document as it builds the case for implementing CMMI according to these recommendations. When presented at a customer site, a seminar is often coupled with a half-day planning session that begins the process of formulating specific, measureable goals for the AIM efforts, identifying any relevant baseline information in the organization, and identifying likely candidates for pilot projects and the PG. However, active participation in such a seminar or funding a proposal to implement AIM is ultimately not enough to demonstrate management s commitment to these methods. It is their ongoing involvement in these efforts that ultimately signals to the organization, This is how we do things now. Sustaining sponsorship can be seen in at least two ways: thoughtful policy statements (both official and informal), and regular reviews. It is no accident that each of these are reflected in CMMI generic practices GP 2.1 Establish an Organizational Policy and GP 2.10 Review Status with Higher Level Management, which apply to every process area in the CMMI. The AIM recommendation is started with a very broad policy statement from management that indicates the general direction in which the organization is heading over the next several months and 2 Implementing CMMI for High Performance, an Executive Seminar, 3 TSP Executive Strategy Seminar, CMU/SEI-2010-SR

27 years with respect to disciplined practices in the areas of the CMMI process categories project management, process management, engineering, and support activities. As the pilot projects and PG work begins to instantiate AIM practices within the organization and these results are reviewed with senior management, more specific policies tailored to the organization s best results are crafted, usually by the PG, but always with strong input from project personnel. In other words, the people most affected by the policies (guiding their use of these methods) will write those policies, while regular reviews of the results from usage experience will explain and justify the policies for senior management to approve and then advocate across the organization. 2.2 Characterizing Current and Future Capability and Performance When formulating specific performance goals for almost any endeavor, the question usually arises: Compared to what? Increase productivity by 50%, Reduce delivered defects by 50%, and Decrease cycle time by 20% all performance goals presume that one knows currently how productive a project or organizations is, how many defects are currently delivered, and what the cycle time currently is, respectively. Even Achieve CMMI Maturity Level 2 by date X (a process capability goal) implies that, if measured by a SCAMPI A, an organization is currently at ML 1. As implied above, both project performance and process capability are valid targets for improvement. AIM takes the view that these are not just compatible but complementary, at least when performance is pursued using the known-capable methods recommended in this report. How well the organization and typical projects perform currently should inform any such goals and also act as a starting point for realistic improvement efforts. While many software-intensive organizations cannot put a number on productivity or delivered defects or cycle times, that does not imply that such numbers do not exist; it may mean simply that they need to be retrieved from their current hiding places. It may be that performance and capability can be assumed at the start and validated as the effort progresses. For example, it was a significant advance for the CMMI community to recognize that there is little value in doing a full SCAMPI A in an organization that is clearly at maturity level 1, especially when the same starting-point information could be gathered with a less formal, and less costly, SCAMPI B. When doing SCAMPIs as part of an AIM effort, it often makes sense to delay even the formality of a SCAMPI B until sometime after pilot projects have started, and then to include in the appraisal, projects both inside and outside the current scope of AIM. This contrast provides clear context for the organization s pre-aim capabilities. Characterizing performance can be more difficult. Postmortems that gather and begin to summarize and analyze individual project performance are a standard feature of AIM projects, but knowing or even making an intelligent guess as to where the organization was prior to the AIM effort can be challenging. Perhaps the development part of the organization doesn t have its productivity or delivered defect numbers, but the testing part of the organization is likely to have some relevant information from which one might be able to derive a useful performance baseline, however narrow that might be. Basic accounting information such as timecards and actual budget and schedule results can be used to estimate, for example, the percentage of time and effort typically spent in formal testing compared to the overall project. Even qualitative instruments such as surveys can be used, especially before and after pilot projects, to help assess performance improvements from the adoption of AIM techniques. If an organization desires or requires CMMI validation recognized by the wider world, SCAMPI A is the only official method. As stated previously, the AIM target is CMMI maturity level 3, but the CMU/SEI-2010-SR

28 implementing organization must understand the implications of preparing for and successfully executing a SCAMPI A, as well as the limitations of AIM out of the box. The most apparent path to this is a series of SCAMPI B and C activities, both formal and informal, and of varying scope, tailored to assessing how well the organization has implemented AIM with respect to CMMI, including a search for problems with current practices and opportunities for improvement. SCAMPI A requires an SEI-certified SCAMPI Lead Appraiser (LA), and the selection of the LA is an important step in this process, regardless of whether the organization is using AIM or some other approach to implementing CMMI-compatible practices. The LA and the organization should be on the same page from the beginning, including letting the LA know up front how the organization is approaching CMMI implementation. The LA may or may not lead the SCAMPI B/C activities, depending on a host of factors including cost and scheduling, but the data gathered and the lessons learned in these activities usually inform and guide SCAMPI A preparations. The Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010] provides CMMI-referenced information on standard AIM practices that should prove useful to Lead Appraisers, appraisal team members, PG members, and other personnel involved in the effort. 2.3 Identifying, Training, and Launching Pilot Projects The execution thread of identifying, training, and launching pilots is perhaps the best understood thread, in that it has been used for over a decade for TSP introduction, and has extensive description elsewhere [Humphrey 2011]. The key challenge is identifying pilot projects that are both likely to succeed and also likely to be seen as valid examples for the rest of the organization to follow. The number, kind, and scope of pilot projects varies according to the organization s size in terms of project mix, the range of project sizes and types, typical durations, and type and numbers of development staff. Once pilot projects have been identified, they can proceed rapidly. Training for the various roles involved in development projects is well-defined and can be taught in parallel, although there is a preference to deliver Leading a Development Team first to make managers and team leaders aware of the training that their personnel will soon receive. Table 1: Role-Based Training Course Target Audience Duration Leading a Development Team 4 Team leaders, line managers 3 days PSP Fundamentals* 5 Software developers 5 days TSP Team Member Training 6 All other development team personnel 3 days *PSP Advanced 7 is highly recommended for software developers once they have had some experience applying the skills and principles taught in PSP Fundamentals. Project teams should be trained together whenever possible, and project launches should follow closely after the training. One should note that preparation and the actual launch events typically involve managers above the first level, often up to the senior executive level, who explain the 4 Leading a Development Team, 5 Personal Software Process (PSP) Fundamentals, 6 TSP Team Member Training, 7 Personal Software Process (PSP) Advanced, CMU/SEI-2010-SR

29 organization s goals and reasons for AIM implementation as well the goals and particulars of each project being launched. This is an intended overlap with the Executive Sponsorship thread and the Training and Launching threads. Regular reviews with sponsoring management must also become a regular AIM feature (and part of the Continuous Improvement thread), at least quarterly, although monthly is common and bi-weekly or even weekly reviews, especially during pilot projects have occurred. The process scripts TOPS and TOPS7 in Appendix B provide the best summary of activities surrounding the preparation, launch, and execution of AIM development teams. 2.4 Identifying, Training, and Launching the PG Organizations pursuing CMMI-based improvements are often admonished to treat process improvement like a project, but concrete guidance on exactly how to do this does not often follow. AIM requires that the PG is trained and launched exactly like every other team, although with very different goals and requirements compared to most projects. As the organization s experience with accumulated data from AIM projects grows, the PG will require additional training, in Six Sigma methods, in order to analyze, characterize, and act upon the large amounts of data that AIM projects produce. The minimum initial training for the PG is Introduction to CMMI-DEV and either PSP Fundamentals or TSP Team Member Training. Highly recommended is Implementing Goal-Driven Measurement. Depending on the size and needs of the organization and the preferences of individual members, some members of the PG may further develop their CMMI expertise. Some may choose additional coursework leading to becoming certified to teach the Introduction to CMMI 8 class or even become a SCAMPI Leader Appraiser or SCAMPI B/C Team Leader, while others may choose to follow the path to become PSP Instructors or TSP Coaches. Planning and developing for the scope, implementation, and internal capabilities of the target organization to deliver training and coaching services is a paramount responsibility of the PG under AIM, and maps directly to the strategic training needs under the CMMI process area Organizational Training (OT), as well as tactical needs identified by projects (see PP SP 2.5 and GP 2.5 for relevant PAs). Typical initial PG goals include addressing the organization s strategic training needs, including AIM-related skills. Having some level of internal TSP coaching capability, for example, is a hallmark of success for most organizations, so either hiring or identifying and developing internal coaches is a critical activity. Internal resources also help to speed the rate of implementation through the organization. Another major PG responsibility is the establishment and maintenance of organizational process assets as described in the CMMI process area Organizational Process Definition (OPD). Obviously, the AIM process assets become part of this definition; however, even the most chaotic of organizations typically have some core processes that can rightly be identified as good practices that should be preserved and built upon. In fact, the AIM approach to the Engineering process category (see below in Section 3.11) specifies that the organization s existing development practices are treated as the starting point in those areas. The initial pilot projects in any organization will typically identify and perhaps formalize these practices, which should in turn simplify the PG s task of capturing these practices and related execution data for use by other projects. 8 Introduction to CMMI for Development v1.3, CMU/SEI-2010-SR

30 The final major responsibility for the PG relates to the CMMI process area Organizational Process Focus (OPF), which deals with identifying strengths, weaknesses, and improvement opportunities for the organization s process capabilities, planning and implementing process improvements, and deploying process assets across the organization, while incorporating the lessons learned by the projects. In essence, this puts the PG in the position of coordinating all of the threads of AIM execution on an ongoing basis. A general description of PG startup and ongoing execution can be found in process scripts POPS and POPS7 in Appendix B. 2.5 Evaluating Pilot Projects and Planning Further Rollout Certainly after the presumed successful use of AIM methods on pilot projects, and sometimes even earlier (especially on larger projects), an organization may decide to adopt AIM across the organization, or to proceed to a second round of pilot projects reflecting a broader range of projects in the organization. In the smallest organizations, broad adoption may be an accomplished fact because the entire organization has already been trained and is using the methods, or it may simply be a matter of training the rest of the personnel and launching one or two more teams. Larger organizations have the problems of training and sequencing ongoing training requirements, plus when and how to launch TSP techniques with remaining teams. And all organizations have the ongoing issue of providing the skilled TSP and CMMI expertise necessary to maintain AIM implementation, and eventually broaden the scope of data analysis, possibly with Six Sigma techniques. All AIM projects, and even cycles within those projects as depicted in Figure 3, end with postmortem (PM) events. These postmortem events characterize project performance for use by the team doing the next cycle or project, and which feed into the organization s long-term memory for use by future teams and possibly by the PG or other group for further analysis. A general description of a TSP cycle can be found in process script CYCLE in Appendix B. While fulfilling particular CMMI practices (notably GP 3.2 for many PAs), an important part of the PM is gathering PIPs (Process Improvement Proposals) that reflect problems and observations made by development teams in the course of doing their work. These can range from complaints on defect tracking that can lead to a simple standard for using an existing field in an existing tool, to a suggestion to develop an entire new process discipline such as formal architecture methods in order to avoid entire sets of problems encountered by largescoped projects. CMU/SEI-2010-SR

31 Figure 3: The TSP Development Cycle Structure Planning and executing a broad rollout of AIM methods, especially in a larger organization, can fall either to the existing PG or to a related team of outside resources that have the capacity, for example, to train a large number of personnel quickly and then to launch those people as TSP teams, using process assets that may or may not have been updated by pilot project teams. Even in the case of outside resources, the PG should be involved with every step of the process, if only because they will have to live with the results. The PG can also warn of potential pitfalls that only organization insiders would recognize, and help to navigate around them. In the best case, the PG will be heavily involved in training and launching new teams, especially in preparing and advising them on the use of the organization s standard process assets. 2.6 Building a Culture of Excellence and Continuous Improvement Ideally, there comes a time when the entire existing organization has been trained, mechanisms have been implemented to train new developers and managers, projects new and old contribute to the organization s process assets, the organization has been officially appraised at CMMI ML3, and management talks the talk and walks the walk. While this is a significant achievement, AIM can and should be used to do more. If an organization says we re good enough then, in some sense, a business opportunity may be passed up, and a potentially significant risk may be forming. The business opportunity is in the potential of the organization to go far beyond its previous performance, based on detailed quantitative knowledge of its own capabilities. This could mean using Six Sigma methods to achieve a high CMMI maturity level, or it could mean simply using those methods to focus narrowly on the aspects of project execution that matter most to the business. Is there opportunity to lean the process to achieve significantly faster throughput, and therefore, significantly higher productivity? Is there an opportunity to reduce industry-best defect densities to CMU/SEI-2010-SR

32 levels that are truly world-class, and thereby differentiate a product in an extraordinary way? Only an organization with hard data that relates their processes to their performances can answer these questions adequately, and then execute based on the answers. On the opposite side of the business opportunity is the risk. If an organization decides not to pursue improvement and wants current performance merely to be sustained, what signal does that send to employees and customers alike? In a dynamic and competitive world, is it even possible to maintain performance when new competitors, new technologies, and new circumstances constantly redefine where the bright line of excellence is drawn? The AIM philosophy is that continuous improvement is not an option, because the world in which we live moves so quickly that one must move quickly ahead or be left behind. Building a culture of excellence is the most difficult and also the most important long-term execution thread in AIM. Consider what can happen in its absence: A key executive leaves, the replacement focuses on short-term gains, a middle manager nearing retirement decides to go with the flow, a team leader loses focus because her boss doesn t seem to care, her team loses focus because its leader is distracted. If people revert to even some of their former behaviors, much of the performance improvement and business benefits of implementing AIM can be quickly lost. It is the job of all parties sponsoring executives, middle and line managers, team leaders, the PG, the developers not just to do their jobs, but also to demand excellence of themselves and their co-workers. This job demands, in turn, not just executing the existing AIM methods, but adapting and improving the methods for better performance on the next project as compared to the current one. CMMI can provide a framework, TSP can provide implementation excellence, SCAMPI can verify compliance, Six Sigma methods can help with analysis of the data and the formulation of new goals and process changes to achieve them but the motivation to use these methods to find new potential improvements in cycle time or delivered defects or project predictability beyond what was achieved just last year comes from the people using the methods. CMU/SEI-2010-SR

33 3 Specific Guidance for AIM Implementers This section provides specific information for those charged with implementing AIM practices. Like the blind men describing the various parts of an elephant, in some cases there is a strong conceptual thread that is obviously carried from one topic to another, while in others the only common thread is that the description is of another aspect of AIM. The following topics need not be implemented or even read in the given order; however, the authors recommend reading them all through once and then taking action on the ones that apply, considering specific information from the appendices, and from other relevant sources. This is generally the work of the PG in any organization, possibly with the use of expert resources from within or outside of the organization. The Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010] is a necessary companion document. AIM exists because of the many documented instances of TSP being used to implement CMMI (and previously, SW-CMM) practices, most often in concert with other known good practices like Six Sigma or Goal Driven Measurement or locally grown procedures [McHale 2004, Saint-Armand 2007, Seshagiri 2009]. The basic combination of a proven starting point of operational practices and a deliberately general set of best-practice descriptions allows broad choices for implementers. Having the clear, documented links between the two provides both an introduction to the main implementation vehicle in CMMI terms, and a worked example of highperformance CMMI practices and artifacts. 3.1 Certified Experts A Caveat and Disclaimer It is important to work with certified experts in PSP, TSP, CMMI and SCAMPI, and Six Sigma when implementing AIM. While the ultimate goal of AIM is for an organization to achieve self-sufficiency, every success story that the authors can point to involves the proper training, coaching, consulting, and other expertise in preparing, coaching, executing, and evaluating the recommended practices that make up AIM. In other words, while there are known cases where the path to expertise and excellence has been shortened, there are no examples where the path can be short-cut. Many of the failures lack one or more of these apparently critical requirements for success, and all of them rely on the use of qualified experts. Perhaps, as time passes, a target organization will customize and adapt AIM practices until they are almost unrecognizable as what began as something called AIM, and the organization s own people will become the experts. However, that progression seems somewhat risky, if not downright unlikely, if the starting point is very far from the basics of the component technologies. The path described in this report involves both a known start and a known end. One of the basic tenets of TSP applies here, namely, that the fastest and cheapest way to implement is usually to do it right in the first place. 3.2 Three Use Cases AIM identifies three common use cases for combining CMMI and TSP technologies. 1. CMMI has some significant implementation in the organization, and TSP is being introduced. 2. TSP has some significant implementation in the organization, and CMMI is being introduced more explicitly. CMU/SEI-2010-SR

34 3. The organization has no significant implementation of or experience with either CMMI or TSP, and therefore, both must be introduced. The expectation is that it should be obvious to any organization that decides to use AIM which use case (UC) applies in the current instance. AIM is designed generally for the third use case, an organization without CMMI or TSP experience, with the first two UCs having already implemented perhaps the only workable shortcut to the desired end. Each use case presents its own unique profile of problems and opportunities, and in fact the three mentioned above represent an entire class of usages across a spectrum. For example, the CMMI-extant organization (UC1) can be at any CMMI maturity level, ML1 to ML5, and the precise details of the existing practices strongly influence how TSP introduction and adaptation should proceed. TSP-extant organizations (UC2) may have implemented such discipline across all development projects or just a few projects, and may or may not have their own TSP coaches on staff. Even UC3 organizations that have no formal CMMI or TSP programs in place may possess other existing good practices like Six Sigma or Scrum, and even home-grown methods must be recognized and incorporated into the new AIM practices. For all use cases, regardless of experience with CMMI and TSP, the size and culture of the organization, and the average and extreme project durations must also be accommodated during implementation. These are just a few of many relevant factors, One important activity under any use case based on an existing implementation of either TSP or CMMI is the verification of existing practices. One may assume from conversation and observation that, for example, an organization has significant experience and expertise in TSP. However, the quality of the TSP implementation, which can only be known by a detailed assessment of that implementation, can make huge differences in the necessary course of action going forward. If TSP implementation is in some way superficial or incomplete, or simply dates to a much earlier version of TSP than the current one, planning and execution of AIM may be best served by first upgrading to the current version, with particular emphasis on any uncovered shortcomings in the prior implementation. 3.3 Which Version of TSP Supports AIM? TSP has been extended formally for AIM as a result of a formal gap analysis, in the mode of a SCAMPI C analysis, which was conducted at the SEI in As a result of that analysis, a series of Process Improvement Proposals (PIPs) were generated to address specific gaps or groups of gaps. For example, the lack of policy statements in TSP resulted in a gap for GP 2.1 of every PA in the scope of the analysis. The original observations noting these gaps, as well as revised observations that point to the present document (for example, for policy guidance), are recorded in the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010]. A copy of the original PIPs are available in Appendix D. The PIPs were written in many cases referencing the basic process elements that make up TSP: scripts that describe processes at an expert level, checklists that guide critical procedures or capture critical information or both, forms that capture process data, guidelines that provide additional information on specific topics, and specifications that describe things like reports, documents, or roles. TSP also calls for the creation of local standards such as a coding or design standard, defect classification standard, or documentation standard that are, in some sense, another kind of specification. Another process element is the tool set one uses to implement one or more aspects of the process, preferably in an automated, integrated way; however, such automation usually enhances or CMU/SEI-2010-SR

35 takes the place of some step or steps in a script, or captures data specified for a given form or according to a given standard. A list of the TSP process assets that were the target of the gap analysis is available in the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010]. The new version of TSP that resulted from these observations and PIPs is popularly called TSP+, with the official designation TSP This version is available only to SEI Partners for TSP. The popular naming derives from the idea that this version is everything that was present in both released and unreleased versions (such as TSPm which can be used to address the IPPD aspects of the CMMI) of TSP previously, plus the added and modified process elements necessary to address certain practices or groups of practices in the CMMI. In several cases where an existing process element was modified in accordance with CMMI guidance, the change that resulted also had the effect of addressing a known problem or issue with TSP implementation. 3.4 Implementation Approach The general approach to implementing any AIM practice is to first recognize that there may be existing practices within the organization that are perfectly adequate as they stand, or that may be partially workable in conjunction with AIM practices. A primary purpose of the pilot projects is to highlight such existing practices and then to determine how to integrate AIM practices effectively with them. Furthermore, it is not primarily the AIM experts who make such determinations; rather, it is the pilot teams themselves, including the PG, guided by the experts, who make such determinations. The bias in AIM implementation is that the working processes belong to the team, therefore the team must decide what existing practices should be used within the AIM framework for the management, measurement, and execution of their work. Such practices may be used as they are, may require modification for use within AIM, or may actually be replaced by a relevant AIM practice. If a practice is, in fact, not measuring up, the AIM measurement framework usually demonstrates that fact in a literal sense. For example, if an inspection process exists and the team wishes to continue to use it instead of the TSP inspection process, the team plans and tracks such inspections just like any other piece of work. However, if that inspection process doesn t work very well in terms of defect yield or defects found per unit time or some other relevant measure of effectiveness, the team will have objective data that not only justifies a change, but demands one. Very often, a new AIM practice must learn to co-exist with existing practices. A fairly common example is that of existing project management practices. A medium-to-large-sized organization often sets up some kind of project management office (PMO) to ensure that adequate, consistent planning and tracking efforts are applied to every project. AIM teams create and update their plans using the TSP launch/relaunch processes, and track progress at the weekly team meetings. Someone, usually the planning manager or team leader, must then provide the right information in the right format in the right timeframe to the PMO [Chick 2006]. This is usually a matter of selecting the proper subset of existing data needed by the PMO, since AIM teams typically generate far more detailed data than the typical PMO requires. Many of the remaining topics in this section cover CMMI PAs that should be approached in just this way: look for overlaps and potential conflicts between existing organizational practices and AIM practices, and use pilot project experience and expertise to determine how these practices will adapt and co-exist going forward. Even after full implementation of AIM methods is achieved, this is a good approach to take when implementing process changes. Attempting across-the-board implementation of CMU/SEI-2010-SR

36 new practices in all but the smallest (one- and two-team) organizations does not usually end well. Major changes to a project team s process should be made during the project team s launch or relaunch as part of their project execution cycle (see script CYCLE in Appendix B). Forcing organizational process changes in mid-cycle can be extremely disruptive to a project s productivity. By introducing such changes during a team s natural planning phase, the team is able to account for and anticipate the process changes into their negotiated commitments made with management and other relevant stakeholders. 3.5 Measurement and Analysis On the surface, there should be few issues surrounding implementation of the Measurement and Analysis (MA) PA in CMMI in an AIM-based effort. Measurement is a hallmark of the TSP. Direct process measurements at the most fundamental level the individual developer are introduced and trained intensively in the PSP and demanded daily on a task-by-task basis of TSP team members, and certain fundamental analysis of this data occurs weekly on every TSP project. Few existing organizations have even attempted measurements at this detailed level. Despite this the coverage was not perfect initially for the CMMI-specific practices (SPs) of this PA, due largely to the lack of explicitly documented measurement objectives, the subject of MA SP 1.1. (See PIP MA-1 in Appendix D.) Fortunately, there are at least two non-exclusive methods available to deal with this issue. First, Goal Driven Measurement training provides a template that captures the information needed for the Goal- Question-Indicator-Metric, or GQ(I)M, paradigm [Park 1996]. In fact, the indictor templates capture much more data than is actually needed for the narrow purpose of meeting MA SP 1.1, as explained in Applications of the Indicator Template for Measurement and Analysis [Goethert 2004]. The additional information also provides an organizational guide for many activities of the PG with respect to the measurement repository (OPD SP 1.4) and the many related specific and generic practices, especially planning for and monitoring project data (PP SP 1.3 and PMC SP 1.4), and using and contributing to organizational process assets (IPM SPs 1.2 and 1.6 and the corresponding GPs 3.1 and 3.2, for any activities planned and tracked by TSP teams). A second, more narrowly focused method, used with good results on an AIM pilot project [Fleshman 2010], is to address the issue of measurement objectives in a policy statement. This method has the advantage of making such a policy specific and actionable, while also addressing MA GP 1.1. As implied above, the two methods can co-exist on any given implementation. This is one of the many choices for implementation to be made in AIM, usually by the PG. Finally, the basic TSP data and project-level analysis should be the starting point for MA in an organization, not the end point. The indicator templates provide a known path to a wider world of effective organizational use of measurement. This becomes more explicit for higher levels of CMMI maturity (outside of the current scope of AIM), however almost any thoughtful application of Six Sigma methods creates actions and artifacts that will show well during a SCAMPI appraisal and, more importantly, provide value to the organization. In a broader sense, the way in which AIM addresses MA is indicative of the general AIM philosophy and implementation strategy. The core of the implementation, proven in the field for over a decade, begins with PSP training and TSP implementation. The SCAMPI C observations in the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010] shows a potential shortcoming, and a solution is provided (i.e., a gap is filled for MA SP 1.1) based on the existing CMU/SEI-2010-SR

37 practices, or in this instance, by GQ(I)M. By using CMMI specific and generic practices and the accompanying observations as a guide, an alternative and perfectly valid solution is formulated to address not just this particular gap, but also the MA GP 1.1 gap as well. The SCAMPI appraisal that verifies a particular method or methods as compliant with the model does not distinguish between AIM and locally developed methods. 3.6 Organizational Policy The subject of organizational policy arises throughout AIM implementation, not only because there is a generic practice for policy (GP 1.1.) applicable to every PA in the CMMI, but also for the links to management s intentions for usage of the AIM process assets and the data that usage of these methods creates. The general AIM recommendation is that the policies related to AIM reflect the stages of AIM introduction and an evolving understanding of how AIM practices are being used in the organization. Therefore, policy statements must be revisited with reasonable frequency, probably every six months or so, during the first few years of AIM usage. For example, an initial policy could be in the form of a memo from the executive sponsor, stating an intention to pilot AIM usage within the organization, the overall goals for AIM implementation, and naming a responsible manager and/or group. Six months later, related individual policy statements might be issued: when pilot projects are well underway or finishing, when the PG is readying to launch, and the organization understands how to integrate AIM practices with existing project management practices, quality assurance practices, and parts of an engineering life cycle. This experience will be reflected in a more sophisticated set of policies. In another six months, after one or two cycles of PG efforts and a broadened set of development projects under the revised policies, the PG would draft a small but comprehensive set of five to ten new policies that matched the organization s goals with the project management, process management, engineering management, and the various support activities that represented the new standard of expected behavior. Organizational policy under AIM should evolve as organizational practice evolves, and the organization is best served if it develops and adapts such policy internally. AIM therefore offers no generic policy statements. Rather, as the organization pilots, uses, and adapts AIM methods, it shapes policy statements to direct the expected new normal behaviors. 3.7 Process Group Strategy and Coaching In an early step in the TSP launch process that AIM teams use to plan their work, the team in question must decide upon an overall strategy for their work. While there are many available options and many decisions to be made, the AIM framework requires an initial strategy that is compatible with the CMMI model, the SCAMPI appraisal method, and the TSP operational practices. The two most obvious implementation strategies, built into CMMI itself, reflect the staged and continuous representations of the model. However, both of these approaches are most often attempted (perhaps after appropriate pilot activities) across an organization. AIM follows a project-by-project approach, but this should not imply that the structure of the CMMI is ignored. Rather, the PG must utilize both the structure of the model and the nature of AIM methods, in conjunction with any local practices, to craft a reasonable strategy. Inherent in the project-by-project approach of AIM is a strong emphasis on the project management PAs as a group. TSP covers these reasonably well [McHale 2004, Davis 2002]. Concurrent with this is a strong emphasis on the activities being planned and tracked, which for the noted references, CMU/SEI-2010-SR

38 corresponds to the engineering PAs, especially at ML3. The largest concentration of gaps is clustered in the ML3 process management PAs (OPF, OPD, OT). The AIM approach of launching the PG like any other team in the organization exercises the project management strengths again, but shifts the focus from engineering development to process development, thus addressing many gaps in an organized way. The PG models the desired behavior of teams within AIM in several ways. First, the PG is trained, launched, and managed and coached just like any other development team in the organization. Second, the nature of PG work forces an adaptation of TSP methods that serves as an exemplar to encourage other teams to adapt and thus truly own their team processes. A process development process, for example, is provided as a starting point for any such activities, along with unique additional team roles that serve the unique purposes of the PG. Some of these are reflected in the charts in Appendices dealing with Process Definition and Organization and Project Training. Coaching the PG represents a challenge, partly because such a team is unique in an organization, but also because of the breadth and depth of knowledge needed by the coach. Ideally the coach would be both a qualified TSP coach and deeply experienced (if not formally qualified) in CMMI-based improvement and appraisal. However, as of this writing, there are few individuals worldwide who possess both skill sets. More likely, a partnership between a TSP coach and a CMMI expert will be the normal minimum expert team. The Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010] and the current document are intended to help bridge that particular gap. The PG coach should recognize that almost nothing the PG does affects only its own work. The connections from the target PAs of the PG OPF, OPD, and OT correspond explicitly to generic practices GPs 2.5, 3.1 and 3.2 which span potentially all engineering activities of development teams, not to mention the many related specific practices in the project management PAs. In addition, some PGs take on parts of one or more of the support PA functions as a matter of efficiency, further strengthening their connections across projects via the corresponding GPs (see below). Finally, the PG must recognize that its behavior will be viewed by the rest of the organization as a model for development teams. It therefore becomes even more important for the PG to function effectively and in some sense publicly as an AIM team, since a valid measure of their success will be in how effectively the other development teams function in the new environment. 3.8 The Support PAs The CMMI support PAs covered by AIM include Measurement and Analysis (MA, addressed above), Process and Product Quality Assurance (PPQA), Configuration Management (CM), and Decision Analysis and Resolution (DAR). During the initial gap analysis that formed the starting point for the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010], PIPs were generated to address gaps found for each of these areas. These PIPs are included in Appendix D as an aid to organizations executing UC2, namely implementing AIM on an existing base of TSP practices, since this information should help to highlight changes in their baseline of TSP practices. For both PPQA and CM, the PIPs reflect the fact that the project-oriented TSP processes basically acknowledged the existence of and interactions with these PA practices, but did not attempt to directly implement them. For example, CM is acknowledged by the designation of the TSP Support Manager role as having principal CM duties for the team, and with direct CM items being planned during the TSP launch, especially launch meeting 3. Quality assurance activities are spread across several roles and are distinguished at least partially by process QA (TSP coach role, TSP team leader role, process CMU/SEI-2010-SR

39 and quality manager roles) and product QA (TSP team leader, test manager role, and other roles depending upon the products in question). However, when applying SCAMPI-like tests for the existence and adequacy of process artifacts, it became apparent that some activities that TSP considered optional became, if not required, at least very strongly suggested in order to leave an artifact trail for a SCAMPI appraisal team. Thus, the answer to the question of whether TSP team role activities related to these practices should be included in a team member s individual workbook changes from it s optional or it s up to the individual to yes, this work should be planned and tracked just like any other. By planning and tracking these tasks in individual workbooks, an artifact trail for specific and generic practices is created as the work is done Configuration Management (CM) and GP 2.6 Configuration Management (CM) provides the best example of how AIM and existing practices should co-exist. It is almost unthinkable that any modern software development organization has no existing configuration management practice. Excellent CM tools are available commercially and freely in open source, and rudimentary CM is built into many modern development environments. However, it is almost impossible for any tool to fully implement even one specific practice in any PA, let alone the seven within CM, without mindful direction from skilled practitioners. The Configuration Management chart in Appendix A shows the formal AIM solution, a solution that goes in a different direction than the one specified in PIP CM-1. The AIM solution includes primary interactions between someone wishing to update or place an artifact under configuration control (most typically a product owner) and the support manager role, with additional interactions with the design manager and process manager roles, and the CCB (Configuration Control Board, an entity defined in the TSP launch if not already in existence). Certain TSP+ process assets referred to in the chart provide specific guidance and capture necessary information. If a team executes the actions on this chart appropriately and consistently, along with other actions specified in the support manager role description, TSP launch scripts, and when enacting well-composed development scripts, compliance with CM practices is virtually assured. However, AIM provides a purely paper-based solution, while real-world CM for almost any project is going to use one or more of the aforementioned automated tools, and possibly well-established procedures for their use. The AIM recommendation is not to throw out the existing CM, but rather to understand the requirements of missing or inadequate CMMI practices and what is provided by the relevant AIM specifics, to use that understanding along with whatever AIM assets make sense under the circumstances, and then to devise a solution that adds value for the team and organization while fully conforming to CMMI. The least-effort solution is that existing CM practices suffice both for the team s and organization s purposes, and for CMMI perfectly acceptable under AIM. In another case, some element of CMMI is lacking in the existing implementation, and between the AIM recommendation and the actual text of the CMMI, some intelligent selection or adaptation of AIM parts will fit the need and provide benefit to the effort. This is the preferred implementation mode for AIM practices beyond the basics of TSP introduction and SCAMPI appraisals: understand what already exists, understand the intent of the model and what is missing or inadequate, understand what AIM can provide, and then formulate and verify a solution that provides value to the project team and to the organization. CMU/SEI-2010-SR

40 3.8.2 Process and Product Quality Assurance (PPQA) and GP 2.9 The subject of QA (quality assurance) is often a sensitive one, especially with development teams. No activity in an organization seems more prone to abuse, whether by a mindless check-the-box mentality or by a punitive, audit-like approach. In fact, a common anxious question in PSP and TSP training is, What is management going to do with this data? The AIM approach to quality assurance is multi-layered and somewhat diffuse, and with proper safeguards in place, should co-exist smoothly with existing QA practices and allay any concerns about management s use of detailed personal data. Rather than implementing the somewhat obvious suggestion in PIP QA-1, AIM emphasizes the use of existing roles and processes in a manner fully consistent with their intended use, to ensure that processes are consistently executed in order to deliver excellent products. As mentioned previously, process quality assurance and product quality assurance are addressed separately but in similar manners. Both begin with the individual team member who executes a defined, measured process to produce a product to a specified standard. Parts of that process typically specify personal reviews by the developer, team inspections, and some level of testing by the developer product checks. The process manager role monitors compliance with the plan process checks. The team leader, who is ultimately responsible to management for the product, and the TSP coach, who is ultimately responsible for the process, each have their own overlapping checks. The team leader runs the weekly data-driven team meetings, sees how the product development is progressing, and hears reports from the process, quality, and test managers. The TSP coach, following a coaching plan developed during the launch, sits in on early weekly meetings and occasionally may do so later in the project, and reviews workbooks on a periodic basis. On request from a role manager or the team leader, or based on the workbook review, the coach may conduct a TSP checkpoint, with or without a formal report to management, in order to identify any issues with gathering data or following the agreed-to processes. The coach may also do specific role-based coaching for the team leader and team members in the use of a tool or specific processes, or with respect to executing a particular team role. Significant artifacts for a SCAMPI appraisal include role descriptions for the TSP coach, team leader, team members, and other relevant roles; workbooks that document and track the coaching plan and the various team roles; weekly meeting minutes that record the role manager reports and any QA issues; and the TSP coach s checkpoint reports. This, then, is the basic model for QA in AIM: three tiers of overlapping checks, first by the individual team member against his or her own process and plan, second by the team roles, and finally by the team leader and TSP coach. Non-compliances can be reported at any level and escalated as necessary; however, the emphasis and preference is that non-compliances are corrected immediately by the people involved. If a process cannot be followed consistently, it is changed. If a tool is being used improperly, remedial instruction is provided immediately. If a team is generally unable to follow its plan, usually this is because the plan is outdated or based on false assumptions, in which case a replan or relaunch is necessary. This approach is also consistent with a more traditional approach to quality assurance. The typical AIM team creates detailed records of their activities as a matter of course and on a daily basis, and early organizations implementing CMMI (and before it, SW-CMM) certainly took this traditional approach utilizing existing QA personnel and procedures [Wall 2007]. However, this did not change CMU/SEI-2010-SR

41 or diminish the approach described above. AIM teams take responsibility for the quality of what they deliver, and recognize explicitly that such quality relates directly to the processes that they use, and to the fidelity and discipline with which they execute those processes Decision Analysis and Resolution Decision Analysis and Resolution (DAR) is a somewhat odd process area. It has no obvious relation to generic practices like the other support PAs. It lives alone in the support category at ML3. Yet in some ways, it is the most useful and ubiquitous PA of all. Consider that an organization just starting out on a process improvement effort must, rather obviously, decide what improvement model or models to use as a guide, and what methods to use to implement those models. This seems like an important enough decision to establish evaluation criteria and methods, identify and evaluate alternatives, and then select from among those alternatives how to proceed. Most of DAR can be executed before the effort formally begins. In fact, this pattern repeats over and over again in a process improvement setting, a project management setting, or a development setting. Any decision important enough in terms of money, personnel, strategic direction, or any number of business or technical considerations can benefit from a formal decision analysis process consistent with DAR, and many are possible. TSP+ provides a script and accompanying form to guide a fairly generic DAR capability, but many different decision analysis disciplines may (and perhaps should) exist in any organization. Sometimes an entire project is launched that recognizably fits the DAR specific practices, for example, to choose between two competing high-speed communications products for a business-critical real-time transactionprocessing system. The organization s PG and its TSP coach should be aware of the DAR script and use it as appropriate, while keeping in mind that it is one minimally acceptable example of formal decision analysis. When building project plans for development teams and for the PG, decision points can often be identified that would benefit from the use of DAR, while incidentally creating the artifacts of interest to a SCAMPI team. Tool decisions, project strategy decisions, architectural decisions, AIM rollout decisions all of these are opportunities to exercise the DAR script and benefit from the formal analysis. 3.9 Stakeholder Involvement GP 2.7 PIP PP-3 in Appendix D points out a paper weakness in the previous version of TSP, namely that there was no consistent documented method of planning the involvement of identified stakeholders. The phrase paper weakness is used because, in practice, stakeholder involvement was typically a strength of TSP because the transparency of its project status simplified the involvement of relevant parties at critical points in the project. However, typically does not mean always, and almost every active TSP coach can name an instance when some relevant stakeholder slipped through the planning cracks, which caused a problem later on. The Stakeholder Involvement chart in Appendix A describes the AIM approach to identifying and planning interactions with important stakeholders by linking their involvement with specific TSP+ process elements (form RSIM, Relevant Stakeholder Assignment Matrix) to specific TSP+ team roles (form SRAM, Stakeholder Role Assignment Matrix). (A complete list of TSP+ process elements is provided in the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010].) This reflects a common way that TSP teams have dealt with their stakeholders in the past, but as with most of AIM, this is not intended as the exclusive method of dealing with this issue. Many CMU/SEI-2010-SR

42 robust project management disciplines, formal and informal, explicitly recognize the importance of stakeholder identification and involvement What About the Other GPs? Generic practices (GPs) in the CMMI are, in some sense, part of the skeletal structure of CMMI. Every PA includes them, and they form a logical, repeatable pattern that reflects an organizing philosophy underlying CMMI. Whether the activities to be performed are engineering, project or process management, or support, those activities should be assigned to properly trained and equipped personnel, be planned and tracked based on practices that have worked before, have important artifacts placed under appropriate configuration control, be subjected to objective scrutiny, be summarized for and reviewed by senior management, and then be made available for analysis and possible future use across the organization. PSP training and TSP teams show their true power in this regard. The basics of using defined processes that are planned, performed, measured, and tracked are fundamental to working on a TSP team, and all of these are evident in the artifacts that such teams naturally produce. Obviously, this reflects well when a SCAMPI appraisal team looks at such a team s activities as measured against the yardstick of CMMI. However, the team is not working in this fashion in order to look good in the SCAMPI, it works this way because it gets the job done right the first time Engineering Practices The issue of implementing engineering practices, especially in the light of a SCAMPI appraisal, is in some ways the easiest for AIM to address, and in some ways the hardest. It is easiest in the sense that, as will be explained below, there is relatively little in the way of standard AIM process assets to understand and implement. But, it is hardest in the sense that the full implementation of engineering practices as specified in the PAs for Requirements Management (REQM 5 SPs), Requirements Development (RD 10 SPs), Technical Solution (TS 8 SPs), Product Integration (PI 9 SPs), Verification (VER 8 SPs), and Validation (VAL 5 SPs), plus the associated generic practices, could represent the largest amount of implementation effort by an organization using AIM. PIPS ENGR-1, ENGR-2, ENGR-3, and REQM-1 (shown in Appendix D) describe the issues with the engineering-related TSP+ scripts. The process scripts that cover an idealized engineering life cycle a CYCLE script (see Appendix B) that points to a DEV script for new development, a MAINT script for maintenance work, each of which in turn point to several other subordinate scripts for particular parts of each life cycle are fairly high level and generic, and eventually refer to artifacts such as an Engineering Requirements Specification (ERS), System Requirements Specification (SRS), or Software Design Specification (SDS). These artifacts have no existence in TSP outside of these scripts no templates, no specifications, no exemplars, only names. The assumption is that these are placeholder scripts and artifacts that would be supplanted by an organization s existing engineering life cycles, including templates, exemplars, and specifications as needed. In most cases, this is exactly how the engineering scripts are used, with occasional reference to them in terms of using the given scripts as checklists to ensure that the local engineering processes used by the team didn t miss any items deemed important enough to be mentioned by AIM. However, when evaluating the baseline TSP or even TSP+ for SCAMPI C purposes, the paper only evaluation came up short [Miluk 2010]. CMU/SEI-2010-SR

43 In deciding how to address these gaps, there is no intention within AIM to direct engineers, developers, and the teams that they work on. Rather, it is part of the team s job, as in standard TSP, to define the way that it works which can mean simply using the organization s existing practices within the TSP management and measurement framework and then capturing those definitions. The PG provides support as needed and then collects such definitions for possible inclusion in the organization s process asset library, along with the associated estimated and actual performance measurements. So, while a large majority of process management, project management, and support CMMI practices are addressed directly by AIM, there are no shortcuts to implementing engineering practices in any given organization. The development team, the PG, the TSP coach, and the CMMI expert must work together, often with additional parties from within the organization. This is especially critical in larger organizations. As a general rule, the larger an organization becomes, the less likely it is that development teams or even the PG will have the entire scope of engineering practice within their direct control. Marketing is likely to be heavily involved in requirements elicitation and validation. A project management office may own critical pieces of the requirements management process. A test department may own the later parts of the life cycle. In these cases, the relevant parts of the life cycle guided by CMMI specific practices must be addressed by more conventional means; however, the development team is well-advised to exercise the stakeholder involvement mechanisms described above to engage other groups and individuals constructively. The good news here is that competent development organizations, almost by definition, will already be performing a significant number of the 45 CMMI engineering specific practices. AIM training and methods provide a generic practice framework that are known to work with a wide range of methods, from agile practices of all descriptions to the most demanding aspects of formal architecture, implementation, and security methods. The first pilot projects are usually critical for characterizing engineering practices by providing the disciplined framework that focuses on producing the artifacts and activity trail for which SCAMPIs are looking Additional Guidance for TSP Teams For organizations with a pre-existing TSP implementation, even the best of such implementations must look for the differences in TSP+. These begin with launch preparations, which have become more extensive and explicit, in part to ensure that CMMI-visible practices move from the category of implicit/usually done to explicit/always done. The experienced TSP coach, team leader, and development team should pay particular attention to the updated PREPL checklist and more recent PREPT script. It is also advised that the TSP coach and the team s process manager take time to study the changes made to existing TSP scripts, forms, specifications, etc. as depicted in the activity charts shown in Appendix A. In a similar vein, TSP role-based activities that were left to the discretion of teams and individuals to plan and track (either formally or informally) are now directed to capture all of these activities formally. The original role manager responsibilities, with only a few updates, are essential to the performance of many CMMI specific and generic practices, as indicated throughout the Guide for SCAMPI Appraisals: Accelerated Improvement Method (AIM) [Miluk 2010]. This also puts added emphasis on the roles of the TSP coach and team leader, and in the case of the TSP coach, a coach role specification. Thus TSP+ corrects an historical irony, namely the lack of a TSP role description CMU/SEI-2010-SR

44 for the coach. The TSP coach role is described in an entire book [Humphrey 2006] and in the five-day SEI course TSP Coach Training 9, but previously had no specification within TSP Concerning Six Sigma Six Sigma has a long pedigree in manufacturing, longer than CMMI has been used for software and systems engineering (even if one includes the CMM for Software in the age of CMMI). Training programs for the various colors of Six Sigma belts vary widely, but are nevertheless grounded in the same basic statistical and procedural techniques incorporated into CMMI, especially at the higher maturity levels, and are widely available. Tens of thousands have been trained in and used these techniques with extraordinary results. Yet, very little specific guidance is given here for the use of Six Sigma methods beyond the most general suggestions of timing and expertise. The reasons for this are likely obvious to Six Sigma Black Belts and other experts, who know only too well that the proper application of the methods requires a sophisticated understanding of the particulars of the organizations, the projects, the processes, and the data available. By the time such an expert knows enough about such things in order to guide intelligently, that person needs no further general instruction from this document. However, a few more words to others involved in such an effort may be useful. As discussed below, Six Sigma provides a likely bridge to CMMI high maturity once the basic goals of AIM achieving high performance in business terms with solid CMMI ML3 conformance are met. However, some such techniques can be of great value in achieving those goals. For example, Voice of the Customer techniques can be used to good effect for at least the Requirements Development and Validation practices. This is, of course, only one example. Consult the local Six Sigma Black Belt for further guidance Preparing for High Maturity As mentioned previously, the targeted CMMI scope of AIM excludes maturity levels 4 and 5, collectively and commonly referred to as high maturity. The reasons for this exclusion are historical and practical. Historically, while Six Sigma methods have been used successfully to move organizations to high maturity and enhance the operations of those already at ML5 [Stoddard 2008], the majority of experiences of TSP-using organizations that have achieved ML4 or better did so with the CMM for Software (SW-CMM) as a reference model. This document relies heavily on the experiences of TSP users for useful information. The experience base for CMMI high maturity using TSP is simply insufficient. Even assuming that the information was available, practical considerations argue against including high maturity guidance here. CMMI high maturity requires that baselines and models specific to the data and processes of a particular organization are accumulated and developed. It seems likely that effective guidance for such activities will be quite lengthy, and in light of the discussion above concerning Six Sigma s use, possibly unnecessary. However, it also seems clear that the detailed data provided by TSP teams provide a more solid foundation for building the baselines and rich statistical models for CMMI high maturity practices than measurement regimens built only on team- or project-level data. For example, individual time and defect data are used to update personal review checklists using only the most basic statistical 9 TSP Coach Training, CMU/SEI-2010-SR

45 techniques. Not only is such data already an invaluable business and professional asset, the potential of using such data, aggregated across and between project teams and properly segregated and analyzed using Six Sigma and other techniques, seems enormous. CMU/SEI-2010-SR

46 Appendix A: Activity Charts for Selected Processes The following activity charts are a useful aid to understanding several of the new process capabilities within TSP+. Note that they do not cover all of the new capabilities. The charts generally show how affected roles (listed at the top of each chart) use a script, form, or checklist to perform the intended activities. The activities have been designed to add value to team activities while complying with relevant CMMI practices. Table 2: Selected TSP+ Activity Flows Chart Name Description Affected TSP Roles Configuration Management (CM) CM Configuration Change Requests Stakeholder Involvement Process Definition Organizational and Project Training Periodic Review of Training Matrix Decision Analysis and Resolution Two charts documenting the intended flow of configuration management activities Documents the flow for formal change requests Documents the usage of the new stakeholder involvement mechanisms Four charts documenting the many interactions between the various roles involved in formal process definition Four charts showing the many interactions between the various roles involved in planning and tracking training activities An additional chart for training activities to close the loop on the training activities One way to perform DAR, certainly not the only way Support manager, team member/product owner, quality and process managers, CCB (Configuration Control Board, team level Requestor, support manager, affected project team, CCB, affected product owner or role manager Team members, team leader, process group, team management, process manager Team members, process and support managers, process group, management Team members, team leader and/or management, training manager (new role on the process group), process group, training attendees and instructors Team members, team leader and/or management, training manager (new role on the process group), process group, training attendees and instructors Team or team member, decision owner, decision participants and stakeholders, process manager CMU/SEI-2010-SR

47 Configuration Management Team Member or Product Owner or Team Support Manager (expanded Role) Quality Manager (expanded Role) Process Manager Configuration Control Board (CCB) Team Launch Preparation Checklist - PREPT Launch Meeting 3 Script LAU3 (Team) Form CIBPS Launch Meeting 6 Script LAU6 (Team) Form CIBPS (Update) Launch Meeting 9 Script LAU9 (Team) Or Management Briefing Guidlines Form CIBPS (Update) Criteria met to configure item? (Team Member) Configuration Identification Release Form Form CIR Obtain Approvals Concur with release? (Product Owner) No Yes Approved or Disapproved? Quality Product? No Yes Created by following Process? No Yes Approve or Disapprove? Rework (Team Member) Disapproved Approved A CMU/SEI-2010-SR

48 Configuration Management Team Member or Product Owner or Team Support Manager (expanded Role) Quality Manager (expanded Role) Process Manager Configuration Control Board (CCB) A Store configuration Item (CI) No Were Configuration Change Requests Implemented? Yes Configuration Change Request Form CCR (Update with Implementation date) Configuration Change Request Log Form LOGCCR (Update CI Status) Configuration Item Log Form LOGCI (Update Log) Form CIBPS (Update status) Store all forms CM Audit Script CMAUDIT CMU/SEI-2010-SR

49 CMU/SEI-2010-SR

50 Stakeholder Involvement Team Member or Team Team Leader Process Group Management Planning Manager Team Launch Preparation Script PREPT (Team) Team Launch Preparation Checklist PREPT Team Leader reviews the RSIM with team, process group and management for completeness Relevant Stakeholder Involvement Matrix Form RSIM (Update) Begin mapping individuals to the roles identified in form RSIM using the project s SRAM. Launch Meeting 8 Script LAU8 (Team reviews SRAM and RSIM) Stakeholder Role Assignment Matrix Form SRAM (Update) Relevant Stakeholder Involvement Matrix Form RSIM (Update) Stakeholder Role Assignment Matrix Form SRAM (Update) Launch Meeting 9 Script LAU9 Management and the process group review the RSIM and the SRAM for completeness and approval. Relevant Stakeholder Involvement Matrix Form RSIM (Update) Stakeholder Role Assignment Matrix Form SRAM (Update) Store approved RSIM and SRAM in Project Notebook CMU/SEI-2010-SR

51 CMU/SEI-2010-SR

52 Process Definition Team Member or Team Process Manager Process Group (New Role) Management Support Manager A Process Manager leads the team in defining the development process by reviewing the PSSP (Form CIBPS) to determine if the standards meet project needs and how they need to be tailored or deviated (Form SUMPD). Standard Process Deviation Summary Form SUMPD (Update) Make a plan to obtain process deviation approvals. Make plan to develop, modify, or obtain each missing process element. Form CIBPS (Update) Launch Meeting 8 Script LAU8 (Prepare for meeting 9 in accordance with the TSP Management Briefing Guidelines) Launch Meeting 9 Script LAU9 (Present proposed changes to the PSSP using From SUMPD) Management and Progress Group Lead initially agree to proposed changes to the PSSP, pending formal process deviation approvals (Form SPDR) Document process elements effected by each process deviation (found in From SUMPD), including rationale for deviation and evaluation criteria for approval. B CMU/SEI-2010-SR

53 CMU/SEI-2010-SR

54 CMU/SEI-2010-SR

55 CMU/SEI-2010-SR

56 Organizational and Project Training (New Specification TRN) Team Member or Team Training Manager (New Role) Team Leader or Management Training Attendee or Instructor Process Group A Yes Met all training requirements? No Received similar training or do not need this training? No C Yes Training Waiver Request Form Form TRNWR Training Waiver Request Form Form TRNWR (Evaluate Request) Must plan to take required training Approved or Denied? Denied Team Member Training Log Form LOGTRNM Approved Approved or Denied? Denied Approved Satisfy additional training requirements Notify team members of any additional unsatisfied training requirement CMU/SEI-2010-SR

57 Organizational and Project Training (New Specification TRN) Team Member or Team Training Manager (New Role) Team Leader or Management Training Attendee or Instructor Process Group B Update all individual training logs and send all attendees a training survey. Training Survey Form Form TRNSUR (Attendee) Team Member Training Log Form LOGTRNM Satisfy training requirements Notify team members of any unsatisfied training requirement Training Survey Summary Form Form SUMTRNS (collect all forms TRNSUR and input data into form SUMTRNS) Review completed survey results, form SUMTRNS (Instructor) Review completed survey results, form SUMTRNS, for consideration in future training improvements Store SUMTRNS, TRNSURs, and associated TRNSI(s) Training Request Log Form LOGTRNR (update) Review completed survey results, form SUMTRNS, for consideration in future training improvements (Management) CMU/SEI-2010-SR

58 Organizational and Project Training (New Specification TRN) Team Member or Team Training Manager (New Role) Team Leader or Management Training Attendee or Instructor Process Group C Can requirement be met my on-the-job, computer-based or self-directed No training? Yes On-the-Job Training Form Form TRNOTJ On-the-Job Training Form Form TRNOTJ (Evaluate Request) Approved or Denied? Approved Approved or Denied? Approved D Denied Denied Team Member Training Log Form LOGTRNM Satisfy additional training requirements Notify team members of any additional unsatisfied training requirement CMU/SEI-2010-SR

59 Organizational and Project Training Periodic review of Training Matrix Team Member or Team Training Manager (New Role) Team Leader or Management Training Attendee or Instructor Process Group The Training Manager must periodically review the Training Matrix (form TRNM) with teams, projects, the process group and management for completeness and approval Training Matrix Form Form TRNM (Update) Were changes made to form TRNM? Yes No Update all individual team member training logs to reflect changes made to the TRNM. Team Member Training Log Form LOGTRNM Satisfy all training requirements Notify team members of any unsatisfied training requirement CMU/SEI-2010-SR

60 CMU/SEI-2010-SR

61 Appendix B: TSP+ Scripts for Process Operation and Team Operations The high-level scripts guiding the process team and the other development teams in an organization are included here to aid TSP coaches and TSP-trained process groups to understand their roles in the overall AIM implementation process. CMU/SEI-2010-SR

62 TSP+ Process Operations Script POPS Purpose Entry Criteria General - To guide an organization in improving its software performance - To couple the TSP, CMMI, and Lean Six Sigma improvement initiatives Management seeks to improve the performance of its software-related operations. This script assumes a TSP/CMMI implementation. The use of Lean Six Sigma methods and tools is optional. Step Activities Description 1 Contact - Determine how the TSP and CMMI can address management s concerns. - Talk with working level engineers and managers to build interest. 2 Awareness - Expose senior management to the opportunity. - Describe the potential benefits and the operating level interest. - Provide credible references. 3 Obtain Sponsorship - Hold a one-day executive seminar for the senior managers and executives, which include a CMMI overview. - Hold a half-day planning workshop to identify initial areas for TSP trial use - Identify a manager to be responsible for the transition plan and execution. 4 Develop Transition Plan The responsible manager produces a plan for TSP trial use. - arranges for qualified training and coaching support - schedules managers and team members for training - schedules initial TSP launches, checkpoints, and relaunches 5 TSP Trial Use The organization conducts trial TSP projects (TOPS script). - trains the engineers and managers - launches the teams and regularly reviews performance - identifies internal candidates to be initial PSP instructors and TSP coaches 6 Evaluation The organization assesses team and TSP performance and decides to proceed with the process improvement initiative. - identifies a manager to lead the long-term improvement effort and become the process group team lead - allocates initial resources - issues a policy describing the process improvement initiative and its importance to the business - defines management responsibilities 7 Adoption: Process Group Formation The process group team lead, with the help of the TSP Coach (POPS7 script) - produces a process improvement plan proposal - reviews this plan proposal with management and gets their approval - recruits a staff and trains that staff in CMMI, PSP, and TSP - launches the process group team to plan and execute the process improvement plan proposal 8 Adoption: Institutionalize TSP 9 Continuing Improvement Working with project management, the process manager - develops a TSP introduction plan and schedule for each project team - assists projects in launching and running TSP projects (TOPS script) Using the project s needs as a guide and reviewing the Process Group Roles and Responsibilities specification, set priorities and build and execute a plan (script CYCLE) based on organizational business objectives for - defining team and organizational processes - establishing and maintaining a process asset library CMU/SEI-2010-SR

63 Continuing Review - providing continuing training and coaching guidance - obtaining needed tools and methods - providing tool and method training and support - assessing the organization annually to identify further improvement needs Apply Lean Six Sigma methods and tools to improve current process performance. Management annually reviews the organization s software operations. - analyzes cost and benefit data - obtains customer, manager, and engineer feedback on improvement results adjusts the improvement program to address identified problems and capitalize on new improvement opportunities during Meeting 1 of the process group s (re)launches (script CYCLE) CMU/SEI-2010-SR

64 TSP+ Process Group Formation Script POPS7 Purpose Entry Criteria To guide the process group (PG) team lead in planning for and staffing a process improvement group - The organization has initiated a software process improvement program. - Management has issued a process improvement policy statement and named a responsible manager. - Initial process improvement resources have been allocated. Step Activities Description 1 Define Responsibilities 2 Develop Improvement Plan The PG team lead, with help from the TSP Coach - documents the principal responsibilities of the job and proposed group (specification Process Group (PG) Roles and Responsibilities) - reviews PG roles and responsibilities with existing staff groups (configuration management, quality assurance, and test for example) - reviews PG roles and responsibilities with development department managers - revises PG roles and responsibilities based on the review results - reviews PG roles and responsibilities with senior management for approval The TSP Coach works with the PG team lead in developing a proposal for the process improvement work. - tasks to be performed - training, support, and assistance to be provided - resources needed, both full and part time - proposed recruiting schedule - proposed task schedule 3 Obtain Plan Approval The PG team lead - reviews the proposal with existing staff groups (configuration management, quality assurance, and test for example) - reviews the proposal with development department managers - obtains agreement from the development and staff groups to provide the needed part time process improvement resources - revises the proposal based on the review results - reviews the proposal with senior management for approval 4 Recruit Initial Staff The PG team lead recruits the process staff. - obtains a core staff of process experts (trained and experienced if possible) - utilizes part-time support from the development groups where planned - recruits experienced professionals from internal development groups where possible - maintains a mix of engineering, process, technology, and management skills 5 Train Initial Staff Using internal skills where possible, the PG team lead trains the process staff in the key process improvement technologies. - the Capability Maturity Model (CMMI, P-CMM, and so forth) - the Personal Software Process (PSP) - the Team Software Process (TSP) - Lean Six Sigma methods and tools for subsequent improvement analysis - the principal tools and methods used in the organization 6 Process Group Team Launch Exit Criteria The process group launches in order to plan and execute the approved process improvement proposal (script LAU). The PG has successfully launched. CMU/SEI-2010-SR

65 TSP+ Team Operations Script TOPS Purpose Entry Criteria Trial Use Entry Criteria TSP Adoption Entry Criteria TSPm Initial Use General To guide managers, teams, and engineers in introducing and using the TSP process. - Senior management has participated in the TSP executive strategy seminar and planning workshop and supports TSP introduction. - An initial TSP trial program has been approved. To move beyond trial use and start broad TSP introduction - TSP trial use has been successful. - A TSP adoption plan has been developed and approved. - Initial PSP instructors and TSP coaches have been selected and are scheduled for training. To launch a TSPm multi-team or distributed multi-team - TSP trial use has been successful. - The organization has at least one authorized TSP coach on its staff and enough additional coaches are to be trained to support team operation. To use the TOPS script, one or more of the entry criteria must be satisfied. Step Activities Description 1 Team Formation and Training For each TSP or TSPm team, a project is identified and the staff has been trained. - All managers on each project or in its management chain are TSP trained before the launch. - All project software professionals are PSP trained before the launch. - All other project professionals are trained in the personal process before the launch. 2 Launch Preparation - Prepare to launch each TSP or TSPm team (checklist PREPL). - For each TSPm multi-team or distributed multi-team, also follow scripts PREP and PREPW during launch preparation. 3 TSP Cycle Follow script CYCLE until project conclusion in order to guide the team in: - launching the project - executing the team s detailed planning - undergoing a checkpoint (script CHECKPOINT) - conducting a phase, cycle, or project postmortem - preparing for subsequent relaunches (if needed) 4 Multi-Team (TSPm) Project Operations For multi-teams only, follow script TOPS4 concurrently for each sub-team following the TSP cycle (script CYCLE) in order to manage the sub-team interdependencies and overall project. Exit Criteria - Project completed with team and team member plan and actual data - Project data filed in the project notebook (specification NOTEBOOK) - Final project report prepared and presented to management CMU/SEI-2010-SR

66 TSP+ Project Operations Script TOPS4 Purpose Entry Criteria To guide managers, team leaders and team members in managing a TSPm project The project has completed a launch or relaunch. Step Activities Description 1 The Leadership Team Launch 2 Leadership Team Operations 3 Sub-team Operations 4 Role Manager Team Launches 5 Quarterly or Monthly Management Reviews 6 Periodic Customer Reviews Exit Criteria Following the project (re)launch and under the guidance of a qualified TSP coach, the leadership team holds a one-day launch (script LTL). - reviews the organization s and this project s goals - develops the leadership team s management strategy - defines the role manager teams goals and responsibilities - allocates tasks and responsibilities among the leadership team members Following the leadership team launch, the leadership team - manages the project and each sub-team in performing its work - holds weekly meetings to review project status and issues (script WEEKL) - provides guidance to the role manager team launches (script RTL) - meets at least monthly with each role manager team (script WEEKLR) - regularly reports to senior management and the customer on project status and progress (specification STATUS) Following the team and sub-team (re)launches, each sub-team follows its defined process and detailed plan in doing its work. Soon after the team (re)launch and under the guidance of a TSP coach, each role manager team holds a two-day launch (script RTL). - the leadership team defines its goals for the role manager team - the role managers establish their strategy and plan to meet these goals - the role managers allocate tasks and responsibilities among team members - the role managers prepare and review their plan with the leadership team The leadership team and senior management periodically - review project status, progress, and projections - assess the team for quality level performance - assess the team for expert level performance - identify issues and problems and assign responsibilities The leadership team regularly reviews status and progress with the customer. - planned versus actual performance - outstanding issues and problems, actions planned, and assistance needed - Project or project phase completed with team and team member plan and actual data filed in the project notebook (specification NOTEBOOK). - Final project report prepared and presented to management CMU/SEI-2010-SR

67 TSP Cycle Script CYCLE Purpose Entry Criteria General - To guide teams through the use of a defined and structured process, with repeatable and measurable steps, which provides rapid feedback on the quality of the product and progress towards completion - To guide teams in the establishment of a shared understanding of the work and how it is to be done, which includes a common understanding of the team goals, team member roles, product or components to be produced, available resources and existing constraints, and measures of success. - To provide the mechanisms required in order for a team to practice self-management - All team members have been adequately trained in the use of PSP and TSP. - All team members and the team leaders have been identified and allocated to the project. - A qualified TSP coach is available to guide and coach the team through the TSP Cycle. - Depending on the size and needs of the project, a TSP Cycle can range from a period of a few weeks to a few months. - Depending on a project s overall duration and needs, the team may choose to use phases, cycles, or both in determining when it needs to conduct a (re)launch. A phase represents a part of the development lifecycle such as the implementation phase, and a cycle represents the time between planning horizons. A phase can encompass several cycles, just as a cycle can encompass several phases. Step Activities Description 1 Team (Re)Launch (LAU/LAUm or REL/RELm) - During the launch, the team learns from management what it is supposed to do, makes a plan for doing the desired work, and then reviews the plan with management. The two desired outcomes of the launch are an approved team plan for producing a particular product, both the overall project plan and a detailed next phase plan, and a jelled self-directed team. - During a relaunch (script REL or RELm), the team members update their overall plan and develop a new next-phase plan based on what they have done since the initial launch or the prior relaunch. The team has already committed to management what it intends to do and, if that commitment is unchanged, the members do not need to repeat the management meetings. However, if the project has changed in any significant way (such as changes to the product requirements, the team membership, project schedule, project scope, etc.), then the relaunch should be regarded as a new project launch and all of the meetings (script LAU or LAUm) and activities should be held. 2 Plan Execution - The team executes the cycle plan created during the (re)launch, making updates or changes to the plan as necessary. - The team uses scripts DEV and MAINT to guide the team in developing, maintaining, and enhancing software-intensive products. - The team meets weekly (script WEEK) to ensure that all team members understand current project status and know what to do next. - The team leader conducts periodic management and customer status meetings (script STATUS). Checkpoint 3 Cycle or Project Postmortem About a month into the TSP cycle or halfway through the cycle, whichever is shorter, the TSP Coach leads the team through a checkpoint (scripts CHECKPOINT). - The cycle postmortem is held before any subsequent launch or relaunch and includes only the data on the work completed during the earlier project phases or cycles. The focus of these postmortems is to evaluate interim project status and calibrate planning parameters to revise goals and improve performance in subsequent cycles (see script PM). - The project postmortem is conducted at the end of the project and includes the full product or project data. Organizational process baseline data may be updated at this time (see script PM). CMU/SEI-2010-SR

68 Exit Criteria - A completed high-quality product - A project summary report (see specification SUMMARY) - PIPs for all identified process improvements CMU/SEI-2010-SR

69 Appendix C: Goal-Question-(Indicator)-Metric Examples The following GQ(I)M templates are in no sense the minimum necessary to fully understand and implement measurement under AIM. They are simply included here as examples of what can be done. The first indicator is for TSP-style earned value management, while the second one gives a more traditional view of earned value at the organization level. A reasonably complete set of indicators would likely include indicators for time-on-task, planned vs. actual quality in several dimensions, and the TSP Quality Profile Indicator. CMU/SEI-2010-SR

70 Date: 6-Sep-2005 Indicator Name/Title: Earned Value OBJECTIVE QUESTIONS To determine current schedule status of a project and to estimate a likely completion date. Where is the project with respect to its current and original schedule? When is the project likely to finish? VISUAL DISPLAY Cumulative Earned Value Earned Value Cumulative Planned Value Cumulative EV Cumulative Predicted Earned Value Baseline Cumulative Plan Value 8/16/2004 8/30/2004 9/13/2004 9/27/ /11/ /25/ /8/ /22/ /6/ /20/2004 1/3/2005 1/17/2005 1/31/2005 2/14/2005 Weeks PERSPECTIVE Project team Team leader Project manager Program manager CMU/SEI-2010-SR

71 Inputs Data Elements List all the data elements in the production of the indicator. Estimated task duration (hours) Estimated hours available each week (hours) Estimated task completion date (week) Actual task duration (hours) Actual hours worked each week (hours) Definition Precisely define the data element used or point to where the definition can be found. Actual task completion date (week) DATA COLLECTION How When/How Often By Whom Forms Manual As defined tasks begin and end/contemporaneous Individual developers Personal time log Personal task list DATA REPORTING Responsibility for Reporting Team members Team plan manager Team leader Project manager By/To Whom By project team to team leader By team leader to project manager By project manager to program manager How Often Weekly DATA STORAGE Where How Security Team project data folder (virtual project notebook) Network save with automatic backups Only team leader, team members, and appropriate IT personnel have access to raw data CMU/SEI-2010-SR

72 ALGORITHM Individual earned values computer as: [estimated EV per task] = [estimated time per task] / sum of ([estimated time per task] of all tasks) [actual EV per task] = [estimated EV per task] [weekly estimated EV] = sum of ([estimated EV per task] of all tasks estimated to finish that week) [weekly actual EV] = sum of ([actual EV per task] of all tasks actually finished that week) ASSUMPTION ANALYSIS None Calculation of how many weeks ahead or behind schedule the project is Calculation of how many weeks likely remain until the project finishes Helps to determine if the project should be replanned or relaunched INTERPRETATION Compare current plan and projected EV track to baselined EV projection (included on chart). Compare to similar finished projects in the database. PROBING QUESTIONS When did the team launch? Relaunch? Replan? Why did the baseline change? (Change in scope, poor estimates for size or hours available, poor estimates for productivity, change in team size) EVOLUTION Add guard bands to projected EV line showing earliest and latest dates possible given the raw data. FEEDBACK GUIDELINES X-REFERENCES Time on task Time distribution profile Defect distribution profile CMU/SEI-2010-SR

73 Indicator Name/Title: Earned Value Management (Cost and Schedule) Date: 6-Sep-2005 OBJECTIVE QUESTIONS VISUAL DISPLAY To monitor contract performance for contracts that use Earned Value Management (EVM). This indicator will track the Cost Performance Index (CPI) and the Schedule Performance Index (SPI) in relation to the target values. Are the CPI and the SPI within their target areas? Where Green CPI 1.0 and SPI 1.0 Yellow CPI < 1.0 or SPI < 1.0 Red CPI < 0.9 or SPI < 0.9 Target Area CPI = 0.95 to 1.08 SPI = 0.95 to 1.08 CMU/SEI-2010-SR