MEASURING PROCESS CAPABILITY VERSUS ORGANIZATIONAL PROCESS MATURITY

Similar documents
SWEN 256 Software Process & Project Management

Understanding Model Representations and Levels: What Do They Mean?

USAF Software Technology Support Center (STSC) STSC SPI Help Desk COM , DSN

Process Improvement Is Continuous Improvement

Capability Maturity Model for Software (SW-CMM )

Process Quality Levels of ISO/IEC 15504, CMMI and K-model

Rational Software White Paper TP 174

Transactions on Information and Communications Technologies vol 11, 1995 WIT Press, ISSN

Quality Management with CMMI for Development v.1.3 (2013)

Quality Management of Software and Systems: Software Process Assessments

CMMI SM Mini- Assessments

How The HP-UX Systems Networking and Security Lab Assures Quality for Release 11i v2

Introduction To The The Software Engineering Institute s Capability Maturity Model for Software

Continuous Process Improvement - Why Wait Till Level 5?

Software Engineering Inspection Models Continued

Software technology 3. Process improvement models. BSc Course Dr. Katalin Balla

Relationships Between the Systems Engineering Capability Maturity Model SM and Other Products, Version 1.0

6. Capability Maturity Method (CMM)

CMMI SM Model Measurement and Analysis

Chapter 6. Software Quality Management & Estimation

9/24/2011 Sof o tw t a w re e P roc o e c s e s s s Mo M d o e d l e s l 1 Wh W a h t t i s i s a Pr P oc o ess s 2 1

1.264 Lecture 5 System Process: CMMI, ISO

The Internal Consistency of the ISO/IEC Software Process Capability Scale

Software Quality Management

Achieving SA-CMM Maturity Level A DoD Acquisition Management Experience

Tailoring CMM to a Process Improvement Project

The CMMI Product Suite and International Standards

Chapter 12. Contents Evaluating Process! Postmortem Analysis. Chapter 12 Objectives

1.264 Lecture 4. Software Process: CMM Unified Modeling Language (UML)

Software Engineering. Lecture 7: CMMI

CMMI Version 1.2. Model Changes

Measuring the requirements management key process area

Theoretical Model of Software Process Improvement for CMM and CMMI based on QFD

MTAT Software Engineering Management

8. CMMI Standards and Certifications

Improving Software Acquisition Processes: A Study of Real Project Costs

How to Develop Highly Useable CMMI Documentation

Business Process Improvement Guided by the BPMM i

Relevance of the CMM to the SIAP MDA Process

Technical Report CMU/SEI-93-TR-025 ESC-TR Mark C. Paulk Charles V. Weber Suzanne M. Garcia Mary Beth Chrissis Marilyn Bush.

A Global Overview of The Structure

CMMI GLOSSARY A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

A Study of Japanese Software Process Practices and a Potential for Improvement Using SOFL

RESEARCHERS and practitioners have realized that

This chapter illustrates the evolutionary differences between

Revista Economică 70:4 (2018) USING THE INTEGRATED CAPABILITY AND MATURITY MODEL IN THE DEVELOPMENT PROCESS OF SOFTWARE SYSTEMS

Best Practice Information Aids for CMMI SM -Compliant Process Engineering

Lessons Learned in Seamless Integration of CMMI, TSP, and PSP Why All Three Are Needed. CMMI Technology Conference November 14, 2007

CMM,,mproving and,ntegrating

Streamlining Processes and Appraisals

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

Applying the Personal Software Process (PSP) sm with Ada

CMMI Capability Maturity Model Integration [CMU SEI]

CMMI for Technical Staff

Implementation of the CO BIT -3 Maturity Model in Royal Philips Electronics

Measurement in Higher Maturity Organizations: What s Different and What s Not?

Reducing Estimating Errors with the CMM Raymond L. Kile 26 October 2000

Using the SA-CMM as a Tool for Estimating the User and Management Costs for Software Acquisition Projects

Presented By: Mark Paulk

CITS5501 Software Testing and Quality Assurance Standards and quality control systems

PROCESS MATURITY ASSESSMENT OF THE NIGERIAN SOFTWARE INDUSTRY

Capability Maturity Model the most extensively used model in the software establishments

ISO/IEC Evolution to an International Standard

Critical Design Decisions in the Development of the Standard for Process Assessment

Process Maturity Models

NDIA Systems Engineering Division. November in partnership with: Software Engineering Institute Carnegie Mellon University

Dean Wooley, Harris Corporation. Redefining QA s Role in Process Compliance

Capability Maturity Model

Technical Report CMU/SEI-96-TR-002 ESC-TR Software Capability Evaluation Version 3.0 Method Description. Paul Byrnes Mike Phillips

CENTRE (Common Enterprise Resource)

USING PILOTS TO ASSESS THE VALUE AND APPROACH OF CMMI IMPLEMENTATION. Goddard Space Flight Center (GSFC)

Software Project Management Sixth Edition. Chapter Software process quality

Measuring Performance: Evidence about the Results of CMMI

Improving Your Play Book Lessons Learned from Proposal Benchmarking

CMMI Level 2 for Practitioners: A Focused Course for Your Level 2 Efforts

The Capability Maturity Model

Collaborative Government / Contractor SCAMPI Appraisal

Lecture 2: Software Quality Factors, Models and Standards. Software Quality Assurance (INSE 6260/4-UU) Winter 2016

Moving On Up: Data and Experience Doing CMM-Based Process Improvement

The Components of the SW Quality Assurance System - Overview. 08/09/2006 SE7161 Software Quality Assurance Slide 1

Boldly Going Where Few Have Gone Before SCAMPI SM C Appraisal Using the CMMI for Acquisition

By: MSMZ. Standardization

Software Quality Management

Assessment Results using the Software Maintenance Maturity Model (S 3m )

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

CMMI for Acquisition Quick Reference

Top 10 Signs You're Ready (or Not)

Process Improvement Proposals (PIPs) Organization, Team, Individual

Improving Operational Resilience Processes

Leveraging Quality For Competitive Advantage

Risk Mitigated SCAMPI SM Process

REQUIREMENTS MANAGEMENT AND ACQUISITION MANAGEMENT EXPERIENCES IN SPANISH PUBLIC ADMINISTRATIONS

CASE STUDIES IN CONSTRUCTION PROCESS IMPROVEMENT

DORNERWORKS QUALITY SYSTEM

Measurement and Analysis: What Can and Does Go Wrong?

Software Process Improvement and Product Line Practice: CMMI and the Framework for Software Product Line Practice

A CASE STUDY ON THE CHALLENGES AND TASKS OF MOVING TO A HIGHER CMMI LEVEL

Evolutionary Differences Between CMM for Software and the CMMI

TSP Performance and Capability Evaluation (PACE): Customer Guide

Transcription:

MEASURING PROCESS CAPABILITY VERSUS ORGANIZATIONAL PROCESS MATURITY Mark C. Paulk and Michael D. Konrad Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213-3890 Abstract The SEI s Capability Maturity Model for Software has popularized the notion of measuring the software process maturity of organizations. ISO s SPICE project is currently creating a set of international standards for software process management that harmonize existing approaches. One of the objectives is to create a way of measuring process capability, while not using a specific approach such as the SEI s maturity levels. The approach selected is to measure the implementation and institutionalization of specific processes; a process measure rather than an organization measure. Maturity levels can be viewed as sets of process profiles using this approach. This paper discusses the capability levels being developed by SPICE and how the CMM s maturity levels would map onto them. 1 Introduction The Capability Maturity Model for Software (CMM) developed by the Software Engineering Institute (SEI) has popularized the notion of measuring the software process maturity of This work is sponsored by the U.S. Department of Defense. Published in the Proceedings of the 4th International Conference on Software Quality, Washington, DC, 3-5 Oct 1994. JPO # 94-314. organizations. Based on the work of the SEI and other appraisal method developers (e.g., Trillium, Software Technology Diagnostic, Bootstrap, Healthcheck, etc.), the International Organization for Standardization (ISO) is currently creating a set of international standards for software process assessment that harmonize existing approaches. One of the objectives of the ISO effort is to create a way of measuring process capability, while not using a specific approach to improvement such as the SEI s maturity levels. The approach selected is to measure the implementation and institutionalization of specific processes; a process measure rather than an organization measure. Maturity levels can be viewed as sets of process profiles using this approach. This paper discusses the capability levels being developed by ISO and how the CMM s maturity levels would map onto them. 1.1 The CMM The CMM [1, 2, 3] describes the principles and practices underlying software process maturity and is intended to help software organizations improve the maturity of their software processes in terms of an evolutionary path from ad hoc, chaotic processes to mature, disciplined software processes. The CMM is organized into five maturity levels. At the Initial level (1), the software process is characterized

as ad hoc, and occasionally even chaotic. Few processes are defined, and success depends on individual effort and heroics. At the Repeatable level (2), basic project management processes are established to track cost, schedule, and functionality. The necessary process discipline is in place to repeat earlier successes on projects with similar applications. At the Defined level (3), the software process for both management and engineering activities is documented, standardized, and integrated into a standard software process for the organization. All projects use an approved, tailored version of the organization's standard software process for developing and maintaining software. At the Managed level (4), detailed measures of the software process and product quality are collected. Both the software process and products are quantitatively understood and controlled. At the Optimizing level (5), continuous process improvement is enabled by quantitative feedback from the process and from piloting innovative ideas and technologies. Except for the Initial level, each maturity level is decomposed into several key process areas that indicate the areas an organization should focus on to improve its software process. The key process areas at the Repeatable level focus on the software project's concerns related to establishing basic project management controls. They are Requirements Management (RM), Software Project Planning (), Software Project Tracking and Oversight (PT), Software Subcontract Management (SM), Software Quality Assurance (QA), and Software Configuration Management (CM). The key process areas at the Defined level address both project and organizational issues, as the organization establishes an infrastructure that institutionalizes effective software engineering and management processes across all projects. They are Organization Process Focus (PF), Organization Process Definition (PD), Training Program (TP), Integrated Software Management (), Software Product Engineering (), Intergroup Coordination (IC), and Peer Reviews (PR). The key process areas at the Managed level focus on establishing a quantitative understanding of both the software process and the software work products being built. They are Quantitative Process Management () and Software Quality Management (). The key process areas at the Optimizing level cover the issues that both the organization and the projects must address to institutionalize continuous and measurable software process improvement. They are Defect Prevention (DP), Technology Change Management (), and Process Change Management (). For convenience, the key process areas are organized by common features. The common features are attributes that indicate whether the implementation and institutionalization of a key process area is effective, repeatable, and lasting. The five common features are Commitment to Perform, Ability to Perform, Activities Performed, Measurement and Analysis, and Verifying Implementation. 2

The CMM document design criteria contain templates that describe the practices that each common feature would normally be expected to contain. For example, Verifying Implementation typically contains practices on senior management oversight, project management review, and SQA audits. The Activities Performed common feature contains the process-specific practices that make the process unique. Because the key process areas reside at a single level, as processes mature they are described in different key process areas at different levels. For example, Software Project Planning and Software Project Tracking and Oversight at level 2 evolve to Integrated Software Management at level 3, are further refined in Quantitative Process Management at level 4, and are an aspect of Process Change Management at level 5. Processes that are not described in the CMM at a particular maturity level may exist and evolve. For example, the engineering processes for requirements analysis, design, code, and test are not described until the Defined level, yet organizations at lower maturity levels do perform these activities. The CMM focuses on systematically attacking the major problems facing organizations at each maturity level, thus key process areas and key practices are described in the CMM. As one moves up the maturity levels, the number of key process areas drop off noticeably at levels 4 and 5. One reason is that the higher level areas can be generically applied to the activities described in the lower level areas. 1.2 ISO s SPICE Project ISO is developing a suite of standards on software process assessment under the rubric of SPICE Software Process Improvement and Capability determination [4]. The core set of SPICE products comprising the proposed software process assessment standard are: Introductory Guide Baseline Practices Guide Assessment Instrument Process Assessment Guide Process Improvement Guide Process Capability Determination Guide Assessor Training and Qualification Guide The Baseline Practices Guide (BPG) defines, at a high level, the goals and fundamental activities that are essential to good software engineering. The BPG is the SPICE equivalent of the SEI's CMM. The BPG describes what activities are required, not how they are to be implemented. The baseline practices may be extended through the generation of application/sector specific Practice Guides to take account of specific industry, sector, or other requirements. The CMM is an example of a sectorspecific Practice Guide for large, softwareintensive projects and organizations. The product managers for the BPG are Al Graydon (Canada) and Mark Paulk (USA). The SPICE products are planned to be released as ISO technical reports in early 1995. A pilot testing period will follow, and the revised products will be 3

submitted for ballot as ISO standards, as appropriate, in the 1996 time frame. At this writing, the BPG has not been published as an ISO technical report, therefore the discussion in this paper is based on a draft report, but it is reasonably stable at this time. 2 Assessing Process Capability in the BPG The BPG categorizes processes into five process areas. The Customer-Supplier process area consists of processes that directly impact the customer, support development and transition of the software to the customer, and provide for its correct operation and use. The Engineering process area consists of processes that directly specify, implement, or maintain a system and software product and its user documentation. The Project process area consists of processes which establish the project, and coordinate and manage its resources to produce a product or provide services which satisfy the customer. The Support process area consists of processes which enable and support the performance of the other processes on a project. The Organization process area consists of processes which establish the business goals of the organization and develop process, product, and resource assets which will help the organization achieve its business goals. The processes associated with each of these areas are summarized in Figure 1 later in this paper. Each process in the BPG can be described in terms of base practices, which are the software engineering or management activities that address the purpose of a particular process (similar to the Activities Performed in the CMM). Process areas, processes, and base practices provide a grouping by type of activity. These processes and activities characterize performance of a process, even if that performance is not systematic. Performance of the base practices may be ad hoc, unpredictable, inconsistent, poorly planned, and/or result in poor quality products, but those work products are at least marginally usable in achieving the purpose of the process. Implementing only the base practices of a process may be of minimal value and represents only the first step in building process capability, but the base practices represent the unique, functional activities of the process when instantiated in a particular environment. The SPICE project did not want to tie the BPG directly to the SEI maturity levels. There are many possible ways of describing organizational and process evolution. There are several variations of the SEI s maturity levels (e.g., in Trillium) as well as radically different approaches. An architecture was proposed at the June 1993 plenary by Mark Paulk that applied the SEI s maturity level concepts at the individual process level. This proposed architecture for the BPG is expressed in terms of capability levels, common features, and generic practices. A capability level is a set of common features (sets of activities) that work together to provide a major enhancement in the capability to perform a process. Each level provides a major enhancement in capability to that provided by its predecessors in the performance of a process. They constitute a rational way of progressing through the practices. 4

Capability levels provide two benefits: they acknowledge dependencies among the practices of a process, and they help an organization identify which improvements it might perform first, based on a rational sequence of process implementation. There are six capability levels in the BPG. The Not-Performed level (0) has no common features. There is general failure to perform the base practices in the Process. At the Performed- Informally level (1), the base practices of the process are generally performed. The performance of these base practices may not be rigorously planned and tracked. Performance depends on individual knowledge and effort, but there is general agreement that this action is performed as and when required. At the Planned-and- Tracked level (2), performance of the base practices in the process are planned, tracked, and verified. At the Well-Defined level (3), base practices are performed according to a well-defined process using approved, tailored versions of standard, documented processes. At the Quantitatively-Controlled level (4), detailed measures of performance are collected and analyzed, which leads to a quantitative understanding of process capability and an improved ability to predict performance and objectively manage the process. At the Continuously- Improving level (5), continuous process improvement against quantitative process effectiveness and efficiency goals is enabled by quantitative feedback from performing the defined processes and from piloting innovative ideas and technologies. Note the similarity between these capability levels and the maturity levels in the CMM. The relationship is clear, but capability levels are distinctly different from maturity levels. Capability levels are applied on a per process basis; organizational maturity levels can be defined as a set of profiles for these processes. It is also practical to add level 0: a specific process may not be performed at all. A common feature in the BPG is a set of practices that address the same aspect of process implementation or institutionalization. BPG common features are clustered according to capability levels. A gap analysis can be performed on missing or incomplete features during an assessment to identify the major inhibitors to effective use of the process. Common features, if properly defined in the BPG, are always applicable to any process. A generic practice is an implementation or institutionalization practice (activity) that enhances the capability to perform any process. Capability levels, common features, and generic practices in the BPG are summarized in Table 1. Table 1. Capability Levels, Common Features, and Generic Practices in the BPG. Capability Level Common Feature Generic Practice Level 0 Not-Performed Level 1 Performed-Informally Base Practices are Performed Perform the process Level 2 Planned-and-Tracked Planning Performance Allocate resources Assign responsibilities 5

Document the process Provide tools Ensure training Plan the process Disciplined Performance Use plans, standards, and procedures Do configuration management Tracking and Verifying Performance Verify process compliance Audit work products Track with measurement Take corrective action Level 3 Well-Defined Defining a Standard Process Standardize the process Tailor the standard process Performing the Defined Process Use a well-defined process Perform peer reviews Use defined data Level 4 Quantitatively-Controlled Establishing Measurable Quality Goals Establish quality goals Objectively Managing Performance Determine process capability Use process capability Level 5 Continuously-Improving Establishing Process Effectiveness Goals Establish process effectiveness goals Continuously improve the standard process Improving Process Effectiveness Continuously improve the defined process Preventing Defects Perform causal analysis Eliminate defect causes The generic practices characterize good process management that results in an increasing process capability for any process. A planned, well-defined, measured, and continuously improving process is consistently performed as the generic practices are implemented for a process. This process capability is built on the foundation of the base practices that describe the unique, functional activities of the process. A process can be characterized as being at a capability level if it has implemented the generic practices for that capability level and the lower capability levels. 3 Mapping Organizational Maturity onto Process Profiles Mapping organizational maturity in the CMM into the BPG architecture should be viewed from two perspectives. First, conceptually the capability levels correspond closely to the maturity levels, and many of the generic practices correspond to key process areas in the CMM. For example, verify process compliance and audit work products corresponds to the software quality assurance practices in Verifying Implementation in the CMM, standardize the process corresponds to Organization Process Definition in the CMM, Objectively Managing Performance corresponds to Quantitative Process Management in the CMM, and Preventing Defects corresponds to Defect Prevention in the CMM. Second, some capability level 2 processes in the BPG correspond to maturity level 2 key process areas in the CMM, some capability level 3 processes correspond to maturity level 3 key process areas, etc. A profile of these correspondences is shown in Figure 1. 6

4 Conclusion The BPG is still a work in progress. The road to international standardization is an arduous one. Although the SEI is working with ISO s SPICE project to build the best possible international standard, our participation does not imply a commitment to use the standards eventually approved. That decision will depend on the final standard, on prototypes of version 2 of the CMM, on field trials of the prototypes, and on feedback from the CMM user community. Re-architecting the CMM according to the BPG approach would involve significant structural changes to the CMM, but one of the deferred change requests for version 1.0 of the CMM was to have key process areas span maturity levels. This architectural change would address that request and perhaps communicate process capability and organizational maturity issues more effectively. 5 References 1. Mark C. Paulk, Bill Curtis, Mary Beth Chrissis, and Charles V. Weber, "Capability Maturity Model for Software, Version 1.1", Software Engineering Institute, CMU/SEI- 93-TR-24, DTIC Number ADA263403, February 1993. 2. Mark C. Paulk, Charles V. Weber, Suzanne M. Garcia, Mary Beth Chrissis, and Marilyn W. Bush, "Key Practices of the Capability Maturity Model, Version 1.1", Software Engineering Institute, CMU/SEI-93-TR-25, DTIC Number ADA263432, February 1993. 3. Mark C. Paulk, Bill Curtis, Mary Beth Chrissis, and Charles V. Weber, "Capability Maturity Model, Version 1.1," IEEE Software, Vol. 10, No. 4, July 1993, pp. 18-27. 4. Mark C. Paulk and Michael D. Konrad, An Overview of ISO s SPICE Project, American Programmer, Vol. 7, No. 2, February 1994, pp. 16-20, reprinted in Standards column of IEEE Computer, Vol. 27, No. 4, April 1994, pp. 68-70. 7

Process Area and Process 0 1 2 3 4 5 Customer-Supplier Acquire Software Product and/or Service Establish Contract Manage Customer Requirements and Requests Perform Joint Audits and Reviews Package, Deliver, and Install the Software Support Operation of Software Provide Customer Service Assess Customer Satisfaction Engineering Develop System Requirements and Design Develop Software Requirements Develop Software Design Implement Software Design Integrate and Test Software Integrate and Test System Maintain System and Software Project Plan Project Life Cycle Establish Project Plan Build Project Teams Manage Requirements Manage Quality Manage Risks Manage Resources and Schedule Manage Subcontractors Support Develop Documentation Perform Configuration Management Perform Quality Assurance Perform Problem Resolution Perform Peer Reviews Organization Engineer the Business Define the Process Improve the Process Perform Training Enable Reuse Provide Software Engineering Environment Provide Work Facilities RM PT RM PT PT PT SM CM QA CM IC IC IC PR PD PF TP DP DP DP Repeatable (2) Defined (3) Managed (4) Optimizing (5) Figure 1. A Profile of BPG Processes and CMM Maturity Levels 8