Sampling for Software Process Assessments, Evaluations, and Appraisals. Dr. Mark C. Paulk 13 October 2017

Similar documents
A Quantitative Comparison of SCAMPI A, B, and C

How to Develop Highly Useable CMMI Documentation

Using CMMI with Defense Acquisition

Software Process Assessment

SWEN 256 Software Process & Project Management

Revista Economică 70:4 (2018) USING THE INTEGRATED CAPABILITY AND MATURITY MODEL IN THE DEVELOPMENT PROCESS OF SOFTWARE SYSTEMS

MTAT Software Engineering Management

CMMI SM Mini- Assessments

Project Selection for SCAMPI A

CMMI Today The Current State

OMNITRACKER & escm-sp. White Paper

Using CMMI to Improve Earned Value Management

CMMI A-Specification. Version 1.7. November, For CMMI Version 1.2. This document is controlled by the CMMI Steering Group.

Transactions on Information and Communications Technologies vol 11, 1995 WIT Press, ISSN

Process Maturity Profile

Update Observations of the Relationships between CMMI and ISO 9001:2000

What do federal and DoD Organizations expect from companies who have adopted CMMI?

How to Use the People CMM to Develop Effective Managers

SCAMPI SM Maintenance Appraisals (SCAMPI M)

Changes to the SCAMPI Methodology and How to Prepare for a SCAMPI Appraisal

CMMI Current State. Bob Rassa Industry CMMI Chair, Raytheon. Clyde Chittister Chief Operating Officer, Software Engineering Institute

SCAMPI-B for Contract Monitoring A Case Study of the Mission Planning Enterprise Contractors

Boldly Going Where Few Have Gone Before SCAMPI SM C Appraisal Using the CMMI for Acquisition

4th World Congress for Software Quality Bethesda, Maryland, USA September A Taxonomy for Improvement Frameworks

Dennis R. Goldenson James McCurley Robert W. Stoddard II. CMMI Technology Conference & User Group Denver, Colorado 19 November 2009

Maturing Measurement Capability The CMMI SM Measurement Thread and Recognized Industry Guidance - Standards

NDIA Systems Engineering Division. November in partnership with: Software Engineering Institute Carnegie Mellon University

AZIST Inc. About CMMI. Leaders in CMMI Process Consulting and Training Services

Becoming an Authorized Organization for escm Capability Determination Services

A Measurement Approach Integrating ISO 15939, CMMI and the ISBSG

SCAMPI SM C ++ to C- How much is enough?

Objective Demonstration of Process Maturity Through Measures

Software Industry and SPI in Brazil

Brochure of CMMI 1.3!

Process Maturity Models

Streamlining Processes and Appraisals

(Getting & Giving) Superior Service through Measurement

An Overview of the SCAMPI Lead Appraiser. Body of Knowledge (SLA BOK) Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213

Measuring Performance: Evidence about the Results of CMMI

Assessment Results using the Software Maintenance Maturity Model (S 3m )

Software Quality Management

Shrinking the Elephant: If Implementing CMMI Practices Looks Like More Effort than it s Worth, Let s Look Again. Sam Fogle ACE Guides, LLC

CITS5501 Software Testing and Quality Assurance Standards and quality control systems

Gary Natwick Harris Corporation

Technical Report CMU/SEI-96-TR-002 ESC-TR Software Capability Evaluation Version 3.0 Method Description. Paul Byrnes Mike Phillips

Contents. Preface. Acknowledgments. Tables and Figures

FAA s Experience with Process Improvement & Measurements

Data Collection Instrument. By Temtim Assefa

CMMI Version 1.2. Model Changes

Experiences with Indicator-Based CMMI Appraisals at Raytheon

Chapter 12. Sample Surveys. Copyright 2010 Pearson Education, Inc.

Techniques for Shortening the Time and Cost of CMMI Appraisals

Fast Track Your CMMI Initiative with Better Estimation Practices

People - Introduction to People Capability Maturity Model (P-CMM )

A Real-Life Example of Appraising and Interpreting CMMI Services Maturity Level 2

Best Practices for Service Providers And Clients: Improving and Assessing an Organization s Sourcing Capabilities DC SPIN May 2, 2012

CAPABILITY MATURITY MODEL INTEGRATION - CMMI. Software Engineering Competence Center

Reaching Business Goals with Value Adding CMM-I Assessments

NATURAL S P I. Critical Path SCAMPI SM. Getting Real Business Results from Appraisals

Contact: Version: 2.0 Date: March 2018

International Program for Development Evaluation Training (IPDET)

Dorna Witkowski Lynn Penn Lockheed Martin, Information Systems & Global Services (IS&GS) Copyright 2009 Lockheed Martin Corporation

Software Engineering. Lecture 7: CMMI

TACOM-ARDEC Software Enterprise (SWE) CMMI Based Process Improvement

Applying the Personal Software Process (PSP) sm with Ada

Ten Years with TSP SM :

Using Pilots to Assess the Value and Approach of CMMI Implementation

Quality Systems Frameworks. RIT Software Engineering

The CMMI Product Suite and International Standards

Presented By: Mark Paulk

A Practical Guide to Implementing Levels 4 and 5

Delivery Excellence Using CMMI

Certification Candidates Examination Guide

Process Maturity Profile

Collaborative Government / Contractor SCAMPI Appraisal

An Initial Comparative Analysis of the CMMI Version 1.2 Development Constellation and the ISO 9000 Family

eservices Capability Model (e scm ) Annotated Bibliography

Quality Management of Software and Systems: Software Process Assessments

STEPS site report. Main comprehensive report for the survey. Purpose is to provide the following information:

CMMI Current State and Future Plans

The Process In-Execution Review (PIER) After Three Years

Comparing the escm SP v2 and COBIT

SCA Standard on Cost Auditing Audit Sampling

Capability Maturity Model

Software technology 3. Process improvement models. BSc Course Dr. Katalin Balla

CMMI Version 1.3: Are you Ready for Release?

Process Maturity Profile

CMMI V2.0 MODEL AT-A-GLANCE. Including the following views: Development Services Supplier Management. CMMI V2.0 outline BOOKLET FOR print.

By: MSMZ. Standardization

MEASURING PROCESS CAPABILITY VERSUS ORGANIZATIONAL PROCESS MATURITY

The Economics of CMMI

MODEL 14:1 14:1. Inspection PSP TSP SW-CMM ISO 9001 CMMI. Figure 1: Examples for ROI

Implementation of the CO BIT -3 Maturity Model in Royal Philips Electronics

CMMI-DEV v1.3 Good? Bad? Ugly?

ISACA. The recognized global leader in IT governance, control, security and assurance

Business address: 101 Parkshore Drive, Suite 134, Folsom, CA Office: Cell:

CERT Resilience Management Model Capability Appraisal Method (CAM) Version 1.1

Patterns: An Approach for CMMI Adoption

SW CMM. Capability Maturity Models. Level 1: Initial Level SW CMM (2) CS 390 Lecture 8 Improving the Software Process

VALUE OF CMMI REAPPRAISALS

Transcription:

Sampling for Software Process Assessments, Evaluations, and Appraisals Dr. Mark C. Paulk 13 October 2017

What Is An Assessment? An appraisal of an organization's current software process for self-improvement by a trained team of experienced software professionals. It is based on review of 4 to 6 representative projects responses to the maturity questionnaire in-depth discussions with project managers and practitioners collective knowledge and experience of the assessment team 2

The Evolution of CMMs Software process maturity framework 1987 Software CMM v1.1 1993 Software Acquisition CMM Systems Engineering CMM over 60 different CMMs staged and continuous escm for Service Providers v1 2001 CMMI for Development v1.1 2002 CMMI for Acquisition CMMI for Services escm for Service Providers v2 2004 CMMI for Development v1.3 2010 3

The Evolution of Assessments (1 of 2) Software process assessments (SPA) software process improvement, 1987 to 1995 What are your problems? map problems to the maturity framework emphasis on organizational intervention Software capability evaluations (SCE) source selection, contract monitoring audits against the maturity framework, 1988 to 1995 audits against the Software CMM, 1995 to 2003 emphasis on level playing field fairness Appraisals = assessments and evaluations 4

The Evolution of Assessments (2 of 2) CMM-based appraisals for internal process improvement (CBA IPI) software process improvement, 1995 to 2003 audit process against Software CMM identify problems, map against Software CMM as appropriate report CMM and non-cmm findings emphasis on reliable and consistent results Standard CMMI appraisal method for process improvement (SCAMPI) process improvement, 2003 till audits against CMMI 5

Assessment Versus Evaluation Issue Use Objective Improvement goal Output Range of findings Style Focus of results Assessment process improvement assess current practice catalyst for improvement input for action plan non-cmm findings possible collaborative applies to organization Evaluation source selection substantiate practice evaluate commitment performance risk CMM findings only audit-oriented predict next project 6 Status of results confidential known to DoD

SEI s SCAMPI v1.3 Standard CMMI Appraisal Method for Process Improvement (SCAMPI) A, Version 1.3: Method Definition Document, Carnegie Mellon University, Software Engineering Institute, CMU/SEI-2011-HB-001, March 2011. Sampling is planned quantitatively based on the diversity of unique process implementations within the appraisal scope, with the goal of both ensuring a representative sample of the organizational unit and optimizing the effort for collection and analysis of objective evidence. 7

Organizational Analysis The scope of what organization means can cover Multiple business units, product lines, domains (type of work) Customer segmentation Sizes of project and/or product, duration Software engineering methods and environments Geographical sites (location) Workforce mixes (experience, ) Organization vs project-level processes 8

SCAMPI Sampling Units Basic units ~ projects + support functions (organizational functions), Organizational scope of the appraisal is selected as a representative sample of the organizational unit. based on sampling factors that reflect meaningful differences in the conditions under which work is performed 9

SCAMPI Sampling Example Commercial customers Government customers New York 7 2 Cincinnati 5 0 Denver 11 5 Total number of basic units Basic units sampled 10 New York, Commercial 7 1 New York, Government 2 1 Cincinnati, Commercial 5 1 Denver, Commercial 11 2 Denver, Government 5 1

The Problem Is this an adequate sampling approach? Performing assessments in large complex organizations quickly becomes cost prohibitive if every component (entity) must be assessed for each criterion Sampling methodology goals Reduce the number of entities that must be assessed, especially for assessments with a large number of entities Keep the risk of incorrectly certifying an organization that is not really compliant at or below 10% under most 11 circumstances

Sampling Sampling allows one to specify the amount of error one can tolerate, thus maintaining the desired accuracy levels while substantially reducing the time and effort involved. Four to six representative projects might be considered a convenience sample. Covering all aspects of the organization and its projects is a distinct improvement but does not address the question of adequate coverage. 12

Statistical Sampling Prerequisites Randomization, every sampling unit (project) must have the same likelihood of being selected as part of any given sample. Stratified random sample, a sample obtained by separating the population elements into nonoverlapping groups (strata), and then selecting a simple random sample from each stratum. 13

No Guarantees Assumption The ability within a stratum to detect a single entity that is not compliant was not a design goal. A high likelihood of finding at least one noncompliant entity when at least two exist within a stratum. It does not guarantee finding a single non-compliant entity. It does not guarantee finding multiple non-compliant entities. 14

Perfect Team Assumption Any entity that is truly not compliant and is selected in the sample will be detected. The Ultimate Sampling Objective Incorrectly passing a single Practice, while undesirable, is not as serious as incorrectly passing an organization at any given Capability Level. 15

Sampling Method Uses a variation of Stratified Random Sampling entities are placed in similar strata by Practice (84 Practices in escm-sp) Stratifying at the Practice level allows us to differentiate while still achieving maximum efficiency for Practices that are implemented across large parts of the organization (e.g., HR, Contracting, etc.) 16

Stratifying the Population Identify the entities that are in scope for the assessment. Organize entities into strata that have similar characteristics (organizational analysis). Consistent process implementations within a strata (homogeneous subgroups) Responsibility of Lead Assessor Approved by certification body 17

Example Stratification for 63 Entities 63 Entities 28 Practices 40 23 17 Practices 15 25 17 6 39 Practices 18

Sampling Table Design Designed to control the probability of an organization incorrectly achieving a Capability Level for which they were not truly compliant. Set the sample sizes such that the conditional probability is less than 10% that no Practice noncompliances will be detected, given that Practice noncompliances exist. If any sample would have a 70% or better chance of detecting at least one noncompliant entity, when there are two or more noncompliant entities in the stratum, we will achieve a minimum of a 91% probability of detecting at 19 least one Practice noncompliance.

escm Sampling If any sample would have a 70% or better chance of detecting at least one noncompliant entity, when there are two or more noncompliant entities in the stratum, we will achieve a minimum of a 91% probability of detecting at least one Practice noncompliance. Number of Entities 1-3 4-7 8-9 10-11 12-13 14-16 17-18 19-20 Sample Size All 3 4 5 6 7 8 9 Number of Entities 21-22 23-24 25-27 28-29 30-31 32-33 34-35 36-38 Sample Size 10 11 12 13 14 15 16 17 Number of Entities 39-40 41-42 43-44 45-46 47-49 50 and up Sample Size 18 19 20 21 22 23 20

A Simple escm Sampling Example Stratum Number of Entities Number of Sample Size in Stratum Practices Org 93 17 23 S1 43 67 20 S2 25 67 12 S3 17 67 8 S4 8 67 4 21

P(Detection) Failed Level Detection Because a single failed practice is sufficient to fail at any Level, detecting Level failure is the same as detecting Practice failure. Prob. of Detecting at Least One Practice 1.0 0.9 0.8 0.7 0.6 1 2 3 4 5 6 7 8 9 10 Failing Practices 22

Typical Results With a full census, the earlier example would have required 5,292 Entity-Practice pairs to be reviewed With the sampling plan, only 2,307 need to be reviewed a 56.4% reduction 63 Entities 28 Practices 40 23 17 Practices 15 25 17 6 39 Practices 23

Conclusions It is possible to significantly reduce the effort required to evaluate organizations with multiple entities while still controlling for the number of false positives While the reduction is significant in relative terms, the remaining effort for large organizations is still substantial in absolute terms 24

escm E.B. Hyder, K.M. Heston, M.C. Paulk, and W.E. Hefley, The esourcing Capability Model for Service Providers, Van Haren Publishing, Zaltbommel, Netherlands, 2009. D.M. Northcutt and M.C. Paulk, Statistical Sampling for Process Assessments, ASQ Software Quality Professional, 12 (4), September 2010, pp. 19-28. By sampling we mean the process of drawing an inference about some population of interest by examining only a subset of the population. 25

Questions? 26