Objective Demonstration of Process Maturity Through Measures

Similar documents
Using CMMI with Defense Acquisition

SWEN 256 Software Process & Project Management

CMMI Project Management Refresher Training

FAA s Experience with Process Improvement & Measurements

Achieving SA-CMM Maturity Level A DoD Acquisition Management Experience

Quality Management with CMMI for Development v.1.3 (2013)

What do federal and DoD Organizations expect from companies who have adopted CMMI?

Transition from SW-CMM to CMMI : The Benefits Continue!

Using CMMI to Improve Earned Value Management

USAF Software Technology Support Center (STSC) STSC SPI Help Desk COM , DSN

Software Quality Management

Rational Software White Paper TP 174

Software Engineering Inspection Models Continued

Introduction To The The Software Engineering Institute s Capability Maturity Model for Software

Independent Verification and Validation (IV&V)

CMMI Version 1.2. Model Changes

CMMI V2.0 MODEL AT-A-GLANCE. Including the following views: Development Services Supplier Management. CMMI V2.0 outline BOOKLET FOR print.

TACOM-ARDEC Software Enterprise (SWE) CMMI Based Process Improvement

TOPIC DESCRIPTION SUPPLEMENT for the SYSTEMS ENGINEERING SURVEY DESCRIPTION

Top 10 Signs You're Ready (or Not)

USING PILOTS TO ASSESS THE VALUE AND APPROACH OF CMMI IMPLEMENTATION. Goddard Space Flight Center (GSFC)

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

MEASURING PROCESS CAPABILITY VERSUS ORGANIZATIONAL PROCESS MATURITY

Lockheed Martin Benefits Continue Under CMMI

NDIA Systems Engineering Division. November in partnership with: Software Engineering Institute Carnegie Mellon University

IIL s International Project Management Day, 2007 The Power of the Profession:

Top Software Engineering Issues in the Defense Industry

MTAT Software Engineering Management

CMMI A-Specification. Version 1.7. November, For CMMI Version 1.2. This document is controlled by the CMMI Steering Group.

Supplier s s Perspective on

Quality Management of Software and Systems: Software Process Assessments

CMMI for Technical Staff

The Components of the SW Quality Assurance System - Overview. 08/09/2006 SE7161 Software Quality Assurance Slide 1

Improving Software Acquisition Processes: A Study of Real Project Costs

Contents. Preface. Acknowledgments. Tables and Figures

Capability Maturity Model for Software (SW-CMM )

Measurement: A Required Path for Achieving Operating Excellence

CMMI Current State. Bob Rassa Industry CMMI Chair, Raytheon. Clyde Chittister Chief Operating Officer, Software Engineering Institute

Gary Natwick Harris Corporation

Pittsburgh, PA

6. Capability Maturity Method (CMM)

CERT Resilience Management Model, Version 1.2

Software Acquisition Capability Maturity Model (SA-CMM ) Version 1.02

Chapter 6. Software Quality Management & Estimation

Integrated Measurements for CMMI

CMMI for Acquisition Quick Reference

SCAMPI-B for Contract Monitoring A Case Study of the Mission Planning Enterprise Contractors

Streamlining Processes and Appraisals

CMMI SM Mini- Assessments

SEI CMMI-DEV Reference Legend for CAI Process Alignment Grid

Process Improvement Is Continuous Improvement

Cintipation Corp. CORINTHIAN PROJECT Policy for Requirements Management

Program managers (PMs)

Managing a Project and Keeping Sane While Wrestling Elegantly With PMBOK, Scrum and CMMI (Together or Any Combination)

DORNERWORKS QUALITY SYSTEM

Reducing Estimating Errors with the CMM Raymond L. Kile 26 October 2000

Applying PSM and Insight within a Small Organization

Fast Track Your CMMI Initiative with Better Estimation Practices

Given design trade studies, interpret the results to recommend the most cost-effective concept for development.

Boldly Going Where Few Have Gone Before SCAMPI SM C Appraisal Using the CMMI for Acquisition

Technical Report CMU/SEI-96-TR-002 ESC-TR Software Capability Evaluation Version 3.0 Method Description. Paul Byrnes Mike Phillips

National Association of State Information Resource Executives

CMMI for Services Quick Reference

CMMI Today The Current State

NASA Procedural Requirements

Quality Assurance / Quality Control Plan

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

Debra J. Perry Harris Corporation. How Do We Get On The Road To Maturity?

System Development Performance. In-process Status

Maturing Measurement Capability The CMMI SM Measurement Thread and Recognized Industry Guidance - Standards

CMMI for Acquisition (CMMI-ACQ) Primer, Version 1.2

Sampling for Software Process Assessments, Evaluations, and Appraisals. Dr. Mark C. Paulk 13 October 2017

PART THREE - WORK PLAN AND IV&V METHODOLOGY WORK PLAN. FL IT IV&V Work Plan

Version 1.0. The Contract Management Standard Final Edition. Version 1.0

GAO Review of Best Practices for Quality Assurance

Software Project Management Sixth Edition. Chapter Software process quality

Risk Mitigated SCAMPI SM Process

Dean Wooley, Harris Corporation. Redefining QA s Role in Process Compliance

Counterfeit Mitigation

Performance-Based Earned Value

Version 1.0. The Contract Management Standard Final Edition. Version 1.0

Defense Logistics Agency Instruction. Information Technology (IT) Benchmarking

DEFENSE ACQUISITION UNIVERSITY ISA 101 BASIC INFORMATION SYSTEM ACQUISITION

CMMI Conference November 2006 Denver, Colorado

The Value-Adding Internal Audit Function

8. CMMI Standards and Certifications

Tailoring CMM to a Process Improvement Project

Using Pilots to Assess the Value and Approach of CMMI Implementation

Leveraging Quality For Competitive Advantage

CMMI for Services: Re-introducing the CMMI for Services Constellation

DoD Systems Engineering and CMMI

Business Value and Customer Benefits Derived from High Maturity

Defense Microelectronics Activity (DMEA) Advanced Technology Support Program IV (ATSP4) Small Business Set-Aside RFP. Pre-Proposal Conference

Chapter 12. Contents Evaluating Process! Postmortem Analysis. Chapter 12 Objectives

An Acquirer s Guide to Navigating Contractor Data

CMMI for Acquisition (CMMI-ACQ) Primer, Version 1.2

Management Principles to Accelerate Process Improvement

Continuous Process Improvement - Why Wait Till Level 5?

rn^m UNCLASSIFIED IDA Log No. HQ

Transcription:

Objective Demonstration of Process Maturity Through Measures 2000 PSM Users Group Conference Vail, Colorado 24-28 July 2000 Presented By: Perry R. DeWeese Lockheed Martin 7 July 2000 1

Agenda: Concept Background Alternative Evaluation Technique Recommendations Summary 7 July 2000 2

Concept An acquirer should be able to authenticate a supplier s process maturity and capability through the use of measures A mature organization (acquirer and supplier) should have in place a measurement program that provides objective insight into an organization s and a project s issues and a project s requirements A mature organization routinely uses these measures to manage its projects and can demonstrate their use to control process performance An organization that does not have an effective measurement program is not SEI SW-CMM Level 3 7 July 2000 3

Background 5000.2 Final Draft June 2000 Section 1 1.1.2 Defense Acquisition System The outcome of systems acquisition is a system that represents a judicious balance of cost, schedule, and performance in response to user s expressed need; that is interoperable with other systems; that uses proven technology, open system design, available manufacturing capabilities or services, and smart competition; that is affordable; and that is supportable. 7 July 2000 4

Background Current DoD Policy: Software systems be designed and developed based upon software engineering principles. This includes the selection of contractors with domain experience in developing comparable software systems, a successful past performance record, and a demonstrable mature software development capability and process. It also requires a software measurement process to plan and track the software program, and to assess and improve the development process and associated software product. At a minimum full compliance with SEI CMM Level 3, or its equivalent level in an approved evaluation tool, is the Department s goal. J.S. Gansler In addition, I request your assistance in defining a single recommended evaluation technique so these tools will provide consistent results and will not burden our contractor community with multiple evaluation types. Delores M. Etter 7 July 2000 5

Background Current Evaluation Practices: Software Capability Evaluation (SCE) - provides empirical evidence of the contractor s ability to create a software product that meets technical requirements and the program s cost/schedule. The SCE provides the government information to determine the risk associated with the contractor s development process. Software Development Capability Evaluation (SDCE) - The SDCE is a methodology for assessing a contractor s capabilities in software and software-related systems engineering disciplines. SDCE is based on the premise that if contractors have defined their software development plans and identified the software engineering processes, tools, and technologies they will use on a given program and if they have past experience in the use of the identified processes, tools, and technologies, they present a lower risk to the program than contractors who have not. This presentation will focus on SCE s 7 July 2000 6

Background Current Evaluation Practices: Software Capability Evaluation (SCE) - Based on the SEI SW-CMM Uses a variant of the CBA-IPI assessment model Usually part of the source selection process SCE team performs the evaluation by reviewing contractor provided information and conducting interviews Includes sites visits and interviews with candidate contractors A subset of a contractor s projects are evaluated (based on similarity to the proposed project) The evaluation should reveal specific strengths and weaknesses associated with the proposed project 7 July 2000 7

Background Source Selection Criteria - Determining Best Value: Best Value Award - the expected outcome of an acquisition that, in the Government s estimation, provides the greatest overall benefit in response to the requirement. FAR 2.101 Other allowable actions: Full Tradeoff Process Performance Price Tradeoff Process Lowest Price Technically Acceptable Process Evaluation Process - used to validate and confirm the offeror s written proposal Evaluation Notices are used to identify deficiencies and allow offers to revise their proposals Objective - to maximize the government s ability to obtain best value, based on the requirements and the evaluation factors in the solicitation 7 July 2000 8

Background Source Selection Criteria - Determining Best Value: Evaluation Factors - The factors are of equal importance 1. Mission Capability Sub-Factor 1 : Management Sub-Factor 2 : System Development Sub-Factor 3 : Software In descending order of importance Sub-Factor 4 : Production and Installation Sub-Factor 5 : Small Business Subcontracting 2. Cost/Price 3. Past Performance 4. Proposal Risk 7 July 2000 9

Background SCE Issues: Does the SCE provide the PM the information needed to make the right acquisition decision? Significant Cost of SCE s - Government and Contractor $60k - $80k per government evaluation team $50k - $300k for contractor preparation/support Team preparation and evaluation period Evaluation model - data intensive Reuse of SCE results by other programs and agencies 7 July 2000 10

Background Similar problems with CMM software process assessments: Cost - $94k - $271k internal $40k - $137k external Data intensive Information for the decision maker 7 July 2000 11

Alternative Evaluation Technique Goal: To establish an equivalent Level 3 evaluation technique that - provides credible information on process and technical capability, product quality, cost/schedule realism, past performance, etc. reduces the cost of on-going evaluations with minimum disruption to the program provides reliable and objective information to decision makers at the right time reuses information that has significant utility to management at little or no additional cost supports the acquisition and the supply of best value 7 July 2000 12

Alternative Evaluation Technique Purpose is to change the evaluation technique: The current CBA-IPI based evaluation technique is too expensive for on-going use and does not provide adequate information about the contractor s ability to satisfy the Defense Acquisition requirements. To provide useful objective evidence at little or no additional cost. Need to provide the evaluation team the ability to observe the performance of the process and how the process influences the success of the project. Purpose is not to replace contractor evaluations - but to provide an objective basis for evaluating a contractor s software process and product capability/maturity and its ability to satisfy requirements. 7 July 2000 13

Alternative Evaluation Technique Proposed Alternative: Establish process and product measures that represent the institutionalization of Level 3 behavior, i.e., The organization s standard software process is routinely used across the projects The use of the process as indicated by measures and other objective evidence to provide organization and project management insight into the project s progress Measures are used to plan, control and assess Perform contractor evaluations based on this measurement set Contractors that do not satisfy the level 3 criteria are high risk 7 July 2000 14

Alternative Evaluation Technique Candidate Approach: Start with a set of processes and objective evidence Use the contractor s existing product and process measures Ensure the current measurement program provides insight into the project s cost, schedule, progress, quality, etc. on all programs Analyze the measures for gaps to ensure the applicability of these measures and objective evidence to the achievement of Level 2 and 3 SW-CMM KPA Goals Obtain acceptance of these measures through an agreed to criteria Develop training for analyzing and interpreting the measures Provide training materials Implement on pilot organizations/projects Update measures Document and release 7 July 2000 15

Alternative Evaluation Technique Example of Goal/Measurement Analysis: Requirements Management CMM KPA Key Measures Implementation Objective Evidence Goal 1: System requirements allocated to software are controlled to establish a baseline for software engineering and management use. Goal 2: Software plans, products and activities are kept consistent with the system requirements allocated to software. Software Project Planning Goal 1: Software estimates are documented for use in planning and tracking the software project. Goal 2: Software project activities and commitments are planned and documented. Goal 3: Affected groups and individuals agree to their commitments related to the software project. Number of Requirements Number of Requirement Changes Number of Configuration Change Board Meetings CPI/SPI Size: number SLOC/FP, requirements, test cases plan vs. actual TPM plan vs. actual Granularity: # of WBSID elements, number of CSCs Risks: number open closed, mitigated, accepted Quality: number of QA audits plan vs actual; number of non compliances open, closed, escalated; % compliance Inspection (peer review) records (G1, A1) Document review sign-off records (G1, A1) Software Requirements Specification (G2, A2) Requirements Traceability Records (G2, A2) Program Management Plan (G2, A3) Baseline Change Requests (G2, A3) Baseline Change Board minutes (G2, A3) Plans Basis of Estimate Reviews Approvals/signatures on plans, estimates Project s defined software process Why does this measure/oe meet the goal? Peer reviews or document review sheets demonstrate that software engineering has reviewed the requirements before being incorporated in the program. Baseline change requests and board meetings demonstrate that changes to requirements necessitate a change to plans, products, and activities. The SRS and PMP demonstrate the effect of changes. Many of these measures (CPI, SPI, size, TPM) are directly specified in activities that support the goals. Goals 2 and 3 are more directly supported by the OE rather than the metrics NONE Gaps Commitment is difficult to measure directly. It can be inferred (perhaps) by CPI/SPI and noncompliances (assuming process requires commitment) Software Project Tracking and Oversight Goal 1: Actual results and performance are tracked against the software plans. Goal 2: Corrective actions are taken and managed to closure when actual results and performance deviate significantly from the software plans. Goal 3: Changes to software commitments are agreed to by the affected groups and individuals. Size, cost, schedule, computer resources Software Development Plan (G1, Act 1,13) Planned vs actual data (G1, Act 5, 6, 7, 8, 11) Samples of project replanning data (G2, Act 11) Meeting minutes, memos (G3, Act 3) Documents means for tracking progress & conducting reviews Sample tracking metric data collected during project performance Samples of senior management reviews of changes to commitments to external organizations. 7 July 2000 16

Alternative Evaluation Technique Going Forward: Obtain Government endorsement for the concept Establish a Joint Government/Industry Team Government - Acquisition and Users Industry - Level 4 and 5 Organizations V&V - PSM Users Group Project span - 6 months Detailed project planning to follow 7 July 2000 17

Alternative Evaluation Technique Benefits of this approach: This evaluation technique uses a contractor s routine measurement process and existing objective evidence - with extensions to provide information on defense acquisition and DoD policy requirements Measures provide an objective basis to predict and determine the contractor s on-going satisfaction of project requirements Measures are agreed to by the Acquirer and the Contractor Measures provide real information (observations) that are actionable by decision makers Level 3 is required to demonstrate organizational implementation of this evaluation technique 7 July 2000 18

Recommendations Initiate the project Solicit government and industry measurement expert participation Form a team with the right people V&V to ensure satisfaction of project requirements Report back in three months to demonstrate proof of concept 7 July 2000 19

Summary Both Government and Industry need a practical approach to authenticate process maturity Current methods are expensive and do not provide decision makers instant and continuous objective information The results will improve improve the SCE process and are extensible to CMMI This measurement based technique has practical application for: DCMA s CMM Based Insight (CBI) OSD s Program Analysis and Evaluation Measurements 7 July 2000 20