The ROI of CMMI: Using Process Simulation to Support Better Management Decisions

Similar documents
Examining the Test Process: Predicting the Return on Investment of a Process Change

Evaluating the Impact of New Tools and Technologies Using Simulation

Using Software Process Simulation to Assess the Impact of IV&V Activities 1

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

9/24/2011 Sof o tw t a w re e P roc o e c s e s s s Mo M d o e d l e s l 1 Wh W a h t t i s i s a Pr P oc o ess s 2 1

Software Process Assessment

Highlights of CMMI and SCAMPI 1.2 Changes

Use of Competency Guidelines to Address CMMI GP 2.5

CMMI for Technical Staff

A Global Overview of The Structure

M. Lynn Penn Lockheed Martin Integrated Systems and Solutions November 2004

Teuvo Suntio. Quality Development Tools. Professor of Power Electronics at University of Oulu. Electronic System Design A TS Rev. 1.

Leveraging Your Service Quality Using ITIL V3, ISO and CMMI-SVC. Monday Half-Day Tutorial

Update Observations of the Relationships between CMMI and ISO 9001:2000

USAF Software Technology Support Center (STSC) STSC SPI Help Desk COM , DSN

DORNERWORKS QUALITY SYSTEM

CMMI SM Mini- Assessments

MTAT Software Engineering Management

SCAMPI V1.1 Method Overview

CC and CMMI. An Approach to Integrate CC with Development

Software Project & Risk Management Courses Offered by The Westfall Team

Managing a Project and Keeping Sane While Wrestling Elegantly With PMBOK, Scrum and CMMI (Together or Any Combination)

CMMI for Services (CMMI -SVC) Process Areas

Chapter 6. Software Quality Management & Estimation

Patricia A Eglin David Consulting Group

Bill Smith, CEO Leading Edge Process Consultants LLC

Relationship between CMMI Maturity Levels and ISO/IEC Processes Capability Profiles

Using Pilots to Assess the Value and Approach of CMMI Implementation

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

8. CMMI Standards and Certifications

Two Branches of Software Engineering

System Engineering Process Improvement using the CMMI in Large Space Programs

CMMI Version 1.2. Model Changes

High Maturity Practices in Quality Assurance Mechanisms for Quantitative Management (QM) and Continuous Process Improvement (CPI)

High Maturity Misconceptions: Common Misinterpretations of CMMI Maturity Levels 4 & 5

What s New in V1.3. Judah Mogilensky Process Enhancement Partners, Inc.

How to Assure your Subcontractors Quality with Cross Constellations and Multi Models Inspiration Continues Process Improvement Initiatives

CMMI V2.0 MODEL AT-A-GLANCE. Including the following views: Development Services Supplier Management. CMMI V2.0 outline BOOKLET FOR print.

USING PILOTS TO ASSESS THE VALUE AND APPROACH OF CMMI IMPLEMENTATION. Goddard Space Flight Center (GSFC)

Strategies for Transitioning to CMMI-SVC

A Practical Guide to Implementing Levels 4 and 5

Chapter 26 Process improvement

A Real-Life Example of Appraising and Interpreting CMMI Services Maturity Level 2

Capability Maturity Model the most extensively used model in the software establishments

The Issue of Performance Why Do you need a Maturity Level 5. Achieving the Organizational Business Objectives Through Optimized Operational Processes

One if by Land, Two if by Sea

CMMI for Acquisition Quick Reference

CMMI for Services Quick Reference

High Maturity/Capability Appraisals

Using CMMI. Type of Work. IT solution development Software development IT solution integration IT solution deployment

Lesson Learned from Cross Constellations and Multi Models Process Improvement Initiatives

Practical Application of the CMMI for Building a Strong Project Management Infrastructure

Using TSP to Improve Performance

Generating Supportive Hypotheses

SWEN 256 Software Process & Project Management

Implementing Systems Engineering Processes to Balance Cost and Technical Performance

Moving Toward CMM Levels 4 and 5: Combining Models and Metrics to Achieve Quantitative Process Management

CMM,,mproving and,ntegrating

There are 10 kinds of people in the world. Those who think in binary, and those who don t.

Marilyn Ginsberg-Finner Northrop Grumman Corporation

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

Ogden Air Logistics Center

Integrated Class C Process Appraisals (ICPA)

Software technology 3. Process improvement models. BSc Course Dr. Katalin Balla

Understanding Model Representations and Levels: What Do They Mean?

SCRUM and the CMMI. The Wolf and the Lamb shall Feed Together

CMMI v1.1 for a Service-Oriented Organization. By Steve Hall, Jeff Ricketts, Diane Simpson 16 November 2005

CMMI-SVC: A Cost-Effective Approach to Early Use

Fast Track Your CMMI Initiative with Better Estimation Practices

SOFTWARE ENGINEERING SOFTWARE PROCESS. Saulius Ragaišis.

Measurement in Higher Maturity Organizations: What s Different and What s Not?

Presentation Objectives

Summary of 47 project management processes (PMBOK Guide, 5 th edition, 2013)

Maturing Measurement Capability The CMMI SM Measurement Thread and Recognized Industry Guidance - Standards

The Quality Paradigm. Quality Paradigm Elements

CMMI Level 2 for Practitioners: A Focused Course for Your Level 2 Efforts

Integration Knowledge Area

Process Maturity Profile

Lockheed Martin Benefits Continue Under CMMI

CMMI Capability Maturity Model Integration [CMU SEI]

Initiation Group Process. Planning Group Process

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

Process Improvement Is Continuous Improvement

SOFTWARE MEASUREMENT GUIDEBOOK. Revision 1

Techniques for Shortening the Time and Cost of CMMI Appraisals

CERT Resilience Management Model, Version 1.2

WE THOUGHT BUT NOW WE KNOW

Software Quality Engineering Courses Offered by The Westfall Team

Construction Project Management Training Curriculum Integration Map

Engineering. CMMI for Development V.1.2 Module 3. M03/Engineering/v1.2

Measuring Performance: Evidence about the Results of CMMI

Software Quality Engineering Courses Offered by The Westfall Team

Project Management Professional (PMP) Exam Prep Course 4 - Project Integration Management

Practical Process Improvement: the Journey and Benefits

Calculating CMMI-Based ROI Why, When, What, and How?

CMMI Re-Appraisal Moving Barriers & Making Strides

An Overview of Software Process

Organizational Synthesis - CMMI, The Glue That Binds

CMMI SM Model Measurement and Analysis

Transcription:

Pittsburgh, PA 15213-3890 The ROI of CMMI: Using Process Simulation to Support Better Management Decisions David M. Raffo, Ph.D. Visiting Scientist, Software Engineering Institute Associate Professor, Sponsored by the U.S. Department of Defense 2005 by Carnegie Mellon page 1

Acknowledgements Portions of this tutorial are reprinted and presented with the permission of. page 2

Pittsburgh, PA 15213-3890 Introductions and Logistics page 3

Introductions Workshop leader introductions Participant introductions Name Position Expectations - What do you want to get out of the workshop? - Do your expectations match the workshop agenda? page 4

Logistics Workshop time/duration Rest Rooms Breaks Smoking Rules Phones Messages page 5

Workshop Approach Lecture/presentation Examples Ask questions Participate!!! page 6

Audience Executive/leaders of organizations seeking to understand - The costs/benefits of CMMI-based process improvement - How to quantify them - How simulation can help them achieve higher CMMI levels Executives/leaders seeking to benchmark their processes and performance with industry Process improvement/epg personnel seeking ways to communicate more effectively to senior management about the costs/benefits of CMMI-based process improvement Personnel seeking to transition to the CMMI, or implement higher-maturity process areas Personnel working to define process and estimate performance based upon quantitative measurements. page 7

Overview Process simulation is a high leverage way to determine which process improvement opportunities are likely to have the best outcome Goals of the tutorial: Familiarize participants with Process Simulation What, Why, How Show participants how to utilize simulation results to support process improvement decisions This tutorial will focus on one simulation method the Process Tradeoff Analysis Method (PTAM) and will briefly touch on others page 8

Overview The tutorial is not intended to be comprehensive, some topics are presented at a high-level only No knowledge of simulation or finance is assumed page 9

Agenda 1. Introduction: What is Process Simulation? 2. Motivation: Why do Process Simulation? 3. Overview of Process Simulation Alternatives 4. How do we build process simulation models? 5. Process Tradeoff Analysis Method (PTAM) 6. Examples of Process Simulation Applications in Industry and Government. 7. Wrap-Up/ Conclusions page 10

Pittsburgh, PA 15213-3890 1 Introduction: What is Process Simulation? page 11

What Is a Simulation Model? A simulation model is a computerized model (not a maturity model) designed to display significant features of the dynamic system it represents. Simulations are generally employed when - behavior over time is of particular interest or significance, and - the economics or logistics of manipulating the system being modeled are prohibitive Common purposes of simulation models are: - to provide a basis for experimentation, - to predict behavior, - to answer what if questions, - to teach about the system being modeled. page 12

Process Simulation Models Process simulation models focus on the dynamics of software and systems development, maintenance and acquisition. They represent the process - as currently implemented (as-is, aspracticed, as-documented), or - as planned for future implementation (tobe) The models represent only selected relevant aspects of a defined process. page 13

Simulation Features Use Graphical interfaces Utilizes actual data/ metrics Predict performance Supports What if Analyses Support business case analyses Reduces risk page 14

Company Strategy Competitive Advantage Customer Value Improving Operations Industry Standards CMMI, Six Sigma, ISO Many choices. Which one(s) to choose? Need to focus efforts to be successful. Set of Potential Process Changes Process Simulation Evaluate Impact on Process Performance Performance Measures Cost, Quality, Schedule Which change will provide the greatest improvement? What is the financial impact? Financial Benefits - NPV, ROI page 15

Pittsburgh, PA 15213-3890 2 Motivation: Why Do Process Simulation? page 16

Benefits of Process Simulation Project Option 0 Base Case Total Effort (PM) Dev Eff + Dev Rwk Rework Effort Devel Defects (PM) Project Duration (Calendar Months) Projected Cost or Revenue delta due to Duration Change Total Injected Defects Corrected Defects Escapted Defects Rework Effort for Field Defects (PM ) Impleme ntation Costs ($) NPV ROI 200 90 18 $0.00 1150 990 160 40 $0.00 n.a. n.a. 1 Implement QFD 190 75 17.5 $0.00 1150 1020 130 30 $100,000 $165,145 15% 2 Implement VOC 185 75 17 $ 100,000 1150 1050 100 20 $120,000 $185,231 29% 3 Add QuARS Tool 175 65 16 $ 300,000 1150 1090 60 10 $ 80,000 $289,674 88% 4 Eliminate 230 130 22 $(400,000) 1150 900 250 80 $0.00 -$378,043-129% 5 Additional Process page 17

Benefits of Process Simulation Decision Support and Tradeoff Analysis Sensitivity Analysis What if Supports Industry Certification and process improvement programs including CMMI, Six Sigma, and others Benchmarking Design and Define Processes/Metrics Bring Lessons Learned Repositories Alive Can save cost, effort, and expertise Can be used to address project manager concerns such as. page 18

Software Project Manager Concerns What development phases are essential? Which phases could be skipped or minimized to shorten cycle time and reduce costs without sacrificing quality? Are inspections worthwhile? What is the value of applying automated tools to support development activities? How do we predict the benefit associated with implementing a process change? How do we prioritize process changes? How to achieve higher levels of the CMMI? What is the level of Risk associated with a change? page 19

Pittsburgh, PA 15213-3890 3 Overview of Alternative Process Simulation Approaches page 20

Alternative Process Simulation Approaches Modeling Paradigms Knowledge-Based Systems Agent Based State-Based Discrete Event System Dynamics Hybrid Research Outlets Software Process: Improvement and Practice Journal of Systems and Software Tools Arena ProModel Extend Stella VenSim Research tools Conferences Winter Simulation Conference ProSim SEPG SSTC page 21

Alternative Process Simulation Approaches Knowledge Based Systems Person-in-the loop Fine level of granularity Supports process enactment Agent Based Systems Fine level of granularity Supports detailed work interactions State Based Systems Captures flow of control (work activities, parallelism) well Multi-view graphical representations Difficult to capture task, work package and resource details page 22

Alternative Process Simulation Approaches Discrete Event Simulation Able to represent richness of processes, work packages and resources Good for modeling quantitative process performance Good tool support System Dynamics Captures feedback well Often used for high level qualitative issues Hybrid Captures best aspects of Discrete Event and System Dynamics Models are complex Being used to predict performance of multi-site development page 23

Common Applications of Each Approach STRAT PLAN MGMT IMPR UNDR TRAIN KBS X X Agent Based X X State-Based X X X X Discrete Event x X X X X X System Dynamic X x x X X Hybrid X X X X X X page 24

Pittsburgh, PA 15213-3890 4 How to Build Process Simulation Models page 25

How it works Software Development Process Project is Approved Func Spec Proposed Process Change FS Insp H Lev Design L Lev Design HLD Insp Code Dev LLD Insp Create UT Plan Development Complete Code Insp Insp UT Plan Unit Test Follow UT Pln Unit Test Complete Functional Test System Test Release to Customers Field Support and Maintenance Better Process Decisions Process Performance Cost, Quality, Schedule SW Process Simulation Model Project Data Process and Product Model Parameters Project is Approved FS Development Func Spec Insp Complete HLD H Lev Design Insp LLD L Lev Design Insp Code Unit Code Dev Insp Test Proposed Create Process Insp UT Follow Change UT UT Pln Plan Plan Unit Test Release to Complete Customers Field Support Functional and Test Maintenance System Test page 26

Process Tradeoff Analysis Method (PTAM) Based on extensive research into Software Process Modeling conducted in academia, SEI and industry. Graphical user interface and models software processes Integrates SEI methods to define processes and supports CMMI PAs Integrates metrics related to cost, quality, and schedule into understandable project performance picture. Predicts project-level impacts of process improvements in terms of cost, quality and cycle time Support business case analysis of process decisions - ROI, NPV and quantitatively assessing risk. page 27

Process Tradeoff Analysis Method (PTAM) Reduces risk associated with process changes by predicting the probability of improvement Saves time, effort and expertise over other methods page 28

Pittsburgh, PA 15213-3890 5 The Process Tradeoff Analysis Method (PTAM) page 29

Process Tradeoff Analysis (PTA) Method Company Strategy Competitive Advantage Customer Value Improving Operations Industry Standards CMM, ISO 9000 Set of Potential Process Changes Many choices. Which one(s) to choose? Need to focus efforts to be successful. Process Simulation Evaluate Impact on Process Performance Performance Measures Cost, Quality, Schedule Which change will provide the greatest improvement? What is the financial impact? Financial Benefits - NPV, ROI page 30

Overview of PTAM Set-up Phase Set the Goal of the Modeling Effort Specify Questions for the Model to Address Define Process Performance Measures Identify Input Parameters Gather Information Modeling Phase Analysis Phase page 31

Set-up Phase What decision(s) am I trying to make? Goal Major Objective(s) for model What questions does management have? Questions Define key questions to address What do I need to know to answer the questions? Performance Measures Metrics/Model Outputs designed to address key questions What information should we collect? Input Data Data and information needed to calibrate and estimate performance measures page 32

Overview of PTAM Set-up Phase - Set the Goal of the Modeling Effort - Specify Questions for the Model to Address - Define Process Performance Measures - Identify Input Data Gather Information Modeling Phase Analysis Phase page 33

Why Simulate? There are a variety of reasons / purposes for undertaking process simulation. CMMI-Based Process Improvement - Strategic management - Planning - Control and operational management - Technology adoption - Understanding - Training and learning page 34

CMMI Based Process Improvement CMMI Levels 4 and 5 Process simulation helps to fulfill PAs (OID, CAR, OPP and QPM - Sub Goals and Generic Goals) CMMI Levels 2 and 3 Process simulation can be used to evaluate alternative process choices (RD, TS, PI, V&V, RM, SAM, PPQA, and CM) Process simulation helps to fulfill PAs (OPF, OPD, OT, IPM, Risk, DAR, PP, PMA, MA, PPQA Multiple Sub Goals and Generic Goals ) page 35

Case Study: Organizational Setting Leading software development firm Peak staffing of 60 developers on project Assessed at strong Level 2 of CMM/CMMI Experienced development staff 5 th release of commercial project Data available in electronic and paper form: quantitative and qualitative; professional estimates used to fill in gaps Active SEPG page 36

Case Study: Validation and Verification Problem: Releasing defective products, had high schedule variance. Why? Unit Test was main defect removal stage. They did it unreliably. Built a model of Large-Scale commercial development process Based on actual project data Predicted project performance in terms of effort, task duration and delivered defects. Part of a full business case analysis - determined financial performance of the process change page 37

Process Overview - 1 Project is Approved Func Spec HL Design FS Insp 1 HLD Insp Development Complete Unit Test Complete Release to Customers Tasks Affected By Process Change Change LL Design Code LLD Insp Code Insp Unit Test Execution Functional Test Field Support and Maintenance 1 Test Plan TP Insp System Test Test Case TC Insp Diagram of the Field Study Life Cycle AS-IS Process page 38

Process Overview - 2 Begin Code Development Code Development is Complete Code Inspection is Complete Code Dev Code Insp Unit Test Execution Unit Test Complete; Begin Functional Testing Create Unit Test Plans Prep, Insp, and RWK UT Plans Follow UT Plan Conducted during Code Development Conducted as part of regular Code Inspection Followed while conducting Unit Test page 39

Overview of PTAM Set-up Phase - Set the Goal of the Modeling Effort - Specify Questions for the Model to Address - Determine Organizational Scope - Define Process Performance Measures - Identify Input Data Gather Information Modeling Phase Analysis Phase page 40

Specify Questions Based upon the goal/ purpose of the simulation model, specific management questions can be identified Questions should point to specific answers that management would like to obtain It should be recognized that the model may not be able to answer or even address all of the questions. The questions should document the full scope issues and information that need to be incorporated into the decision making process Questions document the use case page 41

Example Questions What is the optimal V&V strategy for a specific project? For our organizational process? Would it be better to use Requirements process A or B for this new project? What combination(s) of V&V techniques enable us to meet or exceed the quality goals for the system? Which alternative is best? Given a budget of X dollars, what V&V activities should be conducted? What is the value of applying automated tools to support development activities? What is the level of Risk associated with a change? page 42

Case Study: Questions Investigated Will the process change improve project performance? What is the cost the firm is currently paying by conducting Unit Tests incorrectly? Is partial implementation of the proposed process change possible? How would potential learning curve effects affect the performance of the process change? Would alternative process changes offer a greater improvement? Can the project benefit from reusing process artifacts? page 43

Overview of PTAM Set-up Phase - Set the Goal of the Modeling Effort - Specify Questions for the Model to Address - Define Process Performance Measures - Identify Input Data Gather Information Modeling Phase Analysis Phase page 44

Define Process Performance Measures Main output measures of the simulation Should capture management interests and interests of engineers responsible for implementing the process changes. Must enable the questions to be answered Helps focus data collection and modeling efforts. Should be defined as early as possible on the project page 45

Examples of Common Performance Measures Typical performance measures include the following: effort / cost cycle-time (a.k.a. interval, duration, schedule) defect level staffing requirements staff utilization rate cost / benefit, return on investment throughput / productivity queue lengths (backlogs) page 46

Case Study: Performance Measures Cost Person-Months of Development, Inspection, Testing and Rework effort Equivalent Manpower (Staffing levels) Implementation costs Quality Number of delivered defects by type Schedule Months of Effort page 47

Overview of PTAM Set-up Phase - Set the Goal of the Modeling Effort - Specify Questions for the Model to Address - Define Process Performance Measures - Identify Input Data Gather Information Modeling Phase Analysis Phase page 48

Input Data (1 of 2) Input data are used to predict the performance measures. Can be derived from the organization - Current baseline - Exemplary projects - Pilot data Can also be derived from - Expert opinion - Industry data from comparable organizations Best judgments to describe the state of your organization page 49

Input Data (2 of 2) Examples: process documents and assessments amount of incoming work effort based on size (and/ or other factors) defect detection efficiency effort for rework based on size and number of defects defect injection, detection and removal rates decision point outcomes; number of rework cycles hiring rate; staff turnover rate personnel capability and motivation, over time resource constraints frequency of product version releases page 50

Case Study: Input Data CMM Level 2+ organization Process documents and assessments Project Size Productivity Earned Value by phase Total number of defects removed Defect injection, detection and correction rates Effort and schedule data Defect detection and rework costs page 51

Overview of PTAM Set-up Phase - Set the Goal of the Modeling Effort - Specify Questions for the Model to Address - Define Process Performance Measures - Identify Input Data Gather Information - Gather qualitative and quantitative data about processes and products from variety of sources in variety of forms Modeling Phase Analysis Phase page 52

Overview of PTAM Set up phase - Set the Goal of the Modeling Effort - Specify Questions for the Model to Address - Define Process Performance Measures - Identify Input Data Gather Information Modeling Phase Analysis Phase page 53

Process Models First, create the graphical model Quantitative portion of the simulation model can be theoretical or data driven - Data driven models analyze actual data from past projects using statistical techniques such as correlation coefficients and regression. - Theoretical models are independent of data (relationships) Process simulation can incorporate many kinds of analytical models (data driven or theoretical) - COCOMO, SLIM - Reliability - Other Regression, Queuing and others page 54

Case Study: Build the Graphical Model Project is Approved Func Spec HL Design FS Insp 1 HLD Insp Development Complete Unit Test Complete Release to Customers Tasks Affected By Process Change Change LL Design Code LLD Insp Code Insp Unit Test Execution Functional Test Field Support and Maintenance 1 Test Plan TP Insp System Test Test Case TC Insp Diagram of the Field Study Life Cycle AS-IS Process page 55

Case Study: Simplified Error Model Undetected errors from previous phase Errors injected in this phase Errors detected (and removed) Perform Work Verify Work Rework Detected Errors Verification Efficiency = Errors Detected Total Errors Present Undetected errors to next phase page 56

More Detailed Error Model Errors injected Preliminary Design Errors detected Errors injected Dev TA TA Rev Rwk Errors undetected Dev UA UA Rev Rwk Errors undetected Errors detected Code Code Rev Errors injected Rwk Errors detected Errors detected Errors undetected UT Rwk Errors undetected PT Rwk Errors undetected Errors detected Int Test Errors detected Rwk SW Req. Analysis & Preliminary Dsn SW Detailed Design Coding Unit Test Process Test Integration & Formal Test page 57

Linking Effort, Duration, and Staffing Size (Input) Duration* (Input) Productivity (Input) Effort* Project and Activity Duration* Task Offset* (Input) Earned Value* (Input) Detection and Rework Costs Equivalent Manpower* Staffing* (Input) page 58

Overview of PTAM Set up Phase - Set the Goal of the Modeling Effort - Specify Questions for the Model to Address - Define Process Performance Measures - Identify Input Data Gather Information Modeling Phase Analysis Phase page 59

Analysis Phase Once the model results are validated and viewed as being credible, they can be used to support decisions. Major Steps Evaluate Baseline Process Alternatives Determine Tradeoff Rule(s) Conduct Sensitivity Analyses Select Alternative(s) for Implementation page 60

Project Level Outputs Which Alternative to Choose? CONFIG Delivered Defects Life Cycle Effort Project Duration W W N N 13.4 51.72 17.81 F F N N 12.6 52.83 17.26 W W N W 9.1 48.79 14.92 W W W W 6.6 47.25 12.85 F F F F 3.3 48.60 12.11 page 61

Comparison by Mean Difference CONFIG Reduced Defects Reduced Effort Reduced Duration W W N N 0.00 0.00 0.00 F F N N 0.80-1.11 0.55 W W N W 4.34 2.92 2.89 W W W W 6.82 4.47 4.96 F F F F 10.18 3.12 5.71 page 62

Case Study: Baseline Comparison 15 Quality Schedule Effort 50 60 10 40 * 50 * * 30 40 5 20 30 0 50 80 10 50 80 20 * 50 80 CUM CUM CUM REM_ERR = Number of remaining errors; TOT_DUR = Total project duration (in days); TOT_EFF = Total staff effort (in days); CUM = Cumulative error detection capability (% of initial errors detected); 50 = "AS-IS" No Inspection Baseline; 80 = "TO-BE" Inspection Baseline FIGURE 2 - PERFORMANCE MEASURE DISTRIBUTIONS page 63

Analysis Phase Evaluate Baseline Process Alternatives Determine Tradeoff Rule(s) Conduct Sensitivity Analyses Select Alternative(s) for Implementation page 64

Determine Tradeoff Rule(s) Which alternative is best? Need to reduce multiple performance measures to one decision statistic that can be used to rank process alternatives. Possible Options Utility functions Financial measures (e.g. Net Present Value (NPV), Internal Rate of Return (IRR aka ROI), etc.) Optimization techniques (e.g. Data Envelopment Analysis (DEA)) Analytic Hierarchy Process (AHP) Combination page 65

Financial Measures of Performance Gets management interest (and excitement) Supports building a business case Trick is to convert performance measures to cash equivalents Examples: - Net present value (NPV) - Internal rate of return (IRR aka ROI), etc. - Discounted Payback period page 66

Determining Financial Benefits Need to reduce all benefits to cash equivalents Implementation costs are easy to include Effort is a straight forward conversion Some measures can be converted to effort (e.g. number of customer defects are converted to the effort to correct them) Other measures (e.g. time to market) can be difficult to convert. page 67

Ranking by Financial Performance Rank CONFIG NPV(15%) Mean NPV(15%) STDev PR(NPV<0) (Risk) 1 F F F F $362,291.35 $118,344.45 0.11% 2 W W W W $253,041.92 $68,513.12 0.08% 3 W W N W $157,874.18 $44,518.84 0.09% 4 F F N N $27,836.80 $26,910.00 15.15% 5 W W N N $0.00 NA NA page 68

Case Study: Cash Flows Field Service Effort Savings + $ 105,569 21 months = 16.44 TO-BE Component Duration * 1.25 time=0 time = 21 months - $31,650 Implementation Costs - $4,591 Life Cycle Effort Savings time = 33 months 33 months = Project duration of 21 months + 12 months page 69

Case Study: Results The process change offered significant reductions in remaining defects, staff effort to correct field detected defects, and project duration. The expected ROI was 56% for a typical 30 KLOC release. Pilot implementations indicated that the process change provided a 37% ROI even under worst case conditions. page 70

Analysis Phase Evaluate Baseline Process Alternatives Determine Tradeoff Rule(s) Conduct Sensitivity Analyses Select Alternative(s) for Implementation page 71

Conduct Sensitivity Analyses What if analyses allow managers to apply the model to evaluate the proposed process change(s) under different business conditions and assumptions. Provides added insight and confidence into the potential process change page 72

Case Study: Questions Investigated Will the process change improve project performance? What is the cost the firm is currently paying by conducting Unit Tests incorrectly? Is partial implementation of the proposed process change possible? How would potential learning curve effects affect the performance of the process change? Would alternative process changes offer a greater improvement? Can the project benefit from reusing process artifacts? page 73

Case Study: Results Obtained Compressing Unit Test causes significant increases in schedule (+18%) and effort costs (+8%) during the later testing phases and reduces overall product quality(+48% increase in defects). Partial implementation of the process change is possible for complex portions of the code. Estimated ROI is 72%. Potential learning curve effects significantly enhance the performance of the process change. Expected ROI of 72% assuming only moderate improvements. page 74

Case Study: Results Obtained Improving inspections would be a more effective process improvement than the Creating Unit Test Plans process change. Reusing the Unit Test Plans on the next development cycle provided an overall ROI of 73% (compared to 56% expected improvement without reuse) page 75

Analysis Phase Evaluate Baseline Process Alternatives Select Evaluation Method and Criteria Conduct Sensitivity Analyses Select Alternative(s) for Implementation page 76

Select Alternative(s) for Implementation Process simulation can be used to estimate the ROI and risk Results are traded-off with other factors not included in the model such as budget and political considerations Utilize all the information at hand (quantitative and qualitative) to choose the best alternative page 77

Pittsburgh, PA 15213-3890 6 Examples of Process Simulation Applications in Industry and Government page 78

NASA IV&V Mission: To independently verify and validate software on all missions that are life critical or have significant vehicle cost involved. Problem: Limited resources to conduct IV&V. Critical need to deploy IV&V in most effective manner possible (biggest return on investment) Goal to optimize IV&V within a project and across projects. page 79

NASA IV&V Description of Model Based on IEEE 12207 Software Development Process Tuned for large-scale NASA projects (Size >100 KSLOC) (uses actual data) 8 major life cycle phases; 86 process steps Includes IV&V Layer Compares alternative IV&V configurations (ROI) page 80

NASA IV&V Mission: To independently verify and validate software on all missions that are life critical or have significant vehicle cost involved. Problem: Limited resources to conduct IV&V. Critical need to deploy IV&V in most effective manner possible (biggest return on investment) Goal to optimize IV&V within a project and across projects. Description of Model Based on IEEE 12207 Software Development Process Tuned for large-scale NASA projects (>100 KLOC) (Real data) 8 major life cycle phases; 86 process steps Includes IV&V Layer Compares alternative IV&V configurations (ROI) page 81

NASA Model Includes IV&V Layer with IEEE 12207 Software Development Lifecycle page 82

IV&V Layer Select Criticality Levels for IV&V Techniques using pull-down menus page 83

A Look Inside the Model page 84

What is IV&V? IV&V = Independent Verification and Validation Performed by one or more independent groups Can be employed at any phase with different levels of coverage page 85

IV&V Techniques Traceability Analysis Software Design Evaluation Interface Analysis Criticality Analysis Component Test Plan Verification V&V Test Design Verification Hazard Analysis And etc. page 86

Importance/Benefits Enduring Needs IV&V Level IV&V New Business Planning (Independent Bottoms-Up Cost Estimation for NASA Projects and for IV&V) IV&V Policy Research (IV&V strategies for alternative NASA Project types) IV&V Services Contract Bid Support IV&V Services Replanning Cost/Benefit Evaluation of new technologies and tools Space Science Data Mining page 87

Macro IV&V Questions What is the optimal IV&V strategy for a given NASA project or NASA project type? What combination(s) of IV&V techniques enable us to meet or exceed the quality assurance goals for the system? Given a budget of X dollars, what IV&V activities should be conducted? What if the complexity or defect profiles for a particular project were different than expected? How is the duration of the IV&V effort impacted by the overall staffing level for the project? page 88

Preliminary Study Use the model to quantitatively assess the benefits of performing IV&V on software development projects Comparing benefit of applying IV&V activities at different phases and in combination page 89

Impact of IV&V at Different Points in the Development Process Total Effort Mean Result Comparison Rework Effort Mean Duration Mean Corrected Defects Mean Latent Defects Mean Case Configuration (Person Months) (Person Months) (Months) (Number of Defects) (Number of Defects) 1 Baseline 346.26 201.65 58.42 6,038.26 629.48 2 IV&V at Validation 355.35 210.75 59.95 6,113.79 574.17 3 IV&V at Code 334.13 189.53 57.38 6,134.84 573.49 4 IV&V at Design 327.93 183.33 56.56 6,123.11 581.27 5 IV&V at Requirements 326.82 182.21 56.40 6,078.87 600.04 % Improvement Compared to the Baseline Total Effort Mean Rework Effort Mean Duration Mean Corrected Defects Mean Latent Defects Mean Case Configuration 1Baseline 2 IV&V at Validation -2.63%* -4.51%* -2.63%* +1.25% +8.79%* 3 IV&V at Code +3.50%* +6.01%* +1.77% +1.60% +8.90%* 4 IV&V at Design +5.29%* +9.09%* +3.17%* +1.41% +7.66%* 5 IV&V at Requirements +5.62%* +9.64%* +3.46%* +0.67% +4.68%* page 90

Impact of IV&V Techniques in Combination Result Comparison Total Effort Mean Rework Effort Mean Duration Mean Corrected Defects Mean Latent Defects Mean Case Configuration (Person Months) (Person Months) (Months) (Number of Defects) (Number of Defects) 1 Baseline 346.26 201.65 58.42 6,038.26 629.48 6 IV&V at Code and Validation 342.14 197.54 58.78 6,203.66 524.96 7 IV&V at Req and Code 316.15 171.55 54.41 6,170.94 547.74 8 Two IV&V Techniques at Code 327.10 182.50 57.54 6,180.22 540.60 % Improvement Compared to the Baseline Total Effort Mean Rework Effort Mean Duration Mean Corrected Defects Mean Latent Defects Mean Case Configuration 1Baseline 6 IV&V at Code and Validation +1.19% +2.04% -0.63% +2.74% +16.60%* 7 IV&V at Req and Code +8.69%* +14.93%* +6.86%* +2.20% +12.99%* 8 Two IV&V Techniques at Code +5.53%* +9.50%* +1.50% +2.35% +14.12%* page 91

Rapidly Deployable Software Process Simulation Models and Training Goal: To create a flexible decision support tool that can be easily used to support better project management, planning and tracking by quantitatively assessing the economic benefit of proposed process alternatives. Motivation: Companies need to get useful results from simulation models quickly. page 92

Rapidly Deployable Process Models Software Development Process Project is Approved Func Spec FS Insp H Lev Design HLD Insp Development Complete Unit Test Complete Release to Customers L Lev Design Code Dev LLD Insp Code Insp Unit Test Functional Test Field Support and Maintenance Proposed Process Change Create UT Plan Insp UT Plan Follow UT Pln System Test Life Cycle Model Generic Process Blocks REQ DES IMP TEST CUST TP TCG Generalized Equations page 93

Simulation Dashboard page 94

Demonstration page 95

Pittsburgh, PA 15213-3890 7 Wrap up/ Conclusions page 96

Conclusions Process simulation modeling has been used successfully to quantitatively address a variety of issues from strategic management to process understanding. Key benefits include: Decision Support and Tradeoff Analysis Sensitivity Analysis What if Supports Industry Certification and process improvement programs including CMMI, Six Sigma, and others Supports CMMI at all levels 2 through 5 Design and Define Processes Benchmarking Can address project manager concerns Supports project management and control page 97

Conclusions Process Tradeoff Analysis Method (PTAM) provides a tested approach for developing models and utilizing the results Not a silver bullet Focus on RAPID DEPLOYMENT Reducing costs and time to develop models Making models easier to use No simulation expert needed page 98

The End Questions? page 99

Contact Information David M. Raffo, Ph.D. Associate Professor, Visiting Scientist, Software Engineering Institute President, Vaiyu, Inc. raffod@pdx.edu 503-725-8508 503-939-1720 VAIYU page 100