From Scientific Methods to High Maturity with PSP sm /TSP sm High Maturity made easy!

Similar documents
Developing an Approach for Effective Transition of a TSP Team to Meet Project Goals

Software Engineering. Lecture 2: The Personal Software Process

Lessons Learned in Seamless Integration of CMMI, TSP, and PSP Why All Three Are Needed. CMMI Technology Conference November 14, 2007

Using TSP to Improve Performance

A Quantitative Method for Preventing Defect Injection in PSP/TSP

Process Improvement Proposals (PIPs) Organization, Team, Individual

Personal Software Process SM for Engineers: Part I

NAVAIR Process Resource Team

Applying the Personal Software Process (PSP) sm with Ada

TSP SM as the Next Step for Scrum Teams

Management Principles to Accelerate Process Improvement

Integrating CMMI and TSP/PSP: Using TSP Data to Create Process Performance Models

NAVAIR Process Resource Team

Ten Years with TSP SM :

Applying the Team Software Process

Lessons learned in motivating Software Engineering Process Group to focus on achieving business goals, and not just on achieving a maturity level

The Personal Software Process (PSP)

Managing Software Quality with the Team Software Process

Why Does Agile Software Development Not Require the TSP Disciplines?

Ogden Air Logistics Center

High Maturity/Capability Appraisals

The Victim Trap. The TSP Symposium September 23, Watts S. Humphrey and The Software Engineering Institute Carnegie Mellon University

The Large System Problem

CMMI SM Model Measurement and Analysis

A Quantitative Comparison of SCAMPI A, B, and C

Incorporating the Personal Software Process into the Rational Unified Process

ACHIEVING QUALITY AND INCREASING PERFORMANCE WITH CMMI AND TSP

TSP Performance and Capability Evaluation (PACE): Customer Guide

Thought Before Action: The Advantage of High-Maturity Thinking in a Lower-Maturity Organization

BaiGuo ML4 Journey. Tim Kasse Kasse Initiatives LLC USA +45 (0) Europe NREL

The Value of TSP in Agile Practices

Understanding Model Representations and Levels: What Do They Mean?

QUANTIFIED THE IMPACT OF AGILE. Double your productivity. Improve Quality by 250% Balance your team performance. Cut Time to Market in half

Estimating With Objects - Part III

The Benefits of CMMI. Case Study of a Small Business CMMI Level 5 Organization

TSP SM Plays the ACE: Using Architecture-Centric Engineering on a TSP Project

Use and Organizational Impact of Process Performance Modeling in CMMI High Maturity Organizations

NAVAIR Process Resource Team. Broadening the Ability to Train and Launch Effective Engineering and Service Teams

Software Engineering. Lecture 7: CMMI

Dennis R. Goldenson James McCurley Robert W. Stoddard II. CMMI Technology Conference & User Group Denver, Colorado 19 November 2009

CMMI-DEV v1.3 Good? Bad? Ugly?

Personal SE Project Management Process Lecture/Week 1. A. Winsor Brown

Combining Architecture-Centric Engineering with the Team Software Process

High Maturity Misconceptions: Common Misinterpretations of CMMI Maturity Levels 4 & 5

Shrinking the Elephant: If Implementing CMMI Practices Looks Like More Effort than it s Worth, Let s Look Again. Sam Fogle ACE Guides, LLC

CMMI High Maturity An Initial Draft Interpretation for V1.3

SW CMM. Capability Maturity Models. Level 1: Initial Level SW CMM (2) CS 390 Lecture 8 Improving the Software Process

MTAT Software Engineering Management

NDIA Systems Engineering Division. November in partnership with: Software Engineering Institute Carnegie Mellon University

Update Observations of the Relationships between CMMI and ISO 9001:2000

Application and Evaluation of the Personal Software Process

The TSP Microsoft IT. Microsoft s experience using the Team Software Process SM from the Software Engineering Institute

Watts Humphrey A tribute to the Father of Software Quality. Biography

SUCCESS WITH THE TSP - IMPROVE YOUR PROJECT ESTIMATIONS WITH STATISTICAL ANALYSIS TOOLS. Michael J. Mowle 17-Sep-2013

CMMI Version 1.2. Model Changes

Copyright Statement. Contents. SPIN - Darmstadt - 18 Jan Q:PIT Ltd

MIS Systems & Infrastructure Lifecycle Management 1. Week 9 March 17, 2016

Measuring Performance: Evidence about the Results of CMMI

Engineering Practices and Patterns for Rapid BIT Evolution

SSTC May 2011

Agile and CMMI : Disciplined Agile with Process Optimization

A Practical Guide to Implementing Levels 4 and 5

Illuminating the Intersection of TSP and CMMI High Maturity Process Performance Models

Using TSP Data to Evaluate Your Project Performance

A Global Overview of The Structure

WE THOUGHT BUT NOW WE KNOW

Dan Fuller & Chris Waggoner. Agile Transformation Metrics

Teuvo Suntio. Quality Development Tools. Professor of Power Electronics at University of Oulu. Electronic System Design A TS Rev. 1.

The Personal Software Process SM (PSP SM ) Body of Knowledge, Version 1.0

Measurement within the CMMI

Introduction to Software Product Lines Patrick Donohoe Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213

The Issue of Performance Why Do you need a Maturity Level 5. Achieving the Organizational Business Objectives Through Optimized Operational Processes

A Practical Guide to Implementing Levels 4 and 5

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

Using PSP/TSP in a Performance Review. Nicole Flohr Larry Whitford November 4, 2014

The State of Software Measurement Practice: Results of 2006 Survey

Performance-Based Earned Value

HIGH MATURITY CONFERENCE. Christian Hertneck June 2014

Evaluating CSIRT Operations

What Functional Groups Are Included in the CMMI -SE/SW/IPPD/SS Model?

TSP in small software enterprises. Roberto Ramos Software development Manager Kernel Technologies Group

Nitty Gritty of QA Project Management. 11-Feb-09 Carol Perletz

Empirical Analysis for High Quality Software Development

Complexity and Software: How to Meet the Challenge. NDIA CMMI Technology Conference

Achieving SA-CMM Maturity Level A DoD Acquisition Management Experience

TSP Implementation Veteran

Software technology 3. Process improvement models. BSc Course Dr. Katalin Balla

CMMI High Maturity Appraisal Considerations

What s New in V1.3. Judah Mogilensky Process Enhancement Partners, Inc.

The Capability Maturity Model

Highlights of CMMI and SCAMPI 1.2 Changes

Software Quality Management

SEI Webinar Series: The Next Generation of Process Evolution

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

Gary Natwick Harris Corporation

The Rational Unified Process and the Capability Maturity Model Integrated Systems/Software Engineering

Using Scenarios in Architecture Evaluations Rick Kazman

CMMI for Technical Staff

Quality Systems Frameworks. RIT Software Engineering

OBJECTIVE VISIONARY OF CMMI HIGH MATURITY PRACTICES WITH AGILE & DEVOPS

Transcription:

From Scientific Methods to High Maturity with PSP sm /TSP sm High Maturity made easy! James Over Software Engineering Institute E-mail: jwo@sei.cmu.edu Yoshihiro Akiyama, Ph.D., Kyushu Institute of Technology & Next Process Institute Ltd. E-mail: y.akiiyama@ieee.org September 17, 2009 Sponsored by the U.S. Department of Defense 2007 by Carnegie Mellon University Next Process Institute Ltd. Since 2006

Trademarks and Service Marks The following are service marks of Carnegie Mellon University. Team Software Process SM TSP SM Personal Software Process SM PSP SM The following are registered trademarks of Carnegie Mellon University. Capability Maturity Model Integration CMMI 2007 Carnegie Mellon University 2

The Road to High Maturity: PSP/TSP We all know the characteristics of a high maturity process. We can get to high maturity through scientific measures that inform our decisions and our work as the work progresses. Let s talk about high maturity and the barriers to getting there. 2007 Carnegie Mellon University 3

Quantitative Project Management: CMMI SG1: The project is quantitatively managed using quality and process-performance objectives. SP 1.4-1: Manage Project Performance Monitor the project to determine whether the project s objectives for quality and process performance will be satisfied, and identify corrective action as appropriate. SG2: The performance of selected sub-processes within the project's defined process is statistically managed. To implement, use Process Performance Baselines (PPB) and Process Performance Models (PPM). 2007 Carnegie Mellon University 4

Purpose of Organizational PPM and PPB To engineer successfully: 1. Define aggressive but achievable objectives. 2. Develop a plan with sufficient detail that you can commit to the objectives. 3. Measure and control the process variations and the project progress. 4. Adjust the plan, as needed, to meet the objectives. 2007 Carnegie Mellon University 5

What Makes This Difficult? You must have measures to estimate, evaluate, and control the quality and process performance of a project. However, there are difficult challenges to measurement: Sufficiency of data at planning time Definition of data and sub-processes to monitor the performance variations Accuracy of the data used for analysis and control Context of the data must be understood Size of the data set may be insufficient for application of statistical methods 2007 Carnegie Mellon University 6

The TSP/PSP Approach TSP and PSP fill the gap: Measurement and planning framework Estimation and planning for individuals and teams Evaluation of plan and performance Monitoring of quality We can apply these to both individuals and projects. 2007 Carnegie Mellon University 7

PSP Principles The quality of a software system is determined by its worst components. A developer learns skills for continual process improvement. Learn to estimate, measure, track, analyze, and make a detailed plan Habitually log data: time, defects, and size Understand current performance and performance variation Improve the process using the Process Improvement Proposal (PIP) Establish quality before testing 2007 Carnegie Mellon University 8

PSP The Planning Framework User Needs Define requirements Produce conceptual design Conceptual design shows how building blocks (parts) bridge the requirements and products to be developed. Estimate Estimate size Estimate resources Size database Productivity database Produce schedule Resources available Monitoring Product delivery Develop product Size, resource, schedule data Process analysis Tracking reports 2007 Carnegie Mellon University 9

Size Estimation Dilemma in Project Planning Early in a project, very little information is available for estimating. Accurate effort estimates are needed to make accurate plans and commitments. The Proxy Based Estimating (PROBE) Method makes accurate, early estimation possible. 2007 Carnegie Mellon University 10

PSP PROBE Method: PROXY Size Metric Early project challenge: no precise sizes identified for parts to be developed. Fuzzy logic helps to specify the size for each type of object: VS, S, M, L, VL. C C 25Loc C 2 Loc C 30Loc 5 Loc C. 80Loc C 100Loc. Log Normal Distribution is assumed Avg-2ρ, Avg- ρ, Avg, Avg+ρ, Avg+2ρ Relative Size Metric Calculate part size 2007 Carnegie Mellon University 11

PSP PROBE Method: PROXY Size Estimate Estimated Size Actual Size Variations at elementary level balance out at the total level Planned Actual 2007 Carnegie Mellon University 12

PROBE Method: PPMs on Size and Time Using Correlation between Estmate and Actual 70% Prediction Interval Size range Time range 2007 Carnegie Mellon University 13

PPB and PSP Prediction Interval Size estimate Regression on Size Data Objectives Regression on Time Data Prediction Interval Process Performance Range (Baselines) for this project PROBE Method helps agile planning with accuracy! 2007 Carnegie Mellon University 14

TSP Principles Use a self-directed team to manage knowledge work. Knowledge work must be managed by the team and individuals who actually do the work. The TSP launch process creates a self-directed team. A detailed plan is developed before committing objectives to management and the customer. Team management is accomplished through TSP weekly meetings and management reporting. 2007 Carnegie Mellon University 15

TSP Launch Process 1. Establish product and business goals 4. Build overall and near-term plans 7. Conduct risk assessment 9. Hold management review 2. Assign roles and define team goals 5. Develop the quality plan 8. Prepare management briefing and launch report Launch postmortem 3. Produce development strategy and process 6. Build individual and consolidated plans A qualified coach guides the team through a defined process to develop its plan and to negotiate that plan with management. Ref. SEI Course: Leading a Development Team 2007 Carnegie Mellon University 16

Project Level Data Decomposed to and Reaggregated from Individual Data in TSP Team Leader Requirements Business goals Team goals Member E Member D Project Team (Self Directed Team) Team Data/Process Member C Member A Member B Team data/plan Launch/Relaunch Individual data/plan Status/updated Launch/Relaunch Team data/status/plan 2007 Carnegie Mellon University 17

TSP Derived Measures for Subprocesses Generated from the Team Data in Real Time Requirements Business goals Team goals Team data/plan Subprocess <Individual> Subprocess Quality of Component Launch/Relaunch Launch/Relaunch Subprocess <Schedule> Project Team (Self Directed Team) Team Data/Process Subprocess Yields (Defect Filtering) Individual data/plan Status/updated <on event> Subprocess Project s req., Goals, Objectives, and Risks Weekly Subprocess Estimate Accuracy Subprocess Rate (Speed) Management Report Weekly Team data/status/plan 2007 Carnegie Mellon University 18

TSP WEEK Weekly Progress Status: Variance in Task Hour Goal and Project End Date Task Hours planned at start and current Current week number Project completion date at start-baseline at start-top down planning at now projected 2007 Carnegie Mellon University 19

TSP WEEK Weekly Progress Status: Variance in Schedule Hours Week 61.6 hours worked for this week, which is 61% less than the plan Cycle 318.7 hours worked in total, which is 39% less than the plan 2007 Carnegie Mellon University 20

TSP WEEK Weekly Progress Status: Variance in Weekly Completed Tasks Week 36% under to the plan for this week Cycle 22% under to the plan for this cycle 2007 Carnegie Mellon University 21

TSP WEEK Weekly Progress Status: Variance in Accuracy of Estimate Accuracy on Task Hours Estimate 18% under estimate of Task Hours Accuracy on Schedule Hours Estimate 39% under estimate of Task Available Hours per Week 2007 Carnegie Mellon University 22

TSP WEEK Weekly Progress Status: Variance in Progress Question: When will this project complete? Schedule Growth = 1.39/1.18=1.17 Net 17% growth expected 2007 Carnegie Mellon University 23

Quantitative Management in TSP TSP team strives to manage their planned schedule using the following: Workload growth rate %Task hours added to the baseline Review rate Defect injection and removal rate Process and phase yield, etc. These may be used to generate PPM and PPB. 2007 Carnegie Mellon University 24

TSP Quality Profile The Quality Profile and Process Quality Index (PQI) are useful tools for quantitative management, but they must be used carefully. Example: 1. An excellent PSP engineer was assigned to develop the six components, in the example below. 2. Only one defect was reported in the IT phase. 3. Component1-1 and Component1-2 are closely related. 4. TSP planning and quality parameters were used for the first time. Structure Subsystem Component1-1 Component1-2 Component2 Component3 Component4 Component5 2007 Carnegie Mellon University 25

TSP Quality Profile for Component Standard design time Min[1, DLD/CD] 1 Standard design 1 review time Min[1, 2DLDR/DLD] 0 1 Standard code review time Min[1, 2xCDR/CD] Unit test quality Min[1,(10/(5+UT Defects/KLOC))] 1 1 Compile Quality Min[1,(20/(10+C defects/kloc))] PQI is given by multiplication of the five indexes. Ref. A Self-Improvement Process for Software Engineers, Addison Wesley 2006 2007 Carnegie Mellon University 26

The Selected Six Components Component2 Component1-1 Component1-2 PQI = 0.38 PQI = 1.00 PQI = 1.00 Component3 Component4 Component5 PQI = 0.21 PQI = 1.00 PQI = 0.65 2007 Carnegie Mellon University 27

PQI vs. Post-development Defects 25 Post-D Development Defe ects/kloc 20 15 10 5 0 Expected high quality 0 0.1 0.2 0.3 0.4 0.5 0.6 PQI Reference: Watts Humphrey, Winning with Software, Addison Wesley, 2001 2007 Carnegie Mellon University 28

c-control Chart to Determine Defect-Risk Component 1/PQI 5 4.5 4 3.5 3 2.5 2 1.5 1 0.5 0 Assembly1 Assembly2 Assembly3 Assembly4 Assembly5 Assembly6 Low PQI (<0.4) USL High PQI (>0.4) LSL 2007 Carnegie Mellon University 29

Does a High PQI Mean the Component Is Really Good? Reasons for a defective but high PQI component: Defects are present but not found during compile and unit test. Half of development work time is spent in ineffective review. Design and code progresses with missing components that are identified in later phases. Defect may not be recorded, etc. 2007 Carnegie Mellon University 30

Proposal: An Experience-based Rule to Identify a Defect-risk Component with a High PQI Value 1. Compare time in phase, actual% for similar components that have been worked by a PSP engineer. 2. Identify a component that shows a different pattern in time in phase, actual % distribution, especially for later phases. This may be considered a defect-risk component. 3. Group the components that are closely related to the defect-risk component. Each component in the group may be a defect-risk even if it has a high PQI value. 2007 Carnegie Mellon University 31

Time in phase, actual% Analysis of the Six Components Strong Anomaly Actual Hours/LOC 1. Component1-2 is a defect-risk component. 2. If Component1-1 and Component1-2 are closely related, the component1-1 may be defect-risk. 2007 Carnegie Mellon University 32

Summary The CMMI high maturity practices, i.e., the PPM and PPB, are easily implemented in the PSP and TSP. Project level PPM and PPBs are generated by aggregating individual-level data of time, defect, and size for estimating and quantitative management of a project. Prediction interval, control chart, and significance are used to present variations in estimating, monitoring, and evaluating respectively and TSP data are used to derive these. The sub-process must not only be within specification but should be stable. (A high PQI does not necessarily guarantee that the component is defect free.) The stability should be examined by the developers. 2007 Carnegie Mellon University 33

Questions? Thank you for your attention. Contact information: James Over Software Engineering Institute Carnegie Mellon University jwo@sei.cmu.edu Yoshihiro Akiyama, Ph.D. Kyushu Institute of Technology & Next Process Institute Ltd. y.akiyama@ieee.org 2007 Carnegie Mellon University 34