Calibration and Historical Data Chapter 8

Similar documents
BAE Systems Insyte Software Estimation

TickITplus Implementation Note

Homework 1: Basic modeling, analysis and spreadsheet engineering

FLSA Resource Guide. October Cascade Employers Association

How to Test A SOA System. Carsten Feilberg, Strand & Donslund A/S, Denmark

6 SAFETY CULTURE ESSENTIALS

Instructional Systems Development (ISD)

MANAGEMENT ACCOUNTING PERFORMANCE EVALUATION John Joyce addresses the problem areas of overhead variances and planning variances.

CMMI and FPA. the link and benefit of using FPA when rolling out CMMI. Christine Green IFPUG - Certified Function Point Specialist EDS

Seven Lies Employees Tell You About Time & Pay. Presented by: Jaime Lizotte HR Solutions Manager

Implementing Scrumban

HOW YOUR CAREER BACKGROUND CAN HELP YOU BECOME A BUSINESS ANALYST

reasons to invest in a CMMS

CPSC 310 Software Engineering. Quality

Signing up, managing your team and monitoring your team s progress is easy to do online.

MTAT Software Economics. Session 6: Software Cost Estimation

Sources of Error in Software Cost Estimation

Contents. Today Project Management. What is Project Management? Project Management Activities. Project Resources

Incident Investigation Process

5 Metrics You Should Know to Understand Your Engineering Efficiency

Copyright 2012, Business Directors New Zealand Limited, Graeme Owen,

Online Student Guide Types of Control Charts

Why DFMA Efforts Fail Mike Shipulski Director of Engineering, Hypertherm, Inc.

2007 ALLDATA LLC. All rights reserved. ALLDATA is a registered trademark of ALLDATA LLC. ATI is a mark of Automotive Training Institute.

Introduction to software testing and quality process

Social Media Marketing For Small Business May, 2016

Foundations of Software Engineering

Using NPA Works. Prepared by Matthew Chance Applied Behavioral Learning Services Page 1! of 17!

19 Ways To Growth Your Business Profit

2 Why is systems development difficult and risky? 3 How do businesses use the systems development life cycle (SDLC) process?

Managers at Bryant University

Marginal Costing Q.8

TESTING THE REQUIREMENTS

Advanced Financial Modeling. Unit 1

SOFTWARE DEVELOPMENT. Process, Models, Methods, Diagrams Software Development Life Cyles. Part - V

Unit4 PSA Suite Business Performance Edition

WORKING WITH TEST DOCUMENTATION

Basic Facts. Introduction. Who is the Code for? Introduction to the. What does the Code do? of the Labour Standards Code,

D. People would continue to consume the same amount of the good.

Measurement Tailoring Workshops

Take Your UAT from the Stone Age to the Digital Age

The Master Task Success System. Developed by Mike Scott and Associates

Basic Facts. Introduction. What does the Code do? Who is the Code for? Introduction to the. of the Labour Standards Code,

Proven Strategies for Finding Profitable Seller Carryback Notes

Big Rock Estimation: Using Agile Techniques to Provide a Rough Software Schedule / Resource Estimate

Software Sizing Experiences in the Classroom. Presented at the 18th International Forum on COCOMO and Software Cost Modeling

HOW THE BEST DEALERS USING LOYALTY PROGRAMS TO BOOST CUSTOMER RETENTION

Pertemuan 2. Software Engineering: The Process

Brand and Reputation Management in the. Social Age. August 9, 2012 Dallas Chief Global Digital Strategist

How to Get More Value from Your Survey Data

CAIR Frequently Asked Questions

PLANNING AND ESTIMATING

Chapter 14: Iteration Planning. It is a capital mistake to theorize before one has data. Sherlock Holmes, Scandal in Bohemia

Chapter 4 Software Process and Project Metrics

Information Systems Development

Top 10 Marketing Mistakes Even the Smartest Companies Make And How You Can Avoid Them

Instruction Manual. Instruction Manual

VALUEDIALOG SM WORKSHOP HOW TO GET A GREATER SHARE OF CUSTOMER VALUE

Get Noticed & Get Hired Upgrading the Game Industry Resume

GUARANTEED TO DELIGHT

ERLANGER HEALTH SYSTEM POLICY

WHAT ACCOUNTANTS NEED TO KNOW ABOUT E-COMMERCE CLIENTS. By Vinnie Fisher

The 5 Building Blocks of a CAPA Solution. Managing Corrective Actions/Preventive Actions for the Consumer Products Industry

Application and Evaluation of the Personal Software Process

25% 22% 10% 10% 12% Modern IT shops are a mix of on-premises (legacy) applications and cloud applications. 40% use cloud only. 60% use hybrid model

INSTRUCTIONS CERTIFICATION CHALLENGE. Who Can Apply?

So Just What Is Sales & Operations Planning?

Introduction to Agile Life Cycles. CSCI 5828: Foundations of Software Engineering Lecture 07 09/13/2016

9 Ways To. Reduce Maintenance Costs. A Simple Introduction to. How to improve your bottom line by helping your mechanics be their best.

Components of a Marketing Plan

Linda Carrington, Wessex Commercial Solutions

ISO (SPiCE) Assessment

Estimating Duration and Cost. CS 390 Lecture 26 Chapter 9: Planning and Estimating. Planning and the Software Process

Cliffs Notes Operations Management System (OMS)

The Credit.org Guide. Holidays on a Budget

How we test 100 projects each month?

SWEN 256 Software Process & Project Management

Increasing Value Add by: Sheila Julien, Senior Associate

Career Activities. The Gallup Organization

The author is working as a Technical Analyst in Infosys Technologies Limited.

SENG380:Software Process and Management. Software Size and Effort Estimation Part2

DRAFT. Effort = A * Size B * EM. (1) Effort in person-months A - calibrated constant B - scale factor EM - effort multiplier from cost factors

Requirements Analysis and Specification. Importance of Good Requirements Analysis

SAS ANALYTICS AND OPEN SOURCE

SOFTWARE EFFORT AND SCHEDULE ESTIMATION USING THE CONSTRUCTIVE COST MODEL: COCOMO II

Deltek Touch Time & Expense for Vision. User Guide

Comparing NAWH and Relief Factor Calculations

Sample Exam ISTQB Agile Foundation Questions. Exam Prepared By

Term Time Working. Guide for managers and employees

Field Service Management: Strategies to Improve Profit and Customer Loyalty

Determine What to Prototype

A GUIDE TO GETTING SURVEY RESPONSES

MARKETING AND ADVERTISTING CAN KILL YOUR BRAND

Selecting and Managing Consultants

Software Engineering Lecture 5 Agile Software Development

ADDITIONAL HOLIDAY INFORMATION

Top 17 Payroll Interview Questions & Answers

Metrics in Microbiology Monitoring Practices

Core Skills Training Manufacturing and Shop Paperwork Instructor Guide

Real-time and tactical planning: A PPF discussion document

Transcription:

Dilbert Scott Adams

Dilbert Scott Adams

Dilbert Scott Adams

Calibration and Historical Data Chapter 8 Calibration: used to convert counts to estimates Example: Counting defects late in the project Given 400 open defects 250 defects have been corrected 2 hours per defect Estimate Hours needed to close the remaining defects 400 2 = 800 hours

Three Kinds: Industry Data Calibration Data Data from other organizations that develop the same kind of software Historical Data Past data from the organization doing the estimating Project Data Data generated earlier in the project being estimated

Historical (your own) Data Estimates are better using Historical Data better than using any other type of data or estimating technique. Except For small projects, individual capabilities are key and they may vary from project to project. For large projects, organizational influences may dominate Large projects in trouble may not benefit from adding even capable people So what organizational influences?

Organizational Influences How complex is the software, what is the execution time constraint, what reliability is required, how much documentation is required, how precedented is the (Cocomo II influence factors) Can the organization commit to stable requirements, or must the project team deal with volatile requirements throughout the project? Is the project manager free to remove a problem team member from the project, or do the organization s Human Resources policies make it difficult or impossible to remove a problem employee? Is the team free to concentrate on the current project, or are team members frequently interrupted with calls to support production releases of previous projects?

More influences Can the organization add team members to the new project as planned, or does it refuse to pull people off other projects? Does the organization support the use of effective design, construction, quality assurance, and testing practices? Does the organization operate in a regulated environment (for example, under FAA or FDA regulations) in which certain practices are dictated? Can the project manager depend on team members staying until the project is complete, or does the organization have high turnover? Aside: What does high turnover tell you?

Influence factors e.g. Cocomo Adding control knobs Many are related to personnel (req t analysts, programmers, etc.) Cocomo rating method uses industry wide percentile groupings For programmers: 90 th percentile 55 th percentile 25 th percentile 15 th percentile

and then there is the sell Upper management attacks the fat in the manager s estimate Hypothetical Manager / Executive meeting dialog (page 94) Suggestion (demands) of changes to factors... Reduced estimated effort by 23% Reduced ratings of engineers by 39% Benefits of historical data you don t need to estimate whether programmers are above or below average. Example Company has averaged 300 to 450 LOC per staff month Used (assumed) 400 LOC per month

What data to collect? Start with a small set (may be all you will need): Size (LOC or something else you can count) Effort (staff months/years) Time (months) Defects (grouped by severity/priority) Collect data from more than one project Numbers you calibrate and use can be off by a factor of 2!

Lines of Code (LOC) Yes, there are other things to count (later chapters) But what are the definitional issues? Do you count all code or only code that s included in the released software? (e.g. do you count scaffolding code, mock object code, unit test code, and system test code?) How do you count code that s reused from previous versions? How do you count open source code or 3 rd party library code? Do you count blank lines and comments, or only non-blank, non-comment source lines? Do you count class interfaces? Do you count data declarations? How do you count lines that make up one logical line of code but that are broken across multiple lines for the sake of readability?

Creeps in Subjectivity Estimates can not control for what went wrong in the last project assuming what went wrong will not happen again. and what went wrong is most often not something the project manager has control over. Remember by using historical data, you assume that the next project will be just like the last project. although this data may be the best you can get.

Effort? Do you count time in hours, days, or some other unit? How many hours per day do you count? Standard 8 hours or actual hours applied to the specific project? Do you count unpaid overtime? Do you count holidays, vacation, and training? Do you make allowances for all-company meetings? What kinds of effort do you count? Testing? First-level management? Documentation? Requirements? Design? Research? How do you count time that s divided across multiple projects? How do you count time spent supporting previous releases? How do you count time spent supporting sales calls, trade shows, and so on? How do you count travel time? How do you count fuzzy front-end time the time spent firming up the software concept before the project is fully defined?

Issues related to Calendar Time What kinds of effort do you count? Testing? First-level management? Documentation? Requirements? Design? Research? How do you count time that s divided across multiple projects? How do you count time spent supporting previous releases? How do you count time spent supporting sales calls, trade shows, etc. How do you count travel time? How do you count fuzzy front-end time the time spent firming up the software concept before the project is fully defined? NOTE. Don t forget what you are estimating that is how you tell what data to collect. You are not fishing.

More issues Defect Measures Finally, defect estimates may vary by a factor of 2 or 3 depending on what s counted as a defect. Do you count: All change requests as defects, or only those that are classified as defects rather than feature requests? Multiple reports of the same defect as a single defect or as multiple defects? Defects that are detected by developers, or only those detected by testers? Requirements and design defects that are found prior to the beginning of system testing? Coding defects that are found prior to the beginning of alpha or beta testing? Reported by users after the SW has been released?

Collect data as you go Short-term memory makes it difficult to reconstruct the past. During or immediately after. Collecting data periodically during the project provides for calibration of time between For example: Time between different type categories of problem reports.

What to Calibrate Here are some examples of models you could create: Our developers average X lines of code per staff month. A 3-person team can deliver X stories per calendar month. Our team is averaging X staff hours per use case to create the use case, and Y hours per use case to construct and deliver the use case. Our testers create test cases at a rate of X hours per test case. In our environment, we average X lines of code per function point in C# and Y lines of code per function point in Python. On this project so far, defect correction work has averaged X hours per defect.

Beware of scaling up (or down) Error May Diseconomies with small variations in Project Size Team Size Average Stories Delivered per Calendar Month 1 5 2 3 12 4-5 22 6-7 31 8 No Data

Using Project Data to Refine your Estimates The sooner you can use Project data for calibration and estimating the sooner your estimates will become more accurate. The Cone narrows. So Switch from using organizational (historical) data or Industry-average data as soon as possible.

Calibration with Industry Data Pages 100-101 Figure 8-1 (Industry averages) 25 th and 75 th percentiles Effort (staff months) and Schedule (months) Effort: range from 160 to 50 factor of 3 difference Figure 8-2 (Organization s own historical data) 25 th and 75 th percentiles Effort (staff months) and Schedule (months) Effort: range from 95 to 70 factor of 1.4 difference

Review of Studies Estimation accuracy: When estimation models were NOT calibrated to the organization s historical data, expert estimates were more accurate than other models When Estimation models were calibrated with the organization s historical data, the models were as good as or better than expert estimates. Historical data rather than industry wide data is better not surprising.