C S E USC. University of Southern California Center for Software Engineering

Size: px
Start display at page:

Download "C S E USC. University of Southern California Center for Software Engineering"

Transcription

1 COCOMO II: Airborne Radar System Example Dr. Ray Madachy C-bridge Internet Solutions Center for Software Engineering 15th International Forum on COCOMO and Software Cost Modeling October 24, 2000 C S E University of Southern California Center for Software Engineering

2 Outline Overview the Airborne Radar System (ARS) Demonstrate progressive usage of different COCOMO sub-models within an evolutionary spiral development process Cover estimation of reuse, modification, COTS, and automated translation Show how an aggregate estimate is refined in greater detail Show some estimation aids Page 2

3 ARS System Overview display workstation and console user input graphics commands central computer radar control commands received radar data radar unit computer radar other user commands other sensor data Page 3

4 Software Components Radar Unit Control controls radar hardware Radar Item Processing extracts information from returned radar to identify objects Radar Database maintains radar object tracking data Display Manager high level displays management Display Console user input device interface and primitive graphic processing Built In Test hardware monitoring and fault localization Page 4

5 COCOMO Coverage in Evolutionary Lifecycle Process Development Phase ARS Product Milestone COCOMO Sub-model Inception Prototype Applications Composition Elaboration Breadboard Early Design Construction Initial Operating Capability (IOC) * both top-level and detailed estimates shown Post- Architecture* Page 5

6 Prototype Size and Effort Component Size (Application Points) Estimated Reuse New Application Points (NAP) Radar Unit Control % 8.8 Radar Item Processing % 45.5 Radar Database 23 0 % 23 Display Manager 42 0 % 42 Display Console 17 0 % 17 Built In Test 0 0 % 0 TOTAL Productivity is high at 25 NAP/PM Effort = NAP/ Productivity = 136.3/25 = 5.45 PM (or 23.6 person-weeks) Personnel = 23.5 person-weeks/6 weeks ~ 4 full-time personnel Page 6

7 Scale Factors for Breadboard Factor Rating Precedentedness (PREC) Nominal Development Flexibility (FLEX) Low Risk/Architecture Resolution (RESL) High Team Cohesion (TEAM) Nominal Process Maturity (PMAT) Nominal Page 7

8 Early Design Cost Drivers for Breadboard Factor Rating Product Reliability and Complexity (RCPX) High Required Reuse (RUSE) Very High Platform Difficulty (PDIF) High Personnel Capability (PERS) High Personnel Experience (PREX) Nominal Facilities (FCIL) Nominal Schedule (SCED) Nominal Page 8

9 Breadboard System Size Calculations Component Size (SLOC) Language Type REVL (%) DM CM IM AA SU UNFM AAM Equivalent Size (SLOC) Radar Unit Control Radar Item Processing 4500 Ada 95 New Ada 95 Reused Ada 95 New Ada 95 Translate 0 d Radar Database 6272 Ada 95 New Ada 95 Modified Display Manager Ada 95, C New Ada 95, C Reused C COTS Display Console 5400 C, microcode 2876 C, microcode Built In Test 4200 Ada 95, assembler New COTS New TOTAL Page 9

10 Early Design Estimate for Breadboard Page 10

11 ARS Full Development for IOC Use Post-Architecture estimation model same general techniques as the Early Design model for the Breadboard system, except for elaborated cost drivers Two estimates are demonstrated: top-level and detailed scale drivers apply to overall system in both estimates cost drivers are rated for the aggregrate system in the toplevel estimate (17 ratings) cost drivers are refined for each individual software component in the detailed estimate (17*6 components=102 ratings) Page 11

12 Post-Architecture Estimate for IOC (Top-level) Page 12

13 Organizational Ratings Profile SCALE DRIVERS MEGASPACE COCOMO PROJECT RATINGS Very Low Low Nominal High Very High Extra High PREC - precedentedness ADAPT BST-2 ADM BST LITS MINK thoroughly unprecedented largely unprecedented somewhat unprecedented generally familiar largely familiar thoroughly familiar FLEX - flexibility RESL - architecture/risk resolution (% significant module interfaces specified, % significant risks eliminated) BST-2 ADAPT rigorous ADM ADAPT occasional relaxation BST LITS MINK some relaxation general conformity BST LITS MINK ADM BST-2 some conformity general goals little 20% some 40% often 60% generally 75% mostly 90% full 100% TEAM - team cohesion BST-2 ADAPT ADM BST LITS MINK very difficult interactions some difficult interactions basically cooperative interactions largely cooperative highly cooperative seamless interactions PMAT - process maturity ADAPT BST LITS MINK BST-2 ADM weighted average of "yes" answers to CMM maturity questionnaire Page 13

14 Post-Architecture Estimate for IOC (Detailed) Page 14

15 Incremental Component Structure Page 15

16 Increment Phasing Page 16

17 Increment Summary Page 17

18 Summary and Conclusions We provided an overview of the ARS example provided in Chapter 3 We demonstrated using the COCOMO sub-models for differing lifecycle phases and levels of detail the estimation model was matched to the known level of detail We showed increasing the level of component detail in the Post-Architecture estimates Incremental development was briefly covered Page 18