RESULTS OF DELPHI FOR THE DEFECT INTRODUCTION MODEL

Size: px
Start display at page:

Download "RESULTS OF DELPHI FOR THE DEFECT INTRODUCTION MODEL"

Transcription

1 RESULTS OF DELPHI FOR THE DEFECT INTRODUCTION MODEL (SUB-MODEL OF THE COST/QUALITY MODEL EXTENSION TO COCOMO II) Sunita Devnani-Chulani USC-CSE Abstract In software estimation, it is important to recognize the strong correlation between Cost, Schedule and. They form three sides of the same triangle. Beyond a certain point (the is Free point), it is difficult to increase the quality without increasing either the cost or schedule or both for the software under development. Similarly, development schedule cannot be drastically compressed without hampering the quality of the software product and/or increasing the cost of development. Software estimation models can play an important role in facilitating the balance of the three factors. This paper presents an initial version of the Defect Introduction sub-model of the empirical quality modeling extension to the existing COCOMO II software cost estimation model. The Model is an estimation model that can be used for predicting number of residual /KSLOC (thousands of Source Lines of Code) or /FP (Function Point) in a software product. It applies in the early activities such as analysis and design as well as in the later stages for refining the estimate when more information is available. It enables what-if analyses that demonstrate the impact of various defect removal techniques and the effects of personnel, project, product and platform characteristics on software quality. It also provides insights on determining ship time, assessment of quality investment payoffs and understanding of quality strategy interactions. The model has two sub-models, namely the Defect Introduction Model and the Defect Removal Model. This paper focuses on the Initial version of the Defect Introduction Model which is the result of a two-round Delphi analysis. It discusses in depth the Defect Introduction rate sensitivities of the various COCOMO II parameters and gives a detailed explanation of the rationale behind the suggested numeric ratings associated with each of the parameters. Keywords : Software Metrics, Software Models, Software Defect Introduction, Software Defect Removal, Software Estimation Copyright USC-CSE 1

2 1. Introduction 1.1 This section focuses on defining the terms used throughout the paper. : The term quality has been used in several different contexts and has many possible definitions. Many of these definitions tend to make quality an intangible abstract entity and hence difficult to quantify. It is very important to define quality in a credible way to develop a model which predicts the final quality of a product. Hence, to facilitate quantification of quality it is defined for the purposes of this model in terms of number of residual /KSLOC or /FP. Software Defects : In 1945 a moth got stuck in the relay of Admiral Grace Hopper s Mark II computer causing it to crash and the word bug was coined. As software matured, several software engineers defined terms such as error, fault, failure, defect [Lyu, 1995]. For the model discussed in this paper, a software defect is a divergence of the actual outcome from the desired outcome. It is a feature that causes incorrect results or causes the system to fail. Software Defect Introduction, Detection and Removal : The term introduction means the activity by which the software defect is originated. The term detection means the activity by which the software defect is first discovered and reported. Software are detected mainly during reviews and testing. The term removal means the activity by which the discovered defect is repaired (or fixed). Classification of Software Defects : Software Defects can be classified in the following two ways : Based on Severity : The following classification is adapted from JPL s Formal Inspections [JPL, 1992] for Software Development Reference, System Software Research Center. Major An error which would cause a malfunction or prevents attainment of an expected or specified result. Any error which would in the future result in an approved change request or failure report. Minor A violation of standards, guidelines, or rules, which would not result in a deviation from requirements if not corrected, but could result in difficulties in terms of operations, maintenance, or future development. Trivial Editorial errors such as spelling, punctuation, and grammar which do not cause errors or change requests. Based on Origin : A software defect can be categorized based on the activity it was introduced in or more precisely said, the activity it originated in. The four types of defect artifacts based on this classification for the focus of this research are Requirements Defects (e.g. leaving out a required Cancel option in an Input screen), Design Defects (e.g. error in the algorithm), Coding Defects (e.g. looping 9 instead of 10 times) and Defects (e.g. incorrect instructions in User s manual to Cancel an operation). Capers Jones [Jones, 1994] in the glossary of his book under Defect Origins also has a category called bad fixes. For the model discussed in this paper, bad fixes are accounted for by the varying ratings of certain Defect Rate Attributes. For example, if Analyst Capability is set to Very High versus Very Low then there will be fewer introduced in fixing (i.e. fewer bad fixes ). This will cause the baseline level of the number of introduced in the Design Copyright USC-CSE 2

3 Activity to reduce significantly. Several other Defect Rate Attributes, discussed in section 5, also account for bad fixes. Defect Driver Attribute : The term Defect Driver Attribute is a term used to define a factor that is significant in the defect introduction and removal activities. It has a significant impact on the quality of a software project, e.g., Disciplined Methods (DISC) such as the Cleanroom Software Development approach [Dyer, 1992] and the Personal Software Process (PSP) [Humphrey, 1995]. Defect Rate Multiplier : Each factor or Defect Driver Attribute (total of 23) has a set of multipliers which is mapped to a set of project ratings for the Defect Driver Attribute e.g. DISC (explained in Section 5) = Very High has a Defect Rate Multiplier of 0.67 for Requirements Defects. : The of a Defect Driver Attribute is defined as the ratio between the largest Defect Rate Multiplier and the smallest Defect Rate Multiplier. It is an indicator of the relative ability of the attribute to affect software quality. 1.2 Topics Addressed : Section 2 describes the background model which forms the basis of the Cost/ Model. Section 3 presents the formulation of the Initial Version of the Defect Introduction Model which is introduced in section 2 as a sub-model of the composite Cost/ Model. Section 4 discusses the Delphi Process incorporated to arrive at the numerical values associated with the various Defect Driver Attributes. Section 5 presents the Delphi results. And, finally Section 6 concludes with the ongoing research of the Defect Removal Model and future plans for calibrating and validating the model. Copyright USC-CSE 3

4 2. Background Model The model is an extension to the existing COCOMO II [USC-CSE a, Boehm et al, 1994] model. It is based on The Software Defect Introduction and Removal Model described in SEE [Boehm, 1981] which is analogous to the tank and pipe model introduced by Capers Jones [Jones, 1975]. This model shows that conceptually flow into a holding tank through various defect-source pipes & are drained off through various defect-elimination pipes as shown in Figure 1. Figure 1 : Software Defect Introduction and Removal Model Defects (15/KDSI) Overall defect rate: 60/KDSI Code Defects (15/KDSI) Residual Software Defects Requirements Defects (5/KDSI) Design Defects (25/KDSI) Percent of eliminated % % % % Cost, C C C C C Automated requirements aids Independent requirements V &V activity Simulation Design Inspections Field testing Each of the defect-removal techniques has a characteristic production function which is a plot of the effort expended on the activity versus the benefit achieved in terms of percentage of removed. Ideally, selecting a particular level of investment for each of the defect-removal activities should determine the final quality of the software project (Number of / Size). However, the process of combining the production functions is not easy. This is due to the high correlation between several defect-removal techniques. In order to develop a composite defect-removal production function, it is important to understand the relative overlap in removing belonging to the same class by different defect-removal activities. Several studies done in the late 1970s and the early 1980s [Jones, 1978; Thayer, 1978; Boehm, 1980] reported the percentage of errors removed by each error removal activity (such as artifact inspections, peer reviews, simulation, unit testing, coverage testing, formal methods etc.) but they did not report production functions for these activities. Table 1 shows the strengths and weaknesses of Inspection and Unit Testing. The two activities have a very strong overlap in removing the most common class of : simple programming blunders and logic. Figure 2 shows that if 40% each of the programming effort is invested in Inspection and Unit Testing, then 135 % (60% by Unit testing and 75% by Inspection Testing) of the are removed which is obviously not possible. This clearly explains the high correlation between the two defect-removal activities and the difficulties in developing a composite production function. Copyright USC-CSE 4

5 Table 1 : Strengths and Weaknesses of Inspection and Unit Testing Technique Inspections Unit Test Strengths Simple Programming logic Simple Programming logic Developer blind spots Numerical approximations Interface Program dynamics Missing portions Specification Weaknesses Numerical approximations Program dynamics Developer blind spots Interface Missing portions Specification 100 Unit Test 80 Code Inspection Percent of programming effort Copyright USC-CSE 5

6 3 A-Priori Defect Introduction Model Software Cost/ Model As depicted in section 2, the Software Cost/ Model is composed of a ( i ) Defect Introduction Model and a ( ii ) Defect Removal Model. The Defect Introduction Model is the primary focus of this paper. The Defect Removal Model is still in its early research phase and will be published in the near future. 3.1 Defect Introduction Model : Defects can be introduced in several activities of the software development life cycle and are classified based on their origin (refer to section 1). The total number of Defects Introduced is the sum of Requirements, Design, Coding and Defects. For each type of defect, we have the Defect Introduction formulated as: No. of Requirements Defects Introduced = A req *(Size) B *QAF req No. of Design Defects Introduced = A des *(Size) B *QAF des No. of Coding Defects Introduced = A cod *(Size) B *QAF cod No. of Defects Introduced = A doc *(Size) B *QAF doc where A req/des/cod/doc are multiplicative constants that will be calibrated against actual project data. QAF req/des/cod/doc is the Adjustment Factor (discussed in detail later) Size = Size of the software project measured in terms of KSLOC (thousands of Source Lines of Code [Park, 1992], Function Points [IFPUG, 1994] or Object Points [Banker et al, 1992]) For simplicity, the B (which is equivalent to the Scale Factors in COCOMO II) has been set to 1. Further investigation of this parameter is required but can be carried out only when enough project data is available. Due to publications such as [Banker et al, 1994] and [Gulledge, Hutzler, 1993] it is unclear if Defect Introduction Rates will exhibit economies or diseconomies of scale, i.e. if Size doubles then will the Defect Introduction Rate increase by more than twice the original rate; indicating diseconomies of scale; causing B to be greater than 1, or will it increase by a factor less than twice the original rate; indicating economies of scale; causing B to be less than 1. The total Number of Defects introduced is then the aggregation of the four types of. Hence, Total Number of Defects Introduced = Summarizing the above equations, we have No. of Requirements Defects Introduced + No. of Design Defects Introduced + No. of Coding Defects Introduced + No. of Defects Introduced Total Number of Defects Introduced = Σ [A j *(Size) B *QAF j ] = (Size) B Σ A j *QAF j where : A j = Calibration Constant for the jth artifact B = Provisionally set to 1 Size = Size of the software project measured in terms of KSLOC (thousands of Source Lines of Code, Function Points or Object Points) and QAF j = Adjustment Factor for each type of artifact (Requirements, Design, Coding, ) Copyright USC-CSE 6

7 The Adjustment Factor accounts for differences in the personnel, project, platform and product characteristics. It reflects the product under development and is either a product of 23 Defect Rate Multipliers (DRMs) 1 for the COCOMO II Post-Architecture Model or a product of 13 Defect Rate Multipliers for the COCOMO II Early Design Model. These DRMs are categorized as shown in table 2 2. Table 2 : Post-Architecture Model and Early Design Model Defect Rate Multipliers Category Post-Architecture Model Early-Design Model Platform Required Software Reliability (RELY) Data Base Size (DATA) Product Reliability and Complexity (RCPX) Required Reusability (RUSE) Required Reusability (RUSE) Match to Life-Cycle Needs (DOCU) Product Complexity (CPLX) Product Execution Time Constraint (TIME) Main Storage Constraint (STOR) Platform Volatility (PVOL) Platform Difficulty (PDIF) Personnel Project Scale Factors Analyst Capability (ACAP) Programmer Capability (PCAP) Applications Experience (AEXP) Platform Experience (PEXP) Language and Tool Experience (LTEX) Personnel Continuity (PCON) Use of Software Tools (TOOL) Multisite Development (SITE) Required Development Schedule (SCED) Disciplined Methods (DISC) 3 Precedentedness (PREC) Development Flexibility (FLEX) Architecture/Risk Resolution (RESL) Team Cohesion (TEAM) Process Maturity (PMAT) Personnel Capability (PERS) Personnel Experience (PREX) Facilities (FCIL) Required Development Schedule (SCED) Disciplined Methods (DISC) Precedentedness (PREC) Development Flexibility (FLEX) Architecture/Risk Resolution (RESL) Team Cohesion (TEAM) Process Maturity (PMAT) 1 The definitions in section 1 describe Defect Rate Multiplies as values assigned to Defect Driver Attributes. For the simplicity of the model, they may be used as synonyms. The difference between the two is that Defect Driver Attribute is a qualitative parameter that has a numeric Defect Rate Multiplier assigned to it. 2 The four categories of Defect Rate Multipliers are the same as those for the COCOMO II Effort Multipliers 3 This Defect Rate Multiplier is in addition to the 22 COCOMO II Effort Multipliers Copyright USC-CSE 7

8 For each type of artifact, QAFj = N DRM ij i = 1 where : N = 23 or 13 DRM = Defect Rate Multiplier 3.2 Example of the application of the Defect Introduction Model : The model described above can be illustrated using a simple example. Lets say, calibration results yield A req = 2, A des = 1.5, A cod = 2.5, A doc = 2 for the four multiplicative constants. And, the Size of the software product is 4KSLOC. The baseline level of Defects Introduced for each type of artifact can then by computed as shown : No. of Requirements Defects Introduced = 2*4* = 8 No. of Design Defects Introduced = 1.5*4* = 6 No. of Coding Defects Introduced = 2.5*4* = 10 No. of Defects Introduced = 2*4* =8 Total No. of Defects Introduced = 32 Note that the QAF for each type of artifact is 1 as we are computing the baseline level of Defect Introduction. Now, suppose the same product is developed by analysts rated at the 90 th percentile. This would cause the ACAP rating to be Very High. As shown in Section 5 the associated rating for ACAP=VH would result in QAJ req = 0.75, QAJ des = 0.83, QAJ cod = 0.90, QAJ doc = 0.83 with every other parameter set at. Now, the level of Defect Introduction will be as follows No. of Requirements Defects Introduced = 2*4*0.75 = 6 No. of Design Defects Introduced = 1.5*4*083 = 4.98 No. of Coding Defects Introduced = 2.5*4*0.90 = 9 No. of Defects Introduced = 2*4*0.83 = 6.64 Total No. of Defects Introduced = The above example clearly shows the impact Analyst Capability has on the Defect Introduction Rate. The Total No. of Defects Introduced has reduced by a factor of Copyright USC-CSE 8

9 4 Delphi Process As explained in Section 3, the Adjustment Factor in the Post-Architecture Model is a product of 23 Defect Rate Multipliers (DRMs). The DRMs range from VL (very low) to XH (extra high) and depending on their corresponding values either increase or decrease the level of defect introduction as compared to the nominal level of defect introduction. For the empirical model formulation it was essential to assign numerical values to each of the ratings of the 23 DRMs. Based on expert-judgement an initial set of values were proposed for the model and the Delphi technique [Helmer, 1966] was incorporated for further group consensus. On verbal suggestion from other experts who have used the Delphi technique, the author initiated a 2-round Delphi process. The steps that the author took for the entire Delphi process are outlined below : 1. Provided Participants with Delphi Questionnaire with a proposed set of values for the s (see Section 1) 2. Received 9 responses 3. Ensured validity of responses by correspondence 4. Did simple analysis Round 2 1. Provided participants with Round 2 Delphi Questionnaire -- based on analysis of 2. Repeated steps 2, 3, 4 (above) 3. Converged to Final Delphi Results An example of how the Delphi process was carried out is shown using the DRM, Programmer Capability. In Step 1 for, the author provided the participants with a proposed set of values for the s called Initial. For example, for Code Defects due to PCAP is 1.33/0.75 = 1.33 PCAP level Requirements Design Code VH level of defect introduction VL Initial After about 3 weeks, the author received the responses from. After validating the responses with direct correspondence with some of the participants, the author did a simple analysis and derived the median and range of the responses Median The above results were provided to the participants showing them their response versus the median and range of the other participants responses. This was the first step of Round 2 Again, after a turn around of about 2 weeks, the Round 2 results came in and a similar analysis (as in Round 1) was done. The median of the Round 2 responses resulted in the Final. Copyright USC-CSE 9

10 Round Final (Median - Round 2) It was observed that the range in Round 2 was typically narrower than the range in. The results of the Delphi process are described in Section 5 in detail. New DRMs for the several levels are computed using the Final. For example, the Final for Code Defects due to PCAP is The Very Low and Very High ratings associated with PCAP for Coding Defects are computed as shown below : DRM( VeryLow) = Final = 175. = 1.32, and DRM(Very High) = 1/DRM(Very Low) = 1/1.32 = 0.76, where DRM( VeryLow) Final DRM ( VeryHigh) = i.e. 1.32/0.76 = 1.75 PCAP level / Type of Artifact Very High High Low Very Low Copyright USC-CSE 10

11 5 Defect Introduction Rate Sensitivity - Post Architecture Model The Delphi process outlined in Section 4 was completed successfully. A detailed description of each of the 23 DRMs and its impact on Defect introduction for each type of defect artifact is discussed thoroughly in this section. 5.1A Personnel Factors This section discusses the personnel factors (Analyst Capability, Programmer Capability, Applications Experience, Language and Tool Experience, Personnel Continuity) that have a significant impact on the quality of a software product. Analyst Capability (ACAP) Table 4 represents the ratings 4 for the level of capability of the software analysts Table 4 : Analyst Capability (ACAP) Defect Rate Multipliers ACAP level / Type of Artifact Very High High Low Very Low Example : Consider a software project which has 100 Requirements Defects, and has analysts of nominal level of capability working on it. Suppose, now the team of analysts is replaced by very-high rated analysts. Assume also that all other parameters remain unchanged. Then, using table 4, the number of Requirements Defects will be reduced to 75 (100*0.75). The reasons for the decrease in the number of introduced in the Requirements activity are presented in table 5. Thus the 25% decrease in Requirements Defects is due to the very-high level of capability of the analysts causing fewer Requirements understanding and Requirements Completeness, Consistency as compared to nominal level. 4 These ratings are based on expert-judgement. They are used as initial ratings and will be adjusted after a couple of Delphi rounds with several other experts in the field. Copyright USC-CSE 11

12 Table 5 : Analyst Capability (ACAP) Differences in Defect Introduction ACAP level Requirements Design Code VH Fewer Requirements understanding Fewer Requirements Completeness, consistency 0.75 Fewer Requirements traceability Fewer Design Completeness, consistency Fewer introduced in fixing 0.83 Fewer Coding requirements, design shortfalls -missing guidelines -ambiguities 0.90 Fewer requirements, design shortfalls 0.83 level of defect introduction VL Initial Median - Round 2 Final More Requirements understanding More Requirements Completeness, consistency More Requirements traceability More Design Completeness, consistency More introduced in More Coding requirements, design shortfalls -missing guidelines -ambiguities fixing More requirements, design shortfalls The values assigned to ACAP for each type of artifact are graphically represented in Figure 3. Only differences in Defect Introduction for the extreme ratings are tabulated. The intermediate ratings (for example Low and High for ACAP) can be interpreted based on the explanation provided for the ratings for Very Low and Very High. The values for the Low and High ratings are obtained by geometric interpolation, e.g. DRM( High) = DRM( VeryHigh) or 3 DRM ( ExtraHigh) Copyright USC-CSE 12

13 Figure 3 : Analyst Capability (ACAP) Multiplier for each Type of Artifact Very High High Low Very Low 0.6 Programmer Capability (PCAP) Table 6 represents the ratings for the level of capability of the software programmers on the development team and table 7 explains the differences in defect introduction. Table 6 : Programmer Capability (PCAP) Defect Rate Multipliers PCAP level / Type of Artifact Very High High Low Very Low Copyright USC-CSE 13

14 Table 7 : Programmer Capability (PCAP) Differences in Defect Introduction PCAP level VH VL Initial Median - Round 2 Final Requirements Design Code Fewer Design due to easy interaction with analysts Fewer introduced in fixing 0.85 level of defect introduction Fewer Coding fewer detailed design reworks, conceptual misunderstandings, coding mistakes 0.76 Fewer due fewer in code and documentation reworks, conceptual misunderstandings, coding mistakes 0.88 More Design More Coding due to less easy more interaction with detailed design analysts reworks, conceptual More misunderstandings, introduced in fixing coding mistakes More due more in code and documentation reworks, conceptual misunderstandings, coding mistakes The values assigned to PCAP for each type of artifact are graphically represented in Figure 4 on the following page. Copyright USC-CSE 14

15 Figure 4 : Programmer Capability (PCAP) Multiplier for each Type of Artifact Very High High Low Very Low Applications Experience (AEXP) Table 8 represents the ratings for the level of applications experience of the development team and table 9 explains the differences in defect introduction. Table 8 : Applications Experience (AEXP) Defect Rate Multipliers AEXP level / Type of Artifact Very High High Low Very Low Copyright USC-CSE 15

16 Table 9 : Applications Experience (AEXP) Differences in Defect Introduction AEXP level VH VL Initial Median - Round 2 Final Requirements Design Code Fewer Requirements less learning and fewer false starts Fewer Requirements understanding Fewer Design due to less learning and fewer false starts Fewer Requirements traceability Fewer introduced in fixing requirements, preliminary design fixes level of defect introduction More Requirements extensive learning and more false starts More Requirements understanding Fewer Coding less learning Fewer Coding requirements, design shortfalls 0.88 Fewer requirements, design shortfalls, domain misunderstandings 0.82 More Design due More Coding to less learning and fewer false starts extensive More Requirements learning traceability More Coding More introduced in fixing requirements, requirements, preliminary design fixes design shortfalls More requirements, design shortfalls, domain misunderstandings The values assigned to AEXP for each type of artifact are graphically represented in Figure 5 Copyright USC-CSE 16

17 Figure 5 : Applications Experience (AEXP) Multiplier for each Type of Artifact Very High High Low Very Low Platform Experience (PEXP) Table 10 represents the ratings for the level of platform experience of the development team and table 11 explains the differences in defect introduction. Table 10 : Platform Experience (PEXP) Defect Rate Multipliers PEXP level / Type of Artifact Very High High Low Very Low Copyright USC-CSE 17

18 Table 11 : Platform Experience (PEXP) Differences in Defect Introduction PEXP level Requirements Design Code VH Fewer Requirements fewer application / platform 5 interface analysis misunderstandings 0.90 Fewer Design fewer application / platform interface design misunderstandings 0.86 Fewer Coding application / platform interface coding misunderstandings 0.86 Fewer requirements, design shortfalls, fewer application/platform interface design misunderstandings 0.92 level of defect introduction VL Initial Median - Round 2 Final More Requirements more application / platform * interface, database, networking analysis misunderstandings 1.11 More Design more application / platform interface design misunderstandings More Coding due to application / platform interface coding misunderstandings More requirements, design shortfalls, more application / platform interface design misunderstandings The values assigned to PEXP for each type of artifact are graphically represented in Figure 6 5 Platform refers to the hardware and software infrastructure being used by the system being developed. It can include office automation, database, and user-interface support packages; distributed middleware; operating systems; networking support; and hardware Copyright USC-CSE 18

19 Figure 6 : Platform Experience (PEXP) Multiplier for each Type of Artifact Very High High Low Very Low Language and Tool Experience (LTEX) Table 12 represents the ratings for the level of language and tool experience of the development team and table 13 explains the differences in defect introduction. Table 12 : Language and Tool Experience (LTEX) Defect Rate Multipliers LTEX level / Type of Artifact Very High High Low Very Low Copyright USC-CSE 19

20 Table 13 : Language and Tool Experience (LTEX) Differences in Defect Introduction LTEX level Requirements Design Code VH Fewer Requirements as easier to find and fix via tools 0.93 Fewer Design due to fewer design versus language mismatches and because are easier to find and fix via tools 0.88 Fewer Coding due to fewer language misunderstandings and because are easier to find and fix via tools 0.82 Fewer requirements, design, code shortfalls 0.92 level of defect introduction VL More Requirements as difficult to find and fix via tools 7 More Design due to more design versus language mismatches and because are difficult to find and fix via tools 1.13 More Coding due to more language misunderstandings and because are more difficult to find and fix via tools 1.22 More requirements, design, code shortfalls 9 Initial Median Round 2 Final The values assigned to LTEX for each type of artifact are graphically represented in Figure 7 Copyright USC-CSE 20

21 Figure 7 : Language and Tool Experience (LTEX) Multiplier for each Type of Artifact Very High High Low Very Low Personnel Continuity (PCON) Table 14 represents the ratings for the level of the project s annual personnel turnover and table 15 explains the differences in defect introduction. Table 14 : Personnel Continuity (PCON) Defect Rate Multipliers LTEX level / Type of Artifact Very High High Low Very Low Copyright USC-CSE 21

22 Table 15 : Personnel Continuity (PCON) Differences in Defect Introduction PCON level Requirements Design Code VH Fewer Requirements understanding Fewer Requirements completeness, consistency. Fewer Requirements false starts 0.82 Fewer Requirements traceability Fewer Design completeness, consistency Fewer introduced in fixing 0.80 Fewer Coding due to requirements, design shortfalls, misunderstandings -missing guidelines, context -ambiguities Fewer introduced in fixing 0.77 Fewer requirements, design shortfalls and misunderstandings Fewer introduced in fixing 0.82 level of defect introduction VL Initial Median - Round 2 Final More Requirements understanding More Requirements completeness, consistency. More Requirements false starts More Requirements traceability More Design completeness, consistency More introduced in fixing More Coding due to requirements, design shortfalls, misunderstandings -missing guidelines, context -ambiguities More introduced in fixing More requirements, design shortfalls and misunderstandings More introduced in fixing The values assigned to PCON for each type of artifact are graphically represented in Figure 8 Copyright USC-CSE 22

23 Figure 8 : Personnel Continuity (PCON) Multiplier for each Type of Artifact Very High High Low Very Low B Project Factors This section discusses the project factors (Use of Software Tools, Multisite Development, Required Development Schedule, Disciplined Methods) that have a significant impact on the quality of a software product. Use of Software Tools (TOOL) Table 16 represents the ratings for the level of the use of the several software tools and table 17 explains the differences in defect introduction. Table 16 : Use of Software Tools (TOOL) Defect Rate Multipliers TOOL level / Type of Artifact Very High High Low Very Low Copyright USC-CSE 23

24 Table 17 : Use of Software Tools (TOOL) Differences in Defect Introduction TOOL level VH VL Initial Median - Round 2 Final Requirements Design Code Fewer Requirements as easier to find and fix via tools Fewer Design as easier to fix and find via tools level of defect introduction More Requirements as harder to find and fix via tools Fewer Coding as easier to find and fix via tools Fewer automation of translation of detailed design into code 0.80 Fewer requirements, design, code shortfalls; automated documentation generation 0.85 More Design More Coding as as harder to find and fix via harder to find tools and fix via More tools manual translation of detailed design into code More requirements, design, code shortfalls; manual documentation generation The values assigned to TOOL for each type of artifact are graphically represented in Figure 9 Copyright USC-CSE 24

25 Figure 9 : Use of Software Tools (TOOL) Multiplier for each Type of Artifact Very High High Low Very Low Multisite Development (SITE) Table 18 represents the ratings for the level of frequency of multisite development and table 19 explains the differences in defect introduction. Table 18 : Multisite Development (SITE) Defect Rate Multipliers SITE level / Type of Artifact Extra High Very High High Low Very Low Copyright USC-CSE 25

26 Table 19 : Multisite Development (SITE) Differences in Defect Introduction SITE level Requirements Design Code XH Fewer Requirements understanding, completeness and consistency due to shared context, good communication support 0.83 Fewer Requirements traceability, Design completeness, consistency due to shared context, good communication support Fewer introduced in fixing 0.83 Fewer Coding requirements, design shortfalls and misunderstandings; shared coding context Fewer introduced in fixing 0.85 Fewer requirements, design shortfalls and misunderstandings; shared context Fewer introduced in fixing 0.88 level of defect introduction VL Initial Median - Round 2 Final More Requirements understanding, completeness and consistency due to lack of shared context, poor communication support 1.20 More Requirements traceability, Design completeness, consistency due to lack of shared context, poor communication support More introduced in fixing 1.20 More Coding requirements, design shortfalls and misunderstandings; lack of shared context More introduced in fixing More requirements, design shortfalls and misunderstandings; lack of shared context More introduced in fixing The values assigned to SITE for each type of artifact are graphically represented in Figure 10 Copyright USC-CSE 26

27 Figure 10 : Multisite Development (SITE) Multiplier for each Type of Artifact Extra High Very High High Low Very Low Required Development Schedule (SCED) Table 20 represents the ratings for the level of schedule constraint imposed on the development team and table 21 explains the differences in defect introduction. Table 20 : Required Development Schedule (SCED) Defect Rate Multipliers SCED level / Type of Artifact Very High High Low Very Low Copyright USC-CSE 27

28 Table 21 : Required Development Schedule (SCED) Differences in Defect Introduction SCED level Requirements Design Code VH Fewer requirements higher likelihood of correctly interpreting specs Fewer due to more thorough planning, specs, validation 0.85 Fewer Design due to higher likelihood of correctly interpreting specs Fewer Design due to fewer in fixes and fewer specification to fix Fewer more thorough planning, specs, validation 0.84 Fewer Coding higher likelihood of correctly interpreting specs Fewer Coding requirements and design shortfalls Fewer introduced in fixing 0.84 Fewer higher likelihood of correctly interpreting specs Fewer requirements and design shortfalls 0.85 level of defect introduction VL Initial Median - Round 2 Final More Requirements - more interface problems (more people in parallel) - more TBDs in specs, plans - less time for validation 1.18 More Design due to earlier TBDs, more interface problems, less time for V&V More in fixes and more specification to fix More Coding requirements and design shortfalls, less time for V&V More introduced in fixing More requirements and design shortfalls More introduced in fixing The values assigned to SCED for each type of artifact are graphically represented in Figure 11 Copyright USC-CSE 28

29 Figure 11 : Required Development Schedule (SCED) Multiplier for each Type of Artifact Very High High Low Very Low Disciplined Methods (DISC) Table 22 represents the ratings for the level of several disciplined methods such as Cleanroom development, PSP etc. used by the development team and table 23 explains the differences in defect introduction. Table 22 : Disciplined Methods (DISC) Defect Rate Multipliers TOOL level / Type of Artifact Very High High Low Very Low Copyright USC-CSE 29

30 Table 23 : Disciplined Methods (DISC) Differences in Defect Introduction DISM level Requirements Design Code VH Fewer Requirements - more thorough specs - more emphasis on defect prevention 0.63 Fewer Design - more thorough specs - more emphasis on defect prevention 0.60 Fewer Coding - extensive self-review - more emphasis on defect prevention - extensive defect checklists - defensive programming - independent unit testing 0.54 Fewer - extensive selfreview - more emphasis on defect prevention 0.75 level of defect introduction VL Initial Median - Round 2 Final More Requirements - less thorough specs - less emphasis on defect prevention More Design - less thorough specs - less emphasis on defect prevention More Coding - minimal self-review - less emphasis on defect prevention - minimal defect checklists - lack of defensive programming lack of independent unit testing More - minimal self-review - less emphasis on defect prevention The values assigned to DISC for each type of artifact are graphically represented in Figure 12 Copyright USC-CSE 30

31 Figure 12 : Disciplined Methods (DISC) Multiplier for each Type of Artifact Very High High Low Very Low C Product Factors This section discusses the product factors (Execution Time Constraint, Main Storage Constraint, Platform Volatility) that have a significant impact on the quality of a software product. Execution Time Constraint (TIME) Table 24 represents the ratings for the level of the percentage of available execution time expected to be used by the system or subsystem consuming the execution time resource and table 25 explains the differences in defect introduction. Table 24 : Execution Time Constraint (TIME) Defect Rate Multipliers TIME level / Type of Artifact Extra High Very High High Copyright USC-CSE 31

32 Table 25 : Execution Time Constraint (TIME) Differences in Defect Introduction TIME level XH Initial Median - Round 2 Final Requirements Design Code More Requirements Defects due to trickier analysis, complex interface design, test plans and more planning 8 More Design Defects due to trickier analysis, complex interface design, test plans and more planning 1.2 More Coding Defects since code and data trickier to debug 1.2 or lower level of defect introduction The values assigned to TIME for each type of artifact are graphically represented in Figure Figure 13 : Execution Time Constraint (TIME) Multiplier for each Type of Artifact Extra High Very High High Copyright USC-CSE 32

33 Main Storage Constraint (STOR) Table 26 represents the ratings for the degree of main storage constraint imposed on a software system or sub-system and table 27 explains the differences in defect introduction. Table 26 : Main Storage Constraint (STOR) Defect Rate Multipliers STOR level / Type of Artifact Extra High Very High High Table 27 : Main Storage Constraint (STOR) Differences in Defect Introduction TIME level Requirements Design Code XH More Requirements understanding due to trickier analysis 8 More Design due to trickier analysis 1.18 More Coding since code and data trickier to debug 1.15 or lower level of defect introduction Initial Median Round 2 Final The values assigned to STOR for each type of artifact are graphically represented in Figure 14 Copyright USC-CSE 33

34 Figure 14 : Main Storage Constraint (STOR) Multiplier for each Type of Artifact Extra High Very High High Platform Volatility (PVOL) Table 28 represents the ratings for level of platform volatility where platform can include office automation, database, and user-interface support packages; distributed middleware; operating systems; networking support; and hardware. Table 29 explains the differences in defect introduction. Table 28 : Platform Volatility (PVOL) Defect Rate Multipliers PVOL level / Type of Artifact Very High High Low Copyright USC-CSE 34

35 Table 29 : Platform Volatility (PVOL) Differences in Defect Introduction PVOL level VH L Initial Median - Round 2 Final Requirements Design Code More Requirements changes in platform characteristics 1.16 More Design due to changes in platform characteristics 1.20 level of defect introduction Fewer Requirements fewer changes in platform characteristics More Coding due to Requirements and Design shortfalls More Coding due to changes in platform characteristics 1.22 More changes in platform characteristics 1.12 Fewer Design Fewer Coding due fewer to Requirements and changes in platform Design shortfalls characteristics Fewer Coding due Fewer Defects to fewer changes in introduced in fixing platform characteristics Fewer fewer changes in platform characteristics The values assigned to PVOL for each type of artifact are graphically represented in Figure 15 Copyright USC-CSE 35

36 Figure 15 : Platform Volatility (PVOL) Multiplier for each Type of Artifact Very High High Low D Platform Factors This section discusses the platform factors (Required Software Reliability, Data Base Size, Required Reusability, Match to Life Cycle Needs, Product Complexity) that have a significant impact on the quality of a software product. Required Software Reliability (RELY) Table 30 represents the extent to which the software must perform its intended function over a period of time. Table 31 explains the differences in defect introduction. Table 30 : Required Software Reliability (RELY) Defect Rate Multipliers RELY level / Type of Artifact Very High High Low Very Low Copyright USC-CSE 36

37 Table 31 : Required Software Reliability (RELY) Differences in Defect Introduction RELY level VH VL Initial Median - Round 2 Final Requirements Design Code Fewer Requirements Completeness, consistency due to detailed verification, QA, CM, standards, SSR, documentation, IV&V interface, test plans, procedures 0.70 Fewer Design due to detailed verification, QA, CM, standards, PDR, documentation, IV&V interface, design inspections, test plans, procedures 0.69 level of defect introduction More Requirements Completeness, consistency due to minimal verification, QA, CM, standards, PDR, documentation, IV&V interface, test plans, procedures 1.43 More Design due to minimal verification, QA, CM, standards, PDR, documentation, IV&V interface, design inspections, test plans, procedures Fewer Coding due to detailed verification, QA, CM, standards, documentation, IV&V interface, code inspections, test plans, procedures 0.69 More Coding due to minimal verification, QA, CM, standards, PDR, documentation, IV&V interface, code inspections, test plans, procedures Fewer requirements, design shortfalls 0.82 More requirements, design shortfalls Copyright USC-CSE 37

38 The values assigned to RELY for each type of artifact are graphically represented in Figure 16 Figure 16 : Required Software Reliability (RELY) Multiplier for each Type of Artifact Very High High Low Very Low Data Base Size (DATA) Table 32 represents the values assigned to different ratings of the data base size. Table 33 explains the differences in defect introduction. Table 32 : Data Base Size (DATA) Defect Rate Multipliers DATA level / Type of Artifact Very High High Low Copyright USC-CSE 38

39 Table 33 : Data Base Size (DATA) Differences in Defect Introduction DATA level Requirements Design Code VH More Requirements - complex database design and validation - complex HW/SW storage interface 7 More Design due to - complex database design and validation - complex HW/SW storage interface - more data checking in program 1.10 More Coding due to - complex database development - complex HW/SW storage interface - more data checking in program 1.10 More more requirements, design 5 level of defect introduction L Initial Median - Round 2 Final Fewer Requirements - simple database design and validation - simple HW/SW storage interface 0.93 Fewer Design due to - simple database design and validation - simple HW/SW storage interface - simple data checking in program 0.91 Fewer Coding due to - simple database development - Simple HW/SW storage interface - simple data checking in program 0.91 Fewer fewer requirements, design The values assigned to DATA for each type of artifact are graphically represented in Figure 17 Copyright USC-CSE 39

40 Figure 17 : Data Base Size (DATA) Multiplier for each Type of Artifact Very High High Low Required Reusability (RUSE) Table 34 represents the values assigned to the several levels of required reusability. Table 35 explains the differences in defect introduction. Table 34 : Required Reusability (RUSE) Defect Rate Multipliers RUSE level / Type of Artifact Extra High Very High High 2 2 Low Copyright USC-CSE 40

41 Table 35 : Required Reusability (RUSE) Differences in Defect Introduction RUSE level Requirements Design Code XH More Requirements higher complexity in design, validation and interfaces. Fewer Requirements more thorough interface analysis 5 More Design due to higher complexity in design, validation and interfaces Fewer Requirements more thorough interface analysis 2 More Coding higher complexity in design, validation and interfaces Fewer Requirements more thorough interface definitions 2 More need to distinguish between general and specific cases 4 level of defect introduction L Initial Median - Round 2 Final Fewer Requirements Defects due to lower complexity in design, validation and interfaces. Fewer Design due to lower complexity in design, validation, test plans and interfaces Fewer Coding lower complexity in design, validation, test plans and interfaces Fewer lower complexity of product The values assigned to RUSE for each type of artifact are graphically represented in Figure 18 Copyright USC-CSE 41

42 Figure 18 : Required Reusability (RUSE) Multiplier for each Type of Artifact Extra High Very High High Low Match to Life Cycle Needs (DOCU) Table 35 represents the values assigned to the different levels of required documentation. Table 36 explains the differences in defect introduction. Table 35 : Match to Life Cycle Needs (DOCU) Defect Rate Multipliers DOCU level / Type of Artifact Very High High Low Very Low Copyright USC-CSE 42

43 Table 36 : Match to Life Cycle Needs (DOCU) Differences in Defect Introduction DOCU level Requirements Design Code VH Fewer Requirements good quality detailed documentation of the requirements analysis 0.86 Fewer Design good quality detailed documentation of the requirements analysis and product design 0.85 Fewer Coding due to - good quality detailed documentation of the requirements analysis and product design - fewer in requirements and design 0.85 Fewer fewer requirements, design More more documentation, consistency problems 1.14 level of defect introduction VL Initial Median - Round 2 Final More Requirements minimal documentation of the requirements analysis (which may not be of good quality) More Design minimal documentation of the requirements analysis and product design (which may not be of good quality) 1.18 More Coding due to - minimal documentation of the requirements analysis and product design (which may not be of good quality) - more in requirements and design 1.18 More more requirements, design Fewer fewer documentation, consistency problems The values assigned to DOCU for each type of artifact are graphically represented in Figure 19 Copyright USC-CSE 43

Modeling Software Defect Introduction and Removal: COQUALMO (COnstructive QUALity MOdel)

Modeling Software Defect Introduction and Removal: COQUALMO (COnstructive QUALity MOdel) Modeling Software Defect Introduction and Removal: COQUALMO (COnstructive QUALity MOdel) Sunita Chulani and Barry Boehm USC - Center for Software Engineering Los Angeles, CA 90089-0781 1-213-740-6470 {sdevnani,

More information

Quality Management Lessons of COQUALMO (COnstructive QUALity MOdel) A Software Defect Density Prediction Model

Quality Management Lessons of COQUALMO (COnstructive QUALity MOdel) A Software Defect Density Prediction Model Quality Management Lessons of COQUALMO (COnstructive QUALity MOdel) A Software Defect Density Prediction Model AWBrown and Sunita Chulani, Ph.D. {AWBrown, sdevnani}@csse.usc.edu} -Center for Systems &

More information

Modeling Software Defect Introduction

Modeling Software Defect Introduction Modeling Software Defect Introduction Sunita Devnani-Chulani (sdevnani@sunset.usc.edu) California Software Symposium November 7, 1997 OMO IISunita Devnani-Chulani chart 1 Presentation Outline Motivation

More information

Quality Management Lessons of COQUALMO (COnstructive QUALity MOdel) A Software Defect Density Prediction Model

Quality Management Lessons of COQUALMO (COnstructive QUALity MOdel) A Software Defect Density Prediction Model Quality Management Lessons of COQUALMO (COnstructive QUALity MOdel) A Software Defect Density Prediction Model AWBrown and Sunita Chulani, Ph.D. {AWBrown, sdevnani}@csse.usc.edu} -Center for Systems &

More information

COCOMO II Bayesian Analysis

COCOMO II Bayesian Analysis COCOMO II Bayesian Analysis Sunita Chulani (sdevnani@sunset.usc.edu) Center for Software Engineering University of Southern California Annual Research Review March 9, 1998 Outline Motivation Research Approach

More information

3. December seminar cost estimation W 2002/2003. Constructive cost model Department of Information Technology University of Zurich

3. December seminar cost estimation W 2002/2003. Constructive cost model Department of Information Technology University of Zurich I 3. December 2002 seminar cost estimation W 2002/2003 COCOMO Constructive cost model Department of Information Technology University of Zurich Nancy Merlo-Schett Nancy Merlo-Schett, Department of Information

More information

User Manual. COCOMO II.2000 Post-Architecture Model Spreadsheet Implementation (Microsoft Excel 1997)

User Manual. COCOMO II.2000 Post-Architecture Model Spreadsheet Implementation (Microsoft Excel 1997) User Manual COCOMO II.2000 Post-Architecture Model Spreadsheet Implementation (Microsoft Excel 1997) Center for Software Engineering University of Southern California 2000 USC C enter for Software Engineering

More information

MTAT Software Economics. Session 6: Software Cost Estimation

MTAT Software Economics. Session 6: Software Cost Estimation MTAT.03.244 Software Economics Session 6: Software Cost Estimation Marlon Dumas marlon.dumas ät ut. ee Outline Estimating Software Size Estimating Effort Estimating Duration 2 For Discussion It is hopeless

More information

Expert- Judgment Calibrated Quality Model Extension to COCOMO 11: COQUALMO (Constructive QUALity Model) Outline

Expert- Judgment Calibrated Quality Model Extension to COCOMO 11: COQUALMO (Constructive QUALity Model) Outline Expert- Judgment Calibrated Quality Model Extension to COCOMO 11: COQUALMO (Constructive QUALity Model) Sunita Chulani Research Assistant USC-Center for Software Engineering Technology Week Feb 8-12 1999

More information

Determining How Much Software Assurance Is Enough?

Determining How Much Software Assurance Is Enough? Determining How Much Software Assurance Is Enough? Tanvir Khan Concordia Institute of Information Systems Engineering Ta_k@encs.concordia.ca Abstract It has always been an interesting problem for the software

More information

SOFTWARE EFFORT AND SCHEDULE ESTIMATION USING THE CONSTRUCTIVE COST MODEL: COCOMO II

SOFTWARE EFFORT AND SCHEDULE ESTIMATION USING THE CONSTRUCTIVE COST MODEL: COCOMO II SOFTWARE EFFORT AND SCHEDULE ESTIMATION USING THE CONSTRUCTIVE COST MODEL: COCOMO II Introduction Jongmoon Baik, Sunita Chulani, Ellis Horowitz University of Southern California - Center for Software Engineering

More information

Elaboration Cost Drivers Workshop. 18 th COCOMO / SCE Forum October 2003

Elaboration Cost Drivers Workshop. 18 th COCOMO / SCE Forum October 2003 Elaboration Cost Drivers Workshop 18 th COCOMO / SCE Forum October 200 Attendees Brad Clark (moderator) Mauricio Aguiar Michael Douglas Samuel Eiferman Stuart Garrett Dan Ligett Vicki Love Karen Lum Karen

More information

COCOMO II Based Project Cost Estimation and Control

COCOMO II Based Project Cost Estimation and Control 3rd International Conference on Education, Management, Arts, Economics and Social Science (ICEMAESS 2015) COCOMO II Based Project Cost Estimation and Control Aihua Ren1, a, Yun Chen1, b 1 School of Computer

More information

A Process for Mapping COCOMO Input Parameters to True S Input Parameters

A Process for Mapping COCOMO Input Parameters to True S Input Parameters A Process for Mapping Input s to Input s Agenda > Overview > Rosetta Stone II > Analysis > Summary 2 Overview > Initial Comparison and Assessment was Completed by USC Center for Systems & Software Engineering

More information

SEER-SEM to COCOMO II Factor Convertor

SEER-SEM to COCOMO II Factor Convertor SEER-SEM to COCOMO II Factor Convertor Anthony L Peterson Mechanical Engineering 8 June 2011 SEER-SEM to COCOMO II Factor Convertor The Software Parametric Models COCOMO II public domain model which continues

More information

SENG380:Software Process and Management. Software Size and Effort Estimation Part2

SENG380:Software Process and Management. Software Size and Effort Estimation Part2 SENG380:Software Process and Management Software Size and Effort Estimation Part2 1 IFPUG File Type Complexity Table 1 External user type External input types External output types Low Average High 3 4

More information

The Rosetta Stone: Making COCOMO 81 Files Work With COCOMO II

The Rosetta Stone: Making COCOMO 81 Files Work With COCOMO II The Rosetta Stone: Making COCOMO 81 Files Work With COCOMO II Donald J. Reifer, Reifer Consultants, Inc. Barry W. Boehm, University of Southern California Sunita Chulani, University of Southern California

More information

You document these in a spreadsheet, estimate them individually and compute the total effort required.

You document these in a spreadsheet, estimate them individually and compute the total effort required. Experience-based approaches Experience-based techniques rely on judgments based on experience of past projects and the effort expended in these projects on software development activities. Typically, you

More information

COCOMO II.2003 Calibration Status USC-CSE 1

COCOMO II.2003 Calibration Status USC-CSE 1 COCOMO II.2003 Calibration Status 2003-3-19 USC-CSE 1 Outline Introduction to COCOMO II COCOMO II.2003 Calibration Conclusion 2003-3-19 USC-CSE 2 COCOMO II Model Usage COCOMO II Estimation Endpoints I

More information

Software Project Management. Software effort

Software Project Management. Software effort Software Project Management Chapter Five Software effort estimation 1 Objectives The lecture discusses: why estimating is problematic (or challenging ) the main generic approaches to estimating, including:

More information

Software User Manual Version 3.0. COCOMOII & COCOTS Application. User Manual. Maysinee Nakmanee. Created by Maysinee Nakmanee 2:07 PM 9/26/02 1

Software User Manual Version 3.0. COCOMOII & COCOTS Application. User Manual. Maysinee Nakmanee. Created by Maysinee Nakmanee 2:07 PM 9/26/02 1 COCOMOII & COCOTS Application User Manual Maysinee Nakmanee Created by Maysinee Nakmanee 2:07 PM 9/26/02 1 Created by Maysinee Nakmanee 2:07 PM 9/26/02 2 Contents INTRODUCTION... 4 MODEL OVERVIEW... 5

More information

Calibrating the COCOMO II Post-Architecture Model

Calibrating the COCOMO II Post-Architecture Model Calibrating the COCOMO II Post-Architecture Model Sunita Devnani-Chulani Bradford Clark Barry Boehm Center for Software Engineering Computer Science Department University of Southern California Los Angeles,

More information

COCOMO II Demo and ARS Example

COCOMO II Demo and ARS Example COCOMO II Demo and ARS Example CS 566 Software Management and Economics Lecture 5 (Madachy 2005; Chapter 3, Boehm et al. 2000) Ali Afzal Malik Outline USC COCOMO II tool demo Overview of Airborne Radar

More information

COCOMO II Model. Brad Clark CSE Research Associate 15th COCOMO/SCM Forum October 22, 1998 C S E USC

COCOMO II Model. Brad Clark CSE Research Associate 15th COCOMO/SCM Forum October 22, 1998 C S E USC COCOMO II Model Brad Clark CSE Research Associate 15th COCOMO/SCM Forum October 22, 1998 Brad@Software-Metrics.com COCOMO II Model Overview COCOMO II Overview Sizing the Application Estimating Effort Estimating

More information

COCOMO III. Brad Clark, PhD USC Center for Systems and Software Engineering 2017 Annual Research Review April 4, 2017

COCOMO III. Brad Clark, PhD USC Center for Systems and Software Engineering 2017 Annual Research Review April 4, 2017 COCOMO III Brad Clark, PhD USC 2017 Annual Research Review April 4, 2017 The COCOMO III Project COCOMO (COnstructure COst MOdel) is the most widely used, free, open source software cost estimation model

More information

C S E USC. University of Southern California Center for Software Engineering

C S E USC. University of Southern California Center for Software Engineering COCOMO II: Airborne Radar System Example Dr. Ray Madachy C-bridge Internet Solutions Center for Software Engineering 15th International Forum on COCOMO and Software Cost Modeling October 24, 2000 C S E

More information

COCOMO Summary. USC-CSE COCOMO Team

COCOMO Summary. USC-CSE COCOMO Team K. Appendix 1. COCOMO II Summary COCOMO Summary Constructive Cost Model(COCOMO) is USC-CSE COCOMO Team Abstract 1-1 Table of Contents 1 Introduction...3 2 Overall Model Definition...3 2.1 COCOMO II Models

More information

Systems Cost Modeling

Systems Cost Modeling Systems Cost Modeling Affiliate Breakout Group Topic Gary Thomas, Raytheon 0 1900 USC Center for Software Engineering Sy~C~stModelingBreakoutTopicVisual-v0-1 vl.o - 10/27/00 University of Southern California

More information

Effects of Process Maturity on Development Effort

Effects of Process Maturity on Development Effort Effects of Process Maturity on Development Effort Bradford K. Clark Center for Software Engineering University of Southern California Los Angeles, CA 90089-0781 Abstract There is a good deal of anecdotal

More information

A Value-Based Orthogonal Framework for Improving Life-Cycle Affordability

A Value-Based Orthogonal Framework for Improving Life-Cycle Affordability A Value-Based Orthogonal Framework for Improving Life-Cycle Affordability Barry Boehm, Jo Ann Lane, Sue Koolmanojwong http://csse.usc.edu NDIA Systems Engineering Conference October 25, 2012 Outline Affordability

More information

COCOMO III Drivers of Cost

COCOMO III Drivers of Cost Software Size Product Attributes Platform Attributes COCOMO III Drivers of Cost Personnel Attributes Project Attributes Quality Attributes Attributes & Drivers Product Attributes... 3 Impact of Software

More information

Software cost estimation

Software cost estimation Software cost estimation Joseph Bonello (based on slides by Ian Sommerville) Objectives To introduce the fundamentals of software costing and pricing To describe three metrics for software productivity

More information

Improving the Accuracy of COCOMO II Using Fuzzy Logic and Local Calibration Method

Improving the Accuracy of COCOMO II Using Fuzzy Logic and Local Calibration Method Improving the Accuracy of COCOMO II Using Fuzzy Logic and Local Calibration Method Muhammad Baiquni, Riyanarto Sarno, Sarwosri Department of Informatics Engineering, Institut Teknologi Sepuluh Nopember

More information

Chapter 5 Estimate Influences

Chapter 5 Estimate Influences Dilbert Scott Adams Dilbert Scott Adams Chapter 5 Estimate Influences How much is 68 + 73? ENGINEER: It s 141. Short and sweet. MATHEMATICIAN: 68 + 73 = 73 + 68 by the commutative law of addition. True,

More information

CSCI 510 Midterm 1, Fall 2017

CSCI 510 Midterm 1, Fall 2017 CSCI 510 Midterm 1, Fall 2017 Monday, September 25, 2017 3 questions, 100 points If registered DEN student, please circle: Yes Last Name: First Name: USC ID: Question 1 (30) Question 2 (40) Question 3

More information

Chapter 1: Introduction

Chapter 1: Introduction 1.1 What is COCOMO? COCOMO (COnstructive COst MOdel) is a screen-oriented, interactive software package that assists in budgetary planning and schedule estimation of a software development project. Through

More information

A Bayesian Software Estimating Model Using a Generalized g-prior Approach

A Bayesian Software Estimating Model Using a Generalized g-prior Approach A Bayesian Software Estimating Model Using a Generalized g-prior Approach Sunita Chulani Research Assistant Center for Software Engineering University of Southern California Los Angeles, CA 90089-078,

More information

Software Efforts and Cost Estimation with a Systematic Approach

Software Efforts and Cost Estimation with a Systematic Approach Software Efforts and Cost Estimation with a Systematic Approach Chetan Nagar, 2 Anurag Dixit Ph.D Student, Mewar University (Gangrar) Chittodgarh Rajasthan India 2 Dean-Professor(CS/IT) BRCM CET,Bahal

More information

LADOT SCANNING. Team 8. Team members Primary Role Secondary Role. Aditya Kumar Feasibility Analyst Project Manager

LADOT SCANNING. Team 8. Team members Primary Role Secondary Role. Aditya Kumar Feasibility Analyst Project Manager Life Cycle Plan (LCP) LADOT SCANNING Team 8 Team members Role Role Aditya Kumar Feasibility Analyst Project Manager Anirudh Govil Project Manager Lifecycle Planner Corey Painter IIV&V Shaper Jeffrey Colvin

More information

COCOMO I1 Status and Plans

COCOMO I1 Status and Plans - A University of Southern California c I S IE I Center for Software Engineering COCOMO I1 Status and Plans Brad Clark, Barry Boehm USC-CSE Annual Research Review March 10, 1997 University of Southern

More information

Fundamental estimation questions. Software cost estimation. Costing and pricing. Software cost components. Software pricing factors

Fundamental estimation questions. Software cost estimation. Costing and pricing. Software cost components. Software pricing factors Fundamental estimation questions Software cost estimation How much effort is required to complete an activity? How much calendar time is needed to complete an activity? What is the total cost of an activity?

More information

SENG Software Reliability and Software Quality Project Assignments

SENG Software Reliability and Software Quality Project Assignments The University of Calgary Department of Electrical and Computer Engineering SENG 521 - Software Reliability and Software Quality Project Assignments Behrouz Far Fall 2012 (Revision 1.01) 1 Assignment no.

More information

According to the Software Capability Maturity Model (SW-

According to the Software Capability Maturity Model (SW- Data Collection Four areas generally influence software development effort: product factors, project factors, platform factors, and personnel facfocus estimation Quantifying the Effects of Process Improvement

More information

Calibrating Software Cost Models Using Bayesian Analysis

Calibrating Software Cost Models Using Bayesian Analysis 1 Abstract Calibrating Software Cost Models Using Bayesian Analysis Sunita Chulani*, Barry Boehm, Bert Steece University of Southern California Henry Salvatori, Room 330 Los Angeles, CA 90089-0781 *sdevnani@sunset.usc.edu

More information

Calibration Approach and Results of the COCOMO II Post- Architecture Model

Calibration Approach and Results of the COCOMO II Post- Architecture Model Calibration Approach and Results of the COCOMO II Post- Architecture Model Sunita Chulani*, Brad Clark, Barry Boehm (USC-Center for Software Engineering) Bert Steece (USC-Marshall School of Business) ISPA

More information

Project Plan. For KDD- Service based Numerical Entity Searcher (KSNES) Version 1.1

Project Plan. For KDD- Service based Numerical Entity Searcher (KSNES) Version 1.1 Project Plan For KDD- Service based Numerical Entity Searcher (KSNES) Version 1.1 Submitted in partial fulfillment of the Masters of Software Engineering degree. Naga Sowjanya Karumuri CIS 895 MSE Project

More information

Lecture 10 Effort and time estimation

Lecture 10 Effort and time estimation 1 Lecture 10 Effort and time estimation Week Lecture Exercise 10.3 Quality in general; Patterns Quality management systems 17.3 Dependable and safety-critical systems ISO9001 24.3 Work planning; effort

More information

Project Plan: MSE Portfolio Project Construction Phase

Project Plan: MSE Portfolio Project Construction Phase Project Plan: MSE Portfolio Project Construction Phase Plans are nothing; planning is everything. Dwight D. Eisenhower September 17, 2010 Prepared by Doug Smith Version 2.0 1 of 7 09/26/2010 8:42 PM Table

More information

Life Cycle Plan (LCP)

Life Cycle Plan (LCP) Life Cycle Plan (LCP) Mental Math Team - 7 Isha Agarwal Prototyper, Life Cycle Planner, JingXing Cheng Kajal Taneja Operational Concept Engineer, UML Modeler, Kiranmai Ponakala, Life Cycle Planner, IIV

More information

MTAT Software Economics

MTAT Software Economics MTAT.03.244 Software Economics Product Management (3) Dietmar Pfahl Fall 2016 email: dietmar.pfahl@ut.ee Topics Today Q&A on Assignment 3 Product Sizing: Function Point Analysis (FPA) Parametric Cost Estimation:

More information

A Cost Model for Ontology Engineering

A Cost Model for Ontology Engineering A Cost Model for Ontology Engineering Elena Paslaru Bontas, Malgorzata Mochol AG Netzbasierte Informationssysteme paslaru@inf.fu-berlin.de mochol@inf.fu-berlin.de August 3, 2005 Technical Report B-05-03

More information

CORADMO and COSSEMO Driver Value Determination Worksheet

CORADMO and COSSEMO Driver Value Determination Worksheet 1. COCOMO Stage Schedule and Effort MODEL (COSSEMO) COSSEMO is based on the lifecycle anchoring concepts discussed by Boehm 3. The anchor points are defined as Life Cycle Objectives (LCO), Life Cycle Architecture

More information

Lecture 7 Project planning part 2 Effort and time estimation

Lecture 7 Project planning part 2 Effort and time estimation 1 Lecture 7 Project planning part 2 Effort and time estimation 22.2.2015 Lecture schedule 12.01 Introduction to the course and software engineering 19.01 Life-cycle/process models Project planning (part

More information

A Comparative study of Traditional and Component based software engineering approach using models

A Comparative study of Traditional and Component based software engineering approach using models A Comparative study of Traditional and Component based software engineering approach using models Anshula Verma 1, Dr. Gundeep Tanwar 2 1, 2 Department of Computer Science BRCM college of Engineering and

More information

Life Cycle Plan (LCP)

Life Cycle Plan (LCP) Life Cycle Plan (LCP) City of Los Angeles Public Safety Applicant Resource Center Team No. 09 Team members and roles: Vaibhav Mathur Project Manager Preethi Ramesh Feasibility Analyst Arijit Dey Requirements

More information

Software cost estimation

Software cost estimation Software cost estimation Objectives To introduce the fundamentals of software costing and pricing To describe three metrics for software productivity assessment To explain why different techniques should

More information

Testing 2. Testing: Agenda. for Systems Validation. Testing for Systems Validation CONCEPT HEIDELBERG

Testing 2. Testing: Agenda. for Systems Validation. Testing for Systems Validation CONCEPT HEIDELBERG CONCEPT HEIDELBERG GMP Compliance for January 16-17, 2003 at Istanbul, Turkey Testing for Systems Validation Dr.-Ing. Guenter Generlich guenter@generlich.de Testing 1 Testing: Agenda Techniques Principles

More information

ETSF01: Software Engineering Process Economy and Quality. Chapter Five. Software effort estimation. Software Project Management

ETSF01: Software Engineering Process Economy and Quality. Chapter Five. Software effort estimation. Software Project Management Software Project Management ETSF01: Software Engineering Process Economy and Quality Dietmar Pfahl Lund University Chapter Five Software effort estimation What makes a successful project? Cost/Schedule

More information

Project Plan. CivicPlus Activity Metrics Tool. Version 1.0. Keith Wyss CIS 895 MSE Project Kansas State University

Project Plan. CivicPlus Activity Metrics Tool. Version 1.0. Keith Wyss CIS 895 MSE Project Kansas State University Project Plan CivicPlus Activity Metrics Tool Version 1.0 Keith Wyss CIS 895 MSE Project Kansas State University Table of Contents 1. INTRODUCTION... 5 1.1. REFERENCES... 5 2. WORK BREAKDOWN STRUCTURE...

More information

Software Estimation Experiences at Xerox

Software Estimation Experiences at Xerox 15th International Forum on COCOMO and Software Cost Modeling Software Estimation Experiences at Xerox Dr. Peter Hantos Office Systems Group, Xerox 1 Theme Is making bad estimates a crime? No, but it is

More information

Experience with Empirical Studies in Industry: Building Parametric Models

Experience with Empirical Studies in Industry: Building Parametric Models Experience with Empirical Studies in Industry: Building Parametric Models Barry Boehm, USC boehm@usc.edu CESI 2013 May 20, 2013 5/20/13 USC-CSSE 1 Outline Types of empirical studies with Industry Types,

More information

Life Cycle Plan (LCP)

Life Cycle Plan (LCP) Life Cycle Plan (LCP) Mission Science Information and Data Management System 3.0 Team 03 Fei Yu: Project Manager, Life Cycle Planner Yinlin Zhou: Prototyper, Operational Concept Engineer Yunpeng Chen:

More information

COSOSIMO Parameter Definitions Jo Ann Lane University of Southern California Center for Software Engineering

COSOSIMO Parameter Definitions Jo Ann Lane University of Southern California Center for Software Engineering Constructive System-of-Systems Integration Cost Model COSOSIMO Parameter Definitions Jo Ann Lane University of Southern California Center for Software Engineering jolane@usc.edu Introduction The Constructive

More information

Software Engineering

Software Engineering Software Engineering (CS550) Estimation w/ COCOMOII Jongmoon Baik WHO SANG COCOMO? The Beach Boys [1988] KoKoMo Aruba, Jamaica,ooo I wanna take you To Bermuda, Bahama,come on, pretty mama Key Largo, Montego,

More information

SOFTWARE ENGINEERING

SOFTWARE ENGINEERING SOFTWARE ENGINEERING Project planning Once a project is found to be feasible, software project managers undertake project planning. Project planning is undertaken and completed even before any development

More information

Test Workflow. Michael Fourman Cs2 Software Engineering

Test Workflow. Michael Fourman Cs2 Software Engineering Test Workflow Michael Fourman Introduction Verify the result from implementation by testing each build Plan the tests in each iteration Integration tests for every build within the iteration System tests

More information

Figure 1 Function Point items and project category weightings

Figure 1 Function Point items and project category weightings Software measurement There are two significant approaches to measurement that project managers need to be familiar with. These are Function Point Analysis (Albrecht, 1979) and COCOMO (Boehm, 1981). 1.

More information

COCOMO 1 II and COQUALMO 2 Data Collection Questionnaire

COCOMO 1 II and COQUALMO 2 Data Collection Questionnaire COCOMO 1 II and COQUALMO 2 Data Collection Questionnaire 1. Introduction The Center for Software Engineering at the University of Southern California is conducting research to update the software development

More information

Chapter 3. Information Systems Development. McGraw-Hill/Irwin. Copyright 2007 by The McGraw-Hill Companies, Inc. All rights reserved.

Chapter 3. Information Systems Development. McGraw-Hill/Irwin. Copyright 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 3 Information Systems Development McGraw-Hill/Irwin Copyright 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Objectives 3-2 Describe the motivation for a system development process

More information

CSCI 510 Final Exam, Fall 2017 v10 of solution & rubric Monday, December 11, questions, 300 points

CSCI 510 Final Exam, Fall 2017 v10 of solution & rubric Monday, December 11, questions, 300 points CSCI 510 Final Exam, Fall 2017 v10 of solution & rubric Monday, December 11, 2017 4 questions, 300 points If registered DEN student, please circle: Yes Last Name: First Name: USC ID: Question 1 (48) Question

More information

presents the COnstructive COst MOdel (COCOMO): the most advanced, thoroughly calibrated software cost estimation model available today.

presents the COnstructive COst MOdel (COCOMO): the most advanced, thoroughly calibrated software cost estimation model available today. SOFTWARE ENGINEERING ECONOMICS BARRY W. BOEHM SUMMARY The primary learning objectives of Software Engineering Economies by Barry W. Boehm are to: identify the factors most strongly influencing software

More information

Software Engineering Economics (CS656)

Software Engineering Economics (CS656) Software Engineering Economics (CS656) Software Cost Estimation w/ COCOMO II Jongmoon Baik Software Cost Estimation 2 You can not control what you can not see - Tom Demarco - 3 Why Estimate Software? 30%

More information

COSYSMO: Constructive Systems Engineering Cost Model

COSYSMO: Constructive Systems Engineering Cost Model COSYSMO: Constructive Systems Engineering Cost Model Barry Boehm, USC CSE Annual Research Review February 6, 2001 Outline Background Scope Proposed Approach Strawman Model Size & complexity Cost & schedule

More information

Reasoning about the Value of Dependability: idave Model

Reasoning about the Value of Dependability: idave Model Reasoning about the Value of Dependability: idave Model LiGuo Huang, Barry Boehm, Apurva Jain liguohua, boehm, apurvaja@usc.edu 10/22/2003 USC-CSE 1 idave Model Objectives idave: Information Dependability

More information

Security Factors in Effort Estimation of Software Projects

Security Factors in Effort Estimation of Software Projects Security Factors in Effort Estimation of Software Projects Jana Sedláčková Department of Information Systems Faculty of Information Technology Brno University of Technology Božetěchova 2, 612 66 Brno,

More information

Factors Influencing System-of-Systems Architecting and Integration Costs

Factors Influencing System-of-Systems Architecting and Integration Costs Paper # (unknown) Factors Influencing System-of-Systems Architecting and Integration Costs Jo Ann Lane University of Southern California Center for Software Engineering 941 W. 37th Place, SAL Room 328

More information

DRAFT. Effort = A * Size B * EM. (1) Effort in person-months A - calibrated constant B - scale factor EM - effort multiplier from cost factors

DRAFT. Effort = A * Size B * EM. (1) Effort in person-months A - calibrated constant B - scale factor EM - effort multiplier from cost factors 1.1. Cost Estimation Models Parametric cost models used in avionics, space, ground, and shipboard platforms by the services are generally based on the common effort formula shown in Equation 1. Size of

More information

Life Cycle Plan (LCP)

Life Cycle Plan (LCP) LCP_FCP_F14a_T07_V2.0 Version 2.0 Life Cycle Plan (LCP) Mission Science irobots Team 07 Ashwini Ramesha Chen Li Farica Mascarenhas Jiashuo Li Ritika Khurana Siddhesh Rumde Sowmya Sampath Yun Shao OCE,

More information

XOMO: Understanding Development Options for Autonomy

XOMO: Understanding Development Options for Autonomy XOMO: Understanding Development Options for Autonomy Tim Menzies Computer Science, Portland State University tim@timmenzies.net Julian Richardson RIACS/USRA, NASA Ames Research Center julianr@email.arc.nasa.gov

More information

Bayesian Analysis of Empirical Software Engineering Cost Models

Bayesian Analysis of Empirical Software Engineering Cost Models Bayesian Analysis of Empirical Software Engineering Cost Models 1. Abstract Sunita Chulani, Barry Boehm, Bert Steece University of Southern California Henry Salvatori, Room 330 Los Angeles, CA 90089-0781

More information

Resource Model Studies

Resource Model Studies Resource Model Studies MODELING AND MEASURING RESOURCES Model Validation Study Walston and Felix build a model of resource estimation for the set of projects at the IBM Federal Systems Division. They did

More information

Ilities Tradespace and Affordability Program (itap)

Ilities Tradespace and Affordability Program (itap) Ilities Tradespace and Affordability Program (itap) By Barry Boehm, USC Russell Peak, GTRI 6 th Annual SERC Sponsor Research Review December 4, 2014 Georgetown University School of Continuing Studies 640

More information

Life Cycle Plan (LCP)

Life Cycle Plan (LCP) Life Cycle Plan (LCP) BlackProfessionals.net Team #6 Tian Xiang Tan Jhih-Sheng Cai Aril Alok Jain Pablo Ochoa Jeng-Tsung Tsai Sadeem Alsudais Po-Hsuan Yang Project Manager System/Software Architect Requirements

More information

Amanullah Dept. Computing and Technology Absayn University Peshawar Abdus Salam

Amanullah Dept. Computing and Technology Absayn University Peshawar Abdus Salam A Comparative Study for Software Cost Estimation Using COCOMO-II and Walston-Felix models Amanullah Dept. Computing and Technology Absayn University Peshawar scholar.amankhan@gmail.com Abdus Salam Dept.

More information

An Empirical Study of the Efficacy of COCOMO II Cost Drivers in Predicting a Project s Elaboration Profile

An Empirical Study of the Efficacy of COCOMO II Cost Drivers in Predicting a Project s Elaboration Profile An Empirical Study of the Efficacy of COCOMO II Cost Drivers in Predicting a Project s Elaboration Profile Ali Afzal Malik, Barry W. Boehm Center for Systems and Software Engineering University of Southern

More information

Analysis of System ility Synergies and Conflicts

Analysis of System ility Synergies and Conflicts Analysis of System ility Synergies and Conflicts Barry Boehm, USC NDIA SE Conference October 30, 2014 10-30-2014 1 Ilities Tradespace and Affordability Analysis Critical nature of the ilities Or non-functional

More information

Life Cycle Plan (LCP)

Life Cycle Plan (LCP) Life Cycle Plan (LCP) Version 1.0 Life Cycle Plan (LCP) Software Quality Analysis as a Service (SQAaaS) Team No.1 Kavneet Kaur Requirement Engineer George Llames IIV & V Aleksandr Chernousov Life Cycle

More information

SOFTWARE ENGINEERING. Topics covered 1/20/2015. Chapter 3 - Project Management. Risk management Managing people Project cost Project plan & schedule

SOFTWARE ENGINEERING. Topics covered 1/20/2015. Chapter 3 - Project Management. Risk management Managing people Project cost Project plan & schedule SOFTWARE ENGINEERING Chapter 3 - Project Management Sep 2013 Chapter 2. Project Management 2 Topics covered Risk management Managing people Project cost Project plan & schedule 1 Sep 2013 Chapter 2. Project

More information

Life Cycle Plan (LCP)

Life Cycle Plan (LCP) Life Cycle Plan (LCP) BlackProfessionals.net Team #6 Tian Xiang Tan Jhih-Sheng Cai Aril Alok Jain Pablo Ochoa Jeng-Tsung Tsai Sadeem Alsudais Po-Hsuan Yang Project Manager System/Software Architect Requirements

More information

Predicting Defect Types in Software Projects

Predicting Defect Types in Software Projects dr Łukasz Radliński Institute of Information Technology in Management Faculty of Economics and Management University of Szczecin Predicting Defect Types in Software Projects Abstract Predicting software

More information

Life Cycle Plan (LCP)

Life Cycle Plan (LCP) Life Cycle Plan (LCP) Swim Meet Sign-Up Team 03 Member Archan Dutta Swasti Sharma Rasleen Sahni Deepanshu Suneja Vibhanshu Sharma Jenny Greer Role Project Manager, Life Cycle Planner, Tester Operational

More information

Goals of course. Themes: What can you do to evaluate a new technique? How do you measure what you are doing?

Goals of course. Themes: What can you do to evaluate a new technique? How do you measure what you are doing? MSWE 607: Software Life Cycle methods and Techniques Instructor: Professor Marvin V. Zelkowitz Office: 4121 AV Williams Phone: 405-2690 or 403-8935 (Fraunhofer Center) Email (Best way to contact) mvz@cs.umd.edu

More information

Management and MDD. Peter Dolog dolog [at] cs [dot] aau [dot] dk E2-201 Information Systems March 6, 2007

Management and MDD. Peter Dolog dolog [at] cs [dot] aau [dot] dk E2-201 Information Systems March 6, 2007 Management and MDD Peter Dolog dolog [at] cs [dot] aau [dot] dk E2-201 Information Systems March 6, 2007 2 Management Software Engineering Management 3 Req. Design Const. Test Iterations Management 4 5

More information

Software Economics Homework I

Software Economics Homework I Software Economics Homework I Function Point Analysis and Effort Estimation Martin Vels Raido Seene Rauno Kiss Tartu 2012 2 Table of Contents TABLE OF CONTENTS... 3 OVERVIEW... 5 SCOPE... 5 DOMAIN MODEL...

More information

Introduction to Software Metrics

Introduction to Software Metrics Introduction to Software Metrics Outline Today we begin looking at measurement of software quality using software metrics We ll look at: What are software quality metrics? Some basic measurement theory

More information

Early Phase Software Effort Estimation Model: A Review

Early Phase Software Effort Estimation Model: A Review Early Phase Software Effort Estimation Model: A Review Priya Agrawal, Shraddha Kumar CSE Department, Sushila Devi Bansal College of Technology, Indore(M.P.), India Abstract Software effort estimation is

More information

Life Cycle Plan (LCP)

Life Cycle Plan (LCP) Life Cycle Plan (LCP) City of Los Angeles Public Safety Applicant Resource Center Team No. 09 Team members and roles: Vaibhav Mathur Project Manager Preethi Ramesh Feasibility Analyst Arijit Dey Requirements

More information

Software Architecture Challenges for Complex Systems of Systems

Software Architecture Challenges for Complex Systems of Systems Software Architecture Challenges for Complex Systems of Systems Barry Boehm, USC-CSE CSE Annual Research Review March 6, 2003 (boehm@sunset.usc.edu) (http://sunset.usc.edu) 3/18/03 USC-CSE 1 Complex Systems

More information

Quantitative Control of Process Changes Through Modeling, Simulation and Benchmarking

Quantitative Control of Process Changes Through Modeling, Simulation and Benchmarking Quantitative Control of Process Changes Through Modeling, Simulation and Benchmarking Nancy Eickelmann, Animesh Anant, Sang Hyun, Jongmoon Baik SSERL, Motorola, 1303 E. Algonquin Rd. Schaumburg, L, 601

More information

COCOMO Models 26/12/2016 1

COCOMO Models 26/12/2016 1 COCOMO Models 26/12/2016 1 Project Management and Mr. Murphy 1. Logic is a systematic method of coming to the wrong conclusion with confidence. 2. Technology is dominated by those who manage what they

More information