An Acquirer s Guide to Navigating Contractor Data

Size: px
Start display at page:

Download "An Acquirer s Guide to Navigating Contractor Data"

Transcription

1 Pittsburgh, PA An Acquirer s Guide to Navigating Contractor Data Jeannine Siviy Wolf Goethert Robert Ferguson Software Engineering Measurement & Analysis Initiative Sponsored by the U.S. Department of Defense 2004 by University Version 1.0 page 1

2 Report Documentation Page Form Approved OMB No Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE JAN REPORT TYPE 3. DATES COVERED to TITLE AND SUBTITLE An Acquirer s Guide to Navigating Contractor Data 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) University,Software Engineering Institute,Pittsburgh,PA, PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR S ACRONYM(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT 11. SPONSOR/MONITOR S REPORT NUMBER(S) 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Same as Report (SAR) 18. NUMBER OF PAGES 51 19a. NAME OF RESPONSIBLE PERSON Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18

3 Trademarks and Service Marks Capability Maturity Model, Capability Maturity Modeling,, CERT, CERT Coordination Center, CMM, and CMMI are registered in the U.S. Patent and Trademark Office by University. SM Architecture Tradeoff Analysis Method; ATAM;CMM Integration; CURE; IDEAL; Interim Profile; OCTAVE; Operationally Critical Threat, Asset, and Vulnerability Evaluation; Personal Software Process; PSP; SCAMPI; SCAMPI Lead Assessor; SCAMPI Lead Appraiser; SCE; SEI; SEPG; Team Software Process; and TSP are service marks of University by University Version 1.0 page 2

4 Objectives Establish a view of the acquirer and supplier/contractor roles and responsibilities. Show how measurement and analysis skills for internal development can be recast for acquisition and contracting environments. Address a prevalent question in the acquisition community: How can we conduct causal analysis when we no longer control the collection processes and/or data? 2004 by University Version 1.0 page 3

5 Outline Acquisition roles & responsibilities Measurement & analysis methods Illustration background monitoring and oversight: progress analysis Summary References 2004 by University Version 1.0 page 4

6 Responsibility and Authority Measuring project and product success is the same whether the project is internal or contracted: on schedule at cost with required functionality without defects The acquiring program manager s circle of influence and circle of control is different than the development project manager s. development project manager addresses project execution acquisition program manager executes new set of processes acquisition program manager should leverage development knowledge to manage the contract methodically, rationally, and knowledgeably 2004 by University Version 1.0 page 5

7 Roles and Information Exchange Acquirer Pre-award activities RFP prep. Contract Award Post-award activities Monitor Project Progress Evaluate Deliverables Contractual Handshake Functional & Quality Requirements Status Information Interim Documents, Tangibles Directions, Corrections Deliverables Supplier / Developer Develop, Customize, Integrate systems software COTS Sub Contractors SEPG / SAPG SEPG 2004 by University Version 1.0 page 6

8 Measuring Project, Product, Process Acquirer Contractual Handshake Supplier Processes Pre-award activities Post-award activities Exchange of indicators / information for tracking, monitoring, direction, etc. Develop, Customize, Integrate systems software COTS Project Schedule (status, projection, trend) Cost (status, projection, trend) Requirements satisfaction Relationship Roles (changes) Invoicing (payment) Products Supplier Produced Quality (amount of rework) Acquisition Organization Produced Quality (amount of rework) 2004 by University Version 1.0 page 7

9 Responsibilities After Contract Award Contractor Develop the System Deliverables Documents - SRD - SDP - Measurement Plan - SDD Status Reports - Schedule - Cost - Testing Final Product ACQUIRER Acquirer Responsibilities (Post-Contract Award) Evaluate Quality of Deliverables Monitor and Oversight - Schedule & Progress - Resources & Costs - Developer s Processes 2004 by University Version 1.0 page 8

10 Monitoring & Oversight Status Information - schedule progress - budget status - test results - process results, such as inspections - process compliance Acquirer s Analysis & Review Process Acquirer's Evaluation Criteria Measurable Results (Examples) contractor effort actual vs. plan contractor schedule actual vs. plan defects reported description, severity, class, type size, complexity of the work product Indicators 2004 by University Version 1.0 page 9

11 Evaluating Quality of Deliverables Measurable Results (Examples) Documents to review - SRD - SDP - Meas Plan - SDD Final Deliverables Acquirer s Inspection or Review Process Acquirer s Evaluation criteria Products defects discovered description, severity, class, type size of the work product Process effort invested in the inspection process time spent during the inspection activities Indicators 2004 by University Version 1.0 page 10

12 Outline Acquisition roles & responsibilities Measurement & analysis methods Illustration background monitoring and oversight: progress analysis Summary References 2004 by University Version 1.0 page 11

13 Sources for Measures Goal-Driven (Software) Measurement (GDM) Goals Questions Indicators Measures USER DEFINES INDICATORS & MEASURES Based On: what s needed to manage the User s goals decisions and decision criteria related to managing the user s goals (GQIM) Practical Software & Systems Measurement Common Issue Area PREDEFINED Measurement Category PREDEFINED Measures PREDEFINED 2004 by University Version 1.0 page 12

14 Data Analysis Dynamics Getting Started Identify the goals Black box process view Is the data right? Do I have the right data? Decision point: If the data is not perfect, do I move forward or obtain better data? Initial Evaluation What should the data look like? What does the data look like? Can I characterize the process, product, problem? Decision point: Can I address my goals right now? Or is additional analysis necessary? at the same or deeper level of detail? Can I move forward? Moving Forward Further evaluation Decompose data, process Decision point: Do I take action? What action do I take? Repeat until root cause found, at target with desired variation [DAD 03] 2004 by University Version 1.0 page 13

15 Performance Analysis Model Technical Adequacy Development Performance Growth and Stability Resources and Cost Schedule and Progress [PSM 00] Customer Satisfaction Product Quality 2004 by University Version 1.0 page 14

16 Performance Analysis Checklist 1 Single indicator issues: Do actual trends correspond to planned trends, such as progress, growth, and expenditures? How big is the variance? Does the variance appear to be gradually growing each month? Are actual values exceeding planned limits, such as open defects, changes, and resource utilization? [PSM 00] 2004 by University Version 1.0 page 15

17 Performance Analysis Checklist 2 Integrated indicator issues: Is the source of the problem evident? - Change in functionality, unplanned rework, etc. Are growing problems in one area a leading indicator of other problems later in the project? - Requirements creep impact on schedule Do multiple indicators lead to similar conclusions? - Lack of progress correlates with low staffing Does other project information contradict performance results? - Milestones being met but open defect counts are increasing [PSM 00] 2004 by University Version 1.0 page 16

18 Outline Acquisition roles & responsibilities Measurement & analysis methods Illustration background monitoring and oversight: progress analysis Summary References 2004 by University Version 1.0 page 17

19 Illustration This illustration is based on an organization that is maintaining an existing product, a blend of COTS, and internally developed code pursuing the acquisition of a replacement product Their acquisition includes two contracts: requirements development product design, code, and test This illustration will focus on analyzing project execution data (Contract 2) 2004 by University Version 1.0 page 18

20 Monitoring & Oversight Contract 2 has been awarded. supplier is developing the product in two builds The contractor has just notified you that the project has both cost and schedule slippage. What do you do? 2004 by University Version 1.0 page 19

21 Contractor Information Contractor information: provides data per your contractual agreement also provides additional data if you ask* uses measurement data to help monitor development has defined processes and monitors compliance analyzes software trouble reports to identify process improvements average software process maturity *Typically, If the RFP does not require the data, the contractor is not obligated to provide it. In this case study, the acquirer and contractor have a good working relationship and the contractor is willing to share data beyond what the RFP specifies by University Version 1.0 page 20

22 Action: Management Review Meet with development contractor to: Find out what is happening. Find out why it is happening. What decisions/actions could be taken based upon the data presented? to correct the slippage to prevent further slippage Postulate future consequences of these decisions Identify actions that could have been taken earlier to prevent the slip 2004 by University Version 1.0 page 21

23 Performance Analysis Model Use model to guide analysis. Step 1: Confirm Problem (Cost & Schedule Slippage) Technical Adequacy Development Performance Growth and Stability Resources and Cost Schedule and Progress Customer Satisfaction Product Quality 2004 by University Version 1.0 page 22

24 Schedule & Progress Indicators Cost } PV AC EV Sched Var. = EV - PV } Cost Var. = EV - AC PV AC EV PV: Planned Value AC: Actual Cost EV: Earned Value J F M A M J J A S O N D J F M A M J J A S O N D Components Completing A&T As of Jul 03 Build 1 Assembly & Test Build 2 Assembly & Test Planned Actual A&T = Assembly and Test S O N D J F M A M J J A Life cycle Activities Est. Requirements Architecture Build 1 Design Assembly & Test Integration & Ver. Build 2 Design Assembly & Test Integration & Ver. Product Integration Formal Qual. Test J F M A M J J A S O N D J F M A M J J A S O N D Tool tips: The top two charts were made in Excel and manually manipulated. The Gantt chart can be generated using any scheduling software by University Version 1.0 page 23

25 What We Learned From Schedule and Progress indicators cost and schedule slippage -- EV chart activities taking longer than planned -- Gantt chart assembly and test behind schedule -- components completion chart What does this mean? confirms we have a problem 2004 by University Version 1.0 page 24

26 Resources and Cost Indicators Analysis/Probing Questions Is the staff allocation contributing to the problem (too many, too few, wrong time frame)? What is rate of staff turnover? How does actual staff compare to planned staff allocation? Growth and Stability Customer Satisfaction Technical Adequacy Resources and Cost Schedule and Progress Development Performance Product Quality 2004 by University Version 1.0 page 25

27 Resources and Cost Indicators Actual Number of Staff Planned Turn Over Key Pers. Others. 0 Jan Mar May Jul Sep Nov Jan Mar May Jul Sep Nov Prg Tester Plan Actual Plan Actual Tool tip: This chart was made in Excel and manually manipulated by University Version 1.0 page 26

28 What We Learned From Resources and Cost Indicators staffing did not follow planned level - too many at beginning of project - testers and programmers used to fill in for analysts and designers => high re-training costs - high turnover rate => training & getting up-to-speed costs What does this mean? cost overrun due partly to staffing problems 2004 by University Version 1.0 page 27

29 Growth and Stability Indicators Analysis/Probing Questions Are the requirements stable? What is the code growth? Is functionality being transferred from build 1 to build 2? If so, how does this effect the delivery date? Growth and Stability Customer Satisfaction Technical Adequacy Resources and Cost Schedule and Progress Development Performance Product Quality 2004 by University Version 1.0 page 28

30 Requirement Changes Information Number of Requirements Build 1 Build 2 J F M A M J J A S O N D J F M A M J J As of Jul 03 Added Req. Total Req. Tool tip: This chart can be generated in Excel followed by manual editing using the drawing toolbar Feb Mar Apr May Jun Jul Aug Sep Dec Mar Jul Req Changes Complexity Low Low Low Low Low Low Low Low Low Low Low Resources (staff-days) by University Version 1.0 page 29

31 Growth and Stability Indicators Size Growth Requirements per Build Lines of Code In 100K As of Jul 03 Jan 02 Mar 02 May Jul Sep 02 Build 1 Plan Nov 02 Jan-03 Actuals Mar 03 Build 2 May 03 Jul 03 Tool tip: This chart was made in Excel and manually manipulated. Sep 03 Number of Req Build 1 Original Plan Replan 1 Actual Not Done Build 2 As of Jul 03 Contractor s Explanation: Functions deferred to later build because of unanticipated complexity 2004 by University Version 1.0 page 30

32 What We Learned From Growth and Stability Indicators requirement changes are of low complexity but will have some ripple effect code production below planned value functionality being deferred from build 1 to build 2 attributed by contractor to unanticipated complexity What does this mean? expect further cost and schedule growth due to low code production and increased number of functions to be implemented in Build 2 expect an impact on completion date due to functions deferred to Build 2 expect the possibility of a Build 3 proposal 2004 by University Version 1.0 page 31

33 Product Quality Indicators Analysis/Probing Questions Are the defined processes being followed? What is the rate of closure for trouble reports? What type of trouble reports are being detected? In what phase? Growth and Stability Customer Satisfaction Technical Adequacy Resources and Cost Schedule and Progress Development Performance Product Quality 2004 by University Version 1.0 page 32

34 Product Quality Indicators Cumulative STR Status: Open vs. Closed As of Jul 03 Build 1 Build 2 Open Closed Delta J F M A M J J A S O N D J F M A M J J A S O N D Number of STRs Open and Closed STRs 0 Priority Priority 1 2 Showstopper Priority 3 Priority 4 Priority 5 Nit Tallies on Jul 03 Open (200) Closed (400) Tool tip: This chart was made in Excel and manually manipulated by University Version 1.0 page 33

35 Classifying Trouble Report Defects Types that code inspections would have been expected to catch Defect Type Environment System Function Data Checking Interface Assignment Build, package Syntax Documentation Percent of Defects 2004 by University Version 1.0 page 34

36 What We Learned From Product Quality Indicators STRs being opened faster than they re being closed Code inspections should have found defect types What does this mean? Code inspection process allowed large number of defects to slip through by University Version 1.0 page 35

37 Development Performance Indicators Analysis/Probing Questions Are the defined processes being followed? Are any defined processes being skipped? Growth and Stability Technical Adequacy Resources and Cost Schedule and Progress Development Performance Customer Satisfaction Product Quality 2004 by University Version 1.0 page 36

38 Percentage Development Performance Indicators Process Compliance Build 1 Build 2 As of Jul 03 Jan-02 Feb-02 Mar-02 Apr-02 May-02 Jun-02 Jul-02 Aug-02 Sep-02 Oct-02 Nov-02 Dec-02 Jan-03 Feb-03 Mar-03 Apr-03 May-03 Jun-03 Jul-03 Aug-03 Sep-03 Oct-03 Nov-03 Dec-03 Process Req. Analysis Prelim. Design Detailed Design Code & Unit Testing Funct. Int. & Testing CSCI Int. & Testing SQA Inspections Process Compliance As of Jul 03 Config. Mgt Documentation Rework % Co mpletio n Tool tip: This chart was made in Excel and manually manipulated by University Version 1.0 page 37

39 What We Learned From Development Performance Indicators adherence to defined process decreased over time stopped doing inspections What does this mean? defects usually detected during code inspections allowed to slip through impact on cost and schedule due to rework 2004 by University Version 1.0 page 38

40 Reasons for Slippage Staffing problems: too many at beginning of project below planned level during most of development - noting that productivity increased dramatically high turnover rate Process compliance: stopped doing inspections allowed errors to leak to later phases Requirements changes after Build 2 code and unit test Conclusion: expect further cost and schedule growth due to low code production and increased number of functions to be implemented in Build by University Version 1.0 page 39

41 Possible Actions Developer Actions replan based on current performance get staffing under control - verify the skills balance of resources - do not decrease staffing to conform to planned staffing, particularly if that would decrease the number of programmers restart inspections - code - test cases Acquirer Decision Options use contract labor (additional costs) deliver smaller size - less functionality accept schedule slip 2004 by University Version 1.0 page 40

42 What if Data is Not Available? Data may not be available because contractor does not collect this level of data contractor not required by contract to report it Using process compliance data as an example, how might missing this data affect conclusions, actions, and project results? corrective action might have adjusted staffing it would not have addressed the skipped inspections which allowed errors to leak to later phases, resulting in increased cost and schedule A possible action to infer process compliance could check data on results of code inspections (if data is specified on contract) Lesson learned: specify in contract what type of data to be reported in status reports 2004 by University Version 1.0 page 41

43 Prevention Visible indicators of underlying problems Cost & Schedule Slippage Developers corrective actions Functionality being deferred Decreased process compliance (skipped inspections) Direct causes of problems Unanticipated complexity Inability to process STRs Growth and Stability Customer Satisfaction Technical Adequacy Resources and Cost Schedule and Progress Root causes of problems Staffing issues Requirement changes Development Performance Product Quality The performance analysis model is a guide to root cause. Understanding root cause leads to prevention by University Version 1.0 page 42

44 Outline Acquisition roles & responsibilities Measurement & analysis methods Illustration background monitoring and oversight: progress analysis Summary References 2004 by University Version 1.0 page 43

45 Roles and Information Exchange Acquirer Pre-award activities RFP prep. Contract Award Post-award activities Monitor Project Progress Evaluate Deliverables Contractual Handshake Functional & Quality Requirements Status Information Interim Documents, Tangibles Directions, Corrections Deliverables Supplier / Developer Develop, Customize, Integrate systems software COTS Sub Contractors SEPG / SAPG SEPG 2004 by University Version 1.0 page 44

46 Measuring Project, Product, Process Acquirer Contractual Handshake Supplier Processes Pre-award activities Post-award activities Exchange of indicators / information for tracking, monitoring, direction, etc. Develop, Customize, Integrate systems software COTS Project Schedule (status, projection, trend) Cost (status, projection, trend) Requirements satisfaction Relationship Roles-Changes Invoicing-payment Products Supplier Produced Quality (amount of rework) Acquisition Organization Produced Quality (amount of rework) 2004 by University Version 1.0 page 45

47 Summary Key acquisition responsibilities (after contract award): monitoring and oversight inspecting, reviewing, and understanding documents and other work products Post-contract award success depends on pre-contract award activities building measurement expectations into contracts establishing good partnerships and working relationships with contractors Measures and indicators across landscape are interrelated use the Performance Analysis Model as your navigation guide always use multiple indicators Measure products, processes, projects, relationships 2004 by University Version 1.0 page 46

48 Contact Information Wolf Goethert Measurement & Analysis Initiative Jeannine Siviy Measurement & Analysis Initiative Robert Ferguson Measurement & Analysis Initiative by University Version 1.0 page 47

49 References 1 Note: URLs valid as of tutorial delivery date. [DAD 03] [GQIM 96] [PSM 00] Siviy, Jeannine and William Florac, Data Analysis Dynamics, Half Day Tutorial Delivered at SEPG 2003, Boston, MA Goal-Driven Software Measurement--A Guidebook Practical Software and Systems Measurement A Foundation for Objective Project Management, Guidebook, version 4.0b, Practical Software and Systems Measurement Support Center, U.S. TRACOM-ARDEC, AMSTA-AR-QA-A, Picatinny Arsenal, NJ, Website: October by University Version 1.0 page 48

50 Reading & Resources 1 Note: URLs valid as of tutorial delivery date. Practical Software and Systems Measurement (PSM) reference for the Performance Analysis Model reference lists of measures to consider Goal Driven Measurement (GDM) and Goal-Question-Indicator-Metric (GQIM) front end for selecting most relevant PSM measures used for developing context-specific indicators, particularly success indicators Goal-Driven Software Measurement--A Guidebook /96.reports/96.hb.002.html 2004 by University Version 1.0 page 49

51 Reading & Resources 2 Note: URLs valid as of tutorial delivery date. Defense Acquisition University (DAU) Deskbook provides information about regulatory references, mandatory and discretionary references by service branch, and several knowledge repositories Guidelines for Successful Acquisition and Management of Software-Intensive Systems, Acquisition Centers of Excellence Air Force, for instance ESC Hanscom - Navy by University Version 1.0 page 50

52 Reading & Resources 3 Note: URLs valid as of tutorial delivery date. Project Management Body of Knowledge (PMBOK ) proven, traditional project management practices and innovated, advanced practices with more limited use Project Management Institute Guide to the PMBOK contains the generally accepted subset of knowledge and practices that are applicable to most projects most of the time by University Version 1.0 page 51