WE THOUGHT BUT NOW WE KNOW

Size: px
Start display at page:

Download "WE THOUGHT BUT NOW WE KNOW"

Transcription

1 Pittsburgh, PA CMMI V1.3 High Maturity Panel WE THOUGHT BUT NOW WE KNOW M. Lynn Penn Lockheed Martin Information Systems & Global Services Rawdon Young Software Engineering Institute Alice Perry Raytheon Page 1

2 Presentation Agenda TWO PARTS 1. WE THOUGHT Background how we got here 2. WE KNOW CMMI V1.3 High Maturity Process Areas Page 2

3 The Challenge High Maturity CMMI V1.3 Page 3

4 High Maturity Starting Point The High Maturity Team is composed of veterans of the V1.2A workshop note the majority are from high maturity organizations in industry. Lynn Penn (LMCO) Lead Dan Bennett (AF) Will Hayes (SEI) Rick Hefner (NG) Jim Kubeck (LMCO) Alice Parry (Raytheon) Kathy Smith (HP) Rusty Young (SEI) Use of the high maturity redline of CMMI for Development and change requests submitted against it has added another layer of complexity to the analysis of change requests. Assumption CMMI Dev V1.2 is baseline Assumption CMMI Dev V1.2A (redlines) is a CR Page 4

5 High Maturity Issues Terminology Confusion Requirements implied versus explicit Explanations not central or consistent Model/ Audit Criteria/ Presentations (Healthy Ingredients)/ UCHMP Perceptions Customers ML 5 is expensive no better than 3 Industry ML 5 is NOT RIGHT for every business High Maturity in ALL constellations Examples are focused on Development Page 5

6 Terminology Confusion Common Cause - Statistical versus QuantitativeTechniques Process Models and Process Modeling Business Objectives Subprocesses Page 6

7 Requirements Explicit Explanations not central Two issues really one Goal: - Sunset OTHER explanations Incorporate Healthy ingredients as appropriate goals/ practices Audit Criteria Audit to Model and MDD include what is necessary Page 7

8 Perceptions Individual based However recommended HIGH MATURITY RESTRUCTURING Insufficient link between process improvement, business objectives, and performance Clarify distinction between ML4 and ML5 Eliminate GG4 and GG5 CMMI V1.3 Webnar Suggestions (not covered elsewhere) - Clarify role of OID and CAR - Make CAR more relevant/ clarification in role Page 8

9 High Maturity Pre-Review High Maturity Review at SEPG Invited individuals which include HMLA/ Potential HM Pilot companies to a special session at the NA SEPG - Rusty and Lynn presented the HM Team intentions and introduction to the HM PAs (including the new OPM PA) - Provided these same individuals with the redlines for HM including a high level OPM (GLOP) - Asked for the following actions: Within 1 week provide HMT with a heads up or down on changes Within 2 weeks provide HMT with redlines to the redlines Page 9

10 High Maturity SEPG Results Approximately 150 individuals were invited Approximately 45 individuals attended Received feedback at conference positive on intentions and direction we were pursuing Received heads up/ down feedback from 6 individuals - One of these was not at presentation so took many s to explain - The feedback was more questions not in keeping with initial SEPG positive feedback - Four of these individuals have also provided redlines Page 10

11 High Maturity Team Consensus Team members that attended SEPG were positive and very upbeat about questions and follow up Observations after feedback - Discouraged that so few responded - A few comments showed a mis-understanding of existing HM and an inability to grasp anything different Team Position - Clean up existing PAs (some relevant comments) - Complete OPM generation - Proceed to Open Team review Page 11

12 Combined OPM/ OID 1 ML5 PA Organizational Performance Management Improvements Causal Analysis and Resolution Performance Issues Improvement Proposals Organizational Performance Management Progress toward achieving quality and process performance objectives Organization Quality and process performance objectives Measures, baselines, and models Organizational Quality and process performance objectives Selected Outcomes Root Cause/Solutions Quantitative Project Management Measures, baselines, and models Organizational Process Performance Quality and process performance objectives Updated measures, baselines, and models (actual performance) Customer Page 12

13 High Maturity PAs Relationships 1 Organizational Process Performance (OPP) Clarified relationship between OPP and rest of high maturity process areas Clarified that process performance baselines and models can be created and used in at levels and not just the organizational level Quantitative Project Management (QPM) Restructured to a prepare for quantitative management and quantitatively manage the project or work. Emphasized the use of statistical and other quantitative techniques Emphasized that quantitative management covered managing subprocesses through the project levels (from the micro through the macro levels) Page 13

14 High Maturity PAs Relationships 1 quantitative management Organizational Process Performance (OPP) Managing a project or work group using statistical and other quantitative Clarified relationship techniques to between build an OPP understanding and rest of of the high maturity performance process areas or predicted performance of processes in comparison Clarified that to the process project s performance work group s baselines quality and and process models can performance objectives, and identifying corrective action that may be created need to be and taken. used (See in at also levels statistical and not techniques ) just the Statistical techniques used in quantitative management include analysis, creation, or use organizational of process performance level models, analysis, creation, or use of process Quantitative performance baselines; Project use of Management control charts; analysis (QPM) of variance, regression analysis; and use of confidence intervals or prediction intervals, sensitivity analysis, simulations, Restructured and tests to of hypotheses. a prepare for quantitative management and quantitatively manage the project or work. Emphasized that quantitative management covered managing subprocesses through the project levels (from the micro through the macro levels) Emphasized the use of statistical and other quantitative techniques Page 14

15 High Maturity PAs Relationships 1 statistical and other quantitative techniques Analytic techniques that enable accomplishing an activity by quantifying parameters of the task (e.g., inputs, size, effort, and performance). (See also statistical techniques and quantitative management. ) This term is used in the high maturity process areas where the use of statistical and other quantitative techniques to improve understanding of project, work, and organizational processes is described. Examples of non-statistical quantitative techniques include trend analysis, run charts, Pareto analysis, bar charts, radar charts, and data averaging. The reason for using the compound term statistical and other quantitative techniques in CMMI is to acknowledge that while statistical techniques are expected, other quantitative techniques can also be used effectively Organizational Process Performance (OPP) Clarified relationship between OPP and rest of high maturity process areas Clarified that process performance baselines and models can be created and used in at levels and not just the organizational level Quantitative Project Management (QPM) Restructured to a prepare for quantitative management and quantitatively manage the project or work. Emphasized that quantitative management covered managing subprocesses through the project levels (from the micro through the macro levels) Emphasized the use of statistical and other quantitative techniques Page 15

16 High Maturity PA Relationships 2 Causal Analysis and Resolution (CAR) Made it clearer when to use statistical and other quantitative techniques. Clarified use by projects AND organizations to perform causal analysis and resolution on selected outcomes. Updated to include positive and negative outcomes not just defects. Modified outputs from CAR to include Improvement Proposals to feed process improvements to the organization s set of standard processes (OPM). Organizational Performance Management (OPM) Focused the Process Area on managing business performance to achieve quality and process performance objectives. Added a Specific Goal that requires organizations to use measures, process-performance baselines and models from OPP to understand process performance, target areas for continuing improvement, and evaluate the impact of proposed improvements. Made it clear that statistical and other quantitative techniques are used to evaluate and select improvement proposals and to evaluate whether the improvement achieved expected performance improvement. Page 16

17 Organizational Process Performance Changes: Restructured OPP moving Establish Quality and Process Performance Objectives to SP 1.1 for emphasis. Revised SP 1.4 to include process performance analysis and assessment of subprocess stability. Revised SP 1.5 to clarify process performance models are used throughout the development lifecycle toward achieving quality and process performance objectives. Clarified that not all process performance baselines and models must be created at the organization level. Projects can follow OPP practices to create process performance baselines and models, when appropriate. Clarified the relationship of OPP to other high maturity process areas. Emphasized traceability to business objectives through modifications to SP1.1 and SP1.2 Page 17

18 Quantitative Project Management Changes: Restructured QPM so that SG1 focuses on preparation and SG2 focuses on managing the project. Broadened the focus on using statistical techniques from individual selected subprocesses to cover multiple levels from the individual subprocesses to the entire project. Added guidance about using process performance baselines and process performance models. Defined quantitative management in the glossary to include statistical techniques and used that definition for use of the terms throughout QPM. Modified the practice to remove the emphasis on applying statistical methods to understand variation to reduce the overemphasis on control charts. Added new practices about managing performance and performing root cause analysis. Page 18

19 Causal Analysis and Resolution Changes: Used outcomes to include positive outcomes instead of only defects and problems. Added examples for service organizations and for selecting outcomes for analysis. Added subpractices in SP 1.1 for defining the problem, and in SP 2.2 for following up when expected results were not realized. Added more information about how PPMs can be used. Added informative material addressing more proactive defect prevention. Page 19

20 Organizational Performance Management Changes: Expanded the former OID PA to include performance management and called it Organizational Performance Management (OPM) to emphasize focus on performance of the organizational processes as they relate to business objectives. Defined a new goal about managing business performance using statistical and other quantitative techniques. Clarified that improvements selected for possible implementation can be validated in different ways, piloting is not the only option. More explicitly described the use of process performance models. Provided more information about how improvements can be selected for deployment. Changed references from process and technology improvements to improvements, with an explanation that improvements include both. Page 20

21 Pittsburgh, PA Questions? Page 21