CMMI SM Model Measurement and Analysis

Size: px
Start display at page:

Download "CMMI SM Model Measurement and Analysis"

Transcription

1 Carnegie Mellon University Software Engineering Institute CMMI SM Model CMMI SM is a Service Mark of Carnegie Mellon University Carnegie Mellon University Software Engineering Institute CMMI Staged Representation Level 5 Optimizing 4 Quantitatively Managed 3 Defined 2 Managed 1 Initial Focus Continuous Process Improvement Quantitative Management Process Standardization Basic Project Management Process Areas Organizational Innovation and Deployment Causal Analysis and Resolution Organizational Process Performance Quantitative Project Management Requirements Development Technical Solution Product Integration Verification Validation Organizational Process Focus Organizational Process Definition Organizational Training Integrated Project Management for IPPD Risk Management Integrated Teaming Integrated Supplier Management Decision Analysis and Resolution Organizational Environment for Integration Requirements Management Project Planning Project Monitoring and Control Supplier Agreement Management Process and Product Quality Assurance Configuration Management Quality Productivity Risk Rework Source: Software Engineering Institute, Carnegie Mellon University 2 1

2 Purpose: To develop and sustain a measurement capability that is used to support management information needs. Collect data Defined Objectives Store data and reports Data is analysed and published Source: Software Engineering Institute, Carnegie Mellon University 3 Introductory Notes The process area involves the following: 1. Specifying the objectives of measurement and analysis Such that they are aligned with identified information needs and objectives 2. Specifying the measures, data collection and storage mechanisms, analysis techniques, and reporting and feedback mechanisms 3. Implementing Collection, storage, analysis, and reporting of the data 4. Providing objective results That can be used in making informed decisions, and taking appropriate corrective actions 4 2

3 Establish Objectives Align Analysis Activities Specify Measures Specify Data Collection and Storage Procedures Specify Analysis Procedures Objectives Indicators Personnel Provide Results Repository Procedures, Tools Communicate Results Store Data & Results Analyze Data Collect Data 5 SG 1 Align Activities SP 1.1- Establish Objectives Establish and maintain measurement objectives that are derived from Subpractices 1. Document information needs and objectives. To allow traceability to subsequent measurement and analysis activities. 2. Prioritize information needs and objectives. 3. Document, review, and update measurement objectives. The measurement objectives are documented, reviewed by management and other relevant stakeholders, and updated as necessary. 4. Provide feedback for refining and clarifying information needs and objectives as necessary. 5. Maintain traceability of the measurement objectives to the 6 3

4 SG 1 Align Activities Sources of information needs and objectives may include the following: Project plans, Monitoring of project performance, Interviews with managers and others who have information needs, Established management objectives, Strategic plans, Business plans, etc. Example measurement objectives include the following: Reduce time to delivery, total lifecycle cost, Deliver specified functionality completely, Improve prior levels of quality, prior customer satisfaction ratings, Maintain and improve the acquirer/supplier relationships 7 SG 1 Align Activities SP 1.2 Specify Measures Specify measures to address the measurement objectives. objectives are refined into precise, quantifiable measures. Measures may be either base or derived. * 8 4

5 Type of Measures Either base or derived. Data for base measures are obtained by direct measurement. Examples: Estimates and actual measures of work product size» e.g., number of pages Estimates and actual measures of effort and cost» e.g., number of person hours Quality measures»e.g., number of defects, Data for derived measures come from other data, typically by combining two or more base measures Examples: Defect density Peer review coverage Test or verification coverage Reliability measures (e.g., mean time to failure) Quality measures» e.g., number of defects by severity/total number of defects 9 SG 1 Align Activities SP 1.3 Specify Data Collection and Storage Procedures Specify how measurement data will be obtained and stored Explicit specification of collection methods helps ensure that the right data are collected properly. It may also aid in further clarifying information needs and measurement objectives. Proper attention to storage and retrieval procedures helps ensure that data are available and accessible for future use. SP 1.4 Specify Analysis Procedures Specify how measurement data will be analyzed and reported. Specifying the analysis procedures in advance ensures that appropriate analyses will be conducted and reported to address the documented measurement objectives (and thereby the information needs and objectives on which they are based). This approach also provides a check that the necessary data will in fact be collected. 10 5

6 SG 2 Provide Results results that address identified information needs and objectives are provided. SP Collect Data The data necessary for analysis are obtained and checked for completeness and integrity. SP Analyze Data SP Store Data and Results SP Communicate Results The results of the measurement and analysis process are communicated to relevant stakeholders in a timely and usable fashion to support decision making and assist in taking corrective action. Relevant stakeholders include intended users, sponsors, data analysts, and data providers. 11 Generic Practices GP 2.1 Establish an Organizational Policy Establish and maintain an organizational policy for planning and performing the measurement and analysis process. Elaboration: This policy establishes organizational expectations for aligning measurement objectives and activities with identified information needs and objectives and for providing measurement results. GP 2.5 Train People Train the people performing or supporting the measurement and analysis process as needed. Elaboration: Examplesof training topics include the following: Statistical techniques Data collection, analysis, and reporting processes Development of goal-related measurements» e.g., Goal Question Metric (GQM) 12 6

7 References Capability Maturity Model Integration (CMMI SM), Version 1.2, Software Engineering Institute, References Mary Beth Chrissis, Mike Konrad, Sandy Shrum, CMMI : Guidelines for Process Integration and Product Improvement. Second Edition ISBN: Addison Wesley Professional,