USING PILOTS TO ASSESS THE VALUE AND APPROACH OF CMMI IMPLEMENTATION. Goddard Space Flight Center (GSFC)

Similar documents
Using Pilots to Assess the Value and Approach of CMMI Implementation

Integrated Class C Process Appraisals (ICPA)

Top 10 Signs You're Ready (or Not)

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

USAF Software Technology Support Center (STSC) STSC SPI Help Desk COM , DSN

SWEN 256 Software Process & Project Management

Software Engineering. Lecture 7: CMMI

TACOM-ARDEC Software Enterprise (SWE) CMMI Based Process Improvement

CMMI for Services Quick Reference

CMMI for Technical Staff

I Have Had My CMMI Appraisal What Do I Do Now? How to Establish a Process Improvement WBS

CMMI SM Mini- Assessments

CMMI for Acquisition Quick Reference

Techniques for Shortening the Time and Cost of CMMI Appraisals

Gary Natwick Harris Corporation

MTAT Software Engineering Management

CMMI Project Management Refresher Training

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

The Explicit Relationship Between CMMI and Project Risks

Strategies for Transitioning from SW-CMM to CMMI

CMMI for Services (CMMI -SVC) Process Areas

Integrated Measurements for CMMI

CMMI Version 1.2. Model Changes

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

Experiences with Indicator-Based CMMI Appraisals at Raytheon

Software technology 3. Process improvement models. BSc Course Dr. Katalin Balla

SCAMPI A Appraisals - Preparation in 100 Hours; Execution in a Week

How to Develop Highly Useable CMMI Documentation

Risk Mitigated SCAMPI SM Process

Are You Getting the Most from Your Quality Engineer?

Highlights of CMMI and SCAMPI 1.2 Changes

Managing a Project and Keeping Sane While Wrestling Elegantly With PMBOK, Scrum and CMMI (Together or Any Combination)

Dorna Witkowski Lynn Penn Lockheed Martin, Information Systems & Global Services (IS&GS) Copyright 2009 Lockheed Martin Corporation

NATURAL S P I. Critical Path SCAMPI SM. Getting Real Business Results from Appraisals

SCAMPI V1.1 Method Overview

Use of Competency Guidelines to Address CMMI GP 2.5

Changes to the SCAMPI Methodology and How to Prepare for a SCAMPI Appraisal

Collaborative Government / Contractor SCAMPI Appraisal

FAA s Experience with Process Improvement & Measurements

SCAMPI Appraisal Methods A+B Process Group SCAMPI process v1.0

Boldly Going Where Few Have Gone Before SCAMPI SM C Appraisal Using the CMMI for Acquisition

DORNERWORKS QUALITY SYSTEM

Getting from Here (SW-CMM) to There (CMMI) in a Large Organization

Getting from Here (SW-CMM) to There (CMMI) in a Large Organization

Fast Track Your CMMI Initiative with Better Estimation Practices

3 PART THREE: WORK PLAN AND IV&V METHODOLOGY (SECTION 5.3.3)

Information Technology Independent Verification and Validation

PART THREE - WORK PLAN AND IV&V METHODOLOGY WORK PLAN. FL IT IV&V Work Plan

Debating the Tough Change Requests: Appraisal Perspectives

Transitioning a Merged Organization to CMMI and a New OSSP. NDIA CMMI Technology Conference November 14, 2002

What s New in V1.3. Judah Mogilensky Process Enhancement Partners, Inc.

Maturing Measurement Capability The CMMI SM Measurement Thread and Recognized Industry Guidance - Standards

A Global Overview of The Structure

Streamlining Processes and Appraisals

Using CMMI for Services for IT Excellence QUEST 2009 Conference Talk

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

Developed by: Steven Jacobs, Eck Doerry

NASA Procedural Requirements

CMMI What a Difference a Sponsor Makes!

Use of the Capability Maturity Model Integration (CMMI ) in software engineering management on NASA missions

CMMI v1.1 for a Service-Oriented Organization. By Steve Hall, Jeff Ricketts, Diane Simpson 16 November 2005

Outline. CMMI is registered by Carnegie Mellon University Software Engineering Institute

Risk Management Beyond CMMI Level 5

Our Experience of the SCAMPI E Pilot SEPG North America, Pittsburgh October 1st, 2013

Revista Economică 70:4 (2018) USING THE INTEGRATED CAPABILITY AND MATURITY MODEL IN THE DEVELOPMENT PROCESS OF SOFTWARE SYSTEMS

Software Project Management Sixth Edition. Chapter Software process quality

A Software and Systems Process Improvement Initiative in NAVAIR: the Model Based Appraisal (MBA)

Chapter 6. Software Quality Management & Estimation

MARSHALL SPACE FLIGHT CENTER SOFTWARE ENGINEERING IMPROVEMENT PLAN

CMMI V2.0 MODEL AT-A-GLANCE. Including the following views: Development Services Supplier Management. CMMI V2.0 outline BOOKLET FOR print.

"LEANING" APPRAISALS NAME: TITLE: Michael S. Evanoo, CQE, HMLA, SSBB, Scrum Master Quality Strategist. ORGANIZATION: DXC Technology / US Public Sector

Air Force Systems Engineering Assessment Model (AF SEAM) Randy Bullard AF Center for Systems Engineering

Software Process Assessment

Rational Software White Paper TP 174

An Embedded SCAMPI-C Appraisal at the National Security Agency. NDIA CMMI Technology Conference and User s Group November 15, 2007

Buy:

Reaching Business Goals with Value Adding CMM-I Assessments

Powerful Mechanism for Deployment. November 2004 Copyright 2004 Raytheon Company. All rights reserved.

CC and CMMI. An Approach to Integrate CC with Development

A Universal Classification System for CMMI Artifacts

Supplier s s Perspective on

MDICx Q4 Quarterly update on the CDRH Case for Quality Voluntary Medical Device Manufacturing and Product Quality Pilot Program

Evolutionary Differences Between CMM for Software and the CMMI

Lesson Learned from Cross Constellations and Multi Models Process Improvement Initiatives

Debra J. Perry Harris Corporation. How Do We Get On The Road To Maturity?

8. CMMI Standards and Certifications

Process Innovation at the Speed of Life

SCAMPI SM Maintenance Appraisals (SCAMPI M)

How to Kick Start a Process Improvement Project to Achieve a CMMI Rating Brenda F. Hall Computer Sciences Corporation (CSC)

NDIA Systems Engineering Division. November in partnership with: Software Engineering Institute Carnegie Mellon University

CENTRE (Common Enterprise Resource)

Software Project & Risk Management Courses Offered by The Westfall Team

MEASURING PROCESS CAPABILITY VERSUS ORGANIZATIONAL PROCESS MATURITY

1.0 PART THREE: Work Plan and IV&V Methodology

Headquarters Air Combat Command. Maturing the Enterprise Applying process improvement strategies to a large GIS program

6. Capability Maturity Method (CMM)

Using Lean Six Sigma to Accelerate CMMI Implementation

Project Management Process Groups. PMP Study Group Based on the PMBOK Guide 4 th Edition

Do s and Don ts of Appraisal Preparation

Transcription:

USING PILOTS TO ASSESS THE VALUE AND APPROACH OF CMMI IMPLEMENTATION Goddard Space Flight Center (GSFC) Sally Godfrey, James Andary, Linda Rosenberg SEPG 2003 2/03 Slide 1

Agenda! Background " NASA Improvement Initiatives " GSFC Improvement Plan " GSFC/Phase 1 " Phase 1 Goals " Choice of Pilot Areas! CMMI Pre-Appraisals " Goals/Scope " Pre-Appraisals! Evaluation of Phase 1 " Advantages/Disadvantages of Pre-Appraisal Approach " Lessons Learned SEPG 2003 2/03 Slide 2

The NASA Software & Systems Engineering Initiatives Software Initiative Goal: Advance software engineering practices (development, assurance, and management) to effectively meet the scientific and technological objectives of NASA. 4 Strategies: -Process and Product Improvement -Safety, Reliability, and Quality -Infuse Research -Skill of Workforce Systems Engineering Initiative:. Define and pilot a methodology for assessment of the systems engineering capability, which addresses knowledge and skill of the workforce, processes, and tools and methodology... NASA Chief Engineer SEPG 2003 2/03 Slide 3

GSFC Software Development Process Improvement Plan Developed Software Plan to improve the processes and practices in use at GSFC using the Capability Maturity Model Integrated (CMMI) as a measure of progress -Focuses on Mission Critical Software -Signed by GSFC Director Are working with Systems Engineering to help them pilot CMMI Software Long Term Goals -Increase percentage of projects that are on-time and within cost by at least 10% -Increase productivity by at least 5% -Decrease cycle time by 10-20% -Reduce error rate after delivery by at least 20% SEPG 2003 2/03 Slide 4

Implementation Phases in GSFC s Improvement Plan Phase 1: Pilot Phase (FY02) Benchmark several representative GSFC areas Estimate effort, cost to improve identified gaps Evaluate implementation approach Phase 2: Implementation Phase (FY03-FY07) Implementation of PI on all critical projects Begin by working with new projects to field improvements Eventual target CMMI Level 3 Phase 3: Maintain Level and Continue Improvement Include other less critical areas? (e.g. science processing) FY02 FY03 FY04 FY05 FY06 FY07 FY08 PHASE 1 PHASE 2 PHASE 3 SEPG 2003 2/03 Slide 5

Phase 1 -FY02 Goals Benchmark several areas against the CMMI model (Where are we?) Learn what is involved in using CMMI as a model for improvement (How hard is it? Does it make sense?) Get a basis for estimating the cost of a process improvement program that achieves CMMI Level 3 (How expensive is it?) Assess our planned implementation approach (Are we doing this the right way?) SEPG 2003 2/03 Slide 6

Pre-Appraisal Areas Selected for Phase 1 Project W Project X Project Y Project Z FLT SW GND SW FLT SW GND SW FLT SW GND SW FLT SW GND SW Instr 1 Instr 2 Instr 1 Instr 2 Instr 1 Instr 2 Instr 1 Instr 2 3 Pre-Appraisals Completed in FY02: 1. Flight Software (11/01) 2. Project Level-Focus on Systems Engineering & Acquisition (4/02) 3. Ground Software (9/02) Note: The Instrument area appraisal was not done due to lack of available projects and time constraints SEPG 2003 2/03 Slide 7

CMMI Pre-Appraisals During Phase 1 SEPG 2003 2/03 Slide 8

Goals of the Pre-Appraisals How long does it it take?? How does CMMI apply at at GSFC??? SE & CMMI? SE & CMMI? How much preparation??? Where are we? Where are we? Can we do it?? Can we do it??? SEPG 2003 2/03 Slide 9

Key Points for Pre-Appraisals EPG tried to minimize time required from project participants Pre-appraisals were conducted less formally than SCAMPI More reliance on interviews Less verification of information and document review No maturity ratings determined Pre-appraisal methodology evolved during course of year Findings were the result of team consensus, supported by multiple data points from multiple sessions. Results pre-appraisals were reported as findings of strengths and improvement opportunities in the CMMI Process Areas. SEPG 2003 2/03 Slide 10

Phase 1 Pre-Appraisals Pre-Appraisal #1: Flight Software -2 projects Both projects in-house, integrated contractor/civil servant teams One project complete with all documentation in place Other project at PDR point -Development started under ISO system Pre-Appraisal #2: Flight Projects -3 projects Project 1: Start 2000, In Formulation, Large budget, International with multiple spacecraft, Core spacecraft will be in-house developed Project 2: Start 91, In Implementation, (CDR in 99), L- 04, Large budget, ~30 Civil Servants, Multiple contractors Project 3: Part of program with 3 project series, Several launches complete, (turn-key), Spacecraft budget about 1/2 of other two, mostly contractors, few Civil Servants Pre-Appraisal #3: Ground Software -2 projects Both projects in-house, integrated contractor/civil servant teams One project complete with all documentation in place Other project in testing -Development started under ISO system SEPG 2003 2/03 Slide 11

Differences in the Pre- Appraisals #1 #2 #3 Level of Focus Subsystem Code 400 Project Subsystem Emphasis Software Systems Engineering, Software Development Acquisition Development Mode Discovery Discovery Verification -1/2 Doc. Review -Heavy emphasis -Few interviews -1/2 Interviews on interviews -Doc. Review Use of PIIDs? No No Yes Draft Findings No Yes Briefing Held? SEPG 2003 2/03 Slide 12 Yes Days Spent 6 Days 9 Days 13 Days Interviewee Minimal Gave sample Minimal Preparation questions Interviewed No Yes Support Orgs. No

PIIDs (Process Implementation Indicator Documents) RM Direct Artifact Indirect Artifact Affirmation Char. RM SP 1.1-1 Requirements Doc Req. Q & A PM-affirms FI RM SP 1.2-1 Signatures on Req. Presentation Mat. FI RM SP 1.3-1 Req. Change History Slide 11 of CDR PM affirms FI RM SP 1.4-1 Test Matrix (partial) PM affirms LI RM SP1.5-1 Slide 14 of CDR Done sometimes PI GP 1.1 Req. Doc., DB s. PM affirms FI GP1.2 No org. policy NI Key: FI: Fully Implemented LI: Largely Implemented PI: Partially Implemented NI: Not Implemented SEPG 2003 2/03 Slide 13

Appraisal Participants (Interviewees) Role #1 #2 #3 Line Manager X Project Managers/Instr. Mgrs X Senior Systems Engineers X Software Manager X X X Requirements Developers X X X Software Developers X X Testers X X QA Representatives X X Configuration Managers X X X Schedulers X Contracting Officers X Training Coordinators X EPG Members X X SEPG 2003 2/03 Slide 14

Appraisal Teams Appraisal Team #1 #2 #3 SEI-Authorized Lead Appraisers 3 3 3 GSFC Appraisal Team Members (Total) 3 3 4 GSFC Appraisal Team Characteristics: #1 #2 #3 Experience: All were EPG Members Background: Took Introduction to CMMI 3 3 4 Took Intermediate CMMI 1 1 4 Software Development 3 1 4 Systems Engineering 1 Quality Assurance 1 SEPG 2003 2/03 Slide 15

Process Flow of Pre-Appraisal #1 Pre On-Site Day 1 Day 2-3 Day 4 Day 5 Post On-Site Analyze Requirements Develop Appraisal Plan Select and Prepare Team Obtain Organizational Information Lead Assessor Opening Briefing Conduct Interviews Conduct Interviews Conduct Interviews Conduct interviews and Review Documents Review Documents Consolidate Information Conduct Interviews and Review Documents Work to reach consensus Key Points: Prepare Final Findings Deliver Final Findings Produce Reports and Support Follow-on Activities Select and Prepare Participants Prepare for Data Collection -Little advance preparation -Discovery mode with half interviews, half doc review -No draft findings SEPG 2003 2/03 Slide 16

Process Flow of Pre-Appraisal #2 Pre On-Site Day 1 Day 2-3 Day 4 Day 5 Post On-Site Analyze Requirements Develop Appraisal Plan Select and Prepare Team Obtain Organizational Information Select and Prepare Participants Prepare for Data Collection CMMI Overview Training GSFC, SE Overview Presentations Conduct Interviews Consolidate Information Conduct Interviews Review Documents Consolidate Information Conduct Interviews and Review Documents Consolidate Information Deliver Draft Findings Prepare Final Findings Deliver Final Findings Produce Reports and Support Follow-on Activities Key Points: -More advance preparation -Discovery mode-heavy reliance on interviews -Draft findings SEPG 2003 2/03 Slide 17

Process Flow of Pre-Appraisal #3 Pre On-Site Pre-On-Site Day 1 Day 2-3 Day 4 Day 5 Analyze Requirements Develop Appraisal Plan Select Team Conduct Team Training Obtain Organizational Information/Docs Review Documentation Fill in PIIDs Obtain Additional Docs Discussion of Projects for Appraisers Document Review Fill in PIIDs Identify Missing Information Conduct Interviews Add interview Info to PIIDs Consolidate Information- Begin Assessing Gaps Review Documents and Complete PIIDs Consolidate Information Deliver Draft Findings Key Points: -Heavy advance preparation -Verification mode-interviews used to verify & complete PIIDS -Draft findings Prepare Final Findings Deliver Final Findings Post On-Site Produce Reports and Support Follow-on Activities SEPG 2003 2/03 Slide 18

Evaluation of Phase 1 What did we learn? Would we choose the same approach again? SEPG 2003 2/03 Slide 19

Advantages of CMMI Pre-Appraisal Approach CMMI Pre-Appraisals provided fairly accurate benchmark of state of all three areas evaluated Pre-appraisal was a quick-look -Provided a wealth of information in a short period of time (1 week) Involvement of external appraisers helps facilitate cooperation from projects; Provides credibility for Senior Managers Pre-appraisal was excellent training for internal appraisers involved Future pre-appraisals and bench-marking could now be done by internal appraisers (Have experience base) SEPG 2003 2/03 Slide 20

Disadvantages of CMMI Pre-Appraisal Approach Whole pre-appraisal approach was very time-consuming Majority of our resources expended on convincing projects to participate, appraisal preparation, appraisals Little time left to actually support improvement activities with projects More difficult to estimate costs of addressing weaknesses (doing actual improvements) than anticipated Difficult to show Senior Management that projects were better because we were doing pre-appraisals, not process improvement (Early wins are important!) SEPG 2003 2/03 Slide 21

Lessons Learned on Pre- Appraisals It takes time to prepare Scheduling interviews hard- allow lots of time Assign internal appraisers process areas Gather documents, fill out PIIDS Prepare interviewees Set expectations for pre-appraisal team Brief pre-appraisal team Choose projects in various phases Early phase: more opportunity to change Mid-stream: probably typical of current processes Late or done: all documentation in place SEPG 2003 2/03 Slide 22

Lessons Learned Choose interviewees to cover all process areas Use of PIIDs captured more information on strengths & weaknesses by Specific Practice for later improvement work Need a process for completing PIIDS Too time intensive for Projects to fill out, but some EPG/Project interaction necessary Projects didn t have CMMI knowledge to complete Conduct a draft findings briefing Knowledge of org. process structure more important than CMMI knowledge SEPG 2003 2/03 Slide 23

Next Steps Prioritize improvement opportunities based on the Goddard business direction. Use Continuous Model of CMMI Focus on improving smaller part of s/w organization Expand using assets developed as resources Continue working with the NASA Systems Engineering Working Group on the use of CMMI for evaluating systems engineering capability. Start small pilot in systems engineering area Cost estimates for next year will be based on WBS developed to address gaps identified in appraisals SEPG 2003 2/03 Slide 24

Contact Information Sara (Sally) Godfrey Goddard Space Flight Center Code 583 Code 530 Greenbelt, MD. 20771 Greenbelt, MD 20771 James Andary Goddard Space Flight Center 301-286-5706 301-286-2269 Sara.H.Godfrey.1@gsfc.nasa.gov James.F.Andary.1@gsfc.nasa.gov Dr. Linda Rosenberg Goddard Space Flight Center Code 100 Greenbelt, MD 20771 301-286-5710 Linda.H.Rosenberg.1@gsfc.nasa.gov SEPG 2003 2/03 Slide 25

Back-up Slides SEPG 2003 2/03 Slide 26

The NASA Software Engineering Initiative Goal: Advance software engineering practices (development, assurance, and management) to effectively meet the scientific and technological objectives of NASA. Strategy 1. Implement a continuous software process and product improvement program across NASA and its contract community. Strategy 2. Improve safety, reliability, and quality of software through the integration of sound software engineering principles and standards. Strategy 3. Improve NASA s software engineering practices through research. Strategy 4. Improve software engineers' knowledge and skills, and attract and retain software engineers. SEPG 2003 2/03 Slide 27

Goddard s Matrix Structure GSFC Code 100 Sr. Mgmt. Training Code 200 Procurement Code 300 Quality Assurance, IV&V Code 400 Projects Code 500 Applied Engineering Code 600 Space Science Code 900 Earth Science Project 1 Project 2 Project 3 Contract Officer QA Software Electrical Mechanical Science Instruments SEPG 2003 2/03 Slide 28

Example Process Area: Requirements Management SG 1: Manage Requirements SP 1.1: Obtain an Understanding of the Requirements SP1.2: Obtain Commitment to the Requirements SP1.3: Manage Requirements Changes SP1.4: Maintain Bi-directional Traceability of Requirements SP1.5: Identify Inconsistencies between Project Work & Reqmts GG 2: Institutionalize a Managed Process GP 2.1: Establish an Organizational Policy GP 2.2: Plan the Process GP 2.3: Provide Resources GP 2.4: Assign Responsibility SEPG 2003 2/03 Slide 29

Example Process Area: Requirements Management GG 2: Institutionalize a Managed Process GP 2.5: Train People GP 2.6: Manage Configurations GP 2.7: Identify & Involve Relevant Stakeholders GP 2.8: Monitor and Control the Process GP 2.9: Objectively Evaluate Adherence GP 2.10: Review Status with Higher Level Management GG 3: Define a Managed Process GP 3.1:Establish a Defined Process GP 3.2:Collect Improvement Information SEPG 2003 2/03 Slide 30

Pre-Assessment Scope CMMI Components Reviewed: Maturity Levels 2 and 3 Process Areas Specific Goals Specific practices are evaluated to determine specific goal coverage based on evidence of weaknesses, improvement activities, strengths and alternative practices. CMMI Components NOT Reviewed: (Generic Goals) Actual documented process being used on projects Activities, process inputs & outputs, deliverables, roles & responsibilities, measurements, work instructions, templates, tailoring, why & when, etc. Training for use of process Use of process and adherence to process Planning and monitoring of process Providing resources for process SEPG 2003 2/03 Slide 31

Appraisal Goals for Systems Engineering Pre-Assessment Determine the applicability of the CMMI Model for evaluating systems engineering and acquisition activities at Goddard Baseline the systems engineering organization against the requirements in the model Gain experience in the use of the model as a baselining tool SEPG 2003 2/03 Slide 32

Level 2 Process Areas Requirements Management Project Planning Project Monitoring & Control Supplier Agreement Management Measurement & Analysis Process & Product Quality Assurance Configuration Management SEPG 2003 2/03 Slide 33

General Process Requirements for Each Process Area at Level 2 Document project level processes so that all projects have a starting point for these activities. Plan and manage these process activities, including: Institute an organizational policy Plan the process Provide resources Assign responsibility Train people Manage configurations Identify & involve relevant stakeholders Monitor & control the process Objectively evaluate adherence Review status with higher level management SEPG 2003 2/03 Slide 34

Level 3 Process Areas Requirements Development Technical Solution Product Integration Verification Validation Organizational Process Focus Organizational Process Definition Organizational Training Integrated Project Management Risk Management Integrated Teaming (not assessed) Integrated Supplier Management Decision Analysis and Resolution Organizational Environment for Integration (not assessed) SEPG 2003 2/03 Slide 35

General Process Requirements for Each Process Area at Level 3 Document organization level processes (and tailoring guidelines) so that all projects have a starting point for all process activities. Plan and manage these process activities, including: Institute an organizational policy Plan the process Provide resources Assign responsibility Train people Manage configurations Identify & involve relevant stakeholders Monitor & control the process Objectively evaluate adherence Review status with higher level management Collect information for process improvement SEPG 2003 2/03 Slide 36

Authority Directed by NASA Chief Engineer: the SEWG is expected to define and pilot a methodology for assessment of the systems engineering capability, which addresses knowledge and skill of the workforce, processes, and tools and methodology. Deputy Chief Engineer for Systems Engineering (Nov. 1, 2000) Promoted by the agency Software Working Group (SWG) Software Initiative being implemented across agency CMM and CMMI-SW programs at all Centers Studied by the agency Systems Engineering Working Group (SEWG) Assessment data from GSFC will be evaluated by the SEWG to determine if CMMI is appropriate for Systems Engineering implementation agencywide. SEPG 2003 2/03 Slide 37

Infrastructure Management Oversight Group MOG Institutional Consensus Draft Process Projects Feedback Support Engineering Process Group EPG Defined Process Metrics Asset Management Group AMG SEPG 2003 2/03 Slide 38

MOG! Provide oversight and direction to the EPG and AMG and assist in establishing priorities! Work with the EPG in communicating process issues and industry practices to GSFC senior management! Represent their constituent organizations in reaching consensus on GSFC institutional software policies and standards for both inhouse and contractor-supplied software! Review and concur on all GSFC software and system policies and guidelines prior to final publication SEPG 2003 2/03 Slide 39

EPG For the pilots and during the rollout to other GSFC entities the EPG will: # Lead the continuous definition, maintenance and improvement of software process policies procedures and best practices including the development and maintenance of the GSFC software development process improvement plan # Facilitate software process assessments # Arrange for and support training and continuing education related to process improvements for engineers, line managers, project management, and GSFC senior management # Define and maintain metrics to track, monitor, and assess the status of focused improvement efforts and pilot studies # Provide status information and evaluations of the improvement activities to all levels of management # Lead the institutional response, where appropriate, to software/systems-related Nonconformance Reports # Maintain a collaborative working relationship with practicing software/systems engineers to obtain, plan, and install new practices and technologies # Provide software engineering consultation to development projects and management SEPG 2003 2/03 Slide 40

AMG $ Develop and maintain the GSFC Develop Software and Systems Products web site which includes the software development process improvement library, $ Develop and maintain a database of GSFC software process and product metrics, $ Act as the clearinghouse for software metrics reported to NASA HQ, $ Develop insights into the metrics sources that will enhance the consistency and effectiveness of interpretation, $ Maintain a database of GSFC software product characteristics in order to understand process metrics, encourage software reuse, and assist in identifying special expertise, and $ Establish and manage a service that provides software engineering tools to projects in cases where a single GSFC vendor interface and institutional supplier is appropriate. SEPG 2003 2/03 Slide 41

CMMI and ISO ISO is a standard, CMMI is a model ISO is broad- focusing on more aspects of the business. Initially for manufacturing CMMI is deep - provides more in-depth guidance in more focused areas (Software/Systems Engineering/Software Acquisition-SW/SE/SA) Both tell you what to do, but not how to do it But CMMI tells you what expected practices are for a capable, mature organization CMMI provides much more detail for guidance than ISO by including an extensive set of best practices, developed in collaboration with industry/gov/sei -CMMI provides much better measure of quality of processes; ISO focuses more on having processes -CMMI puts more emphasis on continuous improvement -CMMI allows you to focus on one or a few process areas for improvement (It s a model, not a standard, like ISO) --Can rate just one area in CMMI -CMMI and ISO are not in conflict: ISO helps satisfy CMMI capabilities; CMMI more rigorous SEPG 2003 2/03 Slide 42

Capability Maturity Model Integrated (CMMI)-Staged Level Process Areas SW - CMM CMMI SA - CMM SE - CMM 5 Optimizing 4 Quantitatively Managed 3 Defined 2 Managed 1 Initial Organization innovation and deployment Causal analysis and resolution Organizational process performance Quantitative project management Requirements development Technical solution Product integration Verification Validation Organizational process focus Organizational process definition Organizational training Integrated project management Risk management Decision analysis and resolution Integrated Supplier Management Integrated Teaming Requirements management Project planning Project monitoring and control Configuration Management Supplier agreement management Measurement and analysis Product & Process Quality Assurance SEPG 2003 2/03 Slide 43