Air Force Systems Engineering Assessment Model (AF SEAM) Randy Bullard AF Center for Systems Engineering

Similar documents
USING PILOTS TO ASSESS THE VALUE AND APPROACH OF CMMI IMPLEMENTATION. Goddard Space Flight Center (GSFC)

Highlights of CMMI and SCAMPI 1.2 Changes

Integrated Class C Process Appraisals (ICPA)

Delivering End-to-End Supply Chain Excellence

Changes to the SCAMPI Methodology and How to Prepare for a SCAMPI Appraisal

Buy:

Top 5 Systems Engineering Issues within DOD and Defense Industry

CMMI Project Management Refresher Training

CMMI for Services (CMMI -SVC) Process Areas

Top 10 Signs You're Ready (or Not)

Leveraging Your Service Quality Using ITIL V3, ISO and CMMI-SVC. Monday Half-Day Tutorial

Committee. Holly Coulter Dunlap, Chair Beth Wilson, Alternate Chair. October 2013

Defense Standardization Program Conference. Standardizing Software Development and Acquisition in DoD

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

CMMI for Technical Staff

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

3 PART THREE: WORK PLAN AND IV&V METHODOLOGY (SECTION 5.3.3)

The Internal Consistency of the ISO/IEC Software Process Capability Scale

Marilyn Ginsberg-Finner Northrop Grumman Corporation

Using CMMI to Improve Earned Value Management

Defense Logistics Agency Instruction. Information Technology (IT) Benchmarking

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

MEASURING PROCESS CAPABILITY VERSUS ORGANIZATIONAL PROCESS MATURITY

Supplier Networks Working Group The Lean Supply Chain Now Pilot Demonstration Project

Enterprise Mission Assistance DAU Board of Visitors

1.264 Lecture 5 System Process: CMMI, ISO

Practical Application of the CMMI for Building a Strong Project Management Infrastructure

SWEN 256 Software Process & Project Management

Jari Soini. Information Technology Department Pori, Finland. Tampere University of Technology, Pori

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

Update Observations of the Relationships between CMMI and ISO 9001:2000

C A L L F O R E N T R I E S

Parts Management Implementation Spring LMI. PSMC Spring

FAA s Experience with Process Improvement & Measurements

Debra J. Perry Harris Corporation. How Do We Get On The Road To Maturity?

Systems Engineering Challenges for an Army Life Cycle Software Engineering Center

EA Best Practice Workshop Developing an assessment and improvement framework for managing an EA Program

DoD Parts Management Reengineering. Status Briefing Defense Standardization Conference 10 March 2005 Donna McMurry, DSPO

Gulfstream SMS. Safety Management International Collaboration Group Meeting Seattle - October 25, Fred Etheridge / Rick Trusis / Carmen Schooley

CMMI and FPA. the link and benefit of using FPA when rolling out CMMI. Christine Green IFPUG - Certified Function Point Specialist EDS

The CMMI Value Proposition

CMMI ACQUISITION MODEL (CMMI-ACQ):

There are 10 kinds of people in the world. Those who think in binary, and those who don t.

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

Chutes and Ladders. ISO/CMMI Considerations

Latest Reliability Growth Policies, Practices, and Theories for Improved Execution

SCAMPI-B for Contract Monitoring A Case Study of the Mission Planning Enterprise Contractors

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2

Siebel CRM On Demand Administrator Rollout Guide

Internal Audit Quality Analysis Evaluation against the Standards International Standards for the Professional Practice of Internal Auditing (2017)

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices

Applying Systems Engineering to ITS Projects: Advancing Beyond Federal Rule 940. INCOSE IW Transportation Working Group January 27, 2015

Sponsored by USAF - AFRL/RX

Introduction to Software Product Line Adoption

A Model for CAS Self Assessment

Appendix W: Solid and Hazardous Waste

The Potential for Lean Acquisition of Software Intensive Systems

Lockheed Martin Benefits Continue Under CMMI

Department of Defense Award for Supply Chain Excellence CALL FOR ENTRIES

UNCLASSIFIED. FY 2016 Base FY 2016 OCO

NDIA Life Cycle Affordability Project

WHITE PAPER. Guiding principles and dimensions of testing transformation

Our Experience of the SCAMPI E Pilot SEPG North America, Pittsburgh October 1st, 2013

Implementation of the Reliability & Maintainability (R&M) Engineering Body of Knowledge (BoK)

Leading Indicators for Systems Engineering Effectiveness Presentation for NDIA SE Conference October 28, 2009

PMI - Minnesota August 14, Oh no - PMO. Presented by: Michael Vinje - PMP.

DEFENSE ACQUISITION UNIVERSITY ISA 101 BASIC INFORMATION SYSTEM ACQUISITION

SCAMPI Appraisal Methods A+B Process Group SCAMPI process v1.0

CMMI FOR SERVICES, THE PREFERRED CONSTELLATION WITHIN THE SOFTWARE TESTING FUNCTION OF A SOFTWARE ENGINEERING ORGANIZATION

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

Work Plan and IV&V Methodology

TECHNICAL REVIEWS AND AUDITS

Guidance for the Tailoring of R&M Engineering Data

Net-Ready Key Performance Parameter (NR-KPP) Implementation Guidebook

Evaluating Agile Effectiveness (AE) of Organizations and Programs

DoD Parts Management Reengineering. Status Briefing Defense Standardization Conference 25 May 2006 Donna McMurry, DSPO

SCAMPI SM Maintenance Appraisals (SCAMPI M)

Report of the Reliability Improvement Working Group Table of Contents

ISO (SPiCE) Assessment

Data Management in a Performance Based Logistics Environment

Measures and Risk Indicators for Early Insight Into Software Safety

Business Value and Customer Benefits Derived from High Maturity

SUPPLY CHAIN EXCELLENCE IN WIDEX. June 2016

PSM. Practical Software and Systems Measurement A foundation for objective project management. PSM TWG ISO/IEC and IEEE Standards.

SUBJECT: Better Buying Power 2.0: Continuing the Pursuit for Greater Efficiency and Productivity in Defense Spending

Process Quality Levels of ISO/IEC 15504, CMMI and K-model

Using CMMI for Services for IT Excellence QUEST 2009 Conference Talk

Satisfying DoD Contract Reporting With Agile Artifacts

DEFENSE THREAT REDUCTION AGENCY STRATEGIC PLAN FY

How can you select commercial offthe-shelf

The Value of Asset Management to an Organization

DI2E Framework Standard Brief

Applied CMMI-SVC: Identifying Service Systems and Improving Service System Capability

Process for Evaluating Logistics. Readiness Levels (LRLs(

Requirements for a Successful Replatform Project

Integration Mgmt / Initiating Process Group 4.1 Develop Project Charter

Dallas Center for Performance Excellence (CPE) Executive Summary

Transcription:

Air Force Systems Engineering Assessment Model (AF SEAM) Randy Bullard AF Center for Systems Engineering Randy.bullard@afit.edu 1

Outline 1. Background AF SEAM Pedigree AF SEAM Goals 2. Model Contents (What is Included) Process Areas (PAs) Practices (Specific) Practices (Generic) References (What) Other Information/Elaboration Typical Work Products Methodology 2

AF SEAM Background In 2006, AFMC Engineering Council Action Item to: Provide an AF-wide SE Assessment Model Involve AF Centers (product and logistics) Leverage current CMMI -based models in use at AF Centers Baseline Process capability & usage Definition of AF Systems Engineering Assessment Model: A single AF-wide tool which can be used for the assessment and improvement of systems engineering processes in a program/project. 3

AF SEAM Goals Goals Ensure a Consistent Understanding of SE Ensure Core SE Processes are in Place and Being Practiced Document repeatable SE Best Practices across AF Identify Opportunities for Continuous Improvement Clarify Roles and Responsibilities Improve Program Performance & Reduce Technical Risk 4

Why We Need SE Assessment Lack of Disciplined System Engineering (SE) has been a major contributor to poor program performance Many Problems Have Surfaced Repeatedly with AF Programs Missed or Poorly Validated Requirements Poor Planning Fundamentals Lack of Integrated Risk Management Lack of Rigorous Process Lack of Process Flow Down Restoring SE Discipline in AF Programs Is Key to Improved Performance and Credibility 5

Benefits Restoring Disciplined SE Clear Definition of Expectations Well Aligned with Policy Established Assessment Methods & Tools Best Practices Baseline Driving Improvement Moving towards Deeper Understanding of SE Processes More Efficient Programs 6

Why AF SEAM AF SEAM is a composite of Industry & DoD SE best practices Maps to CMMI -ACQ 1.2 & -DEV 1.2 Consistent w/ Industry and DoD guidance Advantages to using AF SEAM Streamlining of CMMI process areas to AF programs AF-centric w/ end-to-end life cycle coverage More focused document requires less program overhead Does not require SEI certified assessors Impact to AF programs Assure programs are achieving desired outcomes Ensure program teams have adequate resources Qualified People, Process Discipline, Tools/Technology 7

AF SEAM Pedigree All AF product Centers selected and tailored some version of the Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI ) to baseline process institutionalization SEI CMMI is the Defense Industry-wide accepted method for process appraisal and improvement The SEI CMMI incorporates principles and practices from recognized industry and US Government system engineering and related standards such as: AFI 63-1201 Life Cycle Systems Engineering Defense Acquisition Guidebook, Chapter 4 MIL-STD 499B System Engineering ANSI/EIA 632 Processes for Engineering a System IEEE/EIA 731 Systems Engineering Capability Model ISO/IEEE 15288 Systems Engineering-System Life Cycle Processes INCOSE System Engineering Standard IEEE 1220 Application and Management of the Systems Engineering Process 8

AF SEAM Content Process Areas (PAs) Goals Practices Informative Material Description Typical Work Products Reference Material Other Considerations Broadest Level of Specificity Most Detailed 9

Principles & Objectives Best Practices from Government & Industry People with Skills, Training & Motivation Tools & Technology the Means to Execute Baseline Practice of Systems Engineering Lean Assessment of Integrated Team Process & Procedures the Glue that Holds it Together Kaizen or Continuous Continuous Improvement Process Improvement 10

Why Focus on Process? "If you can't describe what you are doing as a process, you don't know what you are doing." - W. Edwards Deming 11

AF SEAM Elements 10 Process Areas (PAs) Based in CMMI process area construct Conforms with AFI 63-1201 & DAG Chapter 4 Process Areas (PAs) Configuration Mgmt (CM) Decision Analysis (DA) Design (D) Manufacturing (M) Project Planning (PP) Requirements (R) Risk Mgmt (RM) Sustainment (S) Tech Mgmt & Ctrl (TMC) Verification &Validation (V) 34 Goals - Are Accomplished through the Specific Practices 120 Specific Practices 7 Generic Practices (Apply to each Process Area) 12

AF SEAM Practices Specific Practices Each one applies to only one Process Area Each Practice has Informative Material Description References Typical Work Products Other Considerations Generic Practices Must be accomplished for each Process Area Ensures specific practices are executed Involves stakeholders 13

AF SEAM Practices Process Area Goals Specific Practices Generic Practices Total Practices Configuration Mgmt 3 8 7 15 Decision Analysis 1 5 7 12 Design 3 14 7 21 Manufacturing 4 12 7 19 Project Planning 3 15 7 22 Requirements 4 13 7 20 Risk Mgmt 3 7 7 14 Sustainment 4 15 7 22 Tech Mgmt & Control 4 15 7 22 V & V 5 16 7 23 Total 34 120 70 190 14

Sample Specific Practice RMG1P1 Determine risk sources and categories Description: Establish categories of risks and risk sources for the project initially and refine the risk structure over time (e.g., schedule, cost, supplier execution, technology readiness, manufacturing readiness, product safety, and issues outside control of team), using Integrated Product Teams. Quantify the risk probability and consequence in terms of cost and schedule. Typical Work Products: Risk matrix Risk management plan Reference Material: USAF Operational Risk Management, AFI 90-901 Other Considerations: Consider using Acquisition Center of Excellence Risk Management Workshops when needed. For manufacturing risks consider the capability of planned production processes to meet anticipated design tolerances. Include the supplier s capacity and capabilities in the analysis. 15

Generic Practices 1. Establish and maintain the description of a defined process 2. Establish and maintain plans for performing the process 3. Provide adequate resources for performing the process, developing the work products, and providing the services of the process 4. Assign responsibility and authority for performing the process, developing the work products, and providing the services of the process 5. Train the people performing or supporting the processes needed 6. Monitor and control the process against the process plan and take appropriate corrective action 7. Review the activities, status, and results of the process with higher level management and resolve issues 16

Process Detail Outline A B Roles/Responsibilities Training - Leadership - Self Assessment C D Build Team Train team Logistics support Set schedule A B C D START SELF ASSESSMENT VALIDAT ION REQUIR ED? YES CONDUCT VALIDATION Leadership identifies area(s) of self assessment Describes self assessment activity What needs to be accomplished Capture data Presentation of results E NO POST RESULTS In brief Conduct interviews Analysis Presentation of results Feedback 17

Criteria for Methodology Facilitate Self Assessment Facilitate Continuous Improvement Provide insight into Program/Project Processes & Capability Objective Assessment Consistent Near and Far Term Approach Provide Results that are meaningful for leadership Relevant to PM/PEO/CC Simple Understandable Graphical Support Multi-level Measurement & Reporting Program/Project, Squadron, Group, Wing, Center Resource Allocation SE Process Improvement 18

Defining the Methodology Low Assessment Continuum High Hands Off Promulgate Policy Directives Instructions Checklists Guidance Expect Compliance AF SEAM Collaborative & inclusive Leanest Possible Best Practices Must Dos Clearly Stated Expectations Program Team & Assessor Team Training Self Assessment of Program with Validation Assessment Hands On Comprehensive Continuous Process Improvement Highly Detailed Process Bibles Training Validation Assessment Deep Dives Assessment Methods that Balance Time and Effectiveness 19

SE Assessment Activities Phase I Planning Phase II Self-Assessment Phase III Independent Validation Phase IV Report Results Read Ahead Package Logistics Planning Training Self Assessment Training Project performs self-assessment Provide self - assessment to review team Team In-Brief Project Brief Review Self- Assessment Collaborative Interviews Consolidate Results Prepare final report / outbrief Deliver Final Results Document Reviews 20

Assessment Outputs Feedback Lessons learned from assessment tool Collaborative review Findings Completed assessment tool Strengths Improvement opportunities Output metrics Recommendations Final outbrief 21

Specific Practices Summary

Generic Practices Summary PA/GP GP1 GP2 GP3 GP4 GP5 GP6 GP7 GP Overall CM 1 1 1 1 1 1 1 7 DA 0 0 1 1 1 1 1 5 D 1 1 1 1 1 1 1 7 M 1 1 1 1 1 1 1 7 PP 0 1 1 0 1 0 1 4 R 1 1 1 1 1 1 1 7 RM 1 1 1 1 1 1 0 6 S 1 1 1 1 1 1 1 7 TMC 0 1 1 1 1 1 1 6 V 1 1 1 1 1 1 1 7

Summary Goal is to Continue to Improve Program Performance Too many examples of program performance/issues being tracked back to lack of SE discipline Long Term Goal Revitalize & Institutionalize Systems Engineering Use SE Best Practices Assist programs in achieving desired outcomes Assist program teams in resource planning Qualified People Disciplined Processes Tools/Technology 24

Back Up Slides 25

Team Members Center AAC AEDC ASC AF CSE HQ AFMC ESC OC-ALC OO-ALC SMC WR-ALC Members Ian Talbot Neil Peery, Maj Mark Jenks Gary Bailey Rich Freeman, Randy Bullard Caroline Buckey Bob Swarz, Bruce Allgood Cal Underwood, Bill Raphael Jim Belford, Mahnaz Maung Linda Taylor Jim Jeter, Ronnie Rogers

Spiral 2 Capability Enhancement Re-look process areas for improvements Further refine assessment methodology Strengthen inclusion of software Capture and promulgate best practices/lessons learned Review scoring Examine potential use for SE health assessment Migrate to web-based platform Resources Funding People Computer Based Training Schedule Estimated 1-year effort One member from each Center Working Group meetings held approximately bi-monthly Lead POC/Steering Group Staff support Community of Interest Model sustainment (continuous improvement)

Number of Programs Scoring Roll-Up Specific Practice Assessment Results XXX Center 7 6 5 4 3 2 1 0 CM DA D M PP R RM S TMC V Process Area

Number of Programs Scoring Roll-Up Generic Practice Assessment Results XXX Center 7 6 5 4 3 2 1 0 GP1 GP2 GP3 GP4 GP5 GP6 GP7 Practice Area

Implementation By Center CENTER AAC 5 AUG 08 - FEEDBACK "AAC began integrating AF SEAM in our established program assessment process in January 2008 and expects to complete this integration in FY09." AEDC ASC ESC OC-ALC OO-ALC SMC WR-ALC "We will begin implementing AF SEAM in October." "We are creating a plan to migrate from our current tool to SEAM, tailored with AFMC and ASC specific areas of interest." "We have initiated tailoring efforts to implement AF SEAM by the end of the calendar year. We will be working closely with SMC, our acquisition partner, on the tailoring and implementation effort." "Strongly support, have plans in place, ready to go!" "We are implementing now." "SMC plans to adopt AF SEAM and comply with related policies." "We'll begin implementation at Robins with pilot assessments in F-15 and Avionics." Development process yielded 100% buy-in