System Development Performance. In-process Status

Similar documents
Leading Indicators for Systems Engineering Effectiveness Presentation for NDIA SE Conference October 28, 2009

Requirements Verification and Validation Leading Indicators

Leading Indicators for Systems Engineering i Effectiveness

SE Effectiveness Leading Indicators. Garry Roedler

Performing Software Feasibility Analysis on Major Defense Acquisition Programs

Status Report SE Division Strategy Meeting 7 December 2011

The 12 th Annual Systems Engineering Conference

Top 5 Systems Engineering Issues within DOD and Defense Industry

Chief Systems Engineers Panel

Practical Software and Systems Measurement

Top Software Engineering Issues in the Defense Industry

Practical Systems and Software Measurement. Affordability Measurement: Exploring Qualitative Approaches (14939)

New Opportunities for System Architecture Measurement

SE Effectiveness Leading Indicators Project

S&T/IR&D to Support Development Planning. NDIA 16 th Annual Systems Engineering Conference 30 October John Lohse NDIA DPWG Chair (520)

Systems Engineering Updates for Parts Management PMSC Fall meeting November 1-3, 2011

Analysis of Alternatives from a Cost Estimating Perspective. Kirby Hom and Kenny Ragland

Boost Your Skills with On-Site Courses Tailored to Your Needs

DoD Modeling & Simulation Verification, Validation & Accreditation (VV&A): The Acquisition Perspective

Acquisition Reform: Integrate Technical Performance with Earned Value Management

Overview of SAE s AS6500 Manufacturing Management Program. David Karr Technical Advisor for Mfg/QA AFLCMC/EZSM

Practical Systems Measurement (PSysM)

DESIRED OUTPUTS MATERIALS TO BRING 7/30/2010

Available online at ScienceDirect. Procedia Computer Science 61 (2015 )

TOPIC DESCRIPTION SUPPLEMENT for the SYSTEMS ENGINEERING SURVEY DESCRIPTION

Measurement Tailoring Workshops

CMMI for Services Quick Reference

Extending Systems Engineering Leading Indicators for Human Systems Integration Effectiveness

Software Project & Risk Management Courses Offered by The Westfall Team

The Method Framework for Engineering System Architectures (MFESA)

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

NDIA Life Cycle Affordability Project

Metric systems for executive overview

Understanding Cost & Affordability Using SEER as a Common Benchmark, Driven By Facts

Requirements Analysis and Design Definition. Chapter Study Group Learning Materials

SYSTEMS ENGINEERING LEADING INDICATORS GUIDE

CMMI for Acquisition Quick Reference

Developing Enterprise-Wide Measures for Tracking Performance of Acquisition Organizations

Air Force Materiel Command

Objective Demonstration of Process Maturity Through Measures

International Diploma in Project Management. (Level 4) Course Structure & Contents

Quantitative Model Olympics

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

CMMI V2.0 MODEL AT-A-GLANCE. Including the following views: Development Services Supplier Management. CMMI V2.0 outline BOOKLET FOR print.

Defining Project Scope

DOD INSTRUCTION BUSINESS SYSTEMS REQUIREMENTS AND ACQUISITION

Stephen Welby Director, Systems Engineering Office of the Under Secretary of Defense (AT&L)

Applying Software Architecture Evaluation to Systems Acquisition

A framework to improve performance measurement in engineering projects

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

SYSTEMS ENGINEERING LEADING INDICATORS GUIDE

VANCOUVER Chapter Study Group. BABOK Chapter 5 Enterprise Analysis

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices

Implementation of the Reliability & Maintainability (R&M) Engineering Body of Knowledge (BoK)

Large Federal Agency Leverages IV&V to Achieve Quality Delivery for Critical Modernization Initiative

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

28 X 27 Steps to Success: The DoD Acquisition M&S Master Plan

Helix Investigating the DNA of the Systems Engineering Workforce

Systems Engineering Management Plan?

NDIA - EHM Committee

CMMI Version 1.2. Model Changes

Developing a Successful RFP for an ITFM Solution ITFMA 2017 Austin

SUBJECT: Better Buying Power 2.0: Continuing the Pursuit for Greater Efficiency and Productivity in Defense Spending

SNOHOMISH COUNTY COUNCIL Snohomish County, Washington MOTION NO

Process Improvement Measurement Workshop

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

Performance Assessments and Root Cause Analyses

HSI PRIORITIES AND PROCESSES FOR AGILE DEVELOPMENT

Extending Systems Engineering Leading Indicators for Human Systems Integration Effectiveness

Performance Outcomes of CMMI -Based Process Improvements

Reliability and Maintainability (R&M) Engineering Update

MBA BADM559 Enterprise IT Governance 12/15/2008. Enterprise Architecture is a holistic view of an enterprise s processes, information and

of Your QA Organization Michael Hoffman 19 October 2010 PNSQC 2010

Agile Acquisition. Peter Modigliani 10 Dec 12. Presented to: Mr. Koen Gijsbers. General Manager NATO Communications and Information Agency

Successfully Choosing and Using an ITFM Solution ITFMA 2017 New Orleans COPYRIGHT 2017 NICUS SOFTWARE, INC ALL RIGHTS RESERVED

Workshops to Jumpstart Measurement Planning

Converting High-Level Systems Engineering Policy to a Workable Program 26 Oct 05

DoD PARCA EVM. EVM Policy Initiatives. Mr. John McGregor PARCA Director for EVM. NDIA IPMD April 27, 2017

CERT Resilience Management Model, Version 1.2

Department of Defense Risk Management Guide for Defense Acquisition Programs

Reliability and Maintainability (R&M) Engineering Update

Given design trade studies, interpret the results to recommend the most cost-effective concept for development.

Independent Verification and Validation (IV&V)

Report of the Reliability Improvement Working Group Table of Contents

DoD DMSMS Strategic Objectives Update. April 26, DoD DMSMS Lead

Project Management Framework with reference to PMBOK (PMI) July 01, 2009

A Data Item Description for System Feasibility Evidence

OSD DMSMS Strategy. Robin Brown. Defense Standardization Program Office DoD DMSMS / Obsolescence Lead

Services Description. Transformation and Plan Services. Business Transformation and Plan Services

Measurement: A Required Path for Achieving Operating Excellence

DOD MANUAL , VOLUME 12 DOD SUPPLY CHAIN MATERIEL MANAGEMENT PROCEDURES: SALES AND OPERATIONS PLANNING

Using CMMI with Defense Acquisition

mêçåééçáåöë= çñ=íüé= qüáêíééåíü=^ååì~ä= ^Åèìáëáíáçå=oÉëÉ~êÅÜ= póãéçëáìã= téçåéëç~ó=péëëáçåë= sçäìãé=f= = Schedule Analytics

DoD Template for Application of TLCSM and PBL In the Weapon System Life Cycle

Implementing Systems Engineering Processes to Balance Cost and Technical Performance

version NDIA CMMI Conf 3.5 SE Tutorial RE - 1

The Method Framework for Engineering System Architectures (MFESA)

Transcription:

System Development Performance Measurement Working Group In-process Status sponsored by the NDIA Systems Engineering Division Systems Engineering Effectiveness Committee (In conjunction with Practical Software and Systems Measurement) 14 July 2010 July 14, 2011 1

Working Group Goal Identify potential high value measures, indicators, and methods for managing programs, particularly in support of making better technical decisions and providing better insight into technical risk at key program milestones during Technology Development and Engineering and Manufacturing Development for both the acquirer and supplier July 14, 2011 2

Drivers Weapons Systems Acquisition Reform Act 2009 (WSARA) Track and report on Major Defense Acquisition Program (MDAP) achievement of measureable performance criteria USD (AT&L) memo Better Buying Power (Sept. 14, 2010) Conduct reviews that support major investment decisions or to uncover and respond to significant program execution issues NDIA Top Systems Engineering (SE) Issues (2010) Issue #4: Decision makers do not have the right information at the right time to support informed and proactive decision making SE Division i i Brief BY Mr. Welby SEP Streamlining i Methodology (Feb 16, 2011) Define set of data-driven successful data products required for program execution. Focus on Engineering g Tables, quantitative data, rather than words July 14, 2011 3

Working Group Imperatives Build on what has already been done Focus on technical measures that provide the insight needed at major decision points Emphasize leading/predictive measures Minimize the effort of data collection and analysis but improve the usage of objective performance data Understand impacts of the right and wrong measures on program effectiveness Measures of cost, time, and technical performance: only measure what provides genuine insight July 14, 2011 4

What the Working Group Looked For How should DoD and industry program executives use the measure(s) to gain insight into what they need to know? Leading insight provided Interpretation guidance Typical decision criteria What should DoD and industry program executives need to look out for in order to use an indicator well? Does its utility vary with where the program is in the lifecycle? Assumptions Implementation criteria or limitations Measures with the right characteristics: Relevance Completeness Timeliness Simplicity it Cost Effectiveness Repeatability Accuracy July 14, 2011 5

Expected Working Group Outcomes Succinct guidance on the most beneficial quantitative measures for providing insight into technical risks and issues Informed decisions for program DoD and industry executives at key program milestones Improved alignment of Industry practices with program plans Consideration of the impact of the life cycle on the measures utilized Initial recommendations on benchmarking and data repositories July 14, 2011 6

Where we are in Process Top SE Issues (2010) Guidance and Findings OSD/Industry Directives, Policy, Guidance, and Initiatives Identify and Prioritize Information Needs Background Brainstorming ICM Analysis Group discussion Prioritized Information Needs Select and Specify Leading Indicators and Measure sources Detailed concepts and questions Insight into effects and root causes Consensus on candidate indicators Candidate Indicators Refine Workshop Results Document Results Measures Supporting Information per template Recommendations Task report Core Team Workshop (March 22-23, 2011) Post Workshop July 14, 2011 7

March 22-23 Workshop Thirty five senior managers and engineers from industry (30) and government (5) participated Identified eighteen information needs Addressed the nine information needs considered most important by the workshop via three breakout teams Breakout teams identified leading indicators, and discussed possible measures and issues associated with the most important information needs July 14, 2011 8

What the Working Group Looked For How should DoD and industry program executives use the measure(s) to gain insight into what they need to know? Leading insight provided Interpretation guidance Typical decision criteria What should DoD and industry program executives need to look out for in order to use an indicator well? Does its utility vary with where the program is in the lifecycle? Assumptions Implementation criteria or limitations Measures with the right characteristics: Relevance Completeness Timeliness Simplicity it Cost Effectiveness Repeatability Accuracy July 14, 2011 9

Information Needs Identified Considered Most Important Based on Prioritization Determined by Workshop Participants Requirements Interfaces Staffing and Skills Technical Performance Technology Maturity Architecture Affordability Risk Management Manufacturability Ranked Lower in Prioritization by Workshop Participants; not considered by breakout teams Testability Requirements Verification and Validation Defects and Errors System Assurance Process Compliance Work Product Progress Facility and Equipment Change Backlog Review Action Item Closure July 14, 2011 10

Workshop Results and Follow-on Analysis 43 leading indicators initially identified across the nine information needs addressed 10 indicators have been identified as highly important by the core team using the following criteria: Strongly addresses the information need Feasible to produce Raw data exists and easily processed Already frequently utilized Leading or predictive Applicable to TD and E&MD Manufacturability did not have any indicator rated highly important (Team plans to revisit this area in collaboration with the NDIA Manufacturing Committee) Many of the high importance indicators have analogs in the Systems Engineering Leading Indicators Guide July 14, 2011 11

Emerging Leading Indicators Information Need Specific Leading Indicator Related Source Material Requirements Requirements Stability SELI 3.1 Requirements Trends -- Volatility Requirements Stakeholder Needs Met SELI 3.4 Validation Trends, SELI 3.5 Verification Trends Requirements Affordability Requirements Tradeoff Impact SELI 3.16 System Affordability Trends Interfaces Interface Trends SELI 3.3 Interface Trends Architecture Critical Success Factor and/or Quality Attribute Requirements Satisfied by the Architecture SELI 3.17 Architecture Trends Staffing and Skills Staffing and Skills Trends SELI 3.11 Staffing and Skills Trends Risk Management Risk Trends SELI 3.9 Risk Exposure Trends SELI 3.10 Risk Treatment Trends Technical Performance Technical Maturity Technical Performance Technical Maturity Technical Maturity TPM Summary (all TPMs) TPM Trend (specific TPM) Technology Readiness Level for each Critical Technology Element SELI 3.13 Technical Measurement Trends SELI 3.13 Technical Measurement Trends SELI 3.8 Technology Maturity Trends July 14, 2011 SELI: Systems Engineering Leading 12 Indicators Guide

Example: Requirements Stability Measureable Concept Leading Insight Provided Base Measures Derived Measures Decision Criteria Is the SE effort driving towards stability in the system definition and size? Indicates whether the system definition is maturing as expected. Indicates risks of change to and quality of architecture, design, implementation, verification, and validation. Indicates schedule and cost risks. May indicate future need for different level or type of resources/skills. Indicates potential lack of understanding of stakeholder requirements that may lead to operational or supportability deficiencies. Total Requirements at the end of the previous reporting period Requirements Changed during the current reporting period (Added, Modified, Deleted) Major Milestone Schedule Time Profile for Expected Requirements Stability Percent Requirements Changed = 100 * total requirement changes/total Requirements Requirements Stability = 100 Percent Requirements Changed Investigate need for corrective action if the Stability is 10 percent below the expected level and/or the Stability trend for the last three reporting periods is moving toward the threshold. July 14, 2011 13

Next Steps Core Team completion of focused summary refining the results of workshop (late June) Review of summary by the entire fifty person working group (mid July) Incorporation of review comments and re-review of the summary (late July) Complete draft report (mid August) Review of draft report by the entire working group (late August) Incorporation of comments and re-review by the working group (early Sept) Completion of final report (late Sept) July 14, 2011 14

In-progress Discussion with OASD (SE) 7 June 2011 Attendees: Stephen Welby, Jim Thompson, and Nic Torelli with select Core Team members DoD Comments: Very supportive of what we re doing Finds intrinsic, direct, and objective work product measures more appealing than actual vs. plan comparisons Desirable to eventually develop thresholds or control limits that are based on historical data Agreement on the need for interpretation guidance and decision criteria Very interested in recommendations concerning manufacturability Action: Work with the NDIA Joint Committee for System Engineering and Manufacturing Discussed the need for a program measurement plan aligned within the Systems Engineering Plan (SEP) July 14, 2011 15

Points of Contact For further information on the Working Group, please contact any of the following: Mr. Peter McLoone, NDIA SED, Working Group Industry Co-chair (peter.j.mcloone@lmco.com) Mr. Martin Meth, representative for OUSD/DDR&E/MPS (mmeth@rsadvisors.com) and Working Group OSD Co-chair Mr. Garry Roedler, NDIA SED, Working Group Industry Adviser (garry.j.roedler@lmco.com) Ms. Cheryl Jones, PSM, Working Group Collaboration Co-chair (cheryl.jones5@us.army.mil) Mr. Bob Rassa, NDIA Systems Engineering Division (SED) Chair (RCRassa@Raytheon.com) Mr. Alan Brown, NDIA SED Systems Engineering Effectiveness Committee (SEEC) Chair (alan.r.brown2@boeing.com) Mr. Geoff Draper, NDIA Systems Engineering Division (SED) Vicechair (gdraper@harris.com) July 14, 2011 16