The Business Case for Systems Engineering: Comparison of Defense-Domain and Non- Defense Projects
|
|
- Margery Waters
- 6 years ago
- Views:
Transcription
1 The Business Case for Systems Engineering: Comparison of Defense-Domain and Non- Defense Projects Presenter: Joseph P. Elm The Software Engineering Institute (SEI) a DoD Research FFRDC
2 Copyright 2014 Carnegie Mellon University This material is based upon work funded and supported by the Department of Defense under Contract No. FA C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. NO WARRANTY. THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN AS-IS BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. This material has been approved for public release and unlimited distribution except as restricted below. This material may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. Carnegie Mellon is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University. DM
3 Challenges in DoD Acquisition GAO T - Actions Needed to Overcome Longstanding Challenges with Weapon Systems Acquisition and Service Contract Management costs increased 26% and development costs increased by from first estimates programs failed to deliver capabilities when promised often forcing warfighters to [maintain] legacy systems current programs experienced, on average, a 21-month delay in delivering initial capabilities to the warfighter Although DoD is the largest acquirer in the world, acquisition troubles remain MDAP RDT&E cost growth (mean) 84% 2011 MDAP Procurement cost growth (mean) 28% Effectiveness ( ) 89% Suitability ( ) 72% Nunn-McCurdy breach rate from % 1. Performance of the Defense Acquisition System 2013 Annual Report Table 2-3, page 34) 3
4 Root Cause of Poor Program Performance Inadequate Systems Engineering! MDAP Cost Growth: PARCA Root Cause Analysis 1 Finding from Performance of the Defense Acquisition System 2013 Annual Report Dominant root cause of MDAP Cost Growth Finding from GAO T 10 of 18 (56%) 5 of 18 (28%) Dominant Poor management Systems engineering Contractual incentives Risk management Situational Awareness Baseline cost and schedule estimates Framing assumptions managers rely heavily on assumptions about system requirements, technology, and design maturity, which are consistently too optimistic. These gaps are largely the result of a lack of a disciplined systems engineering analysis prior to beginning system development 4 of 18 (22%) Change in procurement quantity Infrequent 1 of 18 Immature technology, excessive manufacturing, or integration risk 2 of 18 Unrealistic expectations 1 of 18 Unanticipated design, engineering, manufacturing or technology issues None Funding inadequacy 1. Performance of the Defense Acquisition System 2013 Annual Report Table 2-3, page 34) 4
5 Why Do We Fail to Utilize Good SE Practices? It s difficult to justify the costs of SE in terms that project managers and corporate managers can relate to. The costs of SE are evident Cost of resources Schedule time The benefits are less obvious and less tangible Cost avoidance (e.g., reduction of rework from interface mismatches) Risk avoidance (e.g., early risk identification and mitigation) Improved efficiency (e.g., clearer organizational boundaries and interfaces) Better products (e.g., better understanding and satisfaction of stakeholder needs)` We need to quantify the effectiveness and value of SE by examining its effect on project? 5
6 The 2012 SE Effectiveness Study Purpose Strengthen the business case for SE by relating project to the use of SE practices. Method Contact development projects using the resources of NDIA, AESS, and INCOSE. Survey projects to assess their SE activities Project Degree of challenge Process responses to identify statistical relationships between parameters. Survey Tenets All data is submitted anonymously and handled confidentially by the SEI. Only aggregated non-attributable data is released. 6
7 The Bottom Line: SE = Performance 15% 24% 47% 52% 29% 56% 24% 2 Across ALL projects, 1/3 are at each level For Lower SEC projects, only 15% deliver higher For Middle SEC projects, 24% deliver higher Gamma = 0.49 p-value < For Higher SEC projects, 57% deliver higher Gamma = 0.49 represents a VERY STRONG relationship 7
8 For Challenging Projects SE is even MORE important Perf vs. SEC_Total (Low PC) Perf vs. SEC_Total (High PC) % 23% 45% 32% Lower SEC (n=22) 58% 19% Middle SEC (n=26) 52% 36% 12% Higher SEC (n=25) % 23% 69% Lower SEC (n=26) 26% 35% 39% Middle SEC (n=23) 62% 12% 27% Higher SEC (n=26) Gamma = 0.34 p-value = Gamma = 0.62 p-value = A STRONG relationship between Total SE and Project Performance for CHALLENGE projects A VERY STRONG relationship between Total SE and Project Performance for CHALLENGE projects 8
9 Study Participants Participant Solicitation Contacted key members of major defense contractors to promote study participation Contacted the memberships of NDIA SE Division, IEEE AESS, and INCOSE Collected 148 valid responses Ind. Mfg & Svc: defense Which of these best describes your industry or service? Ind. Mfg & Svc: Electronic Ind. Mfg and Svc: Other Transportation Energy 9 Communications 0 1 Consumer Goods & Svc Health Care 10 Other Please enter the country in which most of the design and development engineering will be/was performed. 130 USA UK South Africa Australia Canada India The Netherlan Sweden Finland 9
10 and Performance SYSTEMS ENGINEERING DEPLOYMENT Risk Management Product Integration Integ Product Teams Total SEC Early SE Project Planning Req'ts Dev't & Mg't PROJECT PERFORMANCE 3.70 PerfS Validation Verification Monitor & Control Trade Studies Configuration Mg't Product Architecture DEF Non-DEF Perf PerfT DEF PerfC Non- DEF 10
11 Total SE vs. Project Performance Project Performance vs. Total SE (defense) % 35% 54% 42% 61% 31% 23% 15% Lower SEC (n=23) Middle SEC (n=26) Higher SEC (n=26) Total Systems Engineering Capability (SEC) Gamma = 0.57 Project Performance vs. Total SE (non-defense) A Very Strong relationship between applied SE and Project Performance for both Defense and non-defense Projects % 25% 25% 55% 38% 67% 36% 38% 9% Lower SEC (n=12) Middle SEC (n=8) Higher SEC (n=11) Total Systems Engineering Capability (SEC) Gamma =
12 Architecture vs. Project Performance Perf vs. SEC_ARCH (defense) % 37% 47% 48% 35% 17% Lower SEC (n=19) Middle SEC (n=33) Higher SEC (n=23) Gamma = 0.38 A Strong relationship between Architecture activities and Project Performance for Defense Projects A Very Strong relationship for nondefense projects Perf vs. SEC_ARCH (non-defense) % 29% 57% 45% 36% 18% Lower SEC (n=14) Middle SEC (n=6) Higher SEC (n=11) Gamma =
13 Requirements Dev t & Mg t vs. Performance Perf vs. SEC_REQ (defense) % 19% 23% 55% 54% 27% 56% 26% 19% Lower SEC (n=22) Middle SEC (n=26) Higher SEC (n=27) Gamma = 0.46 A Very Strong relationship between Requirements activities and Project Performance for both Defense and non-defense Projects Perf vs. SEC_REQ (non-defense) % 58% 2 67% Lower SEC (n=12) Middle SEC (n=10) Higher SEC (n=9) Gamma =
14 Risk Management vs. Project Performance Perf vs. SEC_RSKM (defense) % 25% 28% 44% 46% 27% 29% 23% Lower SEC (n=25) Middle SEC (n=28) Higher SEC (n=22) Gamma = 0.28 A Moderate relationship between Risk Management activities and Project Performance for Defense Projects A Very Strong relationship for nondefense projects Perf vs. SEC_RSKM (non-defense) % 36% 43% 13% 13% 75% 56% 44% Lower SEC (n=14) Middle SEC (n=8) Higher SEC (n=9) Gamma =
15 Trade Studies vs. Project Performance Perf vs. SEC_TRD (defense) % 52% 36% 36% 28% 48% 39% 13% Lower SEC (n=27) Middle SEC (n=25) Higher SEC (n=23) Gamma = 0.45 A Very Strong relationship between Trade Study activities and Project Performance for Defense Projects A Strong relationship for nondefense projects Perf vs. SEC_TRD (non-defense) % 29% 57% 27% 44% 22% Lower SEC (n=7) Middle SEC (n=15) Higher SEC (n=9) Gamma =
16 Summary of Relationships -1 Performance vs. SE Capability Total SEC Early SE Project Planning Req'ts Dev't & Mg't Verification Product Architecture Configuration Mg't Trade Studies Monitor & Control Validation Product Integration Risk Management Integ Product Teams Project Challenge Experience Moderate Weak Moderate Strong Very Strong DEF NonD 16
17 Summary of Relationships -2 Next Steps: Investigate the differences between SE deployment / effectiveness in defense and non-defense domains to find transplantable best practices 17
18 Questions for Further Study On non-defense projects, why are SE activities in Requirements, Architecture, Risk Management, and Verification more effective than those on defense-related projects? On defense projects, why are SE activities in Trade Studies, IPTs, and Project Monitoring and Control more effective than those on non-defense projects? Why is the relationship between Project Challenge and Project Performance stronger for non-defense projects? Why is the relationship between Prior Experience and Project Performance stronger for non-defense projects? 18
19 Next Steps Download the 2012 SE Effectiveness reports from the SEI website The Business Case for Systems Engineering Study: Results of the Systems Engineering Effectiveness Survey The Business Case for Systems Engineering Study: Detailed Response Data The Business Case for Systems Engineering Study: Assessing Project Performance from Sparse Data The Business Case for Systems Engineering: Comparison of Defense Domain and Non-Defense Projects Search for ways to apply the findings within your own work and your own organization. Contact the SEI with questions or to obtain assistance. 19
20 SEI Your Resource for Software and Systems Engineering For more information, contact Software Engineering Institute 4500 Fifth Avenue Pittsburgh, PA OR Joseph P. Elm
21 BACK UP
22 References Elm, J.; Goldenson, D.; El Emam, K.; Donatelli, N.; Neisa, A. A Survey of Systems Engineering Effectiveness Initial Results. Carnegie Mellon University; Pittsburgh, PA (available at Elm, J.; Goldenson, D. The Business Case for Systems Engineering Study: Results of the Systems Engineering Effectiveness Survey. Carnegie Mellon University; Pittsburgh, PA 2012 (available at Elm, J.; Goldenson, D. The Business Case for Systems Engineering Study: Detailed Response Data. Carnegie Mellon University; Pittsburgh, PA 2012 (available at Elm, J. The Business Case for Systems Engineering Study: Assessing Project Performance from Sparse Data. Carnegie Mellon University; Pittsburgh, PA 2012 (available at Elm, J.; Goldenson, D. The Business Case for Systems Engineering: Comparison of Defense Domain and Non-Defense Projects. Carnegie Mellon University; Pittsburgh, PA 2014 (available at 22
23 IPT Utilization vs. Project Performance Perf vs. SEC_IPT (defense) % 38% 46% 39% 45% 35% 26% 25% Lower SEC (n=24) Middle SEC (n=31) Higher SEC (n=20) Gamma = 0.32 A Strong relationship between IPT Utilization and Project Performance for Defense Projects A Moderate relationship for nondefense projects Perf vs. SEC_IPT (non-defense) % 36% 55% 25% 25% Lower SEC (n=11) Middle SEC (n=8) Higher SEC (n=12) Gamma =
24 Project Planning vs. Project Performance Perf vs. SEC_PP (defense) % 32% 59% 32% 43% 25% 56% 28% 16% Lower SEC (n=22) Middle SEC (n=28) Higher SEC (n=25) Gamma = 0.57 A Very Strong relationship between Project Planning activities and Project Performance for both Defense and non-defense Projects Perf vs. SEC_PP (non-defense) % 23% 69% 43% 45% 29% 29% 45% 9% Lower SEC (n=13) Middle SEC (n=7) Higher SEC (n=11) Gamma =
25 Verification vs. Project Performance Perf vs. SEC_VER (defense) % 55% 24% 41% 34% 58% 31% 12% Lower SEC (n=20) Middle SEC (n=29) Higher SEC (n=26) Gamma = 0.55 A Very Strong relationship between Verification activities and Project Performance for both Defense and non-defense Projects Perf vs. SEC_VER (non-defense) % 73% 2 2 Lower SEC (n=11) Middle SEC (n=10) Higher SEC (n=10) Gamma =
26 Validation vs. Project Performance Perf vs. SEC_VAL (defense) % 32% 35% 35% 53% 32% 17% Lower SEC (n=17) Middle SEC (n=34) Higher SEC (n=24) Perf vs. SEC_VAL (non-defense) Gamma = 0.45 A Very Strong relationship between Validation activities and Project Performance for both Defense and non-defense Projects % % 2 43% 2 Lower SEC (n=5) Middle SEC (n=21) Higher SEC (n=5) Gamma =
27 Product Integration vs. Project Performance Perf vs. SEC_PI (defense) % 25% 58% 47% 47% 7% Lower SEC (n=12) Middle SEC (n=48) Higher SEC (n=15) Gamma = 0.45 A Very Strong relationship between Product Integration activities and Project Performance for both Defense and non-defense Projects Perf vs. SEC_PI (non-defense) % 43% 38% 29% 38% 29% 25% Lower SEC (n=9) Middle SEC (n=14) Higher SEC (n=8) Gamma =
28 Configuration Mg t vs. Project Performance Perf vs. SEC_CM (defense) % 52% 45% 42% 31% 42% 24% 16% Lower SEC (n=27) Middle SEC (n=29) Higher SEC (n=19) Gamma = 0.42 A Very Strong relationship between Configuration Management activities and Project Performance for both Defense and non-defense Projects Perf vs. SEC_CM (non-defense) % 58% 36% 27% 36% 38% 13% Lower SEC (n=12) Middle SEC (n=11) Higher SEC (n=8) Gamma =
29 Monitoring & Control vs. Project Performance Perf vs. SEC_PMC (defense) % 36% 37% 54% 31% 15% Lower SEC (n=22) Middle SEC (n=27) Higher SEC (n=26) Gamma = 0.48 A Very Strong relationship between Project Monitoring and Control activities and Project Performance for Defense Projects A Strong relationship for nondefense projects Perf vs. SEC_PMC (non-defense) % 22% 67% 2 Lower SEC (n=9) Middle SEC (n=12) Higher SEC (n=10) Gamma =
30 Prior Experience vs. Project Performance Perf vs. EXP (defense) % 42% 35% 29% 35% 55% 18% 27% Lower EXP (n=24) Middle EXP (n=40) Higher EXP (n=11) Experience Gamma = 0.1 A Weak relationship between Prior Experience and Project Performance for Defense Projects A Strong relationship for nondefense projects Perf vs. EXP (non-defense) % 17% 67% 42% 25% 43% 43% 14% Lower EXP (n=12) Middle EXP (n=12) Higher EXP (n=7) Experience Gamma =
31 Project Challenge vs. Project Performance Perf vs. PC (defense) % 25% 31% 31% 31% 38% 38% 31% Lower PC (n=20) Middle PC (n=29) Higher PC (n=26) Project Challenge Gamma = A Weak Negative relationship between Project Challenge and Project Performance for Defense Projects A Moderate Negative relationship for non-defense projects Perf vs. PC (Non-defense) % 67% 17% 63% 38% 18% 18% 64% Lower PC (n=12) Middle PC (n=8) Higher PC (n=11) Gamma = Porjects delivering 31
32 Early SE vs. Project Performance Perf vs. Early_SE (defense) % 29% 58% 32% 36% 32% 57% 39% 4% Lower Early SE (n=24) Middle Early SE (n=28)higher Early SE (n=23) Gamma = 0.61 A Very Strong relationship between Early SE activities and Project Performance for both Defense and non-defense Projects % 25% 67% Lower Early SE (n=12) Middle Early SE (n=9) Early SE Project Planning Requirements Development Trade Studies Product Architecture Perf vs. Early_SE (non-defense) 1 Higher Early SE (n=10) Gamma =
The Business Case for Systems Engineering: Comparison of Defense-Domain and Non- Defense Projects
The Business Case for Systems Engineering: Comparison of Defense-Domain and Non- Defense Projects Presenter: Joseph P. Elm The Software Engineering Institute (SEI) a DoD Research FFRDC Report Documentation
More informationA Case Study: Experiences with Agile and Lean Principles
A Case Study: Experiences with Agile and Lean Principles Jeff Davenport Software Solutions Conference 2015 November 16 18, 2015 Copyright 2015 Carnegie Mellon University This material is based upon work
More informationIntroduction to Software Product Lines Patrick Donohoe Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213
Introduction to Software Product Lines Patrick Donohoe Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 2014 by Carnegie Mellon University Copyright 2014 Carnegie Mellon University
More informationAcquisition Overview: The Challenges
Acquisition Overview: The Challenges Rita Creel Robert J. Ellison June 2007 ABSTRACT: The challenges of acquiring software-intensive systems continue to grow along with the increasingly critical role software
More informationAgile In Government: A Research Agenda for Agile Software Development
Agile In Government: A Research Agenda for Agile Software Development Will Hayes Suzanne Miller Eileen Wrubel Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 March 201720
More informationOSATE overview & community updates
OSATE overview & community updates Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Julien Delange 04/22/2013 Overview of OSATE2 Eclipse-based AADL editor Support for AADLv2.1,
More informationSafety Evaluation with AADLv2
Safety Evaluation with AADLv2 Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Julien Delange 09/24/2013 Agenda Overview of AADL Error-Model Annex Approach for Safety Evaluation
More informationThe Business Case for Systems Engineering Study: Results of the Systems Engineering Effectiveness Survey
Carnegie Mellon University Research Showcase @ CMU Software Engineering Institute 11-2012 The Business Case for Systems Engineering Study: Results of the Systems Engineering Effectiveness Survey Joseph
More informationThe Method Framework for Engineering System Architectures (MFESA)
The Framework for Engineering System s () Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Donald Firesmith 5 March 2009 Donald G. Firesmith A senior member of the technical
More informationImplementing Product Development Flow: The Key to Managing Large Scale Agile Development
Implementing Product Development Flow: The Key to Managing Large Scale Agile Development Will Hayes SEI Software Solutions Conference 2015 November 16 18, 2015 Copyright 2015 Carnegie Mellon University
More informationSupporting Safety Evaluation Process using AADL
Supporting Safety Evaluation Process using AADL Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Julien Delange and Peter Feiler 12/09/2013 Safety Analysis issues (aka the
More informationWe Have All Been Here Before
We Have All Been Here Before Recurring Patterns Across 12 U.S. Air Force Acquisition Programs William E. Novak Ray C. Williams Introduction Agenda Introduction Independent Technical Assessments (ITAs)
More informationMeasuring What Matters Lisa Young
SESSION ID: GRC-R05 Measuring What Matters www.cert.org/rsa/ Lisa Young Senior Engineer CERT-Software Engineering Institute-Carnegie Mellon University Notices Copyright 2016 Carnegie Mellon University
More informationThe Smart Grid Maturity Model & The Smart Grid Interoperability Maturity Model. #GridInterop
The Smart Grid Maturity Model & The Smart Grid Interoperability Maturity Model #GridInterop Maturity Models Dueling or Complementary? SGMM? SGIMM? SGIMM? SGMM? #GridInterop Phoenix, AZ, Dec 5-8, 2011 2
More informationTSP Performance and Capability Evaluation (PACE): Customer Guide
Carnegie Mellon University Research Showcase @ CMU Software Engineering Institute 9-2013 TSP Performance and Capability Evaluation (PACE): Customer Guide William R. Nichols Carnegie Mellon University,
More informationAn Introduction to Influence Maps: Foundations, Construction, and Use
An Introduction to Influence Maps: Foundations, Construction, and Use Jim Smith NDIA Systems Engineering Conference October 29, 2009 Overview This presentation will provide an overview of Influence Maps
More informationMethodology for the Cost Benefit Analysis of a Large Scale Multi-phasic Software Enterprise Migration
Methodology for the Cost Benefit Analysis of a Large Scale Multi-phasic Software Enterprise Migration Bryce Meyer Jerry Jackson Jim Wessel Software Engineering Institute Carnegie Mellon University Pittsburgh,
More informationOCTAVE -S Implementation Guide, Version 1.0. Volume 9: Strategy and Plan Worksheets. Christopher Alberts Audrey Dorofee James Stevens Carol Woody
OCTAVE -S Implementation Guide, Version 1.0 Volume 9: Strategy and Plan Worksheets Christopher Alberts Audrey Dorofee James Stevens Carol Woody January 2005 HANDBOOK CMU/SEI-2003-HB-003 Pittsburgh, PA
More informationAcquisition & Management Concerns for Agile Use in Government Series. Agile Development and DoD Acquisitions
1 Acquisition & Management Concerns for Agile Use in Government Series Agile Development and DoD Acquisitions Acquisition & Management Concerns for Agile Use in Government This booklet is part of a series
More informationSupply-Chain Risk Analysis
Supply-Chain Risk Analysis Bob Ellison, Chris Alberts, Rita Creel, Audrey Dorofee, and Carol Woody 2010 Carnegie Mellon University Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting
More informationAcquisition & Management Concerns for Agile Use in Government Series. Agile Culture in the DoD
2 Acquisition & Management Concerns for Agile Use in Government Series Agile Culture in the DoD Acquisition & Management Concerns for Agile Use in Government This booklet is part of a series based on material
More informationFall 2014 SEI Research Review. Team Attributes &Team Performance FY14-7 Expert Performance and Measurement
Fall 2014 SEI Research Review Team Attributes &Team Performance FY14-7 Expert Performance and Measurement Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Jennifer Cowley
More informationI ve Evaluated My Architecture. Now What?
Experience with the Architecture Improvement Workshop Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Larry Jones, SEI Rick Kazman, SEI SATURN Conference, May 7, 2009 I ve
More informationEileen Forrester CMMI for Services Product Manager
CMMI for Services (SVC): The Strategic Landscape for Service Eileen Forrester CMMI for Services Product Manager Forrester is the manager of the CMMI for Services Project at the Software Engineering Institute,
More informationSCAMPI-B for Contract Monitoring A Case Study of the Mission Planning Enterprise Contractors
SCAMPI-B for Contract Monitoring A Case Study of the Mission Planning Enterprise Contractors Sponsored by the U.S. Department of Defense Lorraine Adams, SEI Kathy Bastien, BlueForce LLC SM SCAMPI is a
More informationDEPARTMENT OF DEFENSE HANDBOOK ACQUISITION OF SOFTWARE ENVIRONMENTS AND SUPPORT SOFTWARE
NOT MEASUREMENT SENSITIVE MIL-HDBK-1467 10 DECEMBER 1997 SUPERSEDING SEE 6.2 DEPARTMENT OF DEFENSE HANDBOOK ACQUISITION OF SOFTWARE ENVIRONMENTS AND SUPPORT SOFTWARE This handbook is for guidance only.
More informationProject Management by Functional Capability. NDIA CMMI Technology Conference and User s Group November 15, 2007
Project Management by Functional Capability NDIA CMMI Technology Conference and User s Group November 15, 2007 Fred Schenker Software Engineering Institute Bob Jacobs Computer Systems Center Inc. Goals
More informationTop 5 Systems Engineering Issues within DOD and Defense Industry
Top 5 Systems Engineering Issues within DOD and Defense Industry Task Report July 26-27, 27, 2006 1 Task Description Identify Top 5 Systems Engineering problems or issues prevalent within the defense industry
More informationCERT Resilience Management Model Capability Appraisal Method (CAM) Version 1.1
CERT Resilience Management Model Capability Appraisal Method (CAM) Version 1.1 Resilient Enterprise Management Team October 2011 TECHNICAL REPORT CMU/SEI-2011-TR-020 ESC-TR-2011-020 CERT Program http://www.sei.cmu.edu
More informationAgile Security Review of Current Research and Pilot Usage
Agile Security Review of Current Research and Pilot Usage Carol Woody April 2013 OVERVIEW This paper was produced to focus attention on the opportunities and challenges for embedding information assurance
More informationCMMI for Services (CMMI-SVC): Current State
: Current State Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Eileen Forrester April 2011 What I will cover Explain what the CMMI-SVC is and why we built it Discuss service
More informationUse of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in the Acquisition of Software-Intensive Systems
Use of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in the Acquisition of Software-Intensive Systems John K. Bergey Matthew J. Fisher September 2001 Architecture Tradeoff Analysis Initiative
More informationArchitecture-led Incremental System Assurance (ALISA) Demonstration
Architecture-led Incremental System Assurance (ALISA) Demonstration Peter Feiler Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 [DISTRIBUTION STATEMENT A] This material
More informationComplex Systems (Systems of Systems) Technology and What s Next!
Complex Systems (Systems of Systems) Technology and What s Next! Terry Stevenson Chief Technology Officer Date: June 2014 Copyright 2013 Raytheon Company. All rights reserved. Customer Success Is Our Mission
More informationPrioritizing IT Controls for Effective, Measurable Security
Prioritizing IT Controls for Effective, Measurable Security Daniel Phelps Gene Kim Kurt Milne October 2006 ABSTRACT: This article summarizes results from the IT Controls Performance Study conducted by
More informationEileen Forrester CMMI for Services Product Manager
CMMI for Services (SVC): The Strategic Landscape for Service Eileen Forrester CMMI for Services Product Manager Forrester is the manager of the CMMI for Services Project at the Software Engineering Institute,
More informationPerformance Assessments and Root Cause Analyses October 11, 2016
0 Performance Assessments and Root Cause Analyses October 11, 2016 Gary R. Bliss Director, PARCA Gary.R.Bliss2.civ@mail.mil 571-256-0646 PARCA Organization PARCA Director Mr. Gary R. Bliss Operations Ms.
More informationSoS Considerations in the Engineering of Systems
SoS Considerations in the Engineering of Systems Dr. Judith Dahmann The MITRE Corporation Kristen Baldwin Principal Deputy, Office of the Deputy Assistant Secretary of Defense for Systems Engineering,
More informationPerformance-Based Earned Value
Performance-Based Earned Value NDIA Systems Engineering Conference San Diego, CA October 25, 2006 Paul J. Solomon, PMP Performance-Based Earned Value Paul.Solomon@PB-EV.com 1 Agenda DoD Policy and Guidance,
More informationThe Process In-Execution Review (PIER) After Three Years
Sponsored by the U.S. Department of Defense The Process In-Execution Review (PIER) After Three Years Dale Swanson, MITRE Lynda Rosa, MITRE Jennifer Hoberman, MITRE November 2007 SM SCAMPI is a service
More informationApplying Software Architecture Evaluation to Systems Acquisition
Applying Software Architecture Evaluation to Systems Acquisition John Bergey Matthew Fisher Lawrence Jones February 2000 Carnegie Mellon University Pittsburgh, PA 15213-3890 Sponsored by the U.S. Department
More informationIndependent Verification and Validation (IV&V)
Independent Verification and Validation (IV&V) 12 th Annual NDIA CMMI Conference November 2012 - Denver, CO The MITRE Corporation The author s affiliation with The MITRE Corporation is provided for identification
More informationTOPIC DESCRIPTION SUPPLEMENT for the SYSTEMS ENGINEERING SURVEY DESCRIPTION
1 2 Objectives of Systems Engineering 3 4 5 6 7 8 DoD Policies, Regulations, & Guidance on Systems Engineering Roles of Systems Engineering in an Acquisition Program Who performs on an Acquisition Program
More informationDriving Out Technical Risk by Blending Architecture, Process, and Project Discipline
Driving Out Technical Risk by Blending Architecture, Process, and Project Discipline Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 James McHale, Robert Nord In collaboration
More informationOptional Inner Title Slide
Leading SAFe / Agile in Government for Executives: Overview January 2017 SuZ Miller, Eileen Wrubel SEI Agile in Government Team Optional Inner Title Slide Name optional 2016 Carnegie Mellon University
More informationSoftware in System Engineering: Affects on Spacecraft Flight Software
Software in System Engineering: Affects on Spacecraft Flight Software Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Charles (Bud) Hammons, PhD Mary Ann Lapham Nov 4, 2009
More informationDr. Nader Mehravari Research Scientist, CERT Division
Everything You Always Wanted to Know About Maturity Models Dr. Nader Mehravari Research Scientist, CERT Division Dr. Nader Mehravari is with the CERT Program at the Software Engineering Institute (SEI),
More informationWhat Functional Groups Are Included in the CMMI -SE/SW/IPPD/SS Model?
What Functional Groups Are Included in the CMMI -SE/SW/IPPD/SS Model? Software Engineering Process Group Conference March 9, 2004 Roland G. Weiss Joan Weszka Lockheed Martin Systems & Software Resource
More informationProcess Maturity Profile
Carnegie Mellon Process Maturity Profile CMMI v1.1 SCAMPI SM v1.1 Class A Appraisal Results 2005 End-Year Update March 2006 We could not produce this report without the support of the organizations and
More informationCMMI for Services (CMMI-SVC): Current State
: Current State Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Eileen Forrester July 2012 What I will cover Explain what the CMMI-SVC is and why we built it Discuss service
More informationHow to Develop Highly Useable CMMI Documentation
How to Develop Highly Useable CMMI Documentation Presenter: Ralph Williams, President CMM and CMMI is registered in the U.S. Patent and Trademark Office. SM IDEAL is a service mark of Carnegie Mellon University.
More informationAcquisition & Management Concerns for Agile Use in Government Series. Agile Acquisition and Milestone Reviews
4 Acquisition & Management Concerns for Agile Use in Government Series Agile Acquisition and Milestone Reviews Acquisition & Management Concerns for Agile Use in Government This booklet is part of a series
More informationArcade Game Maker Pedagocical Product Line
Arcade Game Maker Pedagocical Product Line John D. McGregor August 2003 This work is sponsored by the U.S. Department of Defense. The Software Engineering Institute is a federally funded research and development
More informationFrom Virtual System Integration to Incremental Lifecycle Assurance
From Virtual System Integration to Incremental Lifecycle Assurance Peter H. Feiler Software Solutions Conference 2015 November 16 18, 2015 Copyright 2015 Carnegie Mellon University This material is based
More informationImpact of Army Architecture Evaluations
Impact of Army Architecture Evaluations Robert L. Nord John Bergey Stephen Blanchette, Jr. Mark Klein April 2009 SPECIAL REPORT CMU/SEI-2009-SR-007 Acquisition Support Program Research, Technology, and
More informationImproving Weapon System Investment Decisions A Knowledge-based Approach to Weapon System Acquisitions Could Improve Outcomes
Improving Weapon System Investment Decisions A Knowledge-based Approach to Weapon System Acquisitions Could Improve Outcomes Travis Masters Senior Defense Analyst U.S. Government Accountability Office
More informationHighlights of CMMI and SCAMPI 1.2 Changes
Highlights of CMMI and SCAMPI 1.2 Changes Presented By: Sandra Cepeda March 2007 Material adapted from CMMI Version 1.2 and Beyond by Mike Phillips, SEI and from Sampling Update to the CMMI Steering Group
More informationSystems Engineering Concept
Systems Engineering Concept WHITE PAPER February 2017 The Systems Engineering Concept provides practical hands-on methods and tools, that enable companies to meet today s global business challenges through
More informationLeveraging Your Service Quality Using ITIL V3, ISO and CMMI-SVC. Monday Half-Day Tutorial
Leveraging Your Service Quality Using ITIL V3, ISO 20000 and CMMI-SVC Monday Half-Day Tutorial Definitions Service - Employment in duties or work for another The Challenge This situation where organization
More informationPRM - IT IBM Process Reference Model for IT
PRM-IT V3 Reference Library - A1 Governance and Management Sysem PRM-IT Version 3.0 April, 2008 PRM - IT IBM Process Reference Model for IT Sequencing the DNA of IT Management Copyright Notice Copyright
More informationUse of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in Source Selection of Software- Intensive Systems
Use of the Architecture Tradeoff Analysis Method SM (ATAM SM ) in Source Selection of Software- Intensive Systems John K. Bergey Matthew J. Fisher Lawrence G. Jones June 2002 Architecture Tradeoff Analysis
More informationSCAMPI V1.1 Method Overview
Pittsburgh, PA 15213-3890 SCAMPI V1.1 Method Overview Charles J. Ryan Sponsored by the U.S. Department of Defense 2005 by Carnegie Mellon University Objectives Review key characteristics of SCAMPI. Describe
More informationDoD Systems Engineering Update
DoD Systems Engineering Update Mr. James Thompson Director, Major Program Support Office of the Deputy Assistant Secretary of Defense for Systems Engineering NDIA SE Division Meeting, June 18, 2014 DoD
More informationSystems Engineering Challenges for an Army Life Cycle Software Engineering Center
Systems Engineering Challenges for an Army Life Cycle Software Engineering Center NDIA 8th Annual CMMI Technology Conference Denver, CO 17-20 November 2008 Dr. William A. Craig, Kerry M. Kennedy, Arthur
More informationArmy Strategic Software Improvement Program (ASSIP) Survey of Army Acquisition Program Management
Army Strategic Software Improvement Program (ASSIP) Survey of Army Acquisition Program Management Results of the 2003 Survey of Army Acquisition Managers Mark Kasunic March 2004 TECHNICAL REPORT CMU/SEI-2004-TR-003
More informationCMMI for Services (CMMI -SVC) Process Areas
CMMI for Services (CMMI -SVC) Process Areas SES CMMI Training Series August27, 2009 Dial - 1-877-760-2042 Pass code - 147272 SM SEI and CMM Integration are service marks of Carnegie Mellon University CMM
More informationQuantifying Uncertainty in Early Lifecycle Cost Estimation for DOD MDAPs
Quantifying Uncertainty in Early Lifecycle Cost Estimation for DOD MDAPs Cost Estimation Research Team: Bob Ferguson (SEI - SEMA) Dennis Goldenson PhD (SEI - SEMA) Jim McCurley (SEI - SEMA) Bob Stoddard
More informationThis resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study
RESOURCE: MATURITY LEVELS OF THE CUSTOMIZED CMMI-SVC FOR TESTING SERVICES AND THEIR PROCESS AREAS This resource is associated with the following paper: Assessing the maturity of software testing services
More informationCommon System and Software Testing Pitfalls
Common System and Software Testing Pitfalls Donald Firesmith Software Solutions Conference 2015 November 16 18, 2015 Copyright 2015 Carnegie Mellon University This material is based upon work funded and
More informationRethinking Risk Management
Rethinking Risk Management NDIA Systems Engineering Conference 2009 Audrey Dorofee Christopher Alberts Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Report Documentation
More informationGiven design trade studies, interpret the results to recommend the most cost-effective concept for development.
1 Summarize the Planning, Programming, Budgeting and Execution (PPBE) process. Analyze the financial structure and the movement of funds within the PPBE. 2 Given a scenario, develop an appropriate PMO
More informationSoftware Acquisition: A Comparison of DoD and Commercial Practices
Special Report CMU/SEI-94-SR-9 Software Acquisition: A Comparison of DoD and Commercial Practices Jack R. Ferguson Michael E. DeRiso October 1994 Special Report CMU/SEI-94-SR-9 October 1994 Software Acquisition:
More informationSGMM Model Definition A framework for smart grid transformation
SGMM Model Definition A framework for smart grid transformation Authors: The SGMM Team Version 1.2 September 2011 TECHNICAL REPORT CMU/SEI-2011-TR-025 ESC-TR-2011-025 CERT Program Research, Technology,
More informationOCTAVE -S Implementation Guide, Version 1.0. Volume 2: Preparation Guidance. Christoper Alberts Audrey Dorofee James Stevens Carol Woody.
OCTAVE -S Implementation Guide, Version 1.0 Volume 2: Preparation Guidance Christoper Alberts Audrey Dorofee James Stevens Carol Woody January 2005 HANDBOOK CMU/SEI-2003-HB-003 Pittsburgh, PA 15213-3890
More informationNDIA Life Cycle Affordability Project
NDIA Life Cycle Affordability Project 27 October 2010 Jerry Cothran J.S. Moorvitch Purpose To define and recommend an approach to institutionalize Operations and Support (O&S) cost reduction as an overarching
More informationGary Natwick Harris Corporation
Automated Monitoring of Compliance Gary Natwick Harris Corporation Gary Natwick - 1 Government Communications Systems Division DoD Programs Strategic Management and Business Development CMMI Technology
More informationWhat is the Systems Engineering Capability Model (SECM)?
What is the Systems Engineering Model (SECM)? EIA Interim Standard 731: SECM S E C A T 1 Can a Maturity Model Help You? Is your company successful at learning from past mistakes? Are you confident in your
More informationFormulation of a Production Strategy for a Software Product Line
Formulation of a Production Strategy for a Software Product Line Gary J. Chastek Patrick Donohoe John D. McGregor August 2009 TECHNICAL NOTE CMU/SEI-2009-TN-025 Research, Technology, and System Solutions
More informationBeyond Service Management: The Next Performance Advantage for All Disciplines
Beyond Service Management: The Next Performance Advantage for All Disciplines September 2012 QUATIC 2012: 8th International Conference on the Quality of Information and Communications Technology Topics
More informationAgenda Framing Assumptions (FAs) Lunch and Learn, March 8, Presenter: Brian Schultz. Moderator: Lt Col Doherty
Agenda Framing Assumptions (FAs) Lunch and Learn, March 8, 2017 Presenter: Brian Schultz Moderator: Lt Col Doherty Overview Background Definition and Guidance Why are they Important? Exercise How Do We
More informationProcess Maturity Profile
Carnegie Mellon University Process Maturity Profile Software CMM CBA IPI and SPA Appraisal Results 2002 Year End Update April 2003 We could not produce this report without the support of the organizations
More informationCombining Architecture-Centric Engineering with the Team Software Process
Combining Architecture-Centric Engineering with the Team Software Process Robert L. Nord, James McHale, Felix Bachmann December 2010 TECHNICAL REPORT CMU/SEI-2010-TR-031 ESC-TR-2010-031 Research, Technology,
More informationSection M: Evaluation Factors for Award HQ R LRDR Section M: Evaluation Factors for Award For HQ R-0002
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 LRDR Section M: Evaluation Factors for Award For 1 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 Table of
More informationSoftware Acquisition and Software Engineering Best Practices
SMC-TR-99-27 AEROSPACE REPORT NO. TR-2000(8550)-1 Software Acquisition and Software Engineering Best Practices A White Paper Addressing the Concerns of Senate Armed Services Committee Report 106-50 On
More informationSoftware Conformity Assessment Executive Summary
cpmplus Energy Manager Software Conformity Assessment Executive Summary Software Version: 4.3 of Assessment 15 November 2012 Final Report Filed on 30 January 2013 Exclusively for ABB, Inc. A client of:
More informationTop 10 Signs You're Ready (or Not)
Top 10 Signs You're Ready (or Not) For an Appraisal Gary Natwick Harris Corporation Gary Natwick - 1 Government Communications Systems Division DoD s Strategic and Business Development CMMI Technology
More informationProcess Quality Levels of ISO/IEC 15504, CMMI and K-model
Process Quality Levels of ISO/IEC 15504, CMMI and K-model Sun Myung Hwang Dept. of Computer Engineering Daejeon University, Korea sunhwang@dju.ac.kr 1. Introduction 1.1 Background The quality of a product
More informationCMMI Version 1.2 and Beyond
Pittsburgh, PA 15213-3890 CMMI Version 1.2 and Beyond SEPG 2007 Conference March 26, 2007 Mike Phillips Software Engineering Institute Carnegie Mellon University CMMI is registered in the U.S. Patent and
More informationRational Software White Paper TP 174
Reaching CMM Levels 2 and 3 with the Rational Unified Process Rational Software White Paper TP 174 Table of Contents Abstract... 1 Introduction... 1 Level 2, Repeatable... 2 Requirements Management...
More informationQuantitative Model Olympics
Quantitative Model Olympics Debra L Smith Kerry Trujillo November 5-8, 2012 Copyright 2012 Raytheon Company. All rights reserved. Customer Success Is Our Mission is a registered trademark of Raytheon Company.
More informationUpdate Observations of the Relationships between CMMI and ISO 9001:2000
Update Observations of the Relationships between CMMI and ISO 9001:2000 September September 14, 14, 2005 2005 ASQ Section 509 - ISO 9000 Users Group Page 1 This presentation summaries points made and topics
More informationSEI Webinar Series: The Next Generation of Process Evolution
SEI Webinar Series: The Next Generation of Process Evolution By Dr. Gene Miluk Today s Speaker Gene Miluk is currently a senior member of the technical staff at the Software Engineering Institute (SEI),
More informationMeasures and Risk Indicators for Early Insight Into Software Safety
Dr. Victor Basili University of Maryland and Fraunhofer Center - Maryland Measures and Risk Indicators for Early Insight Into Kathleen Dangle and Linda Esker Fraunhofer Center - Maryland Frank Marotta
More informationSOA Maturity Assessment Guiding the SOA Roadmap. White Paper July 2009
SOA Maturity Assessment Guiding the SOA Roadmap White Paper July 2009 SOA Maturity Assessment Guiding the SOA Roadmap Executive overview Service Oriented Architecture has moved beyond the hype. More and
More informationAn Introduction to Team Risk Management
Special Report CMU/SEI-94-SR-1 An Introduction to Team Risk Management (Version 1.0) Ronald P. Higuera David P. Gluch Audrey J. Dorofee Richard L. Murphy Julie A. Walker Ray C. Williams May 1994 This
More informationNATURAL S P I. Critical Path SCAMPI SM. Getting Real Business Results from Appraisals
Critical Path SCAMPI SM Getting Real Business Results from Appraisals NATURAL S P I 1 What You Will Get Level-Set: What Is and What Is Not a SCAMPI Level-Set: What a SCAMPI Can and Cannot Do Problems:
More informationAUTOMOTIVE SPICE v3.1 POCKET GUIDE
EXTENDED VDA SCOPE ASPICE v3.1 AUTOMOTIVE SPICE v3.1 POCKET GUIDE 4 5 6 7 8-9 10 11-13 14-15 16-19 20-43 44-49 50-51 52-69 70-93 94-103 104-105 106 Automotive SPICE at a glance Automotive SPICE application
More informationCapability Maturity Model Integration (CMMI) V1.3 and Architecture-Centric Engineering
Capability Maturity Model Integration (CMMI) V1.3 and Architecture-Centric SATURN Conference May 17, 2011 San Francisco, CA Dr. Lawrence G. Jones Dr. Michael Konrad Software Institute Carnegie Mellon University
More informationHealth & Safety Performance Indicators
An ISN Publication Health & Safety Performance Indicators Owner Client Peer Group Benchmarking, Canada 2012 Data Publication No. 1313 2013 ISN Copyright 2013 ISN Software Corporation. All rights reserved.
More informationDoD Template for Application of TLCSM and PBL In the Weapon System Life Cycle
DoD Template for Application of TLCSM and PBL In the Weapon System Life Cycle The purpose of this template is to provide program managers, their staff, and logistics participants in the acquisition process
More information