New Opportunities for System Architecture Measurement

Similar documents
Practical Software and Systems Measurement

Leading Indicators for Systems Engineering Effectiveness Presentation for NDIA SE Conference October 28, 2009

TOPIC DESCRIPTION SUPPLEMENT for the SYSTEMS ENGINEERING SURVEY DESCRIPTION

System Development Performance. In-process Status

SE Effectiveness Leading Indicators. Garry Roedler

Practical Systems Measurement (PSysM)

SE Effectiveness Leading Indicators Project

CMMI and FPA. the link and benefit of using FPA when rolling out CMMI. Christine Green IFPUG - Certified Function Point Specialist EDS

Measuring and Assessing Software Quality

Using your EAC as a GPS 6 Characteristics of a Good EAC. Kim Koster, Senior Director IPM

Software Quality Engineering Courses Offered by The Westfall Team

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

Software Quality Engineering Courses Offered by The Westfall Team

Software Architecture Evaluation Framework The Aerospace Corporation

Systems Engineering for Software Intensive Projects Using Agile Methods


PMBOK Guide Fifth Edition Pre Release Version October 10, 2012

MBA BADM559 Enterprise IT Governance 12/15/2008. Enterprise Architecture is a holistic view of an enterprise s processes, information and

Correlating Product and Process Measures as a Model for Systems Engineering Measurement

Requirements Verification and Validation Leading Indicators

Federal Segment Architecture Methodology Overview

Biometrics Enterprise Architecture Systems Engineering Management Plan (BMEA SEMP)

Available online at ScienceDirect. Procedia Computer Science 61 (2015 )

International Association of Certified Practicing Engineers

Practical Systems and Software Measurement. Affordability Measurement: Exploring Qualitative Approaches (14939)

Performance-Based Earned Value

1.0 PART THREE: Work Plan and IV&V Methodology

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

Project Management Institute (PMI) Practice Standard for Configuration Management

7. Model based software architecture

Design of an Integrated Model for Development of Business and Enterprise Systems

Information Sharing Environment Interoperability Framework (I 2 F)

Workshops to Jumpstart Measurement Planning

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2

Systems Engineers provide a Key Contribution and Role in System Integration and Test

Project Management by Functional Capability. NDIA CMMI Technology Conference and User s Group November 15, 2007

International Diploma in Project Management. (Level 4) Course Structure & Contents

Saving Troubled Projects

Project Managers Guide to Systems Engineering Measurement for Project Success

Value of Systems Engineering. Kerri Polidore, Systems Engineer ARDEC- Systems Engineering Infrastructure

SOFTWARE SYSTEM ENGINEERING: A TUTORIAL

Enterprise Monitoring Management

TOGAF 9.1 Phases E-H & Requirements Management

Project Management Process Groups. PMP Study Group Based on the PMBOK Guide 4 th Edition

NDIA th Annual Systems Engineering Conference. MBSE to Address Logical Text-Based Requirements Issues. Saulius Pavalkis, PhD, No Magic, Inc.

A Web-based Framework of Project Performance and Control System

How to prepare for a DOE EVMS Certification

Work Plan and IV&V Methodology

TOGAF - The - The Continuing Story Story

Top 5 Systems Engineering Issues within DOD and Defense Industry

Applying PSM to Process Measurement

The-Open-Group OG TOGAF 9 Part 1. Download Full Version :

Aerospace Software Architecture Evaluation Framework - Introduction

COSYSMO-IP COnstructive SYStems Engineering Cost Model Information Processing. Headed in a new direction

Lecture 5: Requirements Engineering II. COSI 120b, Principles of Software Engineering

CMMI for Technical Staff

Quality Management_100_Quality Checklist Procedure

Compiled by Rajan Humagai

Available online at ScienceDirect. Procedia Computer Science 36 (2014 )

Integration and Testing

Project Time Management

PROJECT QUALITY MANAGEMENT. 1 Powered by POeT Solvers LImited

NDIA Test and Evaluation Conference

Managing Systems Engineering Processes: a Multi- Standard Approach

Parametric Project Monitoring and Control: Performance-Based Progress Assessment and Prediction

Spaceflight Software Architecture Analysis Techniques

Project Integration Management

Software Project & Risk Management Courses Offered by The Westfall Team

Easing project planning for Small Programs

Models in Engineering Glossary

Requirements Analysis and Design Definition. Chapter Study Group Learning Materials

DATA ITEM DESCRIPTION TITLE: TRAINING SITUATION DOCUMENT Number: DI-SESS-81517C Approval Date:

Project Scope Management

ROEVER ENGINEERING COLLEGE Elambalur,Perambalur DEPARTMENT OF CSE SOFTWARE QUALITY MANAGEMENT

! To solve problems. ! To take up new opportunities. ! Requirements - descriptions of. " Behavior. " Data. " Constraints (eg. cost and schedule)

Rational Software White Paper TP 174

Program managers (PMs)

GLOSSARY OF TERMS For Business Performance Management

APPENDIX O CONTRACTOR ROLES, RESPONSIBILITIES AND MINIMUM QUALIFICATIONS

FOUNDATIONAL CONCEPTS FOR MODEL DRIVEN SYSTEM DESIGN

Project Quality Management

Systems Engineering, Program Management conjoined Disciplines over the Project Life Cycle

Project+ Examination Blueprint Version June 1, 2003

Summary of 47 project management processes (PMBOK Guide, 5 th edition, 2013)

Building High Assurance Systems with SAFe 4.0

Define and Initiate SubPhase

Integration Knowledge Area

QUALITY ASSURANCE PLAN OKLAHOMA DEPARTMENT OF HUMAN SERVICES ENTERPRISE SYSTEM (MOSAIC PROJECT)

Software Quality Management

Initiation Group Process. Planning Group Process

CMMI V2.0 MODEL AT-A-GLANCE. Including the following views: Development Services Supplier Management. CMMI V2.0 outline BOOKLET FOR print.

Project Stakeholder Management

9. Verification, Validation, Testing

Architecture. By Glib Kutepov Fraunhofer IESE

A Proposed Community Roadmap for Advancing the Practice of Model-Based Systems Engineering in Government Programs and Enterprises

PMBOK 2012 FIFTH EDITION CHANGES

USING PILOTS TO ASSESS THE VALUE AND APPROACH OF CMMI IMPLEMENTATION. Goddard Space Flight Center (GSFC)

Applying PSM to Enterprise Measurement

Transcription:

New Opportunities for System Architecture Measurement System Engineering Conference October 2012 Paul Kohl Lockheed Martin Dr. Ronald S. Carson -- Boeing 1

Background The United States Government Accountability Office, the United States Department of Defense ((Carter, 2010) and (Thompson 2010)), and industry (NDIA 2010) have all made the case for better measurement in support of program success. ISO/IEC/IEEE 15288 and INCOSE SE Handbook define architecture - Defines elements of the architecture - No guidance on how to measure INCOSE Handbook - Defines processes for developing an architecture INCOSE System Engineering Leading Indicators (SELI) - Defines base measures and an indicator - Measures trends in architecture and related resources and processes - Does not directly measure the quality of an architecture or its description 2

Previous Activity Outgrowth of a NDIA/PSM study 1 Identify a set of leading indicators that provide insight into technical performance Build upon objective measures in common practice in industry, government, and accepted standards. Select objective measures based on essential attributes (e.g., relevance, completeness, timeliness, simplicity, cost effectiveness, repeatability, and accuracy). Measures should be commonly and readily available Results published as NDIA System Development Performance Measurement Report, December 2011 - Architecture was a high priority area but no indicators were identified that met criteria PSM Workshop 2 July 2012 to obtain support for base measures and measurable concepts as basis for indicators 1 NDIA System Development Performance Measurement Report, December 2011 2 System and Software Architecture Measurement, July 2012 3

What is an Architecture? ISO/IEC/IEEE 42010-2011 - IEEE Systems and software engineering Architecture description Architecture (system) fundamental concepts or properties of a system in its environment embodied in its elements, relationships, and in the principles of its design and evolution Elements - Structure - Behavior - Data - Procedures Relationships - Internal - External Principles - Architecture Rules and Overarching Guidance These outcomes are all objectively measurable 4

Traditional Architecture Measurement Traditionally architecture quality was determined at the milestone reviews and was a lagging indicator at a milestone review - Design was briefed - Didn t always address requirements satisfaction - Text documentation made it difficult to see full picture or determine consistency (MIL STDs 2167A & 498B) - Assessment was subjective by experts INCOSE SELI Indicator measures trends in architecture and related resources and processes but doesn t directly measure the quality of the architecture or its description PSM focus has been on the needs of the Program Manager (PM) Model based architecting brings new opportunities for measurement 5

Architecture Measurement Needs Architecture measurement requires a set of measures to fully address the needs of Technical Leaders as well as the PM Measures may be: - Objective (Quantitative) where discrete elements can be counted or otherwise measured or - Subjective (Quality) where human judgment is needed to fully evaluate an aspect of the architecture Measures should be: - Based on common practice and standards - Readily obtainable - Reflect essential attributes of architecture Quantitative measures are preferred and easier to obtain with model based architecting 6

Quantitative Measurement Goal is to measure whether an architecture is complete and consistent and is the best at satisfying the requirements Easier with model-based architecting - Anticipated artifacts / completed artifacts - Internal reports showing missing data and inconsistencies between artifacts Empty data field counts Other reports from the modeling tool database that address consistency Requirements trace reports - Supported by many of the architecture tools but requires effort on the part of the program to create and customize - Models help visualize heuristics as well 7

July 2012 PSM Workshop Results Results: - Achieved consensus that architecture is measureable - Agreed on a set of measurable concepts - A preferred set of measures was voted on and captured in ICM table (Information Category-Measurable Concept- Prospective Measures) format Measurable Concepts - Size - Complexity - Degree of completeness (more than work unit progress) - Quality of solution - Quality of representation - Cost or effort of the architecture 8

Preferred Measures (from multi-voting) Size - Number of elements Constituent parts to be bought or developed - Number of relationships/interfaces (external) Logical and physical interfaces, organizational relationships - Number of requirements - Number of mission / system scenarios / use cases - Number of artifacts produced - Number Data points - Number of Function points - Number of Use Case points Complexity - Number of relationships/interfaces (internal & external) - Number of interactions Transaction types or messages, frequency - Number of states - Number of functions/methods 9

Preferred Measures Completeness - Requirements addressed - Artifacts produced - Artifacts total expected Quality of Solution - Degree of requirements satisfaction - Degree of Mission Assurance (the ilities) - Number of defects in the baseline - Degree of coupling - Cost of the solution (@some confidence level) - # of open TBx Quality of representation - Degree of consistency of representation - Degree of standards compliance - Number of defects 10

Post-Workshop Activities Added additional enterprise perspectives to that of the PM - Technical Leadership (e.g. Chief Architect) - Cost / Engineering Effectiveness Analysts - Enterprise Measurement Team Added questions for these additional perspectives Normalized the questions to determine common needs - Validation of preferred measures - Identify any missing measures Merged workshop preferred measures into PSM ICM Tables 11

Normalized Questions (Info Need) Information Needs Viewpoints PSM Information Category Can we do the work better? TL Process Performance Does the architecture contain all the data required? TL Product Quality Have we removed all the defects? TL Product Quality Does the architecture meet the requirements? Will we be successful? What is/was the cost (effort) needed for the architecture? PM, TL PM, TL, CA, EM Product Quality, Customer Satisfaction Resources and Cost Will the architecture be done on time? PM, TL Schedule and Progress Are process changes providing a benefit? EM Process Performance Are there trends across the business? (Defects, EM Process Performance durations, success, size and complexity) Can we predict future costs? EM Resources and Cost How big was it and can we compare it other CA, EM Product Size and Stability programs? How long did it take? CA, EM Schedule and Progress How many defects were there? CA Product Quality PM- Program Mgr, TL- Technical Leadership, CA- Cost Analysts, EM- Enterprise Measurement Tm 12

ICM Tables for Schedule & Progress PSM Info Category Measurable Concept Schedule and Progress Work Unit Progress Milestone Completion Questions Addressed Will the architecture be done on time? Prospective Indicators EVMS (SPI), Artifacts produced versus the plan 1 Sample Measures EVMS data Artifact completed Artifacts planned # of requirements addressed 2 Degree of Completion Schedule and Progress Duration How long did it take? N/A Historical Planned Schedule Actual Schedule 1 To avoid subjectivity measurement of Produced artifacts must align with a verifiable event such as an inspection or baseline acceptance. 2 Requirements addressed is measured by number of requirements traced to architecture element or artifact. 13

Example Progress Table/Chart Estimated # of diagrams System Behavior Diagrams Subsystem Behavior Diagrams Component Behavior Diagrams Started Definition TEM Complete Drawn Inspected ERBed % Complete 26 26 26 26 26 100% 175 175 170 160 150 86% 300 25 25 20 15 5% 14

ICM Table for Product Quality PSM Info Category Product Quality (Solution) Product Quality (Representation) Measurable Concept Functional Correctness, illities Degree of Requirements Satisfaction, Degree of Mission Assurance Functional Correctness Questions Addressed Does the architecture meet the requirements? Will we be successful (will it work)? Does the architecture contain all required data? Have we removed all the defects? How many defects were there? Prospective Indicators Multi-variate function against the driving requirements or TPM 1. Multi-variate function against the illities. Artifacts produced versus the plan, #/ % of null data elements in model, # of defects that reach the baseline Sample Measures Degree of requirements satisfaction (Threshold, Objective), # of requirements satisfied, # of defect traceable to architecture Artifacts completed, artifacts planned, null data elements 2, defects including inconsistencies 1 Construct radar chart (Kiviat) rather than defining equation. Requirements satisfied is a true/false evaluation of each requirement which must be traced to an architecture element. 2 Null data elements must be life-cycle appropriate. Some elements may not be required until later in the life-cycle. 15

Example Architecture Radar Chart / Table Key attributes Must haves Evaluate as true/false Examples: Completeness of requirements coverage Threshold performance Attribute 2 Attribute 3 Attribute 4 Attribute Weight Value Weighted Value Flexibility 25% 75% 19% Adaptability 10% 80% 8% Modular 15% 25% 4% Simplicity 10% 75% 8% Usability 10% 75% 8% Performance 30% 100% 30% Total 100% 77% Attribute 1 Attribute 5 Utility Function for the architecture assessment is a simple weighted sum of the assessed attribute values repeat for each candidate architecture! Attribute N Attribute 6 Attribute 7 16

ICM Table for Process & Performance PSM Info Category Process and Performance Measurable Concept Process Efficiency Questions Addressed Can we do the work better? Prospective Indicators Hours per artifact and trends Defects at process steps 1 Sample Measures Hours per artifact, # of defects Process and Performance Process Effectiveness Are process changes providing benefits? 2 Hours per artifact and trends Defects at process steps Hours per artifact, # of defects Process and Performance Process Compliance Are there trends across the business (Defects, durations, success, size and complexity)? 3 Trends of selected architecture measures on multiple programs All architecture measures 4 1 Defects and trends should be captured at internal reviews (e.g. Inspections or baseline approval reviews) 2 This question must be measured against a known baseline or in comparing two programs 3 This question must be measured across multiple programs and does not directly benefit the program 4 All measures being collected across the enterprise are options. Measures should be chosen to provide value to the enterprise. 17

ICM Table for Resources & Cost PSM Info Measurable Category Concept Resources and Cost Personnel Effort Resources and Cost Support environment resources Questions Addressed What is/was the cost (effort ) needed to develop the architecture? What is/was the cost (effort ) needed to develop the architecture? 1 Prospective Indicators EVMS (CPI) Cost of development environment tools and on-going maintenance Sample Measures Labor hours, staff heads, ACWP, staff experience, budget, cost Dollars Resources and Cost Financial Performance Can we predict future costs? N/A Historical Architecture development cost 1 Useful to compare cost of different tool suites and as part of Business Case analysis. See Process Effectiveness. 18

ICM Table for Product Size & Stability PSM Info Category Product Size and Stability Product Size and Stability Product Size and Stability Measurable Concept Functional Size and Stability Size Questions Addressed Functional Size and How big is it? Stability How hard is the job? Size, Complexity Functional Size and Is the design Stability stable? Prospective Indicators Sample Measures How big was it? N/A Historical # of system elements, # of interfaces, # of requirements Element count, Internal interface and transaction counts % of change at each architecture level 1 # of system elements, # of external interfaces, # of internal interfaces, # of requirements, # of transactions/ message types # of objects in model, # of changes in time frame to objects 1 Must measure the right changes. Don t measure stability of preliminary design at SRR. 19

Impact of Architecture Frameworks on Measurement Architecture Frameworks have defined stable sets of process activities (TOGAF) or viewpoint/models (DoDAF & FEAF) The latter provide items which may be measured When combined with the advances in modeling tools we have a standard set of products which may be measured with relative ease - Size (number of elements and interface) - % Complete (artifacts/diagram) - Conformance to standard (diagram types and standard data elements) - Adequacy of representation (right viewpoints & well represented) 20

Heuristics Does it look right - Heuristics are experience based - Review of the model artifacts can sometimes indicate if an architecture exhibits good / bad characteristics such as low cohesion or high levels of coupling - Not generally directly measurable using quantitative means Internal metrics - Number of internal interfaces - Number of requirements per architecture element can indicate an imbalance - Coupling counts Heuristics must be applied within the architecture team to be effective - Utilized as part of artifact/product inspections - Required application prior to baselining of products Otherwise Heuristics become a lagging indicator - Found at milestone reviews - Become defects 21

Heuristics Example High External Complexity Low External Complexity Which Partitioning is Better? Why? 22

MEANS OF MEASURING 23

Measurement in a model based environment Model based architecting (or architecture modeling) makes the evaluation of completeness and consistency feasible as a leading indicator) - Architecture tools provide better insight into consistency and completeness via pre-defined reports or by directly accessing the underlying database - Makes it easy(ier) to count artifacts and determine change dates - Easier to determine missing information - Easier to make consistency checks between architecture artifacts (parent-child, peer-to-peer) Quantitative measures are now available 24

Measuring Size & Complexity Size & Complexity measures available from architecture tools - Number of elements (from model diagrams) - Number of external interfaces (from context diagram) - Number of requirements (from requirements tool) - Number of objects on diagrams / artifacts - Number of data elements / fields associated with artifacts and objects - Number of artifacts by type - Number of classes / objects - Number of functions/methods - Number of interactions - Number of functional requirements traced to an architecture element or artifact (e.g. scenario) 25

Measuring Completeness Degree of Completeness measures available from architecture tools - Size Measures - Empty required data fields - Number of {Size Measure} complete - % of {Size Measure} complete or at a given approval state - Quantity and trend (of closure) of empty required data fields (definition of required will change by milestone) - Number of functional requirements traced to an architecture element or artifact (e.g. scenario) - % of functional requirements traced to an architecture element or artifact (e.g. scenario) - Number & trend of closure of architecture TBx - Number & trend of closure of requirement TBx 26

Measuring Quality of Representation Captured / reported from architecture or process tools - # of Defects in baselined artifacts External standards compliance Consistency of representation (i.e. adherence to APS&C) - Quantity and trend (of closure) of empty required data fields (definition of required will change by milestone) - Stability of architecture artifacts (number of changes across time) 27

Measuring Solution Quality Captured / reported from architecture or other tools - # of Defects in baselined artifacts Solution error (e.g. doesn t work) - Number of functional requirements traced to an architecture element or artifact (e.g. scenario) - % of functional requirements traced to an architecture element or artifact (e.g. scenario) - Number & trend of closure of architecture TBx - Number & trend of closure of requirement TBx - Degree of TPM satisfaction based on modeling or other method - Degree of satisfaction of ilities (could be based on checklist or other tools) - Stability of architecture artifacts (number of changes across time) Reviewer comments - Design Assessments before or at milestones 28

Measuring Cost or Effort Captured / reported from architecture or process tools - Size Measures - Experienced Productivity - CPI/SPI - Estimate at Completion (EAC) - Control Account charges 29

Summary Model based architecting has made it possible to objectively architecture - Easier answer information needs of the PM, technical leadership and other stakeholders - Potential measures and presentation methods provided in this presentation - Thresholds remain to be established for these measures Quality of Solution remains somewhat subjective as each stakeholder in the architecture has a different perspective Standardization of measurement can be achieved but requires top-down (customer) direction - Support definition of thresholds - Frameworks support this standardization 30

QUESTIONS? 31

Author Info Paul Kohl - 610-531-6177 (Ofc) - Lockheed Martin Integrated Systems & Solutions (IS&GS) - Paul.v.kohl@lmco.com Dr. Ronald S. Carson - 253-657-0574 (Ofc) - Boeing Defense, Space & Security - Ronald.s.carson@boeing.com 32