Assuring Mission Success in Complex Settings

Size: px
Start display at page:

Download "Assuring Mission Success in Complex Settings"

Transcription

1 Assuring Mission Success in Complex Settings Software Engineering Institute Carnegie Mellon University Pittsburgh, PA Christopher Alberts and Audrey Dorofee March 2007

2 Report Documentation Page Form Approved OMB No Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE MAR REPORT TYPE 3. DATES COVERED to TITLE AND SUBTITLE 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Carnegie Mellon University,Software Engineering Institute (SEI),Pittsburgh,PA, PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR S ACRONYM(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT 11. SPONSOR/MONITOR S REPORT NUMBER(S) 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Same as Report (SAR) 18. NUMBER OF PAGES 45 19a. NAME OF RESPONSIBLE PERSON Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18

3 Contents / Agenda Background SEI MOSAIC: Managing for Success SEI MOSAIC Project SEI MOSAIC Toolkit Mission Diagnostic MAAP Conclusion 2

4 Managing Complexity Managers are responsible for overseeing increasingly complex projects, programs, and operational processes. Multiple points of management control Complex tasks Complex, distributed support technologies Multiple, detailed status reports A variety of management techniques (project, security, financial, technology, etc.) Requirements of multiple stakeholders 3

5 Need for a New Approach Traditional analysis and management approaches not designed for complex environments Cannot handle organizational and technological complexity Do not easily scale to distributed environments Need new methods, tools, and techniques to Position projects, programs, and processes for success Establish and maintain confidence in achieving objectives 4

6 Report events to CC Receive call Monitor queue If assigned to Tier 1 queue Receive If handled XOR If assigned to T ier 2 queue Monitor queue Monitor events and indicators M onitor open source information Triage event From Gov ernment Manager or Center (on Declare, Respond to, and Close Incidents) From Call C enter Tier 1, Tier 2, or Shift Leads Triage and respond t o event XOR If escalated If assigned to Call Center Tier 2 Triage and respond to event If escalated to IR Team XOR if reassigned outs ide of IM process Reassigned outside IRC If resolved and no action required Resolved event If resolved and field action required A A XOR If escalated to Shift Leads Triage and respond to event If escalated to IR Team D if reassigned outside of IM process If resolved and no action required If resolved and field action required If escalated to service groups (IR Team, IT Group) Reassigned outside I RC Resolved event A if reassigned outs ide of IM process Reas signed outside IRC If resolved and no action required Resolved XOR ev ent If resolved and field act ion required A If escalated to other groups From IR Team, Center mgm t, or government mgmt. T his triggers other groups to respond t o the event (e.g., IG, law enforcement) Respond to event To Field: Implement actions and close ticket To Field: Implement actions and close ticket To Field: Implement actions and close ticket Respond to event Approve release to XOR field If approved Check doc ument quality If comminication Obtain or dat a call approval to Resolved releas e ev ent If resolved and no action required Respond to XOR event If resolved and f ield action required If escalated to other groups B To Other Groups : Respond to event (e.g., IG, law If potential incident C enforcement, other center or government groups ) To IR Team: Recomm end Incident (on Declare, Respond to, and Close Incidents) B If not approved If not approved If data call Approve approved releas e to XOR field If communicat ion approved Implem ent actions and close ticket Conduct data call Send comm unication Implem ent actions Note: This work flow is a snapshot, based on available information Note: Com munications to the field include bulletins, advis ories, and alerts, etc. Implement actions and report status M onitor data call status IT C Recommend incident Approve recommendation From Incident Response Team: Respond to event (on Detect, Triage, and Respond to Events) XOR Managing for Mission Success Managing for mission success requires establishing and maintaining a reasonable degree of confidence that a mission s objectives will be successfully achieved. Declare, Respond to, and Close Incidents Other Groups Declare incident If approved If not approved If declared XOR If not declared To Incident Response Team: Respond to event (on Detect, Triage, and Respond to Events) D Respond to incident Respond to incident Respond to incident Respond to incident Respond to incident Respond to incident Respond to incident Respond to incident Respond to incident Respond to incident Communicate with IM Team If communication or data call Check document quality XOR This indicates a coordinated response to an incident led by Incident Response Team. If IM Team resolved incident and directs field to implement actions If IM Team resolved incident and implemented actions Obtain approval to release Approve release to field If not approved Approve release to field If approved XOR If communication approved Send communication OR If not approved If data call approved Conduct data call Im plem ent actions Implement actions and report status Note: This workflow is a snapshot, based on available information Note: Communications to the field include bulletins, advisories, and alerts, etc. Monitor status Close incident Operational Mission Senior Watch Oficers Watch Officers Help Desk Shift Leads Help Desk Security Tier 2 Help Desk Security Tier 1 Software and System Detect, Triage, and Respond to EventsDevelopment Incident Response QA/Tech Contractor Government Fi eld Team Writers Manager Manager Supports Call Center Tier 2 Call Center Tier 1 Data Call Center Shift Leads Senior. QA/Tech IRC Government Other Watch Incident Response Team IT Group Watch Writers Manager Manager Groups Officers Officers Field ` Produces Internet Systems and Infrastructures 5

7 SEI MOSAIC: Managing for Success 6

8 Overview SEI Mission-Oriented Success Analysis and Improvement Criteria (MOSAIC) is a structured decision-making approach that Establishes a reasonable degree of confidence in the potential for a successful mission Helps ensure mission success in projects, programs, processes, and systems 7

9 Strategic Allocation of Resources People need a way to make appropriate tradeoffs among a broad range of factors. Performance Interoperability Safety Broad Tradeoff Space Security Quality Dependability 8

10 SEI MOSAIC: A Lifecycle Approach Perform during any lifecycle phase Supports most system lifecycle models Strategy Evaluation Design Planning Testing/ Integration Operations/ Maintenance Concept Exploration Requirements Analysis Development Activities Release/ Production 9

11 Managing the Outcome An outcome is the result achieved when executing a mission. A range of potential outcomes is possible Some outcomes are acceptable success Some outcomes are unacceptable failure SEI MOSAIC defines an approach for managing the expected outcome in relation to the desired outcome. What is the mission likely to achieve? What do I want the mission to achieve? 10

12 Range of Potential Outcomes Current Conditions Mission Activities Range of Potential Outcomes Potential Events 11

13 Positioning for Success A range of outcomes is possible for any given mission. Conditions and potential events affect mission execution and influence a mission s eventual outcome must be appropriately managed to position a mission for success The objective is to drive the expected outcome toward acceptable states. Acceptable Outcomes Unacceptable Outcomes

14 Unique Features of SEI MOSAIC Traditional Risk Management Narrow scope (single project, system, or organization) Linear view of risk (cause-effect pairs) Threat-driven Hazard avoidance Playing not to lose Broad scope (distributed processes, systems of systems) Interrelated view of risk Outcome-driven Opportunity seeking Playing to win SEI MOSAIC 13

15 SEI MOSAIC Project 14

16 Characteristics of Current Approaches A prevalence of one-size-fits-all analysis and management methods Complex solutions that are not easily tailored (especially to small organizations) Tied to specific domains or problems Locally optimized results Narrow tradeoff space Subset of the lifecycle Narrow scope (e.g., single project, system, or organization) 15

17 SEI MOSAIC Approach Each SEI MOSAIC method is tailored to A given situation, problem space, or lifecycle phase The domain or application area The circumstances at hand SEI MOSAIC is focused on global effectiveness and mission success. Broad tradeoff space Lifecycle focus (development and operations) Broad scope (e.g., distributed processes, supply chains, systems of systems) 16

18 SEI MOSAIC Toolkit SEI MOSAIC Toolkit Military and Defense Financial Sector Medical Sector Complex Missions Strategy Evaluation Design Planning Testing/ Integration Operations/ Maintenance Concept Exploration Requirements Analysis Development Activities Release/ Production 17

19 SEI MOSAIC Methods Our current work is focused on developing a suite of analysis methods. Two methods so far: Mission Diagnostic is a basic approach that provides a quick, high-level evaluation. Mission Assurance Analysis Protocol (MAAP) is a comprehensive approach that provides an in-depth evaluation. 18

20 Mission Diagnostic What A time-efficient means of assessing the potential for success Why To determine whether conditions are favorable for a successful outcome Key Results An evaluation of key indicators and an estimate of the success potential 19

21 Key Indicators Evaluate a set of indicators representing key aspects of management, for example: Realistic goals Customer requirements Staffing requirements Technology feasibility Plans and schedules Are customer requirements and needs well understood? 20

22 Evaluating Key Indicators Question Answer No Likely no Equally likely Likely yes Yes 1. Are goals realistic and well articulated? q q q n q Each indicator is evaluated based on the data that have been collected. Uncertainty is incorporated into the range of answers for each indicator. 21

23 Indicator Evaluation Criteria Answer Yes Definition The answer is almost certainly yes. Very little uncertainty exists. Likely yes Equally likely Likely no No The answer is most likely yes. However, a degree of uncertainty exists. The answer is just as likely to be yes or no. A high degree of uncertainty exists. The answer is most likely no. However, a degree of uncertainty exists. The answer is almost certainly no. Very little uncertainty exists. 22

24 Indicator Analysis Yes A simple analysis provides insight into a mission s health. Likely yes Equally likely Uncertainty Line Likely no No Indicator 1 Indicator 2 Indicator 3 Indicator 4 Indicator 5 Indicator 6 Indicator 7 Indicator 8 Indicator 9 Indicator 10 23

25 Managing the Potential for Success The goal is to improve a mission s current state of health. Excellent Mission Health Good Borderline Poor Success Threshold Critical Current State Desired State 24

26 Indicators for Software Development Programs Are goals realistic and well articulated? Are communication and information sharing about mission activities effective? Are customer requirements and needs well understood? Are stakeholder politics or other external pressures minimal? Does the process design support efficient and effective execution? Are process control mechanisms are effective? Is task execution efficient and effective? Are staffing and funding sufficient to execute all mission activities? Are the technological and physical infrastructures adequate to support all mission activities? Are changing circumstances and unpredictable events effectively managed? 25

27 Evaluating Indicators The following data are recorded for each indicator: Indicator score Rationale for indicator score Analysis approach (for example, intuition, qualitative analysis, quantitative analysis, other) Potential actions Evaluators Date 26

28 Mission Diagnostic Exercise and Handout 27

29 Tailoring Questions The following questions can be used when tailoring or developing a set of indicators: What constitutes a successful result for the project or process? What constitutes an unsuccessful result, or failure, for the project or process? What circumstances or conditions tend to produce a successful outcome when conducting the project or process? What circumstances or conditions tend to produce an unsuccessful outcome, or failure, when conducting the project or process? 28

30 Mission Diagnostic Across the Lifecycle Strategy Evaluation Design Planning Testing/ Integration Operations/ Maintenance Concept Exploration Requirements Analysis Development Activities Release/ Production How much uncertainty in these indicators can you tolerate at different points in the lifecycle? 29

31 MAAP What A systematic approach for thoroughly analyzing the potential for success Why To characterize the full range of drivers affecting the success potential To set management priorities to ensure the success potential is maintained within tolerance Key Results An operational model, customized analysis artifacts, a measure of the success potential, and strategies for keeping the success potential within tolerance 30

32 Operational Model of Mission Activities Detect, Triage, and Respond to Events Call Center Tier 2 Call Center Tier 1 Receive call Monitor queue If assigned to Tier 1 queue Receive If handled XOR If assigned to Tier 2 queue Monitor queue Triage and respond to XOR event If escalated Triage and respond to event if reassigned outside of IM process Reassigned outside IRC If resolved and no action required Resolved event If resolved and field action required A if reassigned outside of IM process Reassigned outside IRC XOR If resolved and no action required Resolved event If resolved and field action required A To Field: Implement actions and close ticket To Field: Implement actions and close ticket Note: This workflow is a snapshot, based on available information Call Center Shift Leads If assigned to Call Center Tier 2 If escalated to Shift Leads Triage and respond to event XOR if reassigned outside of IM process If resolved and no action required If resolved and field action required Reassigned outside IRC Resolved event A To Field: Implement actions and close ticket Senior. Watch Officers Triage event XOR If escalated to IR Team If escalated to IR Team If escalated to other groups Watch Officers Monitor events and indicators If escalated to service groups (IR Team, IT Group) Other Groups From IR Team, Center mgmt, or government mgmt. This triggers other groups to respond to the event (e.g., IG, law enforcement) B Respond to event IT Group Respond to event Government Manager If not approved If not approved Approve release to field XOR If data call approved Note: Communications to the field include bulletins, advisories, and alerts, etc. IRC Manager Approve release to field XOR If approved If communication approved QA/Tech Writers Check document quality Incident Response Team Monitor open source information From Government Manager or Center (on Declare, Respond to, and Close Incidents) D Respond to event If comminication Obtain or data call approval to Resolved release event If resolved and no action required XOR If resolved and field action required If escalated to other groups B To Other Groups: Respond to event (e.g., IG, law If potential incident C enforcement, other center or government groups ) Conduct data call Send communication Monitor data call status Field Report events to CC From Call Center Tier 1, Tier 2, or Shift Leads A To IR Team: Recommend Incident (on Declare, Respond to, and Close Incidents) Implement actions and close ticket Implement actions Implement actions and report status 31

33 Drivers of Success and Failure A broad range of drivers must be considered when analyzing the potential for mission success. Mission Design Activity Environment Event Success Failure 32

34 Mission A mission threat is a fundamental flaw, or weaknesses, in the purpose and scope of a work process. 33

35 Process Design A design threat is an inherent weakness in the layout of a work process. 34

36 Activity Management An activity threat is a flaw, or weaknesses, arising from the manner in which activities are managed and performed. 35

37 Operational Environment An environment threat is an inherent constraint, weakness, or flaw in the overarching operational environment in which a process is conducted. 36

38 Event Management Event Event An event threat is a set of circumstances triggered by an unpredictable occurrence that introduces unexpected change into a process. 37

39 Scenario-Based Analysis Scenario 1 Expected operational conditions Outcome during expected operational conditions Event 1 Scenario 2 When stressed by Event 1 Outcome resulting from Event 1 Expected outcome Scenario 3 When stressed by Event 2 Event 2 Outcome resulting from Event 2 38

40 Complex Risks IR team has too many tasks relative to number of staff All security events go to IR team R4 Events could be escalated unnecessarily by Call Center R8 False positives could be forwarded by Watch Office Difficult to find qualified staff Limited time and opportunity to stay current Insufficient tools to support IR tasks R20 Understaffing could lead to quality and response time problems Inadequate equipment for online training Inadequate staffing Training is informal and based on mentoring Heavily reliant on onthe-job training Lack of comprehensive, cross, and QA training IR team is bottleneck Limited backup capability for IDS tuning Inadequate training program IDS tools inherently provide false positive Reliance on pre - existing KSAs Inadequate and inefficient tuning of IDS tools exacerbates false positives Watch Office staff have uneven skills for recognizing false positives Best person for job not always selected IDS tools provide false positives Sites don t notify CIRC when performing internal scans R8 False positives could be forwarded by Watch Office 39

41 Outcome Analysis The goal is to ensure that the expected outcome for each objective in all evaluated scenarios is acceptable to key stakeholders. Best Outcome Success threshold for cost Gap between current and desired states for cost Worst Outcome Product Objective Schedule Objective Cost Objective 40

42 Unique Features of SEI MOSAIC Manages the potential for success Can be applied to highly distributed programs and operational processes Provides a global view of a mission Analyzes issues that are too complex for other techniques 41

43 Potential Application Areas Large, distributed software development programs Organizations in dynamic, rapidly changing business environments Organizations with strict reliability, security, and safety requirements Large, distributed supply chains Processes supporting critical infrastructures Distributed information-technology (IT) processes 42

44 Future Research and Development Refine the current SEI MOSAIC analysis protocols. Define and pilot additional SEI MOSAIC analysis protocols. Begin work on an approach for real-time monitoring and management of mission outcomes. 43

45 For Additional Information Telephone 412 / Fax 412 / WWW U.S. mail Customer Relations Software Engineering Institute Carnegie Mellon University Pittsburgh, PA

46