Net-Ready Key Performance Parameter (NR-KPP) Implementation Guidebook

Size: px
Start display at page:

Download "Net-Ready Key Performance Parameter (NR-KPP) Implementation Guidebook"

Transcription

1

2

3 Net-Ready Key Performance Parameter (NR-KPP) Implementation Guidebook Version October 2009 Assistant Secretary of the Navy (Research, Development, and Acquisition) Chief Systems Engineer Department of the Navy Washington, D.C DISTRIBUTION STATEMENT D: Distribution authorized to the Department of Defense and U.S. DoD contractors only (07/22/09). Other requests shall be referred to ASN (RDA) CHSENG.

4

5 Table of Contents 1. Executive Summary Introduction Background Approach Guidebook Contents Net-Ready KPP Clarification Net-Ready Definition Net-Ready KPP Definition Net-Ready KPP Compliance Statement Net-Ready KPP Description Net-Ready KPP Effectiveness and Performance Measures Net-Ready KPP Compliance Measures Four-Step Net-Ready KPP Process Mission Analysis Information Analysis Systems Engineering Documentation Summary Appendix A: Four-Step Net-Ready KPP Process Details Step 1: Mission Analysis Step 2: Information Analysis Step 3: Systems Engineering Process Step 4: Documentation Appendix B: Net-Ready KPP Template Appendix C: Net-Ready KPP In Gate Reviews Appendix D: References Appendix E: Acronyms Appendix F: Definitions List of Figures Figure 1. NR-KPP Current and Desired States... 3 Figure 2. Terms Used In the refined NR-KPP Compliance Statement... 6 Figure 3. Three Components of NR-KPP Compliance Statement... 8 Figure 4. Refined NR-KPP Compliance Statement Figure 5. Four-Step NR-KPP Process Figure 6. Mission Analysis Figure 7. Information Analysis Figure 8. DAG Systems Engineering Process Figure 9. NR-KPP Systems Engineering Figure 10. Documenting the NR-KPP Figure A1. Mission Analysis Process ii

6 Figure A2. Sample Mission Task for BMD Figure A3. Sample AV-1 for BMD Figure A4. Sample OV-1 for BMD Figure A5. Sample Operational Task for BMD Figure A6. Sample partial OV-5 for BMD Figure A7. Sample OV-6c for BMD Figure A8. NR-KPP Elements Specified During Mission Analysis Figure A9. NR-KPP Elements Specified During Mission Analysis Figure A10. Relationship between the OV-2, OV-3, and SV Figure A11. Sample OV-2 for BMD Figure A12. Sample OV-7 for BMD Figure A13. DoDAF OV-3 Template Figure A14. Sample OV-3 for BMD Figure A15. Sample full OV-5 for BMD Figure A16. Sample SV-5a for BMD Figure A17. Sample SV-4a for BMD Figure A18. Sample SV-7 for BMD Figure A19. Sample SV-2 for BMD Figure A20. SV-6 Template Figure A21. Sample SV-6 for BMD Figure A22. SV Product Traceability Figure A23. Sample SV-11 for BMD Figure A24. SV Product Traceability Figure A25. Sample SV-1 for BMD Figure A26. NR-KPP Traceability Figure A27. Required NR-KPP Artifacts Figure B1. Refined NR-KPP Compliance Statement Figure B2. Three Components of the Refined NR-KPP Compliance Statement List of Tables Table 1. Three NR-KPP Attributes Table C1. NR-KPP In Gate Reviews iii

7 1. Executive Summary The Assistant Secretary of the Navy (ASN) (Research, Development, and Acquisition (RDA)) mandates that programs field systems in accordance with Joint Requirements Oversight Council (JROC) and/or Chief of Naval Operations requirements. This means that some programs must satisfy the Joint Capabilities Integration and Development System (JCIDS) requirement to field net-ready systems. 1 To ensure net-readiness, the JCIDS requires some programs to satisfy a Net-Ready Key Performance Parameter (NR- KPP). The Joint Staff (J68) uses the NR-KPP to provide an Interoperability and Supportability (I&S) certification. From this standpoint, the NR-KPP is not simply a collection of DoD Architecture Framework (DoDAF) products. The NR-KPP is an Operational Requirement. CJCSI E articulates this NR-KPP in terms of an NR-KPP Compliance Statement that some programs must include in their JCIDS documentation. Programs have generally had difficulty in developing derived requirements from the NR-KPP Compliance Statement. This Guidebook provides Program Managers, Systems Engineers, and Test Engineers with a methodology for decomposing the NR-KPP Compliance Statement into measurable and testable derived requirements that they can address using their normal Systems Engineering Process. Readers should keep in mind that the approach described in this guidebook represents one option for programs to use. Therefore readers should evaluate the approach and use where appropriate. The Guidebook clarifies the definitions of net-readiness and the NR-KPP. It also describes a refined NR-KPP Compliance Statement that programs can use as a template for their derived NR-KPP requirements. The Guidebook then shows how programs can deal with the NR-KPP by implementing a Four-Step Process based on Mission Systems Engineering principles. The Four-Step Process includes the following activities: 1) A Mission Analysis (MA) to determine derived NR-KPP Operational Requirements in terms of missions, mission activities, and associated Mission Effectiveness and Operational Performance Measures. 2) An Information Analysis (InA) to determine the derived Operational Information Requirements in terms of required networks, mission thread Information Elements, and associated Operational Performance Measures. 3) Systems Engineering (SE) to decompose the derived requirements defined in the MA and InA into System Performance Requirements for use during System Design and Realization. 4) Documentation of the Four-Step Process according to engineering best practices and Compliance Measures in the NR-KPP Compliance Statement. At first glance, the Four-Step Process appears to describe things that programs already do as part of their normal acquisition process. However, each of the steps specifically 1 See paragraph 3.1 for a discussion of the meaning of net-ready Page 1

8 targets items in the NR-KPP that Program Offices and Resource Sponsors do not address uniformly. By providing guidance on these areas, the Four-Step Process alleviates issues in dealing with the NR-KPP. Ultimately, the Four-Step Process outlines how programs may consistently develop measurable and testable derived NR-KPP requirements and incorporate the NR-KPP into System Design and System Realization. Using the Guidebook will also facilitate Milestone Reviews, Gate Reviews, and I&S certification. 2. Introduction The Assistant Secretary of the Navy (ASN) (Research, Development, and Acquisition (RDA)) mandates that programs field systems in accordance with Joint Requirements Oversight Council (JROC) and/or Chief of Naval Operations requirements. This means that some programs must satisfy the Joint Capabilities Integration and Development System (JCIDS) requirement to field net-ready systems. 2 To ensure net-readiness, the JCIDS requires some programs to satisfy a Net-Ready Key Performance Parameter (NR- KPP). The Joint Staff (J68) uses the NR-KPP to provide an Interoperability and Supportability (I&S) certification. 3 From this standpoint, the NR-KPP is not simply a collection of DoD Architecture Framework (DoDAF) products. The NR-KPP is an Operational Requirement. CJCSI E articulates the NR-KPP in terms of an NR-KPP Compliance Statement that some programs must include in their JCIDS documentation. The NR-KPP differs from typical KPPs in that the instruction describing it contains process constraints instead of measurable and testable attributes. As with other KPPs, the Operational Requirements for the NR-KPP should result from analyses done as part of the JCIDS process. Because the NR-KPP Compliance Statement focuses on complying with a set of process constraints instead of measurable and testable performance attributes, the JCIDS rarely specifies the NR-KPP s Operational Requirements in sufficient detail to perform systems engineering. As a result, many programs don t understand the intent of the NR-KPP and its role in the Systems Engineering (SE) Process. These programs usually treat the NR-KPP differently than their other Key Performance Parameters (KPPs) and simply ensure they have satisfied the I&S Certification checklists instead of engineering I&S into their system. This results in NR-KPP artifacts that are minimally relevant to system design, verification, and validation. Ultimately this means that the Navy fields systems that are not net-ready, and warfighters do not get the net-centric capabilities needed for successful military operations. 2 See paragraph 3.1 for a discussion of the meaning of net-ready 3 The Navy provides I&S certification for NSS and / or IT systems without an NR-KPP. The Information Support Plan is also part of I&S certification. For a discussion of that process (including waivers) see RDA CHSENG s ISP Template and Guidebook. Page 2

9 As shown in Figure 1, programs typically engineer and test their systems without welldefined NR-KPP Operational Requirements. Figure 1 also shows the desired end state in which well-defined NR-KPP Operational Requirements precede system engineering and testing. By providing a method to develop derived NR-KPP requirements, the Four-Step process ensures that programs have well-defined starting point for engineering I&S into their systems using their normal SE Process. Figure 1. NR-KPP Current and Desired States 2.1. Background In March 2007, the Deputy Assistant Secretary of the Navy (DASN) (Command, Control, Communications, Computers, Intelligence, and Space (C4I and Space)) requested that the ASN (RDA) Chief Systems Engineer (CHSENG) conduct a Lean Six Sigma (LSS) project to, find potential process improvements for NR-KPP development, review, and use. 4 In this request, DASN (C4I and Space) expressed interest in identifying ways to adjust the NR-KPP process and documentation to better serve acquisition programs. 5 This project found that: The Joint Staff NR-KPP Instruction lacks clarity. Operational requirements documents do not specify measurable and testable NR- KPP requirements. Programs can not adequately demonstrate they engineered interoperability into their system. The Operational Test and Evaluation (OT&E) community often finds a program s NR-KPP lacks traceability to other requirements. 4 DASN (C4I and SPACE) memorandum, 30 March 2007, Review of Net Ready Key Performance Parameter (NR-KPP Impact on Acquisition Programs. 5 Ibid Page 3

10 In response to these challenges, ASN (RDA) CHSENG developed this NR-KPP Guidebook. Readers should keep in mind that the approach described in this guidebook represents one option for programs to use. Therefore readers should evaluate the approach and use where appropriate Approach This Guidebook clarifies the definitions of net-readiness and the NR-KPP. Based on this clarification, the Guidebook identifies measurable and testable derived NR-KPP requirements. It also shows how the Compliance Measures in the NR-KPP Compliance Statement constrain the SE Process. The Guidebook then describes a refined NR-KPP Compliance Statement that programs can use as a template for their derived NR-KPP requirements. Using these refinements, the Guidebook proposes a Four-Step Process based on Mission Systems Engineering principles to derive and implement NR-KPP requirements. The Process develops the refined NR-KPP Compliance Statement and incorporates the measurable and testable metrics into system design. Ultimately, the Process allows programs to consistently establish derived NR-KPP requirements, decompose them into measurable and testable terms, and incorporate them into System Design and System Realization (which includes obtaining I&S certification). Furthermore, this process will help programs articulate their NR-KPP related efforts more effectively at Milestone and Gate Reviews. At first glance, the Four-Step Process appears to describe things that programs already do as part of their normal acquisition process. However, each of the steps specifically targets items in the NR-KPP that Program Offices and Resource Sponsors do not address uniformly. By providing guidance on these areas, the Four-Step Process will alleviate the issues in dealing with the NR-KPP. The Four- Step Process includes the following activities: 1) Perform a Mission Analysis (MA) to determine derived NR-KPP Operational Requirements in terms of missions, mission activities, and associated Mission Effectiveness and Operational Performance Measures. 2) Perform an Information Analysis (InA) to determine derived Operational Information Requirements in terms of required networks, mission thread Information Elements, and associated Operational Performance Measures. 3) Use Systems Engineering (SE) to decompose the derived requirements from the MA and InA into System Performance Requirements for use during System Design and Realization. 4) Document the outcomes of the Four-Step Process according to engineering best practices and Compliance Measures found in the NR-KPP Compliance Statement. The Guidebook describes each step s purpose, what inputs it needs, any process constraints it has, and what outputs it should produce. The Guidebook discusses what Page 4

11 programs should do during each step, but it does not direct programs how to execute each step. The Guidebook does include an Appendix with sample procedures on how to execute each step so that programs have a standard process to follow. The Guidebook also lists key points to keep in mind as programs execute the process. These points are highlighted in bold and will help ensure the program meets the intent of the NR-KPP. ASN (RDA) CHSENG is in the currently helping pilot programs implement this Four- Step process. As these programs satisfy their NR-KPP, ASN (RDA) CHSENG will develop case studies describing the process used by these programs and post those case studies on the ASN (RDA) CHSENG NR-KPP website Guidebook Contents The remainder of the Guidebook describes the NR-KPP and the Four-Step process as follows: Section 3 clarifies the definitions of net-readiness and the NR-KPP. It also presents a template for a refined NR-KPP Compliance Statement with measurable and testable attributes programs can use for their derived NR-KPP requirements. Section 4 discusses the Four-Step NR-KPP Implementation Process. It shows how the process develops the refined NR-KPP Compliance Statement and incorporates the measurable and testable metrics into system design. Section 5 summarizes the Guidebook. 3. Net-Ready KPP Clarification This section clarifies the definitions of net-readiness and the NR-KPP. This clarification is intended to help programs understand the purpose of the NR-KPP so their system can satisfy this important KPP. Figure 2 illustrates the relationship between key terms in the refined NR-KPP compliance statement. As show in the figure, Warfighting Missions are composed of Operational Tasks. The Operational Tasks are then performed by people and supported by System Functions 6. Finally the System Functions are performed by systems. In terms of metrics, Warfighting Missions are measured in terms of Effectiveness Measures. The refined NR-KPP Compliance Statement captures these Effectiveness Measures in the NR-KPP Effectiveness Measures. Operational Tasks are measured in terms of Operational Performance Measures. The refined NR-KPP Compliance Statement captures these Performance Measures in the NR-KPP Performance Measures. 6 As technology matures and more warfighting operations become automated, activities classified as Operational Tasks may turn into System Functions since systems could start performing things people used to do. Page 5

12 It is important to note that, since Warfighting Missions are composed of Operational Tasks, the specified values for the Operational Performance Measures must ensure the Warfighting Missions will meet their required Effectiveness Measures. Lastly, System Functions are measured in terms of System Performance Requirements. Again, since System Functions support Operational Tasks, the specified values for the System Performance Requirements must ensure the Operational Tasks will meet their required Operational Performance Measures. The figure also shows the role of people and how they combine with systems to form DOTMLPF solutions, but a complete discussion of this aspect is outside the scope of this guidebook. Figure 2. Terms Used In the refined NR-KPP Compliance Statement 3.1. Net-Ready Definition In effect, a net-ready system meets the requirements for both the technical exchange of information and the operational effectiveness of those exchanges 7. These requirements include: Information needs Information timeliness 7 The definition above summarizes the full definition of a net-ready system as stated on page GL-20 of CJCSI E. Page 6

13 Information Assurance (IA) accreditation (if applicable) Net-ready attributes ASN (RDA) requires that certain programs satisfy an NR-KPP to ensure those programs field a net-ready system. In light of this intent, programs should continually evaluate whether or not their NR-KPP efforts help them develop a net-ready system Net-Ready KPP Definition A number of policies discuss the NR-KPP. SECNAVINST D (an acquisition instruction), the Manual for the Operation of the Joint Capabilities Integration and Development System (a requirements manual that replaced CJCSM C), and DODD (an interoperability directive) mandate which programs require an NR- KPP. Essentially these policies state that a system requires an NR-KPP unless it does not communicate with external systems. These policies define an external system as any system not covered in the same acquisition program (i.e. not covered by the same JCIDS documents). CJCSI E (an interoperability certification instruction) defines the NR-KPP as a key parameter stating a system s operational requirements for information, the timeliness of that information, Information Assurance (IA), and net-ready attributes for both the technical exchange of information and the operational effectiveness of that exchange. CJCSI E articulates this definition in terms of an NR-KPP Compliance Statement. To satisfy the NR-KPP, programs must show that they completely satisfy the capability s information needs in a timely and accurate manner. The Four-Step process described in this guidebook helps programs do this by building on the process described on page E-21 of CJCSI E Net-Ready KPP Compliance Statement CJCSI E mandates that all programs with an NR-KPP include the NR-KPP Compliance Statement in their Capability Development Document (CDD), Capability Production Document (CPD), and Information Support Plan (ISP), and for Services/Application J-6 I&S Certification. This Compliance Statement contains three distinct areas that programs must address. The Four-Step process shows how programs can do this by developing derived requirements from the original definition of each area. Figure 3 uses colored boxes to highlight the three areas in the original NR-KPP Compliance Statement. These areas are: 1) NR-KPP Description 2) NR-KPP Effectiveness and Performance Measures (Figure 3 designates this area as Potential since the original statement is not a measure. It only becomes a measure after applying the Four-Step Process to develop the derived requirements in the refined NR-KPP Compliance Statement) Page 7

14 3) NR-KPP Compliance Measures Potential NR-KPP Effectiveness and Performance Measures NR-KPP Description NR-KPP Compliance Measures Figure 3. Three Components of NR-KPP Compliance Statement If Resource Sponsors include the NR-KPP Compliance Statement in a program s JCIDS documents, programs must ensure that they satisfy the NR-KPP Description as measured by the NR-KPP Effectiveness and Performance Measures and NR-KPP Compliance Measures. Because programs typically focus on the NR-KPP Compliance Measures and ignore the NR-KPP Effectiveness and Performance Measures, programs may or may not satisfy all elements of the NR-KPP Description. Furthermore, in order to successfully Page 8

15 complete Milestone and Gate Reviews, programs must be able to articulate how their system satisfies the NR-KPP Description in terms of the NR-KPP Effectiveness and Performance Measures as well as the NR-KPP Compliance Measures. The Four-Step Process proposed in this guidebook will help programs do these things by providing a mechanism for satisfying all three NR-KPP components. The following sections discuss each area of the NR-KPP Compliance Statement and suggest derived requirements that will help programs understand how to ensure they produce a net-ready system Net-Ready KPP Description The NR-KPP description contains the attributes used to determine whether a system is net-ready. As stated in the NR-KPP description, a net-ready system must have three attributes. The system must: 1) Support Net-Centric Military Operations 2) Enter and Be Managed In the Network 3) Exchange Information In the original NR-KPP Compliance Statement, these NR-KPP Attributes do not have measurable and testable metrics. This undoubtedly causes some of the difficulty programs experience when implementing the NR-KPP. The descriptions below will help programs associate measurable and testable metrics with these attributes so programs can manage the NR-KPP like their other KPPs. This Guidebook recommends programs develop derived NR-KPP requirements based on these descriptions. Attribute 1: Support Net-Centric Military Operations This NR-KPP Attribute has two associated parts. The first part of this NR-KPP Attribute should specify which military operations (e.g. missions or mission threads) a system supports, the effectiveness metrics used to measure mission success, and the conditions under which a mission will be executed. These items will form the basis of the NR-KPP Effectiveness Measures. The Joint Mission Essential Task List (JMETL) or Navy Mission Essential Task List (NMETL) frameworks described in CJCSM E the Universal Joint Task List (UJTL) or OPNAVINST B the Universal Naval Task List (UNTL) provide standardized mechanisms for specifying this element. For example, a program that supports Ballistic Missile Defense (BMD) can describe its mission in terms of task OP Conduct Joint Operations Area Missile Defense from the UJTL. Programs can then describe mission success using the sample metrics provided in the UJTL. Task OP has metrics such as percent of attacking missiles that successfully penetrated friendly defenses, percent of launched air-to-surface missiles destroyed before impact, and minutes warning provided to friendly assets prior to threat arrival. The UJTL also provides a standardized list of conditions under which the mission will be executed. Page 9

16 Metrics and conditions are critical to satisfying the Requirements in the NR-KPP Compliance Statement. The sample metrics and conditions are why the UJTL and UNTL are so convenient for specifying military operations. The second part of this NR-KPP Attribute should specify which of the military operations required Operational Tasks the system supports, the operational performance metrics used to measure task performance, and the conditions under which the task will be performed. Note that since these tasks describe the NR-KPP s Operational Requirements, this attribute should only include Net-Ready Operational Tasks. In this case, an Operational Task is defined as net-ready if it produces information for an external system (including storing data on an external system) or consumes information from an external system. Again, the JMETL or NMETL frameworks described in the UJTL or UNTL provide standardized mechanisms for specifying this element. For example, the BMD mission described above may require task NTA Provide Cueing from the UNTL. Programs can then describe task performance using the sample metrics provided in the UNTL. Task NTA has metrics such as minutes to transmit updated cueing information, minutes to respond to emergent tasking, and percent of time able to respond to search plan requirements. The UNTL also provides a standardized list of conditions under which the activities will be executed. These associated metrics and conditions are critical to the NR-KPP Compliance Statement and a major reason why the UJTL and UNTL provide a convenient framework for specifying the tasks required by military operations. Although the training community created the UJTL and UNTL to describe training requirements for current operations, their framework and content provide a convenient and effective framework for developing derived requirements (Effectiveness and Operational Performance Measures) for the NR-KPP. The section on performing a Mission Analysis and the appendices will discuss JMETLs, NMETLs, the UJTL, and the UNTL in more detail. Attribute 2: Enter and Be Managed In the Network This NR-KPP Attribute should specify which networks the system must connect to in order to support its net-centric military operations. Unlike the Support Net-Centric Military Operations attribute, this net-ready attribute does not have a standardized framework of terminology and metrics. Therefore, programs should ask a series of questions to develop the derived requirements of this attribute. These questions should be in the context of the missions and tasks a program supports and include: What types of networks will the system connect to? This includes more than just IP networks. What metrics do the required networks use to measure network entrance and management performance? This should include metrics used to measure the time from system start up to when the system has connected to the network and is supporting military operations. Page 10

17 Who will manage the system as it connects to various networks? How will the system be managed? Will management be distributed, centralized, local, remote, etc.? What configuration parameters does the network have? Attribute 3: Effective Information Exchanges This NR-KPP Attribute specifies the Information Elements produced and consumed by each mission and task identified in the Support Net-Centric Military Operations attribute. Since the NR-KPP focuses on a system s interactions with external systems, programs only need to identify two types of Information Elements: 1.) Information Elements that their system produces, sends, or makes available to an external system; 2.) Information Elements that their system receives from an external system. For each Information Element, the attribute should specify operational performance metrics used to measure the effectiveness of the Information Element s production or consumption. As stated in the NR-KPP Description, the performance metrics should describe the elements continuity, survivability, interoperability, security, and operational effectiveness. Programs should also consider how these metrics affect unanticipated users 8 of the Information Elements. The DoDAF OV-3 Operational Information Exchange Matrix and SV-6 System Data Exchange Matrix provide sample operational performance metrics for the information production and consumption. Some sample metrics from the OV-3 include accuracy (e.g. Area of Uncertainty for a track), availability, timeliness, periodicity, etc. Programs should also consider other metrics such as the number of concurrent consumers of an Information Element. As an example, task NTA Provide Cueing in the BMD mission above might consume a Search Plan Information Element with minimum amount of delay and produce a Target Track Information Element with specified track accuracy. Table 1 below summarizes the NR-KPP Attributes in terms of: Refined attributes and their associated metrics Standardized frameworks and data sources to leverage when specifying the attributes Component of the NR-KPP using the attribute 8 Unanticipated users are those users who do not provide advance warning that they will use the data Page 11

18 Table 1. Three NR-KPP Attributes Net-Ready KPP Effectiveness and Performance Measures The NR-KPP Effectiveness and Performance Measures consist of the parameter values for each attribute in the NR-KPP Description. As indicated in the Manual For The Operation Of The Joint Capabilities Integration And Development System, these values should be specified in the context of an overall operational scenario that describes the frequency and number of missions occurring at any given time. Using the example above, BMD mission success might require task NTA have a parameter value of 3 minutes for the time it takes to transmit updated cueing information. Programs can determine these values in a number of ways. The Mission Analysis and Information Analysis sections of the guidebook discuss this in more detail. The NR-KPP Effectiveness and Performance Measures must also address the threshold and objective NR-KPP requirements. The NR-KPP Compliance Statement makes this straightforward. Per the NR-KPP Compliance Statement, to fulfill its threshold requirement for the NR-KPP the system must satisfy all three NR-KPP Attributes for each joint critical operational activity the system supports. Similarly, to fulfill its objective requirement for the NR-KPP, the NR-KPP Compliance Statement mandates the system satisfy all three NR-KPP Attributes for all operational activities the system supports. Since all three attributes of the refined NR-KPP relate back to an operational activity, the threshold requirement simply includes all attributes from the refined NR- KPP related to a critical activity and the objective requirement includes all attributes from the refined NR-KPP. 9 9 See Appendix A, Step 1.3, for further discussion of what constitutes a "joint critical operational activity Page 12

19 Note that Table 1 recommends data sources for two of the attributes from the refined NR- KPP. The data sources are recommended in part because they include measurable and testable metrics for each mission and Operational Task. Therefore incorporating those sources into the derived NR-KPP requirements will turn the NR-KPP into a measurable and testable KPP Net-Ready KPP Compliance Measures The NR-KPP Compliance Statement contains five elements that make up the NR-KPP Compliance Measures. These elements are: Solution Architectures, Net-Centric Data and Services Strategy, Global Information Grid Technical Guidance, Information Assurance, and Supportability. These measures constrain the SE Process used to procure the system. For example, requiring DoDAF views constrains how programs provide traceability to show that the system meets its requirements. While testing for interoperability certification, the Joint Interoperability Test Command (JITC) evaluates a system s ability to meet the threshold and objective levels of these compliance measures. The NR-KPP s threshold and objective requirements have the same Compliance Measures. Unlike the NR-KPP Description and NR-KPP Effectiveness and Performance Measures, programs do not need to develop derived requirements for these NR-KPP Compliance Measures. Instead, programs should simply view the Compliance Measures as constraints on the SE step in the Four-Step Process. The section describing the Four-Step Process will discuss in more detail how these Compliance Measures constrain the SE Process as well as how JITC verifies these measures. Using the descriptions above, programs can now develop the derived NR-KPP requirements that make up the refined NR-KPP Compliance Statement shown in Figure 4 below. The refined NR-KPP Compliance Statement describes these derived requirements in terms similar to those used by other KPPs, and as a result makes it easier for programs to ensure their system satisfies the NR-KPP. Appendix C includes a detailed view of the refined NR-KPP Compliance Statement. It should be emphasized that this refined NR-KPP Compliance Statement is simply a template programs can use to capture their derived NR-KPP requirements and provide traceability back to the original NR-KPP Compliance Statement. It is in no way meant to replace the original NR- KPP Compliance Statement which CJCSI E requires. Page 13

20 Figure 4. Refined NR-KPP Compliance Statement 4. Four-Step Net-Ready KPP Process This section presents a Four-Step Process that Program Managers, Systems Engineers, and Test Engineers can use to address the NR-KPP. The Process uses Mission Systems Engineering principles to help programs consistently refine the original NR-KPP Compliance Statement to derive measurable and testable attributes they can incorporate into System Design and Realization. Furthermore, implementing this process will enable programs to effectively articulate their NR-KPP related efforts at Milestone and Gate Reviews. The Four-Step Process includes the following activities: 1) Mission Analysis (MA) 2) Information Analysis (InA) 3) Systems Engineering (SE) 4) Documentation Figure 5 illustrates the relationship between the four steps and how they use the original NR-KPP Compliance Statement to develop the derived requirements in the refined NR- KPP Compliance Statement. It also shows the DoDAF views developed during each step. Figures 6, 7, 9, and 10 show the inputs and outputs needed for each step in the process. As shown in Figure 5, the MA and InA develop the refined NR-KPP Compliance Statement described in Section 3, while Step 3 decomposes the derived NR- KPP into System Performance Requirements for use during System Design and Realization. Step 4 ensures traceability and interoperability by capturing the Operational and System Performance requirements in a standard format. Page 14

21 Figure 5. Four-Step NR-KPP Process The MA and InA ideally take place during the Capabilities Based Analysis (CBA) portion of the JCIDS process. As stated in Manual For The Operation Of The Joint Capabilities Integration And Development System, the CBA should identify the Operational Tasks, conditions, and operational performance standards needed to achieve desired mission outcomes. However many times the CBA did not conduct the MA or InA in sufficient detail, and programs must develop derived requirements for the NR- KPP. In other cases, Resource Sponsors delegate the MA and InA responsibilities to programs. This section provides guidance to programs for both of these situations. The descriptions below will explain the following aspects of each step in the Four-Step Process: 1) The step s purpose 2) The inputs needed for each step 3) The step s process constraints 4) What each step should accomplish (a sample of how to accomplish each step is included in the appendices) 5) The outcomes expected for each step 6) Key points to keep in mind when executing each step Mission Analysis The MA determines the derived NR-KPP Operational Requirements in terms of missions, mission activities, and their associated Effectiveness and Operational Performance Measures. Figure 6 depicts the MA. Page 15

22 Figure 6. Mission Analysis Purpose The MA derives the Support Net-Centric Military Operations attribute of the refined NR- KPP Compliance Statement described in Section 3. This refinement describes the NR- KPP in measurable and testable terms similar to those used by other KPPs. As a result, programs can address the NR-KPP using their normal SE Process. Inputs As stated in the section introduction, the MA ideally occurred during the CBA process. Alternatively, the DoD Architecture Registry System (DARS), Naval Architecture Repository System (NARS), or DoN Enterprise Architecture might contain the results of an applicable MA. Any MA results available from these three sources should serve as the primary input to this step. In most cases, the existing MAs will not provide the detail needed for the derived requirements in the refined NR-KPP Compliance Statement. Therefore, programs will have to use existing MAs to perform additional work and derive these requirements. When doing so, programs should leverage other sources of potential MAs and contextual data including but not limited to: The program s JCIDS documentation JCIDS documentation from other relevant programs The Required Operational Capability/Projected Operating Environment (ROC/POE) for platforms that will field the system Training requirements captured in the NMETLs in the Navy Training Information Management System (NTIMS) Page 16

23 Readiness reporting requirements captured in the JMETLs and NMETLs found in the Defense Readiness Reporting System (DRRS) that can provide a performance baseline Operational scenarios such as official operational plans (OPLANs) or contingency plans (CONPLANs) for near-term systems and Defense Planning Scenarios (DPS) for far-term systems Constraints When executing this step, programs must specify the missions and Operational Tasks in terms the Fleet uses. Furthermore, the associated effectiveness and operational performance metrics need to be measurable and testable. These are additional reasons why the JMETL and NMETL frameworks are convenient ways to articulate the NR-KPP requirements. Process Since parts (or all) of the MA occur during the JCIDS process, what a program does for this step depends on the level of Program Office involvement in the system s JCIDS process. Programs should start this step by reviewing the CBA, DARS, and NARS to determine if there exists an MA with sufficient detail for the derived requirements in the refined NR- KPP Compliance Statement. If an MA with sufficient detail exists, programs simply need to extract the missions, mission tasks, and associated effectiveness and Operational Performance Measures for the system and include them in the refined NR-KPP Compliance Statement. If existing MAs are insufficient, the Program Office should collaborate with the Resource Sponsor to determine what level of derived requirements the system needs. The MA used to derive these requirements can take on a variety of levels of scope and resources required. At a minimum, Program Offices (in conjunction with their Resource Sponsor) should conduct an abbreviated MA that only develops the Operational Tasks their system will perform along with the associated Operational Performance Measures. Ideally, Program Offices (in conjunction with their Resource Sponsor) would conduct a thorough MA that develops the Operational Tasks, Operational Performance Measures, and Effectiveness Measures for the entire mission. The abbreviated MA differs from a thorough MA in that it only analyzes the portion of the mission that relates to a specific program. Although a thorough MA requires more resources, it results in a better understanding of the mission's derived requirements. This understanding will reduce the risk that the system does not meet its intended capabilities. Programs Offices should ensure they coordinate the MA with their Resource Sponsor and have Resource Sponsor review and approval of MA results. Since analyzing an entire mission thread is a Navy Enterprise issue, Program Offices should also attempt to have as much Community Of Interest (COI) participation and review as possible. This will help ensure consistency of MAs across programs. Page 17

24 Program Offices (in conjunction with their Resource Sponsor) should follow a standardized process (e.g. the JMETL, NMETL, or Capability Package Assessment (CPA) processes) to analyze the mission and derive more detailed NR-KPP requirements in terms of missions, mission activities, and their associated mission effectiveness Operational Performance Measures. Programs can use the sample process given in Appendix A if they do not already have a preferred method for conducting an MA. Outcomes Regardless of whether the program conducted a thorough or abbreviated MA, the outcomes should look similar. If the program conducted thorough MA, the outcomes need to include: A list of critical and non-critical missions the system supports. 10 Effectiveness measures for each mission thread. All critical and non-critical Operational Tasks required for each mission thread. 11 This differs from an abbreviated MA in that it more than just those tasks required by the system under consideration. Operational Performance Measures for each task. If the program conducted an abbreviated MA, the outcomes need to include: Only the critical and non-critical Operational Tasks required for the system under consideration. 12 Operational Performance Measures for each task. These results essentially specify a portion of the system s derived NR-KPP Operational Requirements for each mission. As mentioned in Section 3, the JMETL and NMETL constructs provide a convenient way to articulate these requirements. Programs should use the following DoDAF views to document these derived requirements: An AV-1 should provide the context and scope of the missions a system supports. An OV-1 (or multiple OV-1s) should display the missions a system supports. Although not required for the MA, an OV-4 can display the command hierarchy and external nodes needed for the mission. The NR-KPP requires this view so program should develop it during the MA. A partial OV-5 should display the activities in the mission. A full OV-5 will result from the InA and will show the information each activity produces and consumes. An OV-6c should display the sequence of activities in the mission. Programs should also start developing a dictionary of terms that they use in these architecture products and use an AV-2 to display these terms. 10 See Appendix A, Step 1.3, for further discussion of critical/non-critical 11 Ibid 12 Ibid Page 18

25 Programs can then insert these derived Operational Requirements into the template for the refined NR-KPP Compliance Statement included in Appendix B. The template in Appendix B only needs to include Net-Ready Operational Tasks. This focus on Net- Ready Operational Tasks makes the NR-KPP s MA different from a traditional MA. Key Points As programs execute this step, they should keep a number of things in mind: 1) Since the MA ideally occurred as part of the CBA or exists in DARS or NARS, the MA should have considered the mission from an enterprise level (vice a system or platform level). However, neither the Department of Defense (DoD) nor the Navy has a complete set of documented mission threads. Therefore many times an enterprise level MA does not exist. Programs may find it difficult to conduct a MA that appropriately reflects the Navy at an enterprise-level since they typically do not have the resources to engage the entire Navy enterprise. 2) Since a thorough MA examines elements outside the scope of an individual system, programs should work with the appropriate Community of Interest (COI) or Community of Practice (COP) to ensure coordination across systems and proper stakeholder involvement. 3) Finally, although DoDAF version 1.5 architecture views provide a way to display some of these derived Operational Requirements, they do not provide a mechanism to display Effectiveness Measures for the mission and Operational Performance Measures for the tasks. Therefore programs should either augment existing DoDAF products (e.g. augment the OV-5 to associate Operational Performance Metrics with each task) or ensure the development of integrated architectural products conform to other commonly accepted standards (e.g., IEEE). The use of standards will facilitate the identification of the necessary performance metrics Information Analysis The Information Analysis (InA) determines the derived NR-KPP Operational Information Requirements in terms of required networks, mission thread Information Elements and their Operational Performance Measures. Figure 7 depicts the InA. Page 19

26 Figure 7. Information Analysis Purpose The InA derives the Enter and Be Managed In the Network and Information Exchange attributes of the refined NR-KPP Compliance Statement described in Section 3. This refinement describes the NR-KPP in measurable and testable terms similar to those used by other KPPs. As a result, programs can address the NR-KPP using their normal SE Process. Inputs The InA requires all the inputs used for the MA. In addition, the InA requires the outputs produced by the MA. Constraints When executing this step, programs must ensure that they use standardized lists (e.g. the Common Information Elements List (CIEL)) to describe mission thread Information Elements. 13 Process As with the MA, parts (or all) of the InA occur during the JCIDS process. Therefore what a program does for this step depends on the level of Program Office involvement in the system s JCIDS process. 13 The Naval Architecture Elements Reference Guide link in Appendix D contains a link to the CIEL. Page 20

27 Programs should start this step by reviewing the CBA, DARS, and NARS to determine if there exists an InA with sufficient detail for the derived requirements in the refined NR- KPP Compliance Statement. If an InA with sufficient detail exists, programs simply need to extract the networks, Information Elements, and associated Operational Performance Measures for the system and include them in the refined NR-KPP Compliance Statement. If existing InAs are insufficient, the Program Office should collaborate with the Resource Sponsor to determine what level of derived requirements the system needs. The InA used to derive these requirements can take on a variety of levels of scope and resources required. At a minimum, Program Offices (in conjunction with their Resource Sponsor) should conduct an abbreviated InA that only develops the derived Operational Information Requirements for their system. Ideally, Program Offices (in conjunction with their Resource Sponsor) would conduct a thorough InA that develops the Operational Information Requirements for the entire mission. The abbreviated InA differs from a thorough InA in that it only analyzes the portion of the mission that relates to a specific program. Although a thorough InA requires more resources, it results in a better understanding of the mission's derived requirements. This understanding will reduce the risk that the system does not meet its intended capabilities. Programs Offices should ensure they coordinate the InA with their Resource Sponsor and have Resource Sponsor review and approval of InA results. Since analyzing an entire mission thread is a Navy Enterprise issue, Program Offices should also attempt to have as much Community of Interest (COI) participation and review as possible. This will help ensure consistency of MAs across programs. Programs should also participate or stand up any interoperability meetings that address Information Elements produced or consumed by their system. Outcomes Regardless of how the program conducted the InA, the outcomes should look similar. If the program conducted a thorough InA, the outcomes need to include: Networks and Information Elements produced or consumed by all Net-Ready Operational Tasks required for each mission thread (not just those tasks required by the system under consideration). Operational performance metrics describing network entry and network management for networks to which the system under consideration connects Operational performance metrics describing continuity, survivability, interoperability, security, and operational effectiveness for Information Elements produced or consumed by the system under consideration. If the program office conducted an abbreviated InA, the outcomes need to include: Information Elements produced or consumed by Net-Ready Operational Tasks required for the system under consideration. Page 21

28 Operational performance metrics describing network entry and network management for networks to which the system under consideration connects. Operational performance metrics describing continuity, survivability, interoperability, security, and operational effectiveness for Information Elements produced or consumed by the system under consideration. These results essentially specify the system s derived Operational Information Requirements for each mission. Programs should use the following DoDAF views to document these requirements (as appropriate for the maturity of the system. See Figure A27 and Appendix C.): An OV-2 should display a summary of the Information Elements produced and consumed, as annotations accompanying the needlines between operational nodes. An OV-3 should display the Information Elements produced and consumed along with their Operational Performance Measures. A full OV-5 should display each activity in the mission and the Information Elements produced and consumed by that activity. An OV-7 should display the Information Elements to start planning the program s data strategy. The required networks form the starting point for an SV-2. Programs should also continue developing a dictionary of terms that they use in these architecture products and use an AV-2 to display these terms. Programs can also insert these derived Operational Information Requirements for the Net-Ready Operational Tasks into the template for the refined NR-KPP Compliance Statement included in Appendix B. The template in Appendix B only needs to include Information Requirements for Net-Ready Operational Tasks. This focus on Net-Ready Operational Tasks makes the NR-KPP s InA different from a traditional InA. Key Points The InA has the same key points as the MA Systems Engineering The SE step decomposes the derived NR-KPP requirements from the MA and InA into System Performance Requirements for use during System Design and Realization. Figure 8 summarizes the SE Process from the Defense Acquisition Guidebook (DAG), and Figure 9 depicts the SE step for the NR-KPP. Page 22

29 Evaluate User Needs Evaluate User Needs (Operational Construct), (Operational Construct), Identify System Performance Identify System Performance Reqs Reqs Verification Verification Traceability Traceability Combined Operational and Combined Operational and Developmental Test Event, Developmental Test Event, Demonstrate System Meets User Demonstrate System Meets User Needs Needs Develop System Tech. Develop System Tech. Rqts and Performance Rqts and Performance Spec & Measures Spec & Measures Verification Verification Traceability Traceability System DT&E, Operational System DT&E, Operational Assessments, Performance Assessments, Performance Verification Verification Derive System Derive System Allocated Baseline & Allocated Baseline & Measures Measures Verification Verification Traceability Traceability Integrated DT&E, Early Integrated DT&E, Early Operational Assessment Operational Assessment Derive Sub- Derive Sub- Functions & Functions & Measures Measures Verification Verification Traceability Traceability Configuration Item Configuration Item Verification DT&E Verification DT&E Design and Design and Build Build Figure 8. DAG Systems Engineering Process Figure 9. NR-KPP Systems Engineering Purpose The SE Process translates the derived NR-KPP requirements from the MA and InA into the system s design and verifies the system satisfies those requirements. Inputs At this point in the Four-Step Process, the MA and InA should have produced derived requirements in a format similar to the requirements for the program s other KPPs. These derived requirements form the inputs to the SE step. Constraints When executing this step, programs need to ensure the system complies with the five NR-KPP Compliance Measures given in the NR-KPP Compliance Statement. Page 23

30 CJCSI E gives detailed procedures for complying with all of the NR-KPP Compliance Measures. How these Compliance measures constrain the SE step can be summarized as follows: Each of the DoDAF views should comply with DoDAF standards, DoD Information Enterprise Architecture (DIEA) business rules and principles, the Joint Common Systems Function List (JCSFL), and DISRonline policies. The DoDAF views should also be registered in DARS. Where applicable, the system functions in the SV-4a, system data exchanges in the SV-6 and data elements in the SV-11 should support the Net-Centric Data Strategy, Net-Centric Services Strategy, and DoD Information Enterprise Architecture. 14 Programs should use the Exposure Verification Tracking sheets shown in CJCSI E to document this compliance. Where applicable, the logical interfaces in the SV-1, physical interfaces in the SV-2, and standards in the TV-1 should comply with the Global Information Grid (GIG) Enterprise Service Profiles (GESPs) (or GIG Key Interface Profiles (KIPs) until the GESPs are developed). Where applicable, the logical interfaces in the SV-1, physical interface in the SV- 2, system functions in the SV-4a, and system data exchange in the SV-6 should comply with IA requirements and specify the IA controls the system will use. These IA specifications can include Access Control, Availability, Confidentiality, Dissemination Control, Integrity, and Non-Repudiation Consumer/Producer. Each physical interface in the SV-2 or system function in the SV-4a that uses the electromagnetic spectrum should comply with spectrum and supportability requirements to include Selective Availability Anti-Spoofing Module (SAASM), Spectrum, and Joint Tactical Radio System (JTRS) requirements. Process Unlike the MA and InA, the SE step does not occur during the JCIDS process. Programs should follow a standardized SE Process (e.g. the SE Process given in the DAG or IEEE Standard 1220) to perform System Design and System Realization. System Design includes Requirements Development, Logical Analysis, and Design Solution, and System Realization includes Implementation, Integration, Verification, Validation (to include I&S Certification), and Transition. Note that the Design portion of the SE Process ensures that derived NR-KPP requirements are decomposed into technical requirements for the system. The Realization portion of the SE Process ensures that system meets these technical requirements. Since the MA and InA that determined the NR-KPP s derived requirements focused on Net-Ready Operational Tasks, the SE Process will produce a system that enables those Tasks. This focus on derived Net-Ready Operational Tasks makes the NR-KPP s SE Process different from the traditional SE Process. Outcomes 14 CJCSI E exempts tactical systems, control systems, and weapons systems with time critical constraints from the requirement to demonstrate compliance with the data strategy. Page 24

31 The outcomes of the System Design portion of the SE step should include System Performance Requirements such as attributes, characteristics, functions, interfaces, information flows, and standards. Programs should display these technical requirements in the following DoDAF views (as appropriate for the maturity of the system. See Figure A27 and Appendix C): An SV-1 should display the system s logical interfaces (e.g. what interfaces support the applications and services that produce data for or consume data from external systems). An SV-2 should display the system s physical interfaces (e.g. what interfaces support the system s physical connections to other systems). An SV-4a should display the functions a system performs along with the data produced and consumed by each function. An SV-5a should display how the system s functionality supports the missions and Operational Tasks identified during the MA. An SV-6 should display the system data exchanges supporting the information flows identified during the InA and displayed in the OV-2, OV-3, and SV-4a. The SV-6 also displays performance requirements for each system data exchange. An SV-11 should display the content of the system data exchanges identified in the SV-6. Because the SV-6 maps to the outcomes of the InA, the SV-11 also displays the details of the Information Elements in the OV-7. A TV-1 should display the standards used by the system interfaces in the SV-1 and SV-2, the system functions in the SV-4, the system data exchanges in the SV- 6, and the system data in the SV-11. A TV-2 should display any expected changes in those standards. Programs should also continue developing a dictionary of terms that they use in these architecture products and use an AV-2 to display these terms. In addition to DoDAF products, the programs should display the outcomes of the SE Process using (as applicable): Exposure Verification Tracking sheets shown in Appendix A to Enclosure E of CJCSI E to document the data a system produces and the services the system provides. GIG Technical Guidance (GTG) or GIG Enterprise Service Profile (GESP) compliance matrices managed by DISA. The SE step should also result in derived IA and Supportability requirements. Programs should specify IA requirements used for each system function, interface, and system data exchange. These IA specifications can include Access Control, Availability, Confidentiality, Dissemination Control, Integrity, and Non-Repudiation Consumer/Producer. Programs should specify derived spectrum and supportability requirements for each physical interface that uses the electromagnetic spectrum. Since DoDAF v1.5 products do not provide a mechanism to display all of these requirements, programs may have to augment existing DoDAF products (e.g. augment the SV-2 to associate IA controls with each physical interface) or ensure the development of Page 25

32 integrated architectural products conform to other commonly accepted standards (e.g., IEEE). The use of standards will facilitate identification of the necessary performance metrics Ultimately, the SE Process will turn these outcomes into a net-ready system design. The System Realization portion of the SE Process verifies the system s net-readiness. System Realization should develop procedures to verify and validate a system s net-readiness during I&S Certification. Because programs should have used DoDAF products to display most of the Operational and System Performance Requirements, the DoDAF products should form the foundation of these test procedures. Key Points As programs execute this step, they should keep in mind that without a proper MA and InA, they will not have adequately derived NR-KPP requirements. As a result, they may or may not field a net-ready system. As mentioned earlier in this guidebook, the JITC evaluates a system s ability to meet the threshold and objective levels of the NR-KPP Compliance Measures when testing the system for interoperability certification. The JITC is currently developing an NR-KPP Test Guidebook that describes the process for this evaluation. When available, programs should review this guidebook to ensure their implementation of the process constraints will satisfy the JITC certification requirements Documentation The Documentation step captures the outcomes of the Four-Step Process according to engineering best practices and the Compliance Measures found in the NR-KPP Compliance Statement. Figure 10 depicts the Documentation step of the Four-Step Process. Page 26

33 Figure 10. Documenting the NR-KPP Purpose The Documentation step provides traceability between the Operational Requirements in the original NR-KPP Compliance Statement, derived requirements in the refined NR- KPP Compliance statement, and system design. It also provides a standardized framework to document the system s net-ready aspects for inclusion in system specification, contract specification, verification, and validation. 15 Inputs The outcomes of the MA, InA, and SE form the inputs to the Documentation step. Constraints As indicated in the NR-KPP Compliance Measures, programs need to document the results in terms of DoDAF views, Exposure Verification Tracking Sheets, and GTG/GESP compliance matrices. Process As with the SE Process, the Documentation step does not occur in the JCIDS process. In this step, programs simply need to document the outcomes of the MA, InA, and SE in standardized formats specified by the NR-KPP Compliance Measures. Programs should also develop the textual descriptions for all the architecture products and to ensure the textual descriptions are consistent with the products. Outcomes The outcomes of the Documentation step should capture the system s derived Operational Requirements and System Performance Requirements according to the NR- KPP Compliance Measures in the NR-KPP Compliance Statement. These outcomes need to include DoDAF products, Exposure Verification Tracking Sheets, GTG/GESP 15 As long as the data in the framework is formalized, validated, and precise. Page 27

34 compliance matrices, and Information Assurance and Supportability enhancements to the DoDAF products. This step should also produce traceability matrices tying the Operational Requirements to System Design and Realization. Finally, this step should produce verification and validation procedures. The MA, InA, and SE Process sections indicated how these products should capture the different requirements. Programs should publish these NR-KPP artifacts to DARS, NARS, and the DoD Information Technology Standards Registry (DISR), and ensure the programs requirements documents include links to these artifacts, as applicable. Key Points As programs execute this step, they should use DoDAF Operational Views as a way to display the derived NR-KPP requirements of the system and DoDAF System Views as a way to display how the system is meeting its derived requirements. However, programs need keep in mind that DoDAF version 1.5 does not provide a mechanism to display all of the Operational and System Performance Requirements. Therefore programs may have to augment existing DoDAF products (e.g. augment the OV-5 to associate operational performance metrics with each task) or ensure the development of integrated architectural products conform to other commonly accepted standards. Programs should also keep in mind that architecture products alone will not meet all aspects of the NR- KPP. Therefore, programs should prepare to answer questions to clarify aspects of the NR-KPP. Programs can reduce the number of questions by writing thorough textual descriptions for all the architecture products. 5. Summary To ensure the DoD fields net-ready systems, some programs must satisfy an NR-KPP. From this standpoint, the NR-KPP is not simply a collection of DoDAF products. The NR-KPP is an Operational Requirement. This Guidebook described a Four-Step Process that Program Managers, Systems Engineers, and Test Engineers with a methodology for developing the derived requirements for the NR-KPP. Even though the process appeared to describe things that programs already do as part of their normal system acquisition, the Guidebook discussed how each of the steps specifically targets items in the NR-KPP. Ultimately, the Four- Step Process provides a method for programs to consistently decompose NR-KPP requirements into measurable and testable derived requirements and the incorporate NR- KPP into System Design and System Realization. Page 28

35 Appendix A: Four-Step Net-Ready KPP Process Details This appendix builds on the information included in Section 4.0 of the Guidebook by describing a sample methodology for executing each step of the Four-Step process. It will give detailed procedures for executing each step so that programs have a standard process to follow. A number of policies and guidance address aspects of the Four-Step process. To avoid duplication, the Guidebook will reference the relevant portions of these documents and describe how to tailor them to complete the four steps. The policies related to this section include CJCSM B - Joint Training Manual For the Armed Forces of the United States, OPNAVINST B - the Universal Naval Task List, DoDI Procedures for Interoperability and Supportability of Information Technology (IT) and National Security Systems (NSS), ASN (RDA) CHSENG's Capability Package Assessment (CPA) Rule Set (The CPA Process is still under development by ASN (RDA) CHSENG, and a link will be provided as soon as the document is finished.), and the Defense Acquisition Guidebook. To ease understanding, the appendix will use a Ballistic Missile Defense (BMD) mission thread to illustrate the implementation of process. Readers should keep in mind that the examples do not try to represent a complete set of DoDAF products that would satisfy a formal review, nor do they try to describe a detailed architecture development process. Instead these products show portions of the views to highlight the key pieces of information developed as part of the Four-Step Process. Step 1: Mission Analysis As indicated in Section 4, the Mission Analysis (MA) determines the NR-KPP Operational Requirements in terms of missions, mission activities, and their associated mission effectiveness and Operational Performance Measures. Enclosure C of CJCSM B states that Phase I of the Joint Training System (JTS), "identif[ies] the capabilities required to accomplish assigned missions." Since these two processes have almost identical goals, the Guidebook recommends using Phase I of the JTS as the starting point for the MA. As in Phase I of the JTS, this Guidebook recommends documenting the Operational Requirements in terms of a Joint Mission Essential Task List (JMETL) or Navy Mission Essential Task List (NMETL) (Note that Section 2 of OPNAVIST B outlines a similar process for developing an NMETL). Figure A1 below is taken from Enclosure C of CJCSM A and shows the process for developing a JMETL. Page 29

36 Programs should note the following: Figure A1. Mission Analysis Process CJCSM B focuses on JMETL development from a Combatant Commander's (COCOM's) perspective. Therefore it describes higher-level missions that may not have enough detail for derived NR-KPP requirements. However, A also states that all levels of command need to develop JMETLs and NMETLs. Therefore programs can utilize this process for any of the mission threads they must develop. The training community wrote CJCSM B, so the process may have elements that do not make sense for the acquisition community. For example, it refers to entering information into the Joint Training Information Management System (JTIMS), aligning JMETLs among different levels of command, and the relationship between JMETLs and the Joint Training Plan (JTP). Programs can ignore training specific elements such as these. CJCSM B uses the term Mission Analysis. Unfortunately this definition is different from that used in this Guidebook. Step 1.1 Page 30

37 Step 1.1 of the MA uses higher level guidance to determine the system s assigned missions in terms of tasks from the UJTL or UNTL. Paragraph 4.a of Enclosure C to CJCSM B describes this step and refers to these tasks as "mission tasks." Even though CJCSM B describes the process from a commander s view, programs can utilize the process by applying it from a system point of view. Using the example from Section 3, analysis of higher level guidance could reveal that a system will support BMD which can be described by task OP from the UJTL. Figure A2 below shows the details of task OP from the UJTL. Figure A2. Sample Mission Task for BMD Programs will need to tailor the higher-level guidance they use for this step. In particular, programs should leverage any existing MAs to the greatest extent possible. Since CJCSM B focuses on near-term training requirements, it recommends using near-term guidance (e.g. OPLANs, CONPLANs, etc.) for this part of the MA. This guidance is appropriate for near-term systems. However far-term systems should instead rely on the Defense Planning Scenarios developed by OSD. COCOMs and JTF HQs Page 31

38 must use the Defense Readiness Reporting System (DRRS) to report their current capabilities to execute their assigned missions. The commanders construct their DRRS entries in terms of Mission Essential Task Lists (METLs) (e.g. they specify a mission task from the UJTL or UNTL to describe the mission, measures of effectiveness for each mission task, and identify the conditions under which they must perform the mission). Programs should also leverage this METL data to the greatest extent possible. The fact that this METL data already exists in DRRS is another reason why programs should use the METL construct when specifying Operational Requirements. CJCSM B directs commands to develop a Mission Statement for each of their assigned missions as the primary output of this step. Programs should use the Mission Statement to specify the missions in the refined NR-KPP Description. Programs should also use DoDAF products to display the information in the Mission Statements. Specifically, the AV-1 should include language from the Mission Statement to help provide the scope and context of the missions that the program s DoDAF products will describe. The OV-1 should represent the Mission Statement graphically, and the Mission Statement should serve as the textual description of the OV-1. Programs can create an AV-1 and OV-1 for each mission they support, or they can create one product that covers all missions. Analysis of the higher level guidance should also give an indication of the command hierarchy within which the mission will execute and the external nodes required for mission success. Programs can use this information to start developing an OV-4 and an OV-2. Continuing with the BMD example, Figures A3 and A4 below show a sample AV-1, OV-1 that could result from this step. Figure A3. Sample AV-1 for BMD Page 32

39 Figure A4. Sample OV-1 for BMD Step 1.2 Step 1.2 of the MA determines the Operational Tasks each mission requires in terms of tasks from the UJTL or UNTL. Note that the UJTL and UNTL form a comprehensive list of Navy and Marine Corps tasks. Therefore even though the training community uses them to describe current military operations, programs can use them to describe the future Operational Requirements of their systems. Paragraph 4.b of Enclosure C to CJCSM B describes this step. Using the example from Section 3, the BMD mission thread could require task NTA from the UNTL. Figure A5 below shows the details of task NTA from the UNTL. Figure A5. Sample Operational Task for BMD Again programs need to tailor the inputs of this step to include any MA done as part of the JCIDS process or existing in DARS or NARS. If a sufficient MA does not exist, programs should keep in mind that CJCSM B indicates this step requires significant Subject Matter Expert (SME) input and takes a non-trivial amount of time to complete. The level of SME input and time can make it difficult for a program to execute Page 33

40 this step since programs typically do not have access to the breadth of SMEs required for an entire mission or extra time in their schedule to develop the required data. DRRS and the Navy Training Information Management System (NTIMS) can serve as extremely valuable resources for programs as they execute this step, and programs should leverage the data they contain as much as possible. DRRS and NTIMS contain METLs for COCOMs, Joint Task Force Commanders, Functional Component Commanders (e.g. JFMCC), warfare commanders (e.g. Anti-Submarine Warfare (ASW) Commander), and Units (e.g. CVN). Each organization s METL contains a list of tasks organized by mission that they must execute. METLs also contain the performance measured used to quantify task performance and the conditions under which each task must be executed. For each mission the system must support, programs should find the relevant organizations in DRRS or NTIMS and use their NMETLs as the starting point for their analysis of the mission. Per CJCSM B, this step should result in, the required functions, documented in mission tasks, of all echelons involved with accomplishing the mission: what must be done at each echelon of command, and in each functional area at each echelon of command, in order to accomplish the mission. From this overall list of Operational Tasks supporting the mission, programs should identify the Net-Ready Operational Tasks. As mentioned before, Net-Ready Operational Tasks are those tasks that produce or consume information. Using the BMD example, task NTA shown in Figure A5 is most likely a Net-Ready Operational Task. However task NTA (Track Contacts) may not be a Net-Ready Operational Task because it may not produce information for external systems. This focus on Operational Tasks makes the NR-KPP s MA different than a traditional MA. Programs should include these Net-Ready Operational Tasks in the refined NR-KPP Description. The list of all Net-Ready Operational Tasks will form the basis of the Objective values of the derived NR-KPP Effectiveness and Performance Measures. Programs should also use DoDAF products to display these tasks. Specifically, programs can develop a partial OV-5 that contains all tasks identified as part of this step. Programs should create a separate OV-5 for each mission. Figure A6 below shows a sample partial OV-5 for the BMD mission. The Figure also shows how the figure can be augmented to display the performance measures for each task. Page 34

41 Figure A6. Sample partial OV-5 for BMD Step 1.3 Step 1.3 of the MA determines which of the mission s Operational Tasks identified in Step 1.2 are joint mission critical tasks. The distinction between Operational Tasks and joint mission critical tasks forms the basis between the Threshold and Objective NR-KPP Operational Requirements. The list of all joint mission critical Net-Ready Operational Tasks will form the basis of the Objective values of the derived NR-KPP Effectiveness and Performance Measures. Paragraph 4.c of Enclosure C to CJCSM B describes this step. Since DoDAF version 1.5 does not provide a mechanism for specifying mission critical tasks or the criteria used to distinguish a critical task, programs should use a separate matrix to document these things. This matrix will provide requirements traceability throughout the life of the program. The refined NR-KPP Compliance Statement shown in Appendix B provides a sample way to distinguish between the critical and non-critical tasks. Programs should tailor this step by also using the mission s Operational Tasks in Step 1.2 to create mission threads for each mission. In this case, a mission thread is defined as a specific sequence of tasks to accomplish a mission in a given scenario. For example, the sequence of tasks needed to defeat a short range ballistic missile using engage on remote in MCO-2 specifies a specific mission thread. Since each mission can have a number of different mission threads, programs should list all mission threads for their assigned Page 35

42 missions and then choose the most stressing mission to define their derived NR-KPP Operational Requirements. The CPA Processes discusses this idea in more detail. As above, DoDAF version 1.5 does not provide a way to specify the various mission threads associated with each mission and the criteria used to determine the most stressing mission. Therefore programs should use a separate matrix to document how they determined the most stressing mission threads. This matrix will provide requirements traceability throughout the program. The CPA Processes provides a sample matrix. The training community creates a JMETL as a result of this step. Programs should augment any OV-5 developed during Step 1.2 to identify the critical tasks. Programs should also start building an OV-6c for each mission thread. This OV-6c will display a time-sequenced list of tasks needed for mission completion. As in the OV-5, programs should specify the critical tasks in the OV-6c. Later steps will complete the OV-6c by identifying the nodes responsible for each task in the mission thread. Step 1.4 Step 1.4 of the MA determines the effectiveness metrics for each mission, the operational performance metrics for each task, the conditions under which each task must be executed, and the nodes that will execute each task. Paragraph 4.d of Enclosure C to CJCSM B describes this step. As stated above, a sequence of Operational Tasks forms a mission thread. This means that each individual task s performance contributes to overall mission performance. Therefore programs should start with the Effectiveness Measure used to judge mission success and then determine the task performance requirements needed to achieve mission success. Programs should specify the values for each of these metrics in the context of the overall scenario (i.e. OPLAN or DPS). This scenario will determine the level of performance needed for mission success (e.g. how many red ships must be destroyed in order to win the campaign), the number of simultaneous missions, the timing of missions, etc. Programs can determine the values for these metrics in a number of ways including modeling and simulation, experimentation, and exercises. Programs should use the conditions from the UJTL and UNTL to capture as much campaign context as possible. For example, condition C (Target Density) describes how many targets must be engaged simultaneously. Programs should use the measures and conditions to specify the Threshold and Objective values for the derived Support Net-Centric Military Operations attribute in the refined NR-KPP Description. Programs should also refine the OV-6c s developed in Step 1.3 by identifying the nodes responsible for each task. Since DoDAF version 1.5 does not provide a mechanism for specifying the effectiveness measures for the missions and Operational Performance Measures for the tasks, programs should use a separate matrix to document these things. This matrix will provide requirements traceability throughout the life of the program. Continuing with the BMD example, Figure A2 shows sample effectiveness measures for the mission, figure A5 shows sample performance measures for an Operational Task, and Figure A7 below shows a sample OV-6c that could result from this step. Page 36

43 Figure A7. Sample OV-6c for BMD Step 1.5 Step 1.5 of the MA ensures that the MA has identified all tasks and nodes associated with each mission. Paragraph 4.e of Enclosure C to CJCSM B describes this step. CJCSM B only includes this step since it is written from a commander s perspective. This step is meant to help commanders look at Operational Requirements outside of their command. Depending on how a program executed Step 1.3, this step may or may not be necessary. Ideally, if a program executed step 1.3 from a system perspective, the elements of step 1.5 were incorporated into step 1.3. If programs perform this step, it will result in a completed OV-4 and OV-6c. Step 1.6 Step 1.6 of the MA ensures the appropriate stakeholders have the opportunity to comment on the work conducted so far. Paragraph 4.f of Enclosure C to CJCSM B describes this step. However, programs will have to modify this description to account for the different stakeholders for their system. This step will result in a portion of the refined NR-KPP Description and the NR-KPP Performance and Effectiveness Measures shown in Appendix B. Figure A8 below shows the elements of the refined NR-KPP Compliance Statement specified during this step, and Figure A9 shows an example for the BMD mission thread. Page 37

44 Figure A8. NR-KPP Elements Specified During Mission Analysis Figure A9. Sample NR-KPP Elements for BMD At the end of the MA, programs should ensure they can answer the following questions: 1. What are the mission(s) and the stressing mission thread(s) for the system? 2. What are the effectiveness measures for the mission(s)? 3. Who executes tasks during mission thread(s)? 4. What tasks are required to execute the mission thread? 5. What are the Operational Performance Measures for the mission thread tasks? 6. Under what conditions must the mission thread tasks be executed? Step 2: Information Analysis As indicated in Section 4, the Information Analysis (InA) determines the derived Operational Information Requirements in terms of required networks, mission thread Information Elements, and Operational Performance Measures. Unlike the MA, existing policies and guidance do not address the InA. Therefore, unless programs have a Page 38

45 preferred method for conducting an InA, this section should form the authoritative reference for programs. Step 2.1 Step 2.1 of the InA determines the Information Elements for each mission thread task. To execute this step, programs should examine each task in their mission thread(s) and identify what information is needed to execute the task (i.e. what information is consumed by the task) and what information results from task execution (i.e. what information is produced by the task). Each piece of information produced or consumed will form the Information Elements for the mission thread. Programs should use a standardized list such as the Common Information Element List (CIEL) to specify the Information Elements. 16 Continuing the BMD example from above, task NTA might consume a Search Plan Information Element and produce a Target Quality Track Data Information Element. Programs should also specify the contents of each Information Element. For example the Target Quality Track Data may include track type, area of uncertainty, speed, heading and position. Programs should use these Information Elements for the refined NR-KPP Description. Programs should also use DoDAF products to display this information. Programs can develop an OV-2 by showing the nodes that produce and consume each Information Element and displaying the Information Element as a needline between the nodes. Programs should then expand the details of the needlines into Information Exchange Requirements (IERs) and show those in an OV-3 (This will be a partial OV-3 until the performance metrics are added in Step 2.4). Figure A10 shows how a single needline from the OV-2 can map to multiple IERs in the OV-3 and how a single IER in the OV-3 can map to multiple system data exchanges in the SV-6. Finally, programs can display the contents of each IER as data entities and the relationships between those entities in an OV-7. Programs can use the OV-7 to start planning the program s Data Strategy. Figures A11 and A12 show samples of an OV-2 and OV The Naval Architecture Elements Reference Guide link in Appendix D contains a link to the CIEL. Page 39

46 Figure A10. Relationship between the OV-2, OV-3, and SV-6 Figure A11. Sample OV-2 for BMD Figure A12. Sample OV-7 for BMD A complete OV-7 would also show the relationships Page 40

47 The explicit data requirements from CJCSI E also imply a number of governance questions that programs should ask as they complete this step. These include: Which Community of Interest defines the shared vocabulary and metadata pertinent to the program? Is there more than one COI? Who is the Portfolio Manager? Who is the Functional Data Manager (FDM)? Are these data authoritative? (Does the FDM say so?) Are all or some of the data stored? If so, for how long? Do the data/metadata conform to standards explicitly affirmed or required by the Navy? (e.g., UCore, DDMS) Are the data available to other Military Services, government agencies, and mission partners? Step 2.2 Step 2.2 of the InA determines the different networks required to transport the Information Elements in Step 2.1. To execute this step, programs simply need to examine the Information Elements and determine which networks (e.g. SIPRNET, NIPRNET, Link 16, etc.) will transport the data. Programs should use these networks for the refined NR-KPP Description. Programs can also document this information in the SV-2 developed during the SE Process. Figure A19 shows a sample SV-2. Step 2.3 Step 2.3 of the InA determines the entry and management requirements for the networks the system will connect to. Since network entry and management do not have a standardized framework of terminology and metrics, programs should ask a series of questions when executing this step. These questions should include: What metrics do the required networks use to measure network entrance and management performance? This should include metrics used to measure the time from system start up to when the system has connected to the network and supporting military operations. Who will manage the system as it connects to various networks? How will the system be managed? Will management be distributed, centralized, local, remote, etc.? What configuration parameters does the network have? As in Step 1.4, programs should determine the performance metrics in the context of a scenario and determine the performance values required for mission success. For example, programs could determine the maximum time from a system cold-start until the system has connected to the network while still achieving mission success. Programs should use this information in the refined NR-KPP Description and NR-KPP Effectiveness and Performance Measures. Page 41

48 Step 2.4 Step 2.4 of the InA determines the performance requirements for each of the Information Elements. To execute this step, programs should first determine the metrics used to measure Information Element performance. Since Information Elements do not have a standardized list of performance metrics, programs should use the highlighted elements in the DoDAF OV-3 template shown in Figure A13 as a starting point. Figure A14 shows a sample OV-3 for BMD. Programs should also incorporate the Information Elements into the partial OV-5s developed in Step 1.3 by indicating the Information Elements each activity produces and consumes. This will produce a completed OV-5. Figure A15 shows a sample full OV-5 for BMD. Figure A13. DoDAF OV-3 Template Figure A14. Sample OV-3 for BMD Page 42

49 Figure A15. Sample full OV-5 for BMD As in Step 1.4, programs should determine the performance metrics in the context of a scenario and determine the performance values required for mission success. Programs should use this information in the refined NR-KPP Description and NR-KPP Effectiveness and Performance Measures. Programs should also combine this information with the products generated in Step 2.1 and generate the complete OV-3 that includes performance metrics. At this point, programs will have identified all elements of the refined NR-KPP Compliance Statement shown in Appendix B. At the end of the InA, programs should ensure they can answer the following questions: What are the data requirements (inputs and outputs) from each operational activity or task? What are the sending and receiving nodes for all Information Elements produced or consumed by the Operational Tasks? What is the structure of the Information Elements produced or consumed by the Operational Tasks? What are the performance requirements for the Information Elements defined for the Operational Tasks? Page 43

50 Step 3: Systems Engineering Process As indicated in Section 4, the Systems Engineering (SE) Process decomposes the derived NR-KPP requirements into System Performance Requirements for use during System Design and Realization. The Guidebook recommends using the Defense Acquisition Guidebook (DAG) as the authoritative reference for the SE Process. Since the DAG s SE Process does not require any modification for the NR-KPP, this appendix will not describe the steps of the SE Process. Instead, it will show how the NR-KPP Compliance Measures should constrain System Design and Realization. System Design According to the DAG, the System Design portion consists of Requirements Development, Logical Analysis, and Design Solution. Throughout these steps, programs should use DoDAF System Views to maintain traceability from the System Performance Requirements back to the derived Operational Requirements that were captured in the DoDAF Operational Views developed during the MA and InA. The following sequence represents the recommended approach for developing and documenting these System Performance Requirements: Programs should examine each mission thread s OV-5 and OV-6c to determine which Operational Tasks their system will support. Programs can make this distinction in a number of ways. For example, programs can look at nodes that will field their system and pick a subset of that node s tasks. Programs could also derive the supported tasks from higher-level guidance (e.g. Resource Sponsor). Once the program has identified the tasks their system will support, it should specify the system functions that will be used to accomplish those tasks. Programs can determine this by conducting new analyses or assuming the new system will have the same functionality as the legacy system. Programs should use an SV-5a to display these system functions and their relationships to Operational Tasks. Figure A16 shows a sample SV-5a for the BMD example. Figure A16. Sample SV-5a for BMD Page 44

51 Once programs have determined the system functions their system will perform, they should identify any dependencies between those functions along with the data produced and consumed by each function. Programs should display this information in an SV-4a. The SV-4a is analogous to the OV-5 which shows the relationship between the Operational Tasks and the information produced and consumed by each task. Programs should try to use the function names identified in the Joint Common System Function List or DoD Information Enterprise Architecture. If appropriate function names do not exist, programs should submit change requests to these documents. Figure A17 shows a sample SV-4a for the BMD example. Figure A17. Sample SV-4a for BMD Although not required, programs can use an SV-7 to display performance metrics for the system functions in the SV-4a. Programs can then use these System Performance Requirements throughout System Design and Realization to ensure the system performs as expected. If programs use an SV-7, they should perform an analysis as in Steps 1.4, 2.3, and 2.4 to determine how their system must perform to enable the specified task performance. This in turn will guarantee a specified level of mission performance and thereby provide traceability between the system and its operational effects. Figure A18 shows a sample SV-7 for the BMD example. Figure A18. Sample SV-7 for BMD Page 45

52 Programs should examine the OV-3, OV-5, OV-6c, and SV-5a to determine which nodes the system needs to exchange information with based on the Operational Tasks it supports. Programs should then identify the physical connections (e.g. Ethernet, SATCOM, Link 16, etc.) needed to support these information exchanges and display those physical interfaces in an SV-2. Figure A19 shows a sample SV-2 for the BMD example. Figure A19. Sample SV-2 for BMD If the program produces or consumes any of the information in its supported information exchanges, the program should identify what system data exchanges will implement those information exchanges. Programs should display those system data exchanges in the SV-6. Programs should also perform an analysis as in Steps 1.4, 2.3, and 2.4 to determine how the system data exchanges must perform to enable the specified task performance. This in turn will guarantee a specified level of mission performance and thereby provide traceability between the system and its operational effect. Programs should use the elements in the DoDAF SV-6 template shown in Figure A20 as a starting point. Figure A21 shows a sample SV-6 for the BMD example. The DoDAF products developed thus far should provide traceability between the Information Elements specified and related in the OV-3, OV-5, OV-6c and the system functions and data exchanges specified in the SV-4a, SV-5a, and SV-6. Also note that SV-2 should display the physical links used for each and every data link. Figure A22 below illustrates this traceability. Page 46

53 Figure A20. DoDAF SV-6 Template Figure A21. Sample SV-6 for BMD Page 47

54 Figure A22. SV Product Traceability Programs should use the SV-11 to record, define, and display the structure of each system data exchange in the SV-6. Figure A23 shows a sample SV-11 for the BMD example. In the textual description of the SV-11, programs should also answer questions related to storage of the data and metadata. These might include: Are all or some of the data stored? Where are the data/metadata stored? How long are the data/metadata stored? How are the data/metadata made available and who is it made available to? How will the data s lifecycle be managed? Again, the DoDAF products should provide traceability between the Information Elements specified and related in the OV-3, OV-5, OV-7 and the system functions and data exchanges specified in the SV-4a, SV-5a, SV-6, and SV-11. Figure A24 below illustrates this traceability. Figure A23. Sample SV-11 for BMD A complete OV-7 would also show the relationships Page 48

55 Figure A24. SV Product Traceability Although not required, if the system provides services that produce information for another node or consume information from another node, the program should use an SV-1 to display the logical interfaces connecting the nodes. This information facilitates modeling and simulation of the system and planning for the Navy s enterprise data strategy. Figure A25 shows a sample SV-1 for the BMD example. Figure A25. Sample SV-1 for BMD Finally, programs should use the TV-1 to display the standards used by the logical interfaces in the SV-1, the physical interfaces in the SV-2, the system functions in the SV-4a, and the data exchanges in the SV-6. A TV-2 should display any expected changes in those standards. As programs execute the steps above, they should keep in mind that the NR-KPP Compliance Measures will constrain the system solutions specified during each step. Page 49

Mission-Level Systems Engineering

Mission-Level Systems Engineering 6 th Annual DoD Enterprise Architecture Conference Mission-Level Systems Engineering Mr. Ricardo Cabrera ASN (RD&A) CHSENG Hampton, VA 13 Apr 2011 DISTRIBUTION STATEMENT A. Approved for public release;

More information

Interoperability Developmental Test & Evaluation Guidance

Interoperability Developmental Test & Evaluation Guidance Developmental Test & Evaluation Guidance May 2017 DT&E Initiative Team BLUF DT&E Guidance Include interoperability as part of overall DT&E strategy, developed early Outline DT&E activities in the TEMP

More information

Dr. Fatma Dandashi October, 2003

Dr. Fatma Dandashi October, 2003 Systems Technical Operational DoD Architecture Framework Overview Dr. Fatma Dandashi October, 2003 Outline Policy and Guidance on Architecture History of the Framework Framework Definitions and Purpose

More information

DEPARTMENT OF THE NAVY OFFICE OF THE SECRETARY 1000 NAVY PENTAGON WASHINGTON DC

DEPARTMENT OF THE NAVY OFFICE OF THE SECRETARY 1000 NAVY PENTAGON WASHINGTON DC SECNAV INSTRUCTION 4105.1C DEPARTMENT OF THE NAVY OFFICE OF THE SECRETARY 1000 NAVY PENTAGON WASHINGTON DC 20350-1000 SECNAVINST 4105.1C ASN (RD&A) NOV _::g 2012 From: Subj: Ref: Encl: Secretary of the

More information

TOPIC DESCRIPTION SUPPLEMENT for the SYSTEMS ENGINEERING SURVEY DESCRIPTION

TOPIC DESCRIPTION SUPPLEMENT for the SYSTEMS ENGINEERING SURVEY DESCRIPTION 1 2 Objectives of Systems Engineering 3 4 5 6 7 8 DoD Policies, Regulations, & Guidance on Systems Engineering Roles of Systems Engineering in an Acquisition Program Who performs on an Acquisition Program

More information

DEPARTMENT OF THE NAVY ENERGY PROGRAM

DEPARTMENT OF THE NAVY ENERGY PROGRAM SECNAV INSTRUCTION 4101.3A DEPARTMENT OF THE NAVY OFFICE OF THE SECRETARY 1 000 NAVY PENTAGON WASHINGTON DC 20350-1000 SECNAVINST 4101.3A ASN (EI&E) From: Subj: Ref: Encl: Secretary of the Navy DEPARTMENT

More information

ITEA Live, Virtual, Constructive Conference Paper

ITEA Live, Virtual, Constructive Conference Paper ITEA Live, Virtual, Constructive Conference Paper Recommendations Regarding Initial Guidelines for the Minimum Requirements Necessary for System Representations Used in the Joint Mission Environment (JME

More information

Using JCIDS DoDAF Architecture Primitives to Assemble a Repository for Enterprise-wide Analysis and Decision-Making

Using JCIDS DoDAF Architecture Primitives to Assemble a Repository for Enterprise-wide Analysis and Decision-Making 12TH ICCRTS Adapting C2 to the 21st Century Using JCIDS DoDAF Architecture Primitives to Assemble a Repository for Enterprise-wide Analysis and Decision-Making Topics: Track 1: C2 Concepts, Theory, and

More information

NAVSEAINST NAVAIRINST JUL JAN 2010

NAVSEAINST NAVAIRINST JUL JAN 2010 MARCORSYSCOM Order NAVSUPINST 5000.21 DEPARTMENT OF THE NAVY NAVAL AIR SYSTEMS COMMAND, PATUXENT RIVER, MD 20670-1547 SPAWARINST 5000.1 NAVFACINST 5000.15 NAVAL SEA SYSTEMS COMMAND, WASHINGTON NAVY YARD,

More information

DoD Architecture Framework Version 2.0 Draft

DoD Architecture Framework Version 2.0 Draft 1 wreath stars Text 2 3 4 5 6 7 8 9 10 DoD Architecture Framework Version 2.0 Draft 11 12 13 14 15 16 17 18 19 20 21 22 23 Volume 1: Introduction, Overview, and Concepts Management Volume 24 December 2008

More information

CONTENT GUIDE FOR THE NET-READY KPP. a. Usage. All IS will follow the NR KPP development process in accordance with this guide and reference jjjj.

CONTENT GUIDE FOR THE NET-READY KPP. a. Usage. All IS will follow the NR KPP development process in accordance with this guide and reference jjjj. APPENDIX E TO ENCLOSURE D CONTENT GUIDE FOR THE NET-READY KPP 1. Overview a. Usage. All IS will follow the NR KPP development process in accordance with this guide and reference jjjj. (1) This applies

More information

One of the most important activities in the Production and Deployment phase is Test and

One of the most important activities in the Production and Deployment phase is Test and 1 One of the most important activities in the Production and Deployment phase is Test and Evaluation. And, as HSI practitioners, you should be right in the middle of these activities. iti You need to verify

More information

DoD Template for Application of TLCSM and PBL In the Weapon System Life Cycle

DoD Template for Application of TLCSM and PBL In the Weapon System Life Cycle DoD Template for Application of TLCSM and PBL In the Weapon System Life Cycle The purpose of this template is to provide program managers, their staff, and logistics participants in the acquisition process

More information

DEFENSE ACQUISITION UNIVERSITY ISA 101 BASIC INFORMATION SYSTEM ACQUISITION

DEFENSE ACQUISITION UNIVERSITY ISA 101 BASIC INFORMATION SYSTEM ACQUISITION 1 Identify applicable United States laws, federal regulations and DoD directives that govern management of IT/SW systems. Identify key statutes related to the acquisition of Information Technology (IT)

More information

Downloaded from

Downloaded from Naval Systems of Systems Systems Engineering Guidebook Volume I Prepared by the Office of the ASN (RDA) Chief Systems Engineer 6 November 2006 i. Foreword The Naval Systems of Systems Systems Engineering

More information

An Enhanced Analysis of Alternatives (AoA)

An Enhanced Analysis of Alternatives (AoA) An Enhanced Analysis of Alternatives (AoA) A Mission-Oriented, Evaluation-Based Framework for Defense Test & Evaluation Highlighting Emerging Roles for Systems Engineering in Defense Decision Making Abstract

More information

Subj: INDEPENDENT LOGISTICS ASSESSMENT AND CERTIFICATION REQUIREMENTS

Subj: INDEPENDENT LOGISTICS ASSESSMENT AND CERTIFICATION REQUIREMENTS DEPARTMENT OF THE NAVY OFFICE OF THE SECRETARY 1000 NAVY PENTAGON WASHINGTON, D.C. 20350-1000 SECNAVINST 4105.1B ASN (RD&A) December 18, 2008 SECNAV INSTRUCTION 4105.1B From: SECRETARY OF THE NAVY Subj:

More information

Life Cycle Metrics and OSD Oversight: Discipline With Flexibility

Life Cycle Metrics and OSD Oversight: Discipline With Flexibility ITEA Journal 2008; 29: 263 266 Copyright 2008 by the International Test and Evaluation Association Life Cycle Metrics and OSD Oversight: Discipline With Flexibility Randy T. Fowler Assistant Deputy Under

More information

June Mark Fiebrandt JTEM Operations Research Analyst

June Mark Fiebrandt JTEM Operations Research Analyst June 2010 Mark Fiebrandt JTEM Operations Research Analyst mark.fiebrandt.ctr@jte.osd.mil 757.638.6055 Abstract This tutorial provides background, overview, and status of efforts to produce an overarching

More information

DoD Architecture Framework

DoD Architecture Framework wreath stars Text DoD Architecture Framework Version 2.03 Volume 1: Overview and Concepts Manager s Guide NORMATIVE 07 December 2012 i ii This page left intentionally blank Executive Summary The Department

More information

Part I of this article, published in the May-June 2009 issue of Defense AT&L magazine,

Part I of this article, published in the May-June 2009 issue of Defense AT&L magazine, L I F E C Y C L E L O G I S T I C S Focusing Sustainment Logistics Toward Capabilities Development: Part II Charles Borsch Part I of this article, published in the May-June 2009 issue of Defense AT&L magazine,

More information

Enterprise Architecture as an Essential Tool Supporting Army Transformation. Fernando Mezquita

Enterprise Architecture as an Essential Tool Supporting Army Transformation. Fernando Mezquita Enterprise Architecture as an Essential Tool Supporting Army Transformation Fernando Mezquita There are two ways of spreading light: to be the candle or the mirror that reflects it. Edith Wharton, 1862-1937

More information

DEPARTMENT OF DEFENSE STANDARD PRACTICE

DEPARTMENT OF DEFENSE STANDARD PRACTICE NOT MEASUREMENT SENSITIVE 5 April 2012 SUPERSEDING 28 January 2008 DEPARTMENT OF DEFENSE STANDARD PRACTICE DOCUMENTATION OF VERIFICATION, VALIDATION, AND ACCREDITATION (VV&A) FOR MODELS AND SIMULATIONS

More information

Issues for Future Systems Costing

Issues for Future Systems Costing Issues for Future Systems Costing Panel 17 5 th Annual Acquisition Research Symposium Fred Hartman IDA/STD May 15, 2008 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden

More information

NGB-J6/CIO CNGBI DISTRIBUTION: A 13 August 2012 NATIONAL GUARD BUREAU (NGB) JOINT INFORMATION TECHNOLOGY PORTFOLIO MANAGEMENT

NGB-J6/CIO CNGBI DISTRIBUTION: A 13 August 2012 NATIONAL GUARD BUREAU (NGB) JOINT INFORMATION TECHNOLOGY PORTFOLIO MANAGEMENT CHIEF NATIONAL GUARD BUREAU INSTRUCTION NGB-J6/CIO CNGBI 6000.01 DISTRIBUTION: A NATIONAL GUARD BUREAU (NGB) JOINT INFORMATION TECHNOLOGY PORTFOLIO MANAGEMENT Reference(s): See Enclosure C. 1. Purpose.

More information

National Aeronautics and Space Administration Washington, DC 20546

National Aeronautics and Space Administration Washington, DC 20546 Technical Standards Division Publication NASA-STD-2100-91 NASA Software Documentation Standard Software Engineering Program NASA-STD-2100-91 -91 Approved: July 29, 1991 National Aeronautics and Space Administration

More information

Systems Engineering Processes Applied To Ground Vehicle Integration at US Army Tank Automotive Research, Development, and Engineering Center (TARDEC)

Systems Engineering Processes Applied To Ground Vehicle Integration at US Army Tank Automotive Research, Development, and Engineering Center (TARDEC) 2010 NDIA GROUND VEHICLE SYSTEMS ENGINEERING AND TECHNOLOGY SYMPOSIUM SYSTEMS ENGINEERING MINI-SYMPOSIUM AUGUST 17-19 DEARBORN, MICHIGAN Systems Engineering Processes Applied To Ground Vehicle Integration

More information

An Introduction to the DoD Architecture Framework v1.5

An Introduction to the DoD Architecture Framework v1.5 An Introduction to the DoD Architecture Framework v1.5 Presented to Colorado Front Range INCOSE Chapter Colorado Springs 27 May 2008 1 Agenda DODAF Overview DODAF 1.5 Volume I / Volume II DODAF 1.5 Volume

More information

Evolving Interoperability Certification to Support Agility

Evolving Interoperability Certification to Support Agility Danielle Mackenzie Chief (Acting) Strategic Planning & Engineering Division March 13, 2012 Defense Information Systems Agency Evolving Interoperability Certification to Support Agility 2 The information

More information

Report of the Reliability Improvement Working Group Table of Contents

Report of the Reliability Improvement Working Group Table of Contents Report of the Reliability Improvement Working Group 1942 poster commissioned by US Office for Emergency Management, Office of War Information Courtesy of the National Archives September 2008 Report of

More information

Integrated Architecture-Based Portfolio Investment Strategies

Integrated Architecture-Based Portfolio Investment Strategies Integrated Architecture-Based Portfolio Investment Strategies 2005 10th International Command and Control Research and Technology Symposium The Future of C2 Assessment, Tools, and Metrics, #343 June 13,

More information

DOD INSTRUCTION BUSINESS SYSTEMS REQUIREMENTS AND ACQUISITION

DOD INSTRUCTION BUSINESS SYSTEMS REQUIREMENTS AND ACQUISITION DOD INSTRUCTION 5000.75 BUSINESS SYSTEMS REQUIREMENTS AND ACQUISITION Originating Component: Office of the Under Secretary of Defense for Acquisition and Sustainment Effective: February 2, 2017 Change

More information

Methodology for Modeling, Simulation, and Analysis Support of DoD Space Acquisitions

Methodology for Modeling, Simulation, and Analysis Support of DoD Space Acquisitions Methodology for Modeling, Simulation, and Analysis Support of DoD Space Acquisitions Aerospace USC Technical Interchange Meeting Dec 17, 2010 Michael Baxter, Director Modeling & Simulation Systems Analysis

More information

Implementation of the Reliability & Maintainability (R&M) Engineering Body of Knowledge (BoK)

Implementation of the Reliability & Maintainability (R&M) Engineering Body of Knowledge (BoK) Implementation of the Reliability & Maintainability (R&M) Engineering Body of Knowledge (BoK) Andrew Monje Office of the Deputy Assistant Secretary of Defense for Systems Engineering 20th Annual NDIA Systems

More information

..._ I Page lof 13 ~ LOG200 Intermediate Acquisition Logistics lesson 2.6- Evaluate Product Support Capabilities- Metrics. Welcome to Metrics

..._ I Page lof 13 ~ LOG200 Intermediate Acquisition Logistics lesson 2.6- Evaluate Product Support Capabilities- Metrics. Welcome to Metrics lesson 2.6- Evaluate Product Support Capabilities- Metrics RESOURCES I PRIMT I HELP Welcome to Metrics This lesson addresses the development of system sustainment metrics that will be used in the evaluation

More information

CAPABILITY-BASED MODELING METHODOLOGY: A FLEET-FIRST APPROACH TO ARCHITECTURE

CAPABILITY-BASED MODELING METHODOLOGY: A FLEET-FIRST APPROACH TO ARCHITECTURE DAHLGREN DIVISION NAVAL SURFACE WARFARE CENTER Dahlgren, Virginia 22448-5100 NSWCDD/TR-13/180 CAPABILITY-BASED MODELING METHODOLOGY: A FLEET-FIRST APPROACH TO ARCHITECTURE BY JOSEPH N. PACK WARFARE SYSTEMS

More information

OPNAVINST G N15 15 May a. To issue policy, procedures, and responsibilities for the Personnel Qualification Standards (PQS) Program.

OPNAVINST G N15 15 May a. To issue policy, procedures, and responsibilities for the Personnel Qualification Standards (PQS) Program. DEPARTMENT OF THE NAVY OFFICE OF THE CHIEF OF NAVAL OPERATIONS 2000 NAVY PENTAGON WASHINGTON, DC 20350-2000 OPNAVINST 3500.34G N15 OPNAV INSTRUCTION 3500.34G From: Chief of Naval Operations Subj: PERSONNEL

More information

Intermediate Systems Acquisition Course. Lesson 2.11 Program Approval ADE-2. Program Approval ADE-2

Intermediate Systems Acquisition Course. Lesson 2.11 Program Approval ADE-2. Program Approval ADE-2 Program Approval ADE-2 At the end of the Analyze/Select Phase, the program office prepares for the next major acquisition decision event, Acquisition Decision Event 2A (ADE-2A). ADE-2A approves the program

More information

The Information Technology (IT) Box A Primer

The Information Technology (IT) Box A Primer The Information Technology (IT) Box A Primer Sources: CJCSI 5123.01H, 31 Aug 2018 JCIDS Manual, 31 Aug 2018 Patrick Wills Dean, Defense Systems Management College Defense Acquisition University work: 703-805-4563

More information

Operational Requirements Document (ORD)

Operational Requirements Document (ORD) Operational Requirements Document (ORD) A. Purpose This Appendix sets the requirements and establishes procedures for preparation of an Operational Requirements Document (ORD). It also prescribes procedures

More information

Test and Evaluation for Agile Information Technologies. Steve Hutchison DISA T&E

Test and Evaluation for Agile Information Technologies. Steve Hutchison DISA T&E Test and Evaluation for Agile Information Technologies Steve Hutchison DISA T&E ag ile [aj-uh l, -ahyl] - adjective 1.quick and well-coordinated in movement Dictionary.com Based on the Random House Dictionary,

More information

DoD CIO Interoperability Process

DoD CIO Interoperability Process DoD CIO Interoperability Process 30 Apr 2012 UNCLASSIFIED Ed Zick DoD CIO edward.zick@osd.mil (571) 372 4680 Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection

More information

Top 5 Systems Engineering Issues within DOD and Defense Industry

Top 5 Systems Engineering Issues within DOD and Defense Industry Top 5 Systems Engineering Issues within DOD and Defense Industry Task Report July 26-27, 27, 2006 1 Task Description Identify Top 5 Systems Engineering problems or issues prevalent within the defense industry

More information

a GAO GAO DEFENSE MANAGEMENT Tools for Measuring and Managing Defense Agency Performance Could Be Strengthened

a GAO GAO DEFENSE MANAGEMENT Tools for Measuring and Managing Defense Agency Performance Could Be Strengthened GAO United States Government Accountability Office Report to the Committee on Armed Services, U.S. Senate September 2004 DEFENSE MANAGEMENT Tools for Measuring and Managing Defense Agency Performance Could

More information

Analysis of Alternatives (AoA) Handbook

Analysis of Alternatives (AoA) Handbook Analysis of Alternatives (AoA) Handbook A Practical Guide to the Analysis of Alternatives 6 July 2016 Office of Aerospace Studies Headquarters Air Force HAF/A5R-OAS 1655 1st Street SE Kirtland AFB, NM

More information

Systems Engineering Management Plan?

Systems Engineering Management Plan? Writing a Systems Engineering Plan, or a Systems Engineering Management Plan? Think About Models and Simulations Philomena Zimmerman Office of the Deputy Assistant Secretary of Defense for Systems Engineering

More information

Department of the Navy. Enterprise Architecture Federation Pilot Initiative. Final Report

Department of the Navy. Enterprise Architecture Federation Pilot Initiative. Final Report Department of the Navy Enterprise Architecture Federation Pilot Initiative Final Report Version 1.0 May 2008 Version History Version Publication Date Author Description of Change 1.0 31 December 2007 DON

More information

INFORMATION ASSURANCE DIRECTORATE

INFORMATION ASSURANCE DIRECTORATE National Security Agency/Central Security Service INFORMATION ASSURANCE DIRECTORATE CGS Portfolio Management Portfolio Management is the process of analyzing, selecting, controlling, and evaluating needs

More information

Towards Better Control of Information Assurance Assessments in Exercise Settings

Towards Better Control of Information Assurance Assessments in Exercise Settings ITEA Journal 2008; 29: 63 66 Copyright 2008 by the International Test and Evaluation Association Towards Better Control of Information Assurance Assessments in Exercise Settings David J. Aland Wyle, Arlington,

More information

Presentation Outline. Introduction. Revitalization Effort using CMMI. Training. Summary

Presentation Outline. Introduction. Revitalization Effort using CMMI. Training. Summary Sound Systems Engineering using CMMI Michael T. Kutch, Jr. Sandee Guidry Director Engineering Operations, Code 09K Chief Engineer Code 70E Intelligence & Information Warfare Systems Department SPAWAR Systems

More information

MINISTRY OF DEFENCE. MOD Architectural Framework Executive Summary

MINISTRY OF DEFENCE. MOD Architectural Framework Executive Summary MODAF-M09-001 MINISTRY OF DEFENCE MOD Architectural Framework Executive Summary Draft 0.2 22 July 2005 Prepared by:- Approved by:- CROWN COPYRIGHT 2005. THIS DOCUMENT IS THE PROPERTY OF HER BRITANNIC MAJESTY

More information

MBA BADM559 Enterprise IT Governance 12/15/2008. Enterprise Architecture is a holistic view of an enterprise s processes, information and

MBA BADM559 Enterprise IT Governance 12/15/2008. Enterprise Architecture is a holistic view of an enterprise s processes, information and Enterprise Architecture is a holistic view of an enterprise s processes, information and information technology assets as a vehicle for aligning business and IT in a structured, more efficient and sustainable

More information

Engineering and Manufacturing Development YOU ARE HERE. As homework, you read the SPAW MS B TEMP and wrote down any mistakes you noticed.

Engineering and Manufacturing Development YOU ARE HERE. As homework, you read the SPAW MS B TEMP and wrote down any mistakes you noticed. TEMP Review Exercise CDD Validation Dev. RFP Release MDD A B C FRP IOC FOC Materiel Solution Analysis Tech Maturation & Risk Reduction Engineering and Manufacturing Development Production & Deployment

More information

Enterprise Architectures

Enterprise Architectures Enterprise Architectures A Just-in-Time Approach for Decision-Making Carlos E. Martinez The MITRE Corporation 7515 Colshire Drive McLean, Virginia 22102 March 24, 2014 Approved for Public Release; Distribution

More information

-- - : - : ,.. -=:- ..._ I Pilge 1of17 1!1- Back Next ...,- - : Natlonal. -::::r=-.. e=-

-- - : - : ,.. -=:- ..._ I Pilge 1of17 1!1- Back Next ...,- - : Natlonal. -::::r=-.. e=- LOG200 I ntermediate Acquisition Logistics Lesson 1.2 - Define Supportability Objectives- Regulat ory Environment RESOURCES 1 PR I NT 1 HELP Welcome to the Regulatory Environment This lesson addresses

More information

QUALITY ASSURANCE PLAN OKLAHOMA DEPARTMENT OF HUMAN SERVICES ENTERPRISE SYSTEM (MOSAIC PROJECT)

QUALITY ASSURANCE PLAN OKLAHOMA DEPARTMENT OF HUMAN SERVICES ENTERPRISE SYSTEM (MOSAIC PROJECT) QUALITY ASSURANCE PLAN OKLAHOMA DEPARTMENT OF HUMAN SERVICES ENTERPRISE SYSTEM (MOSAIC PROJECT) MOSAIC Quality Assurance Plan v04.02 Prepared by: Approved by: QUALITY ASSURANCE PLAN APPROVALS QA/QC Program

More information

Analysis of the integrated defense acquisition, technology, and logistics life cycle management framework for human systems integration documentation

Analysis of the integrated defense acquisition, technology, and logistics life cycle management framework for human systems integration documentation Calhoun: The NPS Institutional Archive Theses and Dissertations Thesis Collection 2009-12 Analysis of the integrated defense acquisition, technology, and logistics life cycle management framework for human

More information

Agile Integration of Complex Systems

Agile Integration of Complex Systems Agile Integration of Complex Systems Wayne O Brien Page 1 Copyright 2010 Raytheon Company. All rights reserved. Customer Success Is Our Mission is a registered trademark of Raytheon Company. Report Documentation

More information

DATA ITEM DESCRIPTION TITLE: TRAINING SITUATION DOCUMENT Number: DI-SESS-81517C Approval Date:

DATA ITEM DESCRIPTION TITLE: TRAINING SITUATION DOCUMENT Number: DI-SESS-81517C Approval Date: DATA ITEM DESCRIPTION TITLE: TRAINING SITUATION DOCUMENT Number: DI-SESS-81517C Approval Date: 20130524 AMSC Number: N9379 Limitation: N/A DTIC Applicable: N/A GIDEP Applicable: N/A Office of Primary Responsibility:

More information

REQUIREMENTS DOCUMENTATION

REQUIREMENTS DOCUMENTATION REQUIREMENTS DOCUMENTATION Project Title: Date Prepared: Stakeholder Requirement Category Priority Acceptance Criteria REQUIREMENTS DOCUMENTATION Project Title: Date Prepared: Stakeholder Requirement Category

More information

Defining a System of Systems Engineering and integration Approach to Address the Navy's Information Technology Technical Authority

Defining a System of Systems Engineering and integration Approach to Address the Navy's Information Technology Technical Authority Calhoun: The NPS Institutional Archive DSpace Repository Faculty and Researchers Faculty and Researchers Collection 2013 Defining a System of Systems Engineering and integration Approach to Address the

More information

SYSTEMS ENGINEERING REQUIREMENTS AND PRODUCTS

SYSTEMS ENGINEERING REQUIREMENTS AND PRODUCTS BY ORDER OF THE COMMANDER SMC Standard SMC-S-001 12 July 2010 ------------------------ Supersedes: SMC-S-001 (2008) Air Force Space Command SPACE AND MISSILE SYSTEMS CENTER STANDARD SYSTEMS ENGINEERING

More information

Improve Your. Process. BY john dobriansky

Improve Your. Process. BY john dobriansky Improve Your Acquisition Process With Lean Six Sigma BY john dobriansky 14 Contract Management May 2008 While Lean Six Sigma can be applied to numerous categories of contracting, this article examines

More information

Using System-of-Systems Simulation Modeling and Analysis to Measure Energy KPP Impacts for Brigade Combat Team Missions

Using System-of-Systems Simulation Modeling and Analysis to Measure Energy KPP Impacts for Brigade Combat Team Missions Using System-of-Systems Simulation Modeling and Analysis to Measure Energy KPP Impacts for Brigade Combat Team Missions June 2010 Kimberly M. Welch Sandia National Laboratories Jessica Kerper Sandia National

More information

SUBJECT: Better Buying Power 2.0: Continuing the Pursuit for Greater Efficiency and Productivity in Defense Spending

SUBJECT: Better Buying Power 2.0: Continuing the Pursuit for Greater Efficiency and Productivity in Defense Spending THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC 20301-3010 ACQUISITION, TECHNOLOGY AND LOGISTICS MEMORANDUM FOR DEFENSE ACQUISITION WORKFORCE SUBJECT: Better Buying Power 2.0: Continuing

More information

An Enhanced Analysis of Alternatives (AoA)

An Enhanced Analysis of Alternatives (AoA) An Enhanced Analysis of Alternatives (AoA) A Mission-Oriented, Evaluation-Based Framework for Defense Test Evaluation NDIA March 2-5. 2009 Vince Roske Institute for Defense Analyses vroske@ida.org 703

More information

Request for Solutions: High Energy Laser (HEL) Flexible Prototype. 11 December 2018

Request for Solutions: High Energy Laser (HEL) Flexible Prototype. 11 December 2018 Request for Solutions: High Energy Laser (HEL) Flexible Prototype 11 December 2018 1.0 Purpose The Air Force Life Cycle Management Center (AFLCMC) seeks a prototype groundbased Laser Weapon System (LWS)

More information

DEFENSE INFORMATION SYSTEMS AGENCY P. O. BOX 549 FORT MEADE, MARYLAND

DEFENSE INFORMATION SYSTEMS AGENCY P. O. BOX 549 FORT MEADE, MARYLAND DEFENSE INFORMATION SYSTEMS AGENCY P. O. BOX 549 FORT MEADE, MARYLAND 20755-0549 IN REPLY REFER TO: Chief Information Assurance Executive (CIAE) 3 May 2013 MEMORANDUM FOR DISTRIBUTION SUBJECT: Department

More information

DEPARTMENT OF DEFENSE Defense Contract Management Agency INSTRUCTION. Government Contract Quality Assurance (GCQA) Surveillance Planning

DEPARTMENT OF DEFENSE Defense Contract Management Agency INSTRUCTION. Government Contract Quality Assurance (GCQA) Surveillance Planning DEPARTMENT OF DEFENSE Defense Contract Management Agency INSTRUCTION Government Contract Quality Assurance (GCQA) Surveillance Planning Quality Assurance Technical Directorate DCMA-INST 309 OPR: DCMA-QA

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE COMMANDER AIR FORCE RESEARCH LABORATORY AIR FORCE RESEARCH LABORATORY INSTRUCTION 33-401 13 MARCH 2014 Communications and Information ENTERPRISE BUSINESS INFORMATION TECHNOLOGY REQUIREMENTS

More information

a. To provide policy and assign responsibilities for the management of depot-level repairables (DLR).

a. To provide policy and assign responsibilities for the management of depot-level repairables (DLR). DEPARTMENT OF THE NAVY OFFICE OF THE CHIEF OF NAVAL OPERATIONS 2000 NAVY PENTAGON WASHINGTON DC 20350-2000 N41 OPNAV INSTRUCTION 4400.9D From: Chief of Naval Operations Subj: DEPOT LEVEL REPAIRABLE ITEM

More information

New Acquisition Policy and Its Impact on Systems Engineering

New Acquisition Policy and Its Impact on Systems Engineering New Acquisition Policy and Its Impact on Systems Engineering NDIA 11 th Annual Systems Engineering Conference October 21, 2008 Sharon Vannucci Systems and Software Engineering/Enterprise Development Office

More information

SYSTEMS ENGINEERING REQUIREMENTS AND PRODUCTS

SYSTEMS ENGINEERING REQUIREMENTS AND PRODUCTS SMC Standard SMC-S-001 1 July 2013 ------------------------ Supersedes: SMC-S-001 (2010) Air Force Space Command SPACE AND MISSILE SYSTEMS CENTER STANDARD SYSTEMS ENGINEERING REQUIREMENTS AND PRODUCTS

More information

DEPARTMENT OF DEFENSE Defense Contract Management Agency INSTRUCTION. Government Contract Quality Assurance (GCQA) Surveillance Planning

DEPARTMENT OF DEFENSE Defense Contract Management Agency INSTRUCTION. Government Contract Quality Assurance (GCQA) Surveillance Planning DEPARTMENT OF DEFENSE Defense Contract Management Agency INSTRUCTION Government Contract Quality Assurance (GCQA) Surveillance Planning Quality Assurance Directorate DCMA-INST 309 OPR: DCMA-QA Administrative

More information

Department of Defense Independent Technical Risk Assessment Framework for Risk Categorization

Department of Defense Independent Technical Risk Assessment Framework for Risk Categorization Department of Defense Independent Technical Risk Assessment Framework for Risk Categorization June 08 Office of the Under Secretary of Defense Research and Engineering Washington, D.C. Department of Defense

More information

Systems Engineering for the Joint Capabilities Integration and Development System (JCIDS) Tutorial for the 9 th NDIA Systems Engineering Conference

Systems Engineering for the Joint Capabilities Integration and Development System (JCIDS) Tutorial for the 9 th NDIA Systems Engineering Conference Systems Engineering for the Joint Capabilities Integration and Development System (JCIDS) Tutorial for the 9 th NDIA Systems Engineering Conference Agenda and Presenters Introduction to JCIDS Chris Ryder

More information

APPENDIX C Configuration Change Management Verification and Validation Procedures

APPENDIX C Configuration Change Management Verification and Validation Procedures DCMA-INST 217 APPENDIX C Configuration Change Management Verification and Validation Procedures Table of Contents C1. Introduction... C-1 C2. Implementation... C-1 C3. Pre-Inspection: Verification... C-1

More information

A Capability- Focused T&E Framework. Steven Hutchison

A Capability- Focused T&E Framework. Steven Hutchison A Capability- Focused T&E Framework Steven Hutchison 14 Iam just going to say it: I don t like the terms developmental test (DT) or operational test (OT). For that matter, I don t like the term integrated

More information

SPAWAR Paperless Initiatives Partnering for Success

SPAWAR Paperless Initiatives Partnering for Success SPAWAR Paperless Initiatives Partnering for Success Reducing Administration Costs One Byte at a Time 06 March 2012 Presented to: Department of Defense Office of Small Business Programs Mentor-Protégé Conference

More information

Information Technology Independent Verification and Validation

Information Technology Independent Verification and Validation Florida Department of Management Services Information Technology Independent Verification and Validation RFP No. Work Plan and Methodology ; 2:30 PM EST 2150 River Plaza Drive Suite 380 Sacramento California

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE INSTRUCTION 63-145 30 SEPTEMBER 2016 Acquisition MANUFACTURING AND QUALITY MANAGEMENT COMPLIANCE WITH THIS PUBLICATION IS MANDATORY ACCESSIBILITY: Publications

More information

Incorporating Test and Evaluation into Department of Defense Acquisition Contracts

Incorporating Test and Evaluation into Department of Defense Acquisition Contracts DEPARTMENT OF DEFENSE UNITED STATES OF AMERICA Incorporating Test and Evaluation into Department of Defense Acquisition Contracts CLEARED For Open Publication OCT 24, 2011 Office of Security Review Department

More information

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices Appendix 1 Formulate Programs with a RAM Growth Program II-1 1.1 Reliability Improvement Policy II-3 1.2 Sample Reliability

More information

NASA Systems Engineering Processes and Requirements

NASA Systems Engineering Processes and Requirements NASA NPR 7123.1B Procedural Effective Date: April 18, 2013 Requirements Expiration Date: April 18, 2018 RESPONSIBLE OFFICE: Office of the Chief Engineer COMPLIANCE IS MANDATORY NASA Systems Engineering

More information

DoD Modeling & Simulation Verification, Validation & Accreditation (VV&A): The Acquisition Perspective

DoD Modeling & Simulation Verification, Validation & Accreditation (VV&A): The Acquisition Perspective DoD Modeling & Simulation Verification, Validation & Accreditation (VV&A): The Acquisition Perspective NDIA Presentation Number 8939 Mr. Michael Truelove ODDR&E/Systems Engineering/Mission Assurance October

More information

JUN MEMORANDUM FOR DISTRIBUTION

JUN MEMORANDUM FOR DISTRIBUTION DEPARTMENT OF THE NAVY OFFICE OF THE ASSISTANT SECRETARY (RESEARCH, DEVELOPMENT AND ACQUISITION) 1000 NAVY PENTAGON WASHINGTON DC 20350 1 000 JUN 0 8 2010 MEMORANDUM FOR DISTRIBUTION SUBJECT: Naval Acquisition

More information

This training matters, because the auditors are coming.

This training matters, because the auditors are coming. The Property & Equipment Policy Office is pleased to present the second of two training courses on the Military Equipment Valuation (or, MEV) initiative. This one is called Management Assertion for Military

More information

Subj: IMPLEMENTATION OF ITEM UNIQUE IDENTIFICATION WITHIN THE DEPARTMENT OF THE NAVY

Subj: IMPLEMENTATION OF ITEM UNIQUE IDENTIFICATION WITHIN THE DEPARTMENT OF THE NAVY D E PAR TME NT OF THE N A VY OFFICE OF T HE SECRET ARY 1000 NAVY PENT AGON WASHINGT ON D C 20350-1000 SECNAVINST 4440.34 ASN (RD&A) SECNAV INSTRUCTION 4440.34 From: Secretary of the Navy Subj: IMPLEMENTATION

More information

SOFTWARE DEVELOPMENT STANDARD

SOFTWARE DEVELOPMENT STANDARD SFTWARE DEVELPMENT STANDARD Mar. 23, 2016 Japan Aerospace Exploration Agency The official version of this standard is written in Japanese. This English version is issued for convenience of English speakers.

More information

DoD Environmental Information Technology Management (EITM) Program

DoD Environmental Information Technology Management (EITM) Program 2011 GreenGov Symposium Oct. 31 - Nov. 2, 2011 Washington Hilton Washington, DC DoD Environmental Information Technology Management () Program LTC Stephen Spellman Program Manager, U.S. Army Report Documentation

More information

Factors to Consider When Implementing Automated Software Testing

Factors to Consider When Implementing Automated Software Testing Factors to Consider When Implementing Automated Software Testing By Larry Yang, MBA, SSCP, Security+, Oracle DBA OCA, ASTQB CTFL, ITIL V3 ITM Testing is a major component of the Software Development Lifecycle

More information

Latest Reliability Growth Policies, Practices, and Theories for Improved Execution

Latest Reliability Growth Policies, Practices, and Theories for Improved Execution Latest Reliability Growth Policies, Practices, and Theories for Improved Execution Lou Gullo Raytheon Missile Systems Senior Principal Engineer March 14, 2012 Copyright 2012 Raytheon Company. All rights

More information

OFFICE OF THE SECRETARY OF DEFENSE 1700 DEFENSE PENTAGON WASHINGTON, DC

OFFICE OF THE SECRETARY OF DEFENSE 1700 DEFENSE PENTAGON WASHINGTON, DC OFFICE OF THE SECRETARY OF DEFENSE 1700 DEFENSE PENTAGON WASHINGTON, DC 20301-1700 OPERATIONAL TEST AND EVALUATION JUN 26 2013 MEMORANDUM FOR COMMANDER, OPERATIONAL TEST AND EVALUATION FORCE (COMOPTEVFOR)

More information

Transforming Logistics Through Performance-Based Logistics

Transforming Logistics Through Performance-Based Logistics Transforming Logistics Through Performance-Based Logistics AFEI EXPO 23 September 2003 Jerry Beck Office of Assistant Deputy Under Secretary of Defense (Logistics Plans & Programs) Amateurs talk about

More information

GAO ORGANIZATIONAL TRANSFORMATION. Military Departments Can Improve Their Enterprise Architecture Programs

GAO ORGANIZATIONAL TRANSFORMATION. Military Departments Can Improve Their Enterprise Architecture Programs GAO United States Government Accountability Office Report to the Committee on Armed Services, U.S. Senate September 2011 ORGANIZATIONAL TRANSFORMATION Military Departments Can Improve Their Enterprise

More information

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B 1. Work Plan & IV&V Methodology 1.1 Compass Solutions IV&V Approach The Compass Solutions Independent Verification and Validation approach is based on the Enterprise Performance Life Cycle (EPLC) framework

More information

PMBOK Guide Fifth Edition Pre Release Version October 10, 2012

PMBOK Guide Fifth Edition Pre Release Version October 10, 2012 5.3.1 Define Scope: Inputs PMBOK Guide Fifth Edition 5.3.1.1 Scope Management Plan Described in Section 5.1.3.1.The scope management plan is a component of the project management plan that establishes

More information

Reliability and Maintainability (R&M) Engineering Update

Reliability and Maintainability (R&M) Engineering Update Reliability and Maintainability (R&M) Engineering Update Mr. Andrew Monje Office of the Deputy Assistant Secretary of Defense for Systems Engineering 15th Annual NDIA Systems Engineering Conference San

More information

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2 Passit4Sure.OG0-093.221Questions Number: OG0-093 Passing Score: 800 Time Limit: 120 min File Version: 7.1 TOGAF 9 Combined Part 1 and Part 2 One of the great thing about pass4sure is that is saves our

More information

TOGAF Foundation Exam

TOGAF Foundation Exam TOGAF Foundation Exam TOGAF 9 Part 1 (ESL) Time Limit 90 minutes Number of questions 40 Pass-through 22 1. Which of the following best describes the meaning of "Initial Level of Risk" in Risk Management?

More information