Test and Evaluation Master Plan (TEMP)

Size: px
Start display at page:

Download "Test and Evaluation Master Plan (TEMP)"

Transcription

1 Test and Evaluation Master Plan (TEMP) A. Purpose The Test and Evaluation Master Plan (TEMP) is the basic top-level planning document for Test and Evaluation (T&E) related activities for major acquisition programs. The TEMP describes the necessary Developmental Test and Evaluation (DT&E) and Operational Test and Evaluation (OT&E) that needs to be conducted to determine system technical performance, operational effectiveness / suitability, and limitations. The TEMP identifies all critical technical characteristics and operational issues and describes the objectives, responsibilities, resources, and schedules for all completed and planned T&E, including modeling and simulation tools used in the T&E process. Also, the TEMP describes all subordinate plans (e.g., DT&E and OT&E plans), required reports (e.g., DT&E and OT&E reports), and assigns responsibility for preparing and approving these plans and reports. Most acquisition programs are required to have an approved TEMP and subordinate test plans prior to commencing associated T&E unless a specific waiver is granted by the DHS Science &Technology (S&T) Director, T&E and Standards Division. 1 The TEMP guidance outlined and described in this document cancels and supersedes all previous DHS guidance for TEMPs. B. Overview of TEMP The Program Manager (PM) is responsible for coordinating the overall T&E strategy and approach for the program. The PM should prepare a TEMP in accordance with the TEMP template as outlined in this document after approval of the initial Operational Requirement Document (ORD). The program s TEMP should be approved by the Component Acquisition Executive (CAE) and the DHS S&T Director, T&E and Standards Division prior to formal submission to the Acquisition Decision Authority (ADA) at Acquisition Decision Event (ADE) 2. The TEMP is a living document that should accurately reflect major changes in program requirements, schedule, and funding. An approved TEMP should be reviewed and updated by the PM at each ADE or whenever a breach occurs in the program s Acquisition Program Baseline (APB). For example, the TEMP should be revised when the PM is unable to execute the TEMP as written, or when changes to the program s cost/funding, schedule, performance make the existing TEMP obsolete. Revision of the TEMP should receive the same endorsements and approvals as the original document. C. TEMP Preparation and Process Planning is the first step for any test and evaluation program, and should begin as early in the Analyze/Select phase of the DHS acquisition life cycle as possible. Early T&E planning should be documented in a working TEMP even if information for some of the 1 Under Secretary for Science & Technology has designated S&T Director, Test & Evaluation and Standards Division with responsibilities for DHS T&E policy and procedures. DHS Acquisition Instruction/Guidebook # : Appendix L L-1

2 sections may not be available at that time. The initial working TEMP can then be used as a basis for completing the TEMP as program direction and details become known. The PM is responsible for developing and maintaining the TEMP. Early in the acquisition process, the PM should form a test planning team such as a T&E Integrated Process Team (IPT) or Test Management Oversight Team (TMOT) to start development of the draft TEMP. The PM will prepare the draft TEMP in consultation with the TMOT or T&E IPT, which is comprised of representatives from program leadership, stakeholders from other organizations, and appropriate Subject Matter Experts (SMEs) involved in the T&E activities of the program. TMOT/IPT members may include: The PM The program sponsor/user representative(s) Developmental and operational testers The independent evaluator The program logistician The program systems engineer/t&e engineer The program training developer Supporting test area manager (DHS T&E Standards Division) Any organization or agency that has a role critical to program s T&E activities TMOT/IPT members are responsible for helping the PM develop a TEMP that is sufficient, realistic, reasonable, and executable. For example, the independent evaluator and operational testers serve as key members responsible for the development of Section D (Operational Test & Evaluation) of the TEMP. Although this document is focused on providing guidance for development of the TEMP, there are some T&E concepts that may be beneficial to helping the user understand the T&E process. The fundamental purpose of T&E in an acquisition program is to verify attainment of technical performance (in Developmental Testing), and required operational effectiveness and suitability (in Operational Testing). As a system undergoes design, development, and integration, the emphasis in testing moves gradually from DT&E, which is concerned mainly with validating the contract requirements and the attainment of engineering design goals and manufacturing process objectives, to OT&E, which focuses on verifying operational effectiveness and suitability. To foster common understanding and prevent confusion, the use of consistent and accepted terminology is important when describing T&E activities. For ease of understanding, testing is generally categorized as either developmental or operational. Both developmental and operational test and evaluation are required for DHS acquisition programs. Developmental Testing (DT) is any test used to assist the engineering design and development process, and to verify the status of technical progress. DT is a tool to help identify and manage design risks, substantiate achievement of technical performance requirements, validate manufacturing process requirements and L-2 DHS Acquisition Instruction/Guidebook # : Appendix L

3 system maturity, and certify readiness for Operational Testing (OT). Developmental testing typically requires instrumentation and measurements, and is performed under known, controlled, and limited conditions that might or might not be representative of the complex operational environment. DT is controlled by the PM and consists of all testing performed by the contractor and/or Government during system development. Operational Testing (OT) is the field test, under realistic conditions, of any system or component, for determining that system or components overall effectiveness and suitability for use before deployment of the system or component. OT provides information for the overall assessment of how well a system will provide the desired capability when operated by typical users in the expected environment. OT is usually conducted by an independent evaluator, and not controlled by the PM. While the PM retains responsibility for all aspects of his or her program, the system evaluation or system assessment for operational test and evaluation is prepared independently of both the PM and agency user organizations. This independence allows evaluators to present objective and unbiased conclusions regarding the system or equipments operational effectiveness and suitability. The independent operational evaluator, which can be referred as the Operational Test Authority (OTA) is responsible to present his or her findings in the OT&E report, which is submitted to the PM, Component Acquisition Executive (CAE), DHS Director, Test & Evaluation and Standards, and Acquisition Decision Authority (ADA). The OTA must be prepared to present and defend those findings to the Component Acquisition Executive or the Acquisition Decision Authority at ADEs or other program reviews. Acquisition decision authorities will ultimately determine the degree to which they accept and factor the evaluator s findings and recommendations into programmatic decisions. However, they must make such determinations in view of the evaluator s objective and unbiased assessment. Integrated testing (DT/OT, or DT/Operational Assessment (OA)) is a process under which developmental and operational test objectives are performed during the same phase or scenarios to make best use of time and available test resources. Integrated testing is more complex due to the dual nature of its objectives. If this approach is taken, developmental and operational T&E objectives remain separate and distinct from one another, even though resources, activities, and data may be shared. Care must be taken to ensure that the combined approach does not compromise either the developmental or operational T&E objectives. Rigorous planning and coordination are required for effective use of the integrated testing approach due to the added complexity of managing concurrent test events and activities. DHS Acquisition Instruction/Guidebook # : Appendix L L-3

4 Required Elements in the TEMP Document The TEMP shall include the following information and comply with the following format. Cover/Signature Page Executive Summary Revision Summary (if applicable) Section A. Introduction 1. Background 2. Operational Performance Requirements 3. Critical Technical Parameters Section B. Program Summary 1. Integrated Master Schedule 2. Management Section C. Developmental Test and Evaluation Outline 1. Developmental Test and Evaluation Overview 2. Developmental Test and Evaluation to Date 3. Future Developmental Test and Evaluation 4. Developmental Test and Evaluation Plans and Reports Section D. Operational Test and Evaluation Outline 1. Operational Test and Evaluation Overview 2. Critical Operational Issues 3. Operational Test and Evaluation to Date 4. Planned Operational Test and Evaluation 5. Operational Test and Evaluation Plans and Reports Section E. Test and Evaluation Resource Summary 1. Summary 2. Resource Summary Updates Appendices: 1. Bibliography 2. Acronyms 3. Points of Contact L-4 DHS Acquisition Instruction/Guidebook # : Appendix L

5 Guidance on Required Elements Cover/Signature Page The TEMP cover page should, at a minimum, be signed and dated by: the Program Manager, the Component Head or CAE, DHS Director, T&E and Standards (dependent upon program level), and the DHS ADA (dependent upon program level). See TEMP template in this document. Executive Summary (1 to 2 pages in length) Provide an Executive Summary of the TEMP. The Executive Summary should be a brief discussion of the plan, highlighting the salient points of each chapter in the plan, including goals and objectives and expected outcomes. Briefly discuss the roles and responsibilities of key participants and discuss reports expected to be prepared and how these reports will support program decisions.. Revision Summary (if applicable) The Revision Summary should provide a high-level description of major changes followed by a Table of Changes that describes specific changes, including references to the changed section/paragraph. Section A. Introduction 1. Background Briefly summarize the capability the system provides. Briefly describe the system design. Include key features and subsystems; describe unique characteristics of the system or unique support concepts which may result in special test and analysis requirements. Do not repeat detailed background information from other program documents. The focus of this section should be on T&E issues. 2. Operational Performance Requirements List in matrix format (see Table L-1) the minimum acceptable operational performance requirements. Operational performance requirements should be traceable to the ORD and the Key Performance Parameters (KPPs) listed in the ORD. Table L-1 contains examples of operational performance requirements with their associated threshold and objective parameters. Thresholds, against which each of the effectiveness and suitability parameters will be measured, are normally quantitative. Thresholds should represent the level of system performance acceptable to the user to successfully provide the desired capability. 3. Critical Technical Parameters List in a matrix format (see Table L-2) the critical technical parameters of the system that have been evaluated or will be evaluated during the remaining phases of DT&E. For each technical parameter, list the appropriate technical threshold. Highlight critical technical issues that must be demonstrated before entering the next acquisition phase or before entering OT&E. DHS Acquisition Instruction/Guidebook # : Appendix L L-5

6 Table L-1: Example of Operational Performance Requirements Operational Effectiveness Requirement Parameter Threshold Objective Speed** Minimum Top Speed 25 Knots 30 Knots Threat Detection Continuous Speed (Sea State 2) 20 Knots 25 Knots Probability of detection of unknown threat 85% at 10 miles 90% at 10 miles Operational Suitability Requirement Parameter Threshold Objective Reliability Mean Time Between Maintenance 1000 Hours 1200 Hours Actions Mean Time Between Failures 2000 Hours 2200 Hours Mean Time Between Critical Failures 5000 Hours 5200 Hours Maintainability Mean Time To Repair 2.5 Hours 2.0 Hours Operational Availability** ** Key Performance Parameter Percentage Of Time Available To Support Operational Tasking 80% 85% Table L-2: Sample Critical Technical Parameters Matrix Critical Technical Parameter Stability Stability Minimum Top Speed Test Event Model Test Static Rollover Model Test Technical Threshold Self-right through 360 o Self-right through 360 o < Project Title > Test Location Government Test Site 25 Knots Government Test Facility Test Schedule Development Test (DT) Decision Supported Preliminary Design Completion Contractor DT Preliminary Acceptance DT Preliminary Design Completion Additional examples of Critical Technical Parameters for various types of systems are included in Table L-3. L-6 DHS Acquisition Instruction/Guidebook # : Appendix L

7 Table L-3: Examples of Critical Technical Parameters Boats Length Beam Draft Speed Range Human Factors Safety/Environmental Health Navigation Information Technology Enterprise Architecture Compliance Speed of Calculation Throughput Capability Reliability Software Maintainability Security Controls Human Factors Speed Maneuvering Range Maximum Gross Weight Cargo Capacity Personnel Capacity Human Factors Airworthiness Aircraft Radars Range Detection Limits Jamming Protection Reliability Error Rate/Signal Processing Human Factors Section B. Program Summary 1. Integrated Master Schedule Graphically display the integrated time sequencing of the critical T&E phases and events within the integrated master schedule. Figure L-1 is provided for illustrative purposes only. The PM may use any graphics technique that clearly shows the key T&E events and their sequential relationship with the acquisition life cycle phases. T&E events should be traceable to applicable test requirements (KPPs and Critical Operational Issues [COIs]) within the appropriate T&E documentation (e.g., ORD, test plans, test reports, traceability matrix, etc.). Include event dates related to the testing program, such as Acquisition Decision Events (ADEs), test article availability, appropriate phases of DT&E, and OT&E. Include all T&E planning documents (TEMP, DT&E Test Plan, OT&E Test Plan) and DT&E and OT&E Reports. DHS Acquisition Instruction/Guidebook # : Appendix L L-7

8 Figure L- 1: Sample Integrated Master Schedule (Fiscal Years) 2. Management Identify all organizations that will be participating in the T&E program. Discuss in detail the roles and responsibilities of each of the identified stakeholders. Organizations which must be included in the T&E program include the PM, the Component, the Independent Evaluator or Operational Test Authority (OTA), or any organization conducting actual testing, including contractors and operational units. L-8 DHS Acquisition Instruction/Guidebook # : Appendix L

9 Section C. Developmental Test & Evaluation Outline 1. Developmental Test & Evaluation Overview Discuss the overall goals and objectives of the DT&E program. Explain how the planned (or accomplished) DT&E will verify the status of the engineering design and development progress, verify that design risks have been minimized, and substantiate the achievement of technical performance. Discuss specific DT&E exit criteria and system acceptance tests, qualification tests, and test readiness reviews if applicable. This section should also address: (a) Technology which has not demonstrated its ability to contribute to system performance and ultimately provide the capability the program is designed to provide, and (b) Degree to which system hardware and software design has stabilized so as to reduce manufacturing and production decision uncertainties. 2. Developmental Test and Evaluation to Date Describe all DT&E which has been conducted to date. Include all DT&E conducted by both contractors and the government. Briefly note the results of the testing and reference all reports completed or under preparation. 3. Planned Developmental Test and Evaluation Discuss all remaining DT&E that is planned, beginning with the date of the current TEMP revision and extending through completion of production. Place emphasis on the testing events that will occur during the upcoming acquisition phase and the corresponding test resources and assets needed (e.g., test facilities and ranges, government furnished equipment, and specialized test equipment). For each segment of testing (e.g., modeling, laboratory tests, and in-plant tests), the following topics should be discussed: (a) Configuration Description - Summarize the functional capability of the system configuration (model, mock-up, and prototype) and how it differs, if any, from the planned production model. (b) DT&E Objectives - State the test objectives for the phase in terms of the critical technical parameters (CTPs) to be confirmed. Identify any specific technical parameters which an Acquisition Decision Memorandum (ADM) or legislative action has directed to be demonstrated during a particular phase of testing. (c) DT&E Events, Scope of Testing, and Basic Scenarios - Summarize the test events, test scenarios, and the test design concept. Quantify the testing in terms of the number of test events planned, and discuss the information which will be expanded upon in the DT&E Plan. Discuss the environment in which testing will be conducted and how realistic that environment is. Describe and identify any models or simulations that will be used, justification for their use, and relevant validation or certification requirements. (d) Limitations - Discuss any test limitations that may significantly affect the evaluator s ability to draw conclusions and make recommendations concerning the critical technical parameters. 4. Developmental Test and Evaluation Plans and Reports Describe all required DT&E plans and reports. Include information on the scope of each plan or report, who prepares it, who reviews it, who approves it, and when it is to be submitted. DHS Acquisition Instruction/Guidebook # : Appendix L L-9

10 Section D. Operational Test & Evaluation Outline 1. Operational Test and Evaluation Overview The OT&E outline will typically be written by the independent operational evaluator or OTA with inputs from the program and the user/operators. Discuss the overall goals and objectives of the OT&E program, including any combined DT&E/OT&E or integrated testing. Discuss the entry criteria that must be met prior to the commencement of OT&E. Discuss how OT&E is performed independently from the vendor and structured to ensure that an operationally effective and operationally suitable system is delivered to the Component and user/operators. Provide information to show how OT&E will evaluate the system in an environment as operationally realistic as possible (i.e., using typical operators, expected ranges of natural environmental conditions, and expected operational scenarios). Discuss the different phase components of the planned OT&E, which may include Early Operational Assessment (EOA), Operational Assessment (OA), Initial Operational Test and Evaluation (IOT&E), and Follow-On Operational Test and Evaluation (FOT&E). 2. Critical Operational Issues List the COIs that have been identified by the Sponsor, Component or end user. COIs are the operational effectiveness and operational suitability issues (not characteristics, parameters, or thresholds) that must be examined in OT&E to evaluate/assess the system s capability to provide the desired capability. A COI is typically phrased as a question that must be answered in order to properly evaluate the operational effectiveness (e.g., Will the system detect Home-Made Explosives at a level that will prevent a terrorist attack at airport terminal environments?) and operational suitability (e.g., Will the system be maintainable within the planned funding levels, personnel structure, and expertise level at airport terminal facilities?). The list of COIs should be thorough enough to ensure that, if every COI is resolved favorably, the system will be operationally effective and operationally suitable when employed in its intended environment by typical users. The list of COIs will normally consist of five to ten issues and should reflect only those that are truly critical in nature. If a COI cannot be favorably resolved during testing, the situation should be reviewed by the program office and user/operator as to the actual operational impact, and recommendations for resolution should be presented to the ADA. 3. Operational Test and Evaluation to Date Briefly describe all OT&E (including EOA and OA) that has been completed; if none has been conducted, so state. The descriptions should include the following: (a) A description of the system actually tested and how its configuration relates to the system that will be fielded. (b) A summary of the actual testing that occurred, including events, scenarios, resources used, test limitations, evaluations conducted, results achieved, and a reference to any test report detailing the results of such testing. Emphasis should be upon those COIs that were resolved, partially resolved, or unresolved at the completion of that portion of testing. 4. Planned Operational Test and Evaluation (a) Configuration Description - Identify the system to be tested, and describe any differences between the tested system and the system that will be fielded. Include, where applicable, the extent of integration with other systems with which it must be interoperable or compatible. Characterize the system (e.g., first article, production representative, or production configuration). (b) Operational Test and Evaluation Objectives - State the test objectives including the COIs to be addressed during remaining OT&E and the Acquisition Decision Events (ADEs) supported. OT&E which supports the Production and Deployment Decision (ADE 3) should have test objectives that examine all areas of operational effectiveness and suitability. (c) Operational Test and Evaluation Events, Scope of Testing, and Scenarios - Summarize the scenarios and identify the events to be conducted. Indicate the type of resources to be used, the simulation(s) to be employed (if applicable), the type of representative personnel who will operate and maintain the system, the status of logistic support, the operational and maintenance documentation that will be used, and the environment under which the system is to be employed and supported during testing. This L-10 DHS Acquisition Instruction/Guidebook # : Appendix L

11 section should also identify sources of information (e.g., developmental testing, modeling, and simulations) that may be used by the operational testers to supplement this phase of OT&E. Whenever models and simulations are to be used, explain the rationale for their applicability and credible use. (d) Limitations - Discuss the test limitations including the scenario realism, resource availability, limited operational environments, limited support environment, maturity of tested system, safety, etc., that may impact the resolution of affected COIs. Indicate the COI(s) affected in parentheses after each limitation. 5. Operational Test and Evaluation Plans and Reports Describe all required OT&E plans and reports, which include plans and reports for any scheduled EOA and OA. Include information on the scope of each plan or report, such as who reviews it, who approves it, and when it is to be submitted. OT&E Plans are prepared by the user/operators, and approved for use by the OTA. The OT&E Plan is executed by the user/operator and overseen by the OTA. The OTA is then responsible for test evaluation and OT&E report generation. DOT&E is responsible for briefing OT&E results and conclusions at the appropriate ADEs to the ADA. Section E. Test and Evaluation Resource Summary 1. Summary Provide a summary (preferably in a table or matrix format) of all key T&E resources, both Government and contractor, which will be used during the course of the acquisition program. Estimate, by Fiscal Year, the funding required to pay direct costs of planned testing (see Table L-4) and identify any major shortfalls. Table L-4: Summary of T&E Funding Requirements FY08 FY09 FY10 FY11 FY12 FY13 FY14 FY15 DT&E OT&E Total Specifically, the TEMP shall identify the following test resources: (a) Test Articles - Identify the actual number of and timing requirements for all test articles, including key support equipment and technical information required for testing in each phase of DT&E and OT&E. If key subsystems (components, assemblies, subassemblies, or software modules) are to be tested individually, before being tested in the final system configuration, identify each subsystem in the TEMP and the quantity required. Specify if/when prototypes, development pre-production, or production models will be used. (b) Test Sites/Instrumentation/Airspace/Spectrum - Identify the specific facilities/test ranges to be used for each type of testing. Compare the requirements for facilities/test ranges dictated by the scope and content of planned testing with existing and programmed facility/test range capability, and highlight any major shortfalls. Identify instrumentation that must be acquired specifically to conduct the planned test program and any airspace or spectrum requirements needed to support testing. (c) Test Support Equipment - Identify test support equipment that must be acquired specifically to conduct the test program. Identify unique or special calibration requirements associated with any such equipment. DHS Acquisition Instruction/Guidebook # : Appendix L L-11

12 (d) Threat Systems/Simulators - For those systems that require the use of threat systems/simulators, identify the type, model number, and availability requirements for all required threat systems/simulators. Compare the requirements for threat systems/simulators with available and projected assets and their capabilities. Identify any major shortfalls. (e) Test Targets and Expendables - Identify the type, model number, and availability requirements for all test targets and expendables (e.g., targets, flares, chaff, smoke generators, acoustic countermeasures, etc.) that will be required for each phase of testing. Identify any major shortfalls. (f) Operational Program Test Support - For each T&E phase, identify the type and timing of operational test support required (e.g., aircraft flying hours, vessel underway days, operating personnel hours and other critical operating program support required). (g) Simulations, Models, and Test Beds - For each T&E phase, identify the system simulations required, including computer-driven simulation models and hardware-in-the-loop test beds (a system representation consisting partially of actual hardware and/or software, and partially of computer models or prototype hardware and/or software). The rationale for their credible use or application must be explained in an approved TEMP before their use. (h) T&E Administrative Support - For each test phase, identify all administrative and facilities support required. Identify the organization responsible for providing such support and the source and type of funding required. Such items as office space and equipment, pier or hangar space, and maintenance services should be discussed. (i) Manpower and Training - Identify manpower and training requirements and limitations that affect test execution. (J) Investment Requirements - Discuss investment requirements for any significant instrumentation capabilities and resources needed to support testing. 2. Resource Summary Updates The initial TEMP should project the key resources necessary to accomplish DT&E and OT&E. As system acquisition progresses, test resource requirements shall be reassessed and subsequent TEMP updates shall reflect any changed system concepts or requirements. Appendices: 1. Bibliography Cite all documents referred to in the TEMP. Also cite all reports documenting developmental and operational testing and evaluation of the system. 2. Acronyms List and define all acronyms used in the TEMP. 3. Points of Contact Provide a list of Points of Contact for all participating organizations (Project Manager, Component, Sponsor, Support Organization Representatives, testers, evaluators, etc.) APPENDIX A Acronyms ADA ADE APB Acquisition Decision Authority Acquisition Decision Event Acquisition Program Baseline L-12 DHS Acquisition Instruction/Guidebook # : Appendix L

13 CAE COI CTP DT&E EOA FOT&E IPT IV&V KPP M&S MNS OA ORD OTA OT&E PM S&T T&E TEMP TMOT Component Acquisition Executive Critical Operational Issue Critical Technical Parameter Developmental Test and Evaluation Early Operational Assessment Follow-On Test and Evaluation Integrated Process Team Independent Verification & Validation Key Performance Parameter Modeling and Simulation Mission Needs Statement Operational Assessment Operational Requirement Document Operational Test Authority Operational Test & Evaluation Program Manager Science and Technology Test and Evaluation Test and Evaluation Master Plan Test Management Oversight Team APPENDIX B T&E Terms and Definitions Acquisition Decision Authority: With respect to a major DHS acquisition program, means the individual within the Department of Homeland Security designated with overall responsibility (including acquisition decision-making) for the program. Conformity Assessment: An evaluation that determines whether the requirements for a specific system or equipment are fulfilled. It may include: sampling and testing; inspection; supplier's declaration of conformity; certification; and quality and environmental management system assessment and registration. Contractor Test: Testing performed by the contractor or developing agency during the development of a product. This includes component testing, integration testing and the final system level test. Critical Operational Issue (COI): COIs are the operational effectiveness and operational suitability issues (not characteristics, parameters, or thresholds) that must be examined in OT&E to evaluate/assess the system s capability to provide the desired capability. Critical Technical Parameter (CTP): Measurable critical system characteristics that, when achieved, allow the attainment of desired operational performance capabilities. They are technical measures derived from desired user capabilities. Failure to achieve a critical technical parameter should be considered a reliable indicator that the system is behind in the planned development schedule or will likely not achieve an operational requirement. CTPs directly support Critical Operational Issues. Demonstration: A limited exhibition of the operation, use, maturity, operational potential or other characteristic of a device, process, product, technology or system. Demonstrations are generally conducted to inform interested parties in an explanatory, but not conclusive manner regarding some aspect of interest regarding the thing being demonstrated. Demonstrations typically do not provide the levels of statistical confidence, repeatability, closely controlled conditions and other elements that characterize more formal and rigorous testing. Developmental Test: Any testing used to assist in the development and maturation of products, product elements, or manufacturing or support processes. Also any engineering-type test used to verify that design DHS Acquisition Instruction/Guidebook # : Appendix L L-13

14 risks are minimized, substantiate achievement of contract technical performance, and certify readiness for Operational Testing. Early Operational Assessment (EOA): An operational assessment conducted prior to or in support of prototype testing. EOA assess the operational impact of candidate technical approaches and to assist in selecting preferred alternative system concepts. Experiment: A limited trial or tentative procedure conducted to test a principle, supposition or hypothesis, for the purpose of understanding the behavior of a system or discovering something unknown. Experiments are typically directed toward increasing knowledge and understanding in the field of system research, development and acquisition supporting the establishment of long term operational capabilities. Experiments, by themselves, generally do not provide sufficient basis for conclusions on the operational effectiveness or suitability of an acquired system. Follow-On Operational Test and Evaluation (FOT&E): Test and Evaluation that may be necessary after system deployment to refine estimates made during operational test and evaluation, to evaluate production changes, and to re-evaluate the system to ensure that it continues to meet operational needs. Independent Verification and Validation (IV&V): The verification and validation of a software product by an organization that is both technically and managerially separate from the organization responsible for developing the product. Information Assurance: Activities that protect and defend information and information systems by ensuring their: - Availability: Timely, reliable access to services. - Integrity: Protection from unauthorized change. - Authentication: Verification of originator. - Confidentiality: Protection from unauthorized disclosure. - Non-repudiation: Undeniable proof of participation. Integrated Test and Evaluation Program: The planning, execution and reporting on the totality of test and evaluation events conducted on a system or equipment throughout the system technology development and acquisition. The purpose of ITEP is to ensure that the system and/or component are thoroughly tested, that redundant testing is minimized, and associated costs and time are conserved. Integration Testing: The process of testing the segments of a system as they are developed and integrated with other segments to ensure individual interfaces meet the design criteria. This usually follows unit testing which tests the segment to ensure it meets the design criteria of that segment. Key Performance Parameter (KPP): Those attributes or characteristics of a system/program/project that are considered critical or essential parts of an effective system/program/project capability. Maintainability: The ability of a system to be retained in, or restored to a specified condition when maintenance is performed by personnel having the specified skill levels, using prescribed procedures and resources, at each prescribed level of maintenance and repair. Maintainability is a characteristic of design and installation, expressed as the probability that an item will be retained in or restored to a specified condition within a given period of time, when the maintenance is performed in accordance with prescribed procedures and resources. Measures of Effectiveness: Measures of Effectiveness (MOEs) are operational outcome measures that identify the most critical performance requirements needed to meet system-level capability objectives. Operational effectiveness is the overall degree of a system s ability to provide desired capability considering the total operational environment. For example, weapon system effectiveness would consider environmental factors such as operator organization, doctrine, and tactics; survivability; vulnerability; and threat characteristics. Measure of Performance: Measures of Performance (MOPs).characterize physical or functional attributes relating to the execution of the system s function. They quantify a technical or performance requirement directly derived from MOEs and MOSs. MOPs should relate to these measures such that a change in MOP can be related to a change in MOE or MOS. MOPs are used to derive, develop, support, and document the L-14 DHS Acquisition Instruction/Guidebook # : Appendix L

15 performance requirements that will be the basis for design activities and process development. They also are used to help identify the critical technical parameters. Operational Assessment (OA): An evaluation of operational effectiveness and operational suitability made by an independent operational test activity, with user support as required, on other than production systems. OAs focus on developmental efforts, programmatic voids, risk areas adequacy of requirements and the ability of the program to support adequate Operational Testing (OT). OAs may be conducted at any time in a program lifecycle using technology demonstrators, prototypes, mock-ups, engineering development models or simulators, but are typically done early in the concept or development phases. OAs will not substitute for Operational Testing and Evaluation necessary to support full rate production and deployment decisions. Operational Effectiveness: Measure of the overall ability of a system to provide desired capability when used by representative personnel in the environment planned or expected for operational employment of the system considering organization, doctrine, tactics, supportability, survivability, vulnerability, and threat. Operational Suitability: The degree to which a system can be placed and sustained satisfactorily in field use with consideration being given to availability, compatibility, transportability, interoperability, reliability, wartime usage rates, maintainability, safety, human factors, habitability, manpower, logistics supportability, natural environmental effects and impacts, documentation, and training requirements. Operational Test: The field test performed under realistic conditions, overseen and evaluated by an activity independent from the agency developer and user organizations, of any system or key component of a system for the purposes of determining the effectiveness and suitability of that system or component when used by typical users in the expected operating environment. Regression Testing: Testing of computer software and/or systems to assure correct performance after changes were made to code that previously performed in a known manner. Regression testing seeks to uncover regression faults that occur when software functionality that previously worked as desired, stops working or no longer works in the same way that was previously planned. Regression faults typically occur as an unintended consequence of program changes. Reliability: The ability of a system to provide desired capability without failure, degradation, or demand on the support system; including the ability to perform required functions in routine and non-routine and/or unexpected circumstances. Test: A program or procedure designed to obtain, verify or provide data for the evaluation of any of the following: 1) progress in accomplishing developmental objectives; 2) the performance, operational capability and suitability of systems, subsystems, components and equipment items; and 3) the vulnerability and/or lethality of systems, subsystems, components and equipment items. Validation: The process of evaluating a system or component (including software) during or at the end of the development process to determine whether it satisfies the specified user s needs. Validation answers the question; Is it the right product for the established need? Verification: The process of confirming that a system or system element is designed and/or built as intended; in other words, that the system or element meets design-to or build-to specifications. Verification answers the question: Is the product as we expected it to be? DHS Acquisition Instruction/Guidebook # : Appendix L L-15

16 Test and Evaluation Master Plan (TEMP) for (PROGRAM TITLE) Submitted by: Program Manager Date Endorsed by: User/Operator Date Approved by: Operational Test Authority 2 Date Approved by: Director, Operational Test and Evaluation (Section D) Date Approved by: Acquisition Decision Authority Date 2 The level of the OTA signature is dependent on the level of the ADA L-16 DHS Acquisition Instruction/Guidebook # : Appendix L

17 APPENDIX C Executive Summary Revision Summary (if applicable) Table of Contents Section A. Introduction 1. Background 2. Operational Performance Requirements 3. Critical Technical Parameters Section B. Program Summary 1. Integrated Master Schedule 2. Management Section C. Developmental Test and Evaluation Outline 1. Developmental Test and Evaluation Overview 2. Developmental Test and Evaluation to Date 3. Future Developmental Test and Evaluation 4. Developmental Test and Evaluation Plans and Reports Section D. Operational Test and Evaluation Outline 1. Operational Test and Evaluation Overview 2. Critical Operational Issues 3. Operational Test and Evaluation to Date 4. Planned Operational Test and Evaluation 5. Operational Test and Evaluation Plans and Reports Section E. Appendices Test and Evaluation Resource Summary 1. Summary 2. Resource Summary Updates 1. Bibliography 2. Acronyms 3. Points of Contact DHS Acquisition Instruction/Guidebook # : Appendix L L-17

18 APPENDIX D DHS TEMP Review Checklist GENERAL 1. Does the TEMP accurately describe the comprehensive test and evaluation program, including all known or anticipated T&E activities and events over the program life cycle? 2. Is the TEMP compliant with the format and content of the DHS guidance? 3. Are all changes to the TEMP described in the Revisions Summary (if applicable)? 4. For programs with multiple projects, does the TEMP address the required T&E throughout the program s life cycle, including activities for each project/discrete useful segment? 5. Do the operational requirements, Key Performance Parameters (KPPs) and Critical Technical Parameters (CTPs) identified in the TEMP trace to the Operational Requirements Document (ORD)? 6. Does the TEMP show that DT&E and OT&E begin early enough to assess performance, identify risks, and estimate operational potential prior to scheduled production and deployment decisions? 7. Are the required test reports scheduled to be available to support key milestone decisions? 8. Is OT&E overseen and evaluated by an organization independent of the developer and program office? 9. Has adequate time for data analysis and report development been factored into the schedule? 10. Have adequate numbers of test assets been identified for OT and DT? 11. Will production representative units be available and dedicated to OT? 12. Have an adequate number of test hours been scheduled to resolve suitability COIs? L-18 DHS Acquisition Instruction/Guidebook # : Appendix L

Operational Requirements Document (ORD)

Operational Requirements Document (ORD) Operational Requirements Document (ORD) A. Purpose This Appendix sets the requirements and establishes procedures for preparation of an Operational Requirements Document (ORD). It also prescribes procedures

More information

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices Appendix 1 Formulate Programs with a RAM Growth Program II-1 1.1 Reliability Improvement Policy II-3 1.2 Sample Reliability

More information

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B 1. Work Plan & IV&V Methodology 1.1 Compass Solutions IV&V Approach The Compass Solutions Independent Verification and Validation approach is based on the Enterprise Performance Life Cycle (EPLC) framework

More information

Given design trade studies, interpret the results to recommend the most cost-effective concept for development.

Given design trade studies, interpret the results to recommend the most cost-effective concept for development. 1 Summarize the Planning, Programming, Budgeting and Execution (PPBE) process. Analyze the financial structure and the movement of funds within the PPBE. 2 Given a scenario, develop an appropriate PMO

More information

The Analyze/Select Phase

The Analyze/Select Phase D page 1 of 32 Long Description Title screen for Module 1, Lesson 3: System Acquisition Life Cycle, with three photos: three people meeting at a conference table, a stoplight, and a group of people meeting.

More information

National Aeronautics and Space Administration Washington, DC 20546

National Aeronautics and Space Administration Washington, DC 20546 Technical Standards Division Publication NASA-STD-2100-91 NASA Software Documentation Standard Software Engineering Program NASA-STD-2100-91 -91 Approved: July 29, 1991 National Aeronautics and Space Administration

More information

Department of Homeland Security. DHS Directives System Directive Number: 1 02~1 Revision Number: 01 Issue Date: ACQUISITION MANAGEMENT DIRECTIVE

Department of Homeland Security. DHS Directives System Directive Number: 1 02~1 Revision Number: 01 Issue Date: ACQUISITION MANAGEMENT DIRECTIVE I. Purpose Department of Homeland Security. DHS Directives System Directive Number: 1 02~1 Revision Number: 01 Issue Date: ACQUISITION MANAGEMENT DIRECTIVE A. This Directive provides the overall policy

More information

The Produce/Deploy/Support Phase. The Produce/Deploy/Support Phase. Long Description

The Produce/Deploy/Support Phase. The Produce/Deploy/Support Phase. Long Description D page 1 of 13 Long Description Title screen for Module 1, Lesson 5: System Acquisition Life Cycle with three images: A person looking at a clipboard in a large warehouse, a Coast Guard ship on the ocean,

More information

Engineering and Manufacturing Development YOU ARE HERE. As homework, you read the SPAW MS B TEMP and wrote down any mistakes you noticed.

Engineering and Manufacturing Development YOU ARE HERE. As homework, you read the SPAW MS B TEMP and wrote down any mistakes you noticed. TEMP Review Exercise CDD Validation Dev. RFP Release MDD A B C FRP IOC FOC Materiel Solution Analysis Tech Maturation & Risk Reduction Engineering and Manufacturing Development Production & Deployment

More information

TECHNICAL REVIEWS AND AUDITS

TECHNICAL REVIEWS AND AUDITS Chapter 11 Technical Reviews and Audits CHAPTER 11 TECHNICAL REVIEWS AND AUDITS 11.1 PROGRESS MEASUREMENT The Systems Engineer measures design progress and maturity by assessing its development at key

More information

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide processlabs CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide CMMI-DEV V1.3 Process Areas Alphabetically by Process Area Acronym processlabs CAR - Causal Analysis and Resolution...

More information

Rigor and Objectivity in T&E

Rigor and Objectivity in T&E Guest Editorial ITEA Journal 2010; 31: 161 164 Rigor and Objectivity in T&E J. Michael Gilmore, Ph.D. Director, Operational Test and Evaluation, Office of the Secretary of Defense, Washington, DC The Director

More information

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide processlabs CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide CMMI-SVC V1.3 Process Areas Alphabetically by Process Area Acronym processlabs CAM - Capacity and Availability Management...

More information

Report of the Reliability Improvement Working Group Table of Contents

Report of the Reliability Improvement Working Group Table of Contents Report of the Reliability Improvement Working Group 1942 poster commissioned by US Office for Emergency Management, Office of War Information Courtesy of the National Archives September 2008 Report of

More information

Integrated Systems Engineering and Test & Evaluation. Paul Waters AIR FORCE FLIGHT TEST CENTER EDWARDS AFB, CA. 16 August 2011

Integrated Systems Engineering and Test & Evaluation. Paul Waters AIR FORCE FLIGHT TEST CENTER EDWARDS AFB, CA. 16 August 2011 AFFTC-PA-11331 Integrated Systems Engineering and Test & Evaluation A F F T C m Paul Waters AIR FORCE FLIGHT TEST CENTER EDWARDS AFB, CA 16 August 2011 Approved for public release A: distribution is unlimited.

More information

Net-Ready Key Performance Parameter (NR-KPP) Implementation Guidebook

Net-Ready Key Performance Parameter (NR-KPP) Implementation Guidebook Net-Ready Key Performance Parameter (NR-KPP) Implementation Guidebook Version 1.0 1 October 2009 Assistant Secretary of the Navy (Research, Development, and Acquisition) Chief Systems Engineer Department

More information

Mission-Level Systems Engineering

Mission-Level Systems Engineering 6 th Annual DoD Enterprise Architecture Conference Mission-Level Systems Engineering Mr. Ricardo Cabrera ASN (RD&A) CHSENG Hampton, VA 13 Apr 2011 DISTRIBUTION STATEMENT A. Approved for public release;

More information

3 PART THREE: WORK PLAN AND IV&V METHODOLOGY (SECTION 5.3.3)

3 PART THREE: WORK PLAN AND IV&V METHODOLOGY (SECTION 5.3.3) 3 PART THREE: WORK PLAN AND IV&V METHODOLOGY (SECTION 5.3.3) Emagine IT s approach to Independent Verification and Validation (IV&V) has been shaped over the years by hands-on experience and contributions

More information

Intermediate Systems Acquisitions Course. Commercial and Non-Developmental Items

Intermediate Systems Acquisitions Course. Commercial and Non-Developmental Items Commercial and Non-Developmental Items Many systems acquired by DHS are not completely new developments; instead, they use existing products. These products may be used just as they are, or they may be

More information

SYSTEMS ENGINEERING REQUIREMENTS AND PRODUCTS

SYSTEMS ENGINEERING REQUIREMENTS AND PRODUCTS SMC Standard SMC-S-001 1 July 2013 ------------------------ Supersedes: SMC-S-001 (2010) Air Force Space Command SPACE AND MISSILE SYSTEMS CENTER STANDARD SYSTEMS ENGINEERING REQUIREMENTS AND PRODUCTS

More information

SECTION C - DESCRIPTION / SPECIFICATIONS / STATEMENT OF WORK

SECTION C - DESCRIPTION / SPECIFICATIONS / STATEMENT OF WORK SECTION C - DESCRIPTION / SPECIFICATIONS / STATEMENT OF WORK C.1. OBJECTIVE The objective of OASIS is to provide Government agencies with total integrated solutions for a multitude of professional service

More information

DATA ITEM DESCRIPTION

DATA ITEM DESCRIPTION DATA ITEM DESCRIPTION Title: Test Plans/Test Procedures Number: DI-SESS-81704 Approval Date: 20061122 AMSC Number: F7658 Limitation: N/A DTIC Applicable: N/A GIDEP Applicable: N/A Preparing Activity: 10

More information

Number: DI-IPSC-81427B Approval Date:

Number: DI-IPSC-81427B Approval Date: DATA ITEM DESCRIPTION Title: Software Development Plan (SDP) Number: DI-IPSC-81427B Approval Date: 20170313 AMSC Number: N9775 Limitation: N/A DTIC Applicable: No GIDEP Applicable: No Preparing Activity:

More information

System Safety in Systems Engineering V-Charts

System Safety in Systems Engineering V-Charts System Safety in Systems Engineering V-Charts System Safety is an integral part of the systems engineering (SE) process, providing specific inputs by activity and phase as illustrated in the five V-Charts

More information

REQUIREMENTS FOR SAFETY RELATED SOFTWARE IN DEFENCE EQUIPMENT PART 1: REQUIREMENTS

REQUIREMENTS FOR SAFETY RELATED SOFTWARE IN DEFENCE EQUIPMENT PART 1: REQUIREMENTS Ministry of Defence Defence Standard 00-55(PART 1)/Issue 2 1 August 1997 REQUIREMENTS FOR SAFETY RELATED SOFTWARE IN DEFENCE EQUIPMENT PART 1: REQUIREMENTS This Part 1 of Def Stan 00-55 supersedes INTERIM

More information

DATA ITEM DESCRIPTION

DATA ITEM DESCRIPTION DATA ITEM DESCRIPTION Title: HUMAN SYSTEMS INTEGRATION PROGRAM PLAN Number: Approval Date: 20070404 AMSC Number: N7716 Limitation: N/A DTIC Applicable: N/A GIDEP Applicable: N/A Office of Primary Responsibility:

More information

Section M: Evaluation Factors for Award HQ R LRDR Section M: Evaluation Factors for Award For HQ R-0002

Section M: Evaluation Factors for Award HQ R LRDR Section M: Evaluation Factors for Award For HQ R-0002 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 LRDR Section M: Evaluation Factors for Award For 1 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 Table of

More information

DATA ITEM DESCRIPTION TITLE: TRAINING SITUATION DOCUMENT Number: DI-SESS-81517C Approval Date:

DATA ITEM DESCRIPTION TITLE: TRAINING SITUATION DOCUMENT Number: DI-SESS-81517C Approval Date: DATA ITEM DESCRIPTION TITLE: TRAINING SITUATION DOCUMENT Number: DI-SESS-81517C Approval Date: 20130524 AMSC Number: N9379 Limitation: N/A DTIC Applicable: N/A GIDEP Applicable: N/A Office of Primary Responsibility:

More information

On Board Use and Application of Computer based systems

On Board Use and Application of Computer based systems (Dec 2006 (Corr.1 Oct 2007) (Rev.1 Sept 2010) (Rev.2 June 2016 Complete Revision) On Board Use and Application of Computer based systems 1. Introduction 1.1 Scope These requirements apply to design, construction,

More information

Incorporating Test and Evaluation into Department of Defense Acquisition Contracts

Incorporating Test and Evaluation into Department of Defense Acquisition Contracts DEPARTMENT OF DEFENSE UNITED STATES OF AMERICA Incorporating Test and Evaluation into Department of Defense Acquisition Contracts CLEARED For Open Publication OCT 24, 2011 Office of Security Review Department

More information

SOURCE SELECTION PLAN. {Insert if Phase I or Phase II} {Insert Project Name} {Insert Project Acronym} SOLICITATION XXXXXX-xx-R-xxxx

SOURCE SELECTION PLAN. {Insert if Phase I or Phase II} {Insert Project Name} {Insert Project Acronym} SOLICITATION XXXXXX-xx-R-xxxx SOURCE SELECTION PLAN {Insert if Phase I or Phase II} {Insert Project Name} {Insert Project Acronym} SOLICITATION XXXXXX-xx-R-xxxx {INSERT MONTH & YEAR} COORDINATION: Contracting Officer Date IPT Leader

More information

CORPORATE QUALITY MANUAL

CORPORATE QUALITY MANUAL Corporate Quality Manual Preface The following Corporate Quality Manual is written within the framework of the ISO 9001:2008 Quality System by the employees of CyberOptics. CyberOptics recognizes the importance

More information

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2 Passit4Sure.OG0-093.221Questions Number: OG0-093 Passing Score: 800 Time Limit: 120 min File Version: 7.1 TOGAF 9 Combined Part 1 and Part 2 One of the great thing about pass4sure is that is saves our

More information

CMII-100G. CMII Standard for Integrated Process Excellence and. and

CMII-100G. CMII Standard for Integrated Process Excellence and. and CMII-100G CMII Standard for Integrated Process Excellence and and About this Standard How an organization does what it does has everything to do with processes. Every organization has a network of core

More information

Independent Verification and Validation (IV&V)

Independent Verification and Validation (IV&V) Independent Verification and Validation (IV&V) 12 th Annual NDIA CMMI Conference November 2012 - Denver, CO The MITRE Corporation The author s affiliation with The MITRE Corporation is provided for identification

More information

PRINCESS NOURA UNIVESRSITY. Project Management BUS 302. Reem Al-Qahtani

PRINCESS NOURA UNIVESRSITY. Project Management BUS 302. Reem Al-Qahtani PRINCESS NOURA UNIVESRSITY Project BUS 302 Reem Al-Qahtani This is only for helping reading the PMBOK it has our notes for focusing on the book Project Framework What is PMBOK? o PMBOK= Project Body of

More information

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study RESOURCE: MATURITY LEVELS OF THE CUSTOMIZED CMMI-SVC FOR TESTING SERVICES AND THEIR PROCESS AREAS This resource is associated with the following paper: Assessing the maturity of software testing services

More information

20 September 2017 Document No. QM-ISO revision T ASTRONAUTICS CORPORATION OF AMERICA S. AS 9100 and FAA QUALITY MANUAL. Proprietary Notice

20 September 2017 Document No. QM-ISO revision T ASTRONAUTICS CORPORATION OF AMERICA S. AS 9100 and FAA QUALITY MANUAL. Proprietary Notice 20 September 2017 Document No. QM-ISO revision T ASTRONAUTICS CORPORATION OF AMERICA S AS 9100 and FAA QUALITY MANUAL Proprietary Notice Information disclosed herein is the property of Astronautics Corporation

More information

ISO 9001:2008 Quality Management System QMS Manual

ISO 9001:2008 Quality Management System QMS Manual 2501 Kutztown Road Reading, PA, 19605 Tel. 610-929-3330 Fax 610-921-6861 ISO 9001:2008 Quality Management System QMS Manual The information contained in this document is Fidelity Technologies Corporation

More information

Development, Validation and Implementation Considerations of a Decision Support System for Unmanned & Autonomous System of Systems Test & Evaluation

Development, Validation and Implementation Considerations of a Decision Support System for Unmanned & Autonomous System of Systems Test & Evaluation Development, Validation and Implementation Considerations of a Decision Support System for Unmanned & Autonomous System of Systems Test & Evaluation Test Week 2010 Kris Cowart, Maj, USAF Ricardo Valerdi,

More information

QUALITY MANAGEMENT SYSTEM POLICIES AND PROCEDURES

QUALITY MANAGEMENT SYSTEM POLICIES AND PROCEDURES Your Company Name QUALITY MANAGEMENT SYSTEM POLICIES AND PROCEDURES Origination Date: XXXX Document Identifier: Date: Document Revision: QMS-00 QMS Policies and Procedures Latest Revision Date Abstract:

More information

EPICOR, INCORPORATED QUALITY ASSURANCE MANUAL

EPICOR, INCORPORATED QUALITY ASSURANCE MANUAL EPICOR, INCORPORATED QUALITY ASSURANCE MANUAL Revision: 6 Date 05/18/09 EPICOR, INCORPORATED 1414 E. Linden Avenue P.O. Box 1608 Linden, NJ. 07036-0006 Tel. 1-908-925-0800 Fax 1-908-925-7795 Table of Contents:

More information

DEPARTMENT OF DEFENSE HANDBOOK MECHANICAL EQUIPMENT AND SUBSYSTEMS INTEGRITY PROGRAM

DEPARTMENT OF DEFENSE HANDBOOK MECHANICAL EQUIPMENT AND SUBSYSTEMS INTEGRITY PROGRAM NOT MEASUREMENT SENSITIVE 24 September 2001 SUPERSEDING MIL-HDBK-1798 (USAF) 19 December 1997 DEPARTMENT OF DEFENSE HANDBOOK MECHANICAL EQUIPMENT AND SUBSYSTEMS INTEGRITY PROGRAM This handbook is for guidance

More information

POLICY MANUAL FOR ISO 9001:2008. Document: PM-9001:2008 Date: April 7, Uncontrolled Copy

POLICY MANUAL FOR ISO 9001:2008. Document: PM-9001:2008 Date: April 7, Uncontrolled Copy POLICY MANUAL FOR ISO 9001:2008 Document: PM-9001:2008 Date: April 7, 2015 REVIEWED BY: Tim Powers DATE: 4-7-2015 APPROVED BY: C._Bickford Uncontrolled Copy DATE: 4-7-2015 1.0 GENERAL ISS: 1 REV: E Page:

More information

OCI Mitigation Plan SAMPLE for IDIQ contract

OCI Mitigation Plan SAMPLE for IDIQ contract OCI Mitigation Plan SAMPLE for IDIQ contract Company (Authorized Signatory) Company Vice President (or equivalent level) i TABLE OF CONTENTS Section Description Page I. Organizational Conflict of Interest

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE COMMANDER AIR FORCE RESEARCH LABORATORY AIR FORCE RESEARCH LABORATORY INSTRUCTION 33-401 13 MARCH 2014 Communications and Information ENTERPRISE BUSINESS INFORMATION TECHNOLOGY REQUIREMENTS

More information

Risk-Based Testing: Analysis and Strategy. Presented at Quality Assurance Institute QUEST Conference Chicago, Ill., 2009

Risk-Based Testing: Analysis and Strategy. Presented at Quality Assurance Institute QUEST Conference Chicago, Ill., 2009 Risk-Based Testing: Analysis and Strategy Presented at Quality Assurance Institute QUEST Conference Chicago, Ill., 2009 Clyneice Chaney, CMQ/OE, PMP April 21, 2009 Workshop Outline Part I Risk Management

More information

SOFTWARE DEVELOPMENT FOR SPACE SYSTEMS

SOFTWARE DEVELOPMENT FOR SPACE SYSTEMS BY ORDER OF THE COMMANDER SMC Standard SMC-S-012 13 June 2008 ------------------------ Supersedes: New issue Air Force Space Command SPACE AND MISSILE SYSTEMS CENTER STANDARD SOFTWARE DEVELOPMENT FOR SPACE

More information

CMMI FOR SERVICES, THE PREFERRED CONSTELLATION WITHIN THE SOFTWARE TESTING FUNCTION OF A SOFTWARE ENGINEERING ORGANIZATION

CMMI FOR SERVICES, THE PREFERRED CONSTELLATION WITHIN THE SOFTWARE TESTING FUNCTION OF A SOFTWARE ENGINEERING ORGANIZATION CMMI FOR SERVICES, THE PREFERRED CONSTELLATION WITHIN THE SOFTWARE TESTING FUNCTION OF A SOFTWARE ENGINEERING ORGANIZATION NAME: Nestor K. Ovalle, PhD TITLE: Leadership & Corporate Change Consultant; CMMI

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY Template modified: 27 May 1997 14:30 BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE INSTRUCTION 63-501 31 MAY 1994 Certified Current 4 November 2009 Acquisition AIR FORCE ACQUISITION QUALITY PROGRAM

More information

JAA Administrative & Guidance Material Section Six: Synthetic Training Devices (STD/FSTD), Part Three: Temporary Guidance Leaflet

JAA Administrative & Guidance Material Section Six: Synthetic Training Devices (STD/FSTD), Part Three: Temporary Guidance Leaflet LEAFLET NO. 9 (rev. 1): ADDITIONAL GUIDANCE ON QUALITY SYSTEMS FOR OPERATORS OF SYNTHETIC TRAINING DEVICES NOTE: The material contained in this Leaflet has been issued in accordance with Chapter 9 of Administrative

More information

Top 5 Must Do IT Audits

Top 5 Must Do IT Audits Top 5 Must Do IT Audits Mike Fabrizius, Sharp HealthCare, VP, Internal Audit DJ Wilkins, KPMG, Partner, IT Advisory 2011 AHIA Annual Conference www.ahia.org Background on Sharp HealthCare Sharp s Co-sourcing

More information

Implementation of the Reliability & Maintainability (R&M) Engineering Body of Knowledge (BoK)

Implementation of the Reliability & Maintainability (R&M) Engineering Body of Knowledge (BoK) Implementation of the Reliability & Maintainability (R&M) Engineering Body of Knowledge (BoK) Andrew Monje Office of the Deputy Assistant Secretary of Defense for Systems Engineering 20th Annual NDIA Systems

More information

Request For Information (RFI) For a. Spectrum Management & Monitoring System (SMMS)

Request For Information (RFI) For a. Spectrum Management & Monitoring System (SMMS) Request For Information (RFI) For a Spectrum Management & Monitoring System (SMMS) Table of Contents 1 Introduction... 3 1.1 About Iraq CMC... 3 1.2 The Iraq CMC Objectives... 3 1.3 RFI Process... 4 2

More information

CHAPTER 2: IMPLEMENTATION PHASES AND OFFERINGS

CHAPTER 2: IMPLEMENTATION PHASES AND OFFERINGS CHAPTER 2: IMPLEMENTATION PHASES AND OFFERINGS Objectives Introduction The objectives are: Describe the purpose of the phase planning activity, preconditions, and deliverables in the implementation methodology.

More information

Internal Controls Required for Systems Involved in Procurement to Payment Processes

Internal Controls Required for Systems Involved in Procurement to Payment Processes Internal Controls Required for Systems Involved in Procurement to Payment Processes Definition: Procure to Pay encompasses all business functions necessary to obtain goods and services through contracting.

More information

ANNEX 2 Security Management Plan

ANNEX 2 Security Management Plan ANNEX 2 Page 1 of 24 The following pages define our draft security management plan (a complete and up to date shall be submitted to The Authority within 20 days of contract award as per Schedule 2.4, para

More information

Appendix B Maintenance Control Manual Template

Appendix B Maintenance Control Manual Template Appendix B Maintenance Control Manual Template MAINTENANCE CONTROL MANUAL TELATE OTAR PART 39 SUBPART E OPTION ONE AND TWO The purpose of this Maintenance Control Manual (MCM) Guidance Document is to assist

More information

Quality Manual. Specification No.: Q Revision 07 Page 1 of 14

Quality Manual. Specification No.: Q Revision 07 Page 1 of 14 Page 1 of 14 Quality Manual This Quality Manual provides the overall quality strategy and objectives of Pyramid Semiconductor s quality system. It is based on the requirements of ISO 9000. This manual

More information

Leveraging Your Service Quality Using ITIL V3, ISO and CMMI-SVC. Monday Half-Day Tutorial

Leveraging Your Service Quality Using ITIL V3, ISO and CMMI-SVC. Monday Half-Day Tutorial Leveraging Your Service Quality Using ITIL V3, ISO 20000 and CMMI-SVC Monday Half-Day Tutorial Definitions Service - Employment in duties or work for another The Challenge This situation where organization

More information

PRACTICE NO. PD-ED RELIABILITY February 1996 PRACTICES PAGE 1 OF 7 COMMON REVIEW METHODS FOR ENGINEERING PRODUCTS

PRACTICE NO. PD-ED RELIABILITY February 1996 PRACTICES PAGE 1 OF 7 COMMON REVIEW METHODS FOR ENGINEERING PRODUCTS PREFERRED PRACTICE NO. PD-ED-1215.4 RELIABILITY PRACTICES PAGE 1 OF 7 COMMON REVIEW METHODS FOR Practice: Conduct technical reviews to validate engineering designs using a common, consistent approach which

More information

Principles of Verification, Validation, Quality Assurance, and Certification of M&S Applications

Principles of Verification, Validation, Quality Assurance, and Certification of M&S Applications Introduction to Modeling and Simulation Principles of Verification, Validation, Quality Assurance, and Certification of M&S Applications OSMAN BALCI Professor Copyright Osman Balci Department of Computer

More information

VNF Lifecycle Management

VNF Lifecycle Management Case Study Lifecycle Customer Profile Customer: Cloud Service Provider (CSP) Industry: Telecommunications Employees: 22,500 (2016) Customers: 3+ Million The Challenge A CSP finds that rolling out new services,

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE MANUAL 63-119 20 JUNE 2008 SPACE AND MISSILE SYSTEMS CENTER Supplement 24 JUNE 2014 Acquisition CERTIFICATION OF SYSTEM READINESS FOR DEDICATED OPERATIONAL

More information

ITSM Process/Change Management

ITSM Process/Change Management ITSM Process/Change Management Process Documentation Revision Date: December 13, 2017 Version Number: 2.0 Document Ownership Document Owner Maury Collins Revision History ITSM Role, Department Service

More information

DRAFT ISO/IEC INTERNATIONAL STANDARD. Information technology Security techniques Information security management system implementation guidance

DRAFT ISO/IEC INTERNATIONAL STANDARD. Information technology Security techniques Information security management system implementation guidance INTERNATIONAL STANDARD ISO/IEC 27003 First edition 2010-02-01 Information technology Security techniques Information security management system implementation guidance Technologies de l'information Techniques

More information

UNCLASSIFIED. FY 2016 Base FY 2016 OCO

UNCLASSIFIED. FY 2016 Base FY 2016 OCO Exhibit R-2, RDT&E Budget Item Justification: PB 2016 Air Force Date: February 2015 3600: Research, Development, Test & Evaluation, Air Force / BA 6: RDT&E COST ($ in Millions) Prior Years FY 2014 FY 2015

More information

Sample Company. Risk Assessment and Mitigation Plan

Sample Company. Risk Assessment and Mitigation Plan Sample Company Revision History DATE VERSION DESCRIPTION AUTHOR 08-07-04 0.01 Initial Version F. Jack Anderson 09-07-04 0.02 Revised with changes from team reviews Michael Bennett 09-08-04 2.00 Initial

More information

MECHANICAL EQUIPMENT AND SUBSYSTEMS INTEGRITY PROGRAM

MECHANICAL EQUIPMENT AND SUBSYSTEMS INTEGRITY PROGRAM NOTE: MIL-STD-1798 (USAF) has been redesignated as a handbook, and is to be used for guidance purposes only. This document is no longer to be cited as a requirement. For administrative expediency, the

More information

Common Criteria Evaluation and Validation Scheme For. Information Technology Laboratory. Guidance to Validators of IT Security Evaluations

Common Criteria Evaluation and Validation Scheme For. Information Technology Laboratory. Guidance to Validators of IT Security Evaluations Common Criteria Evaluation and Validation Scheme For Information Technology Security Guidance to Validators of IT Security Evaluations Scheme Publication #3 Version 1.0 February 2002 National Institute

More information

3Z0Z 0? W7? TOTAL QUALITY MANAGEMENT MASTER PLAN* (S)l\f> v^n. 0 Department of Defense. AUG 2 ego, August O/ A./ \o V* TECHNICAL LIBRARY

3Z0Z 0? W7? TOTAL QUALITY MANAGEMENT MASTER PLAN* (S)l\f> v^n. 0 Department of Defense. AUG 2 ego, August O/ A./ \o V* TECHNICAL LIBRARY 3Z0Z O/ A./ \o V* ^V TIME y>- v^n 0? W7? TOTAL QUALITY MANAGEMENT MASTER PLAN* 0 Department of Defense August 1988 (S)l\f> TECHNICAL LIBRARY AUG 2 ego, u i ACCESSION NO AD DoD TOTAL QUALITY MANAGEMENT

More information

Requirements Analysis and Design Definition. Chapter Study Group Learning Materials

Requirements Analysis and Design Definition. Chapter Study Group Learning Materials Requirements Analysis and Design Definition Chapter Study Group Learning Materials 2015, International Institute of Business Analysis (IIBA ). Permission is granted to IIBA Chapters to use and modify this

More information

ISO /TS 29001:2010 SYSTEMKARAN ADVISER & INFORMATION CENTER SYSTEM KARAN ADVISER & INFORMATION CENTER

ISO /TS 29001:2010 SYSTEMKARAN ADVISER & INFORMATION CENTER SYSTEM KARAN ADVISER & INFORMATION CENTER SYSTEM KARAN ADVISER & INFORMATION CENTER PETROLEUM, PETROCHEMICAL AND NATURAL GAS INDUSTRIES -- SECTOR-SPECIFIC QUALITY MANAGEMENT SYSTEMS -- REQUIREMENTS FOR PRODUCT AND SERVICE SUPPLY ORGANIZATIONS

More information

Highlights of CMMI and SCAMPI 1.2 Changes

Highlights of CMMI and SCAMPI 1.2 Changes Highlights of CMMI and SCAMPI 1.2 Changes Presented By: Sandra Cepeda March 2007 Material adapted from CMMI Version 1.2 and Beyond by Mike Phillips, SEI and from Sampling Update to the CMMI Steering Group

More information

GMIG Courses and Course Descriptions

GMIG Courses and Course Descriptions FCN 190 FAR Fundamentals.... 2 Basic Simplified Acquisition Procedures.. 2 CON 100 Shaping Smart Business Arrangements... 2 CON 121 Contract Planning... 2 CON 124 Contract Execution. 3 CON 127 Contract

More information

Sawdey Solution Services

Sawdey Solution Services Sawdey Solution Services I n c o r p o r a t e d COMMERCIAL PRICE LIST 2016 1430 Oak Court, Suite 304 Beavercreek, OH 45430 Office Phone: (937) 490-4060 Fax: (937) 490-4086 www.sawdeysolut ionservices.com

More information

Alan F. Estevez. Principal Deputy Assistant Secretary of Defense for Logistics and Materiel Readiness

Alan F. Estevez. Principal Deputy Assistant Secretary of Defense for Logistics and Materiel Readiness FOREWORD The DoD Weapon Systems Acquisition Reform Product Support Assessment (WSAR-PSA) identified eight principle areas to improve product support effectiveness. One of those areas, Governance, included

More information

LIST OF TABLES. Table Applicable BSS RMF Documents...3. Table BSS Component Service Requirements... 13

LIST OF TABLES. Table Applicable BSS RMF Documents...3. Table BSS Component Service Requirements... 13 General Services Administration NS2020 Enterprise Infrastructure Solutions EIS RFP #QTA0015THA3003 Volume 2: Management BSS Risk Management Framework Plan LIST OF TABLES Table 8.2-1. Applicable BSS RMF

More information

Software System Safety

Software System Safety JOINT SERVICES SOFTWARE SAFETY AUTHORITIES (JS-SSA) Software System Implementation Process and Tasks Supporting MIL-STD-882E With Joint Software System Engineering Handbook References Developed by the

More information

NDIA Systems Engineering Conference 2013 M&S Applied to Improved Test Planning & Analysis Methods in the DT & OT Execution Phase

NDIA Systems Engineering Conference 2013 M&S Applied to Improved Test Planning & Analysis Methods in the DT & OT Execution Phase Joe Murphy Analytical Graphics, Inc. jmurphy@agi.com 610 457 5002 NDIA Systems Engineering Conference 2013 M&S Applied to Improved Test Planning & Analysis Methods in the DT & OT Execution Phase Oct 30,

More information

Software Engineering II - Exercise

Software Engineering II - Exercise Software Engineering II - Exercise April 29 th 2009 Software Project Management Plan Bernd Bruegge Helmut Naughton Applied Software Engineering Technische Universitaet Muenchen http://wwwbrugge.in.tum.de

More information

Qualification Management for Geological Storage of CO 2

Qualification Management for Geological Storage of CO 2 DNV SERVICE SPECIFICATION DNV-DSS-402 Qualification Management for Geological Storage of CO 2 JUNE 2012 This document has been amended since the main revision (June 2012), most recently in July 2013. See

More information

Summary of TL 9000 R4.0 Requirements Beyond ISO 9001:2000

Summary of TL 9000 R4.0 Requirements Beyond ISO 9001:2000 This summary identifies the additional TL 9000 Release 4.0 requirements beyond those stated in ISO 9001:2000. See the TL 9000 R4.0 Handbook for the actual TL 9000 R4.0 requirements. ISO 9001:2000 section

More information

9100 revision Changes presentation clause-by-clause. IAQG 9100 Team November 2016

9100 revision Changes presentation clause-by-clause. IAQG 9100 Team November 2016 Changes presentation clause-by-clause IAQG 9100 Team November 2016 INTRODUCTION In September 2016, a revision of the 9100 standard has been published by the IAQG (International Aerospace Quality Group)

More information

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3) PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3) 3.1 IV&V Methodology and Work Plan 3.1.1 NTT DATA IV&V Framework We believe that successful IV&V is more than just verification that the processes

More information

ENERGY PERFORMANCE PROTOCOL QUALITY ASSURANCE SPECIFICATION

ENERGY PERFORMANCE PROTOCOL QUALITY ASSURANCE SPECIFICATION ENERGY PERFORMANCE PROTOCOL QUALITY ASSURANCE SPECIFICATION Version 1.0 April 2015 Table of Contents 1.0 INVESTOR CONFIDENCE PROJECT 1.1 ENERGY EFFICIENCY PERFORMANCE QUALITY ASSURANCE SPECIFICATION 1.2

More information

Digital Twin Digital Thread in Aerospace David Riemer

Digital Twin Digital Thread in Aerospace David Riemer Digital Twin Digital Thread in Aerospace David Riemer Unrestricted Siemens AG 20XX Realize innovation. Siemens Focus is to Enable Excellent Performance on Every Program Program Execution Excellence Fully

More information

DEFENSE ACQUISITION UNIVERSITY ISA 101 BASIC INFORMATION SYSTEM ACQUISITION

DEFENSE ACQUISITION UNIVERSITY ISA 101 BASIC INFORMATION SYSTEM ACQUISITION 1 Identify applicable United States laws, federal regulations and DoD directives that govern management of IT/SW systems. Identify key statutes related to the acquisition of Information Technology (IT)

More information

UNCLASSIFIED. UNCLASSIFIED Army Page 1 of 12 R-1 Line #136

UNCLASSIFIED. UNCLASSIFIED Army Page 1 of 12 R-1 Line #136 Exhibit R2, RDT&E Budget Item Justification: PB 2015 Army Date: March 2014 2040: Research, Development, Test & Evaluation, Army / BA 6: RDT&E Management Support COST ($ in Millions) Prior Years FY 2013

More information

University of Tennessee, Knoxville

University of Tennessee, Knoxville University of Tennessee, Knoxville Trace: Tennessee Research and Creative Exchange Masters Theses Graduate School 8-2004 A Model for the Application of Test and Evaluation Concepts by the Air Element of

More information

Washington Headquarters Services ADMINISTRATIVE INSTRUCTION

Washington Headquarters Services ADMINISTRATIVE INSTRUCTION Washington Headquarters Services ADMINISTRATIVE INSTRUCTION NUMBER 94 October 19, 2007 Incorporating Change 2, July 17, 2017 DFD FSD SUBJECT: Personal Property Management and Accountability References:

More information

PRIIA Section 305 Executive Committee 10/20/2010. Version 3

PRIIA Section 305 Executive Committee 10/20/2010. Version 3 Requirements Document Diesel Locomotive Version 3 Introduction The Next-Generation Equipment Committee (NGEC) was established under the provisions of section 305 of the Passenger Rail Investment and Improvement

More information

Jetstream Certification and Testing

Jetstream Certification and Testing PJM Interconnection 02/01/2016 This page is intentionally left blank. PJM 2016 www.pjm.com 2 P age Introduction Document Description PJM maintains and publishes a list of certified devices, software packages

More information

Requirements for a Successful Replatform Project

Requirements for a Successful Replatform Project Requirements for a Successful Replatform Project Replatform Preparation A successful Replatform Project begins with assessing and validating the current system, applications, jobs and associated application

More information

NATIONAL INCIDENT MANAGEMENT SYSTEM CAPABILITY ASSESSMENT SUPPORT TOOL (NIMCAST) SELF-ASSESSMENT INSTRUMENT 6

NATIONAL INCIDENT MANAGEMENT SYSTEM CAPABILITY ASSESSMENT SUPPORT TOOL (NIMCAST) SELF-ASSESSMENT INSTRUMENT 6 NATIONAL INCIDENT MANAGEMENT SYSTEM CAPABILITY ASSESSMENT SUPPORT TOOL (NIMCAST) TABLE OF CONTENTS Page TABLE OF CONTENTS 1 NATIONAL INCIDENT MANAGEMENT SYSTEM CAPABILITY ASSESSMENT SUPPORT TOOL (NIMCAST)

More information

October 1, 2016 September 30, 2017 CONTRACTOR PERFORMANCE EVALUATION AND MEASUREMENT PLAN

October 1, 2016 September 30, 2017 CONTRACTOR PERFORMANCE EVALUATION AND MEASUREMENT PLAN September 28, 2016 October 1, 2016 September 30, 2017 CONTRACTOR PERFORMANCE EVALUATION AND MEASUREMENT PLAN Management and Operations of the Thomas Jefferson National Accelerator Facility (TJNAF) U.S.

More information

Using Measurement to Assure Operational Safety, Suitability, and Effectiveness (OSS&E) Compliance for the C 2 Product Line

Using Measurement to Assure Operational Safety, Suitability, and Effectiveness (OSS&E) Compliance for the C 2 Product Line Using Measurement to Assure Operational Safety, Suitability, and Effectiveness (OSS&E) Compliance for the C 2 Product Line David L. Smith The MITRE Corporation 115 Academy Park Loop, Suite 212 Colorado

More information

QUALITY MANUAL DISTRIBUTION. Your Logo Here. Page 1 1 of 57. Rev.: A

QUALITY MANUAL DISTRIBUTION. Your Logo Here. Page 1 1 of 57. Rev.: A Page 1 1 of 57 President Design Engineering Production Production Engineering Materials Control Purchasing Service Marketing Sales Contracts Human Resources Quality Assurance Quality Control Production

More information

Agenda Framing Assumptions (FAs) Lunch and Learn, March 8, Presenter: Brian Schultz. Moderator: Lt Col Doherty

Agenda Framing Assumptions (FAs) Lunch and Learn, March 8, Presenter: Brian Schultz. Moderator: Lt Col Doherty Agenda Framing Assumptions (FAs) Lunch and Learn, March 8, 2017 Presenter: Brian Schultz Moderator: Lt Col Doherty Overview Background Definition and Guidance Why are they Important? Exercise How Do We

More information