Test and Evaluation Master Plan (TEMP)

Similar documents
Operational Requirements Document (ORD)

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

Given design trade studies, interpret the results to recommend the most cost-effective concept for development.

The Analyze/Select Phase

National Aeronautics and Space Administration Washington, DC 20546

Department of Homeland Security. DHS Directives System Directive Number: 1 02~1 Revision Number: 01 Issue Date: ACQUISITION MANAGEMENT DIRECTIVE

The Produce/Deploy/Support Phase. The Produce/Deploy/Support Phase. Long Description

Engineering and Manufacturing Development YOU ARE HERE. As homework, you read the SPAW MS B TEMP and wrote down any mistakes you noticed.

TECHNICAL REVIEWS AND AUDITS

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

Rigor and Objectivity in T&E

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

Report of the Reliability Improvement Working Group Table of Contents

Integrated Systems Engineering and Test & Evaluation. Paul Waters AIR FORCE FLIGHT TEST CENTER EDWARDS AFB, CA. 16 August 2011

Net-Ready Key Performance Parameter (NR-KPP) Implementation Guidebook

Mission-Level Systems Engineering

3 PART THREE: WORK PLAN AND IV&V METHODOLOGY (SECTION 5.3.3)

Intermediate Systems Acquisitions Course. Commercial and Non-Developmental Items

SYSTEMS ENGINEERING REQUIREMENTS AND PRODUCTS

SECTION C - DESCRIPTION / SPECIFICATIONS / STATEMENT OF WORK

DATA ITEM DESCRIPTION

Number: DI-IPSC-81427B Approval Date:

System Safety in Systems Engineering V-Charts

REQUIREMENTS FOR SAFETY RELATED SOFTWARE IN DEFENCE EQUIPMENT PART 1: REQUIREMENTS

DATA ITEM DESCRIPTION

Section M: Evaluation Factors for Award HQ R LRDR Section M: Evaluation Factors for Award For HQ R-0002

DATA ITEM DESCRIPTION TITLE: TRAINING SITUATION DOCUMENT Number: DI-SESS-81517C Approval Date:

On Board Use and Application of Computer based systems

Incorporating Test and Evaluation into Department of Defense Acquisition Contracts

SOURCE SELECTION PLAN. {Insert if Phase I or Phase II} {Insert Project Name} {Insert Project Acronym} SOLICITATION XXXXXX-xx-R-xxxx

CORPORATE QUALITY MANUAL

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2

CMII-100G. CMII Standard for Integrated Process Excellence and. and

Independent Verification and Validation (IV&V)

PRINCESS NOURA UNIVESRSITY. Project Management BUS 302. Reem Al-Qahtani

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

20 September 2017 Document No. QM-ISO revision T ASTRONAUTICS CORPORATION OF AMERICA S. AS 9100 and FAA QUALITY MANUAL. Proprietary Notice

ISO 9001:2008 Quality Management System QMS Manual

Development, Validation and Implementation Considerations of a Decision Support System for Unmanned & Autonomous System of Systems Test & Evaluation

QUALITY MANAGEMENT SYSTEM POLICIES AND PROCEDURES

EPICOR, INCORPORATED QUALITY ASSURANCE MANUAL

DEPARTMENT OF DEFENSE HANDBOOK MECHANICAL EQUIPMENT AND SUBSYSTEMS INTEGRITY PROGRAM

POLICY MANUAL FOR ISO 9001:2008. Document: PM-9001:2008 Date: April 7, Uncontrolled Copy

OCI Mitigation Plan SAMPLE for IDIQ contract

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

Risk-Based Testing: Analysis and Strategy. Presented at Quality Assurance Institute QUEST Conference Chicago, Ill., 2009

SOFTWARE DEVELOPMENT FOR SPACE SYSTEMS

CMMI FOR SERVICES, THE PREFERRED CONSTELLATION WITHIN THE SOFTWARE TESTING FUNCTION OF A SOFTWARE ENGINEERING ORGANIZATION

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

JAA Administrative & Guidance Material Section Six: Synthetic Training Devices (STD/FSTD), Part Three: Temporary Guidance Leaflet

Top 5 Must Do IT Audits

Implementation of the Reliability & Maintainability (R&M) Engineering Body of Knowledge (BoK)

Request For Information (RFI) For a. Spectrum Management & Monitoring System (SMMS)

CHAPTER 2: IMPLEMENTATION PHASES AND OFFERINGS

Internal Controls Required for Systems Involved in Procurement to Payment Processes

ANNEX 2 Security Management Plan

Appendix B Maintenance Control Manual Template

Quality Manual. Specification No.: Q Revision 07 Page 1 of 14

Leveraging Your Service Quality Using ITIL V3, ISO and CMMI-SVC. Monday Half-Day Tutorial

PRACTICE NO. PD-ED RELIABILITY February 1996 PRACTICES PAGE 1 OF 7 COMMON REVIEW METHODS FOR ENGINEERING PRODUCTS

Principles of Verification, Validation, Quality Assurance, and Certification of M&S Applications

VNF Lifecycle Management

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

ITSM Process/Change Management

DRAFT ISO/IEC INTERNATIONAL STANDARD. Information technology Security techniques Information security management system implementation guidance

UNCLASSIFIED. FY 2016 Base FY 2016 OCO

Sample Company. Risk Assessment and Mitigation Plan

MECHANICAL EQUIPMENT AND SUBSYSTEMS INTEGRITY PROGRAM

Common Criteria Evaluation and Validation Scheme For. Information Technology Laboratory. Guidance to Validators of IT Security Evaluations

3Z0Z 0? W7? TOTAL QUALITY MANAGEMENT MASTER PLAN* (S)l\f> v^n. 0 Department of Defense. AUG 2 ego, August O/ A./ \o V* TECHNICAL LIBRARY

Requirements Analysis and Design Definition. Chapter Study Group Learning Materials

ISO /TS 29001:2010 SYSTEMKARAN ADVISER & INFORMATION CENTER SYSTEM KARAN ADVISER & INFORMATION CENTER

Highlights of CMMI and SCAMPI 1.2 Changes

GMIG Courses and Course Descriptions

Sawdey Solution Services

Alan F. Estevez. Principal Deputy Assistant Secretary of Defense for Logistics and Materiel Readiness

LIST OF TABLES. Table Applicable BSS RMF Documents...3. Table BSS Component Service Requirements... 13

Software System Safety

NDIA Systems Engineering Conference 2013 M&S Applied to Improved Test Planning & Analysis Methods in the DT & OT Execution Phase

Software Engineering II - Exercise

Qualification Management for Geological Storage of CO 2

Summary of TL 9000 R4.0 Requirements Beyond ISO 9001:2000

9100 revision Changes presentation clause-by-clause. IAQG 9100 Team November 2016

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

ENERGY PERFORMANCE PROTOCOL QUALITY ASSURANCE SPECIFICATION

Digital Twin Digital Thread in Aerospace David Riemer

DEFENSE ACQUISITION UNIVERSITY ISA 101 BASIC INFORMATION SYSTEM ACQUISITION

UNCLASSIFIED. UNCLASSIFIED Army Page 1 of 12 R-1 Line #136

University of Tennessee, Knoxville

Washington Headquarters Services ADMINISTRATIVE INSTRUCTION

PRIIA Section 305 Executive Committee 10/20/2010. Version 3

Jetstream Certification and Testing

Requirements for a Successful Replatform Project

NATIONAL INCIDENT MANAGEMENT SYSTEM CAPABILITY ASSESSMENT SUPPORT TOOL (NIMCAST) SELF-ASSESSMENT INSTRUMENT 6

October 1, 2016 September 30, 2017 CONTRACTOR PERFORMANCE EVALUATION AND MEASUREMENT PLAN

Using Measurement to Assure Operational Safety, Suitability, and Effectiveness (OSS&E) Compliance for the C 2 Product Line

QUALITY MANUAL DISTRIBUTION. Your Logo Here. Page 1 1 of 57. Rev.: A

Agenda Framing Assumptions (FAs) Lunch and Learn, March 8, Presenter: Brian Schultz. Moderator: Lt Col Doherty

Transcription:

Test and Evaluation Master Plan (TEMP) A. Purpose The Test and Evaluation Master Plan (TEMP) is the basic top-level planning document for Test and Evaluation (T&E) related activities for major acquisition programs. The TEMP describes the necessary Developmental Test and Evaluation (DT&E) and Operational Test and Evaluation (OT&E) that needs to be conducted to determine system technical performance, operational effectiveness / suitability, and limitations. The TEMP identifies all critical technical characteristics and operational issues and describes the objectives, responsibilities, resources, and schedules for all completed and planned T&E, including modeling and simulation tools used in the T&E process. Also, the TEMP describes all subordinate plans (e.g., DT&E and OT&E plans), required reports (e.g., DT&E and OT&E reports), and assigns responsibility for preparing and approving these plans and reports. Most acquisition programs are required to have an approved TEMP and subordinate test plans prior to commencing associated T&E unless a specific waiver is granted by the DHS Science &Technology (S&T) Director, T&E and Standards Division. 1 The TEMP guidance outlined and described in this document cancels and supersedes all previous DHS guidance for TEMPs. B. Overview of TEMP The Program Manager (PM) is responsible for coordinating the overall T&E strategy and approach for the program. The PM should prepare a TEMP in accordance with the TEMP template as outlined in this document after approval of the initial Operational Requirement Document (ORD). The program s TEMP should be approved by the Component Acquisition Executive (CAE) and the DHS S&T Director, T&E and Standards Division prior to formal submission to the Acquisition Decision Authority (ADA) at Acquisition Decision Event (ADE) 2. The TEMP is a living document that should accurately reflect major changes in program requirements, schedule, and funding. An approved TEMP should be reviewed and updated by the PM at each ADE or whenever a breach occurs in the program s Acquisition Program Baseline (APB). For example, the TEMP should be revised when the PM is unable to execute the TEMP as written, or when changes to the program s cost/funding, schedule, performance make the existing TEMP obsolete. Revision of the TEMP should receive the same endorsements and approvals as the original document. C. TEMP Preparation and Process Planning is the first step for any test and evaluation program, and should begin as early in the Analyze/Select phase of the DHS acquisition life cycle as possible. Early T&E planning should be documented in a working TEMP even if information for some of the 1 Under Secretary for Science & Technology has designated S&T Director, Test & Evaluation and Standards Division with responsibilities for DHS T&E policy and procedures. DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L L-1

sections may not be available at that time. The initial working TEMP can then be used as a basis for completing the TEMP as program direction and details become known. The PM is responsible for developing and maintaining the TEMP. Early in the acquisition process, the PM should form a test planning team such as a T&E Integrated Process Team (IPT) or Test Management Oversight Team (TMOT) to start development of the draft TEMP. The PM will prepare the draft TEMP in consultation with the TMOT or T&E IPT, which is comprised of representatives from program leadership, stakeholders from other organizations, and appropriate Subject Matter Experts (SMEs) involved in the T&E activities of the program. TMOT/IPT members may include: The PM The program sponsor/user representative(s) Developmental and operational testers The independent evaluator The program logistician The program systems engineer/t&e engineer The program training developer Supporting test area manager (DHS T&E Standards Division) Any organization or agency that has a role critical to program s T&E activities TMOT/IPT members are responsible for helping the PM develop a TEMP that is sufficient, realistic, reasonable, and executable. For example, the independent evaluator and operational testers serve as key members responsible for the development of Section D (Operational Test & Evaluation) of the TEMP. Although this document is focused on providing guidance for development of the TEMP, there are some T&E concepts that may be beneficial to helping the user understand the T&E process. The fundamental purpose of T&E in an acquisition program is to verify attainment of technical performance (in Developmental Testing), and required operational effectiveness and suitability (in Operational Testing). As a system undergoes design, development, and integration, the emphasis in testing moves gradually from DT&E, which is concerned mainly with validating the contract requirements and the attainment of engineering design goals and manufacturing process objectives, to OT&E, which focuses on verifying operational effectiveness and suitability. To foster common understanding and prevent confusion, the use of consistent and accepted terminology is important when describing T&E activities. For ease of understanding, testing is generally categorized as either developmental or operational. Both developmental and operational test and evaluation are required for DHS acquisition programs. Developmental Testing (DT) is any test used to assist the engineering design and development process, and to verify the status of technical progress. DT is a tool to help identify and manage design risks, substantiate achievement of technical performance requirements, validate manufacturing process requirements and L-2 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L

system maturity, and certify readiness for Operational Testing (OT). Developmental testing typically requires instrumentation and measurements, and is performed under known, controlled, and limited conditions that might or might not be representative of the complex operational environment. DT is controlled by the PM and consists of all testing performed by the contractor and/or Government during system development. Operational Testing (OT) is the field test, under realistic conditions, of any system or component, for determining that system or components overall effectiveness and suitability for use before deployment of the system or component. OT provides information for the overall assessment of how well a system will provide the desired capability when operated by typical users in the expected environment. OT is usually conducted by an independent evaluator, and not controlled by the PM. While the PM retains responsibility for all aspects of his or her program, the system evaluation or system assessment for operational test and evaluation is prepared independently of both the PM and agency user organizations. This independence allows evaluators to present objective and unbiased conclusions regarding the system or equipments operational effectiveness and suitability. The independent operational evaluator, which can be referred as the Operational Test Authority (OTA) is responsible to present his or her findings in the OT&E report, which is submitted to the PM, Component Acquisition Executive (CAE), DHS Director, Test & Evaluation and Standards, and Acquisition Decision Authority (ADA). The OTA must be prepared to present and defend those findings to the Component Acquisition Executive or the Acquisition Decision Authority at ADEs or other program reviews. Acquisition decision authorities will ultimately determine the degree to which they accept and factor the evaluator s findings and recommendations into programmatic decisions. However, they must make such determinations in view of the evaluator s objective and unbiased assessment. Integrated testing (DT/OT, or DT/Operational Assessment (OA)) is a process under which developmental and operational test objectives are performed during the same phase or scenarios to make best use of time and available test resources. Integrated testing is more complex due to the dual nature of its objectives. If this approach is taken, developmental and operational T&E objectives remain separate and distinct from one another, even though resources, activities, and data may be shared. Care must be taken to ensure that the combined approach does not compromise either the developmental or operational T&E objectives. Rigorous planning and coordination are required for effective use of the integrated testing approach due to the added complexity of managing concurrent test events and activities. DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L L-3

Required Elements in the TEMP Document The TEMP shall include the following information and comply with the following format. Cover/Signature Page Executive Summary Revision Summary (if applicable) Section A. Introduction 1. Background 2. Operational Performance Requirements 3. Critical Technical Parameters Section B. Program Summary 1. Integrated Master Schedule 2. Management Section C. Developmental Test and Evaluation Outline 1. Developmental Test and Evaluation Overview 2. Developmental Test and Evaluation to Date 3. Future Developmental Test and Evaluation 4. Developmental Test and Evaluation Plans and Reports Section D. Operational Test and Evaluation Outline 1. Operational Test and Evaluation Overview 2. Critical Operational Issues 3. Operational Test and Evaluation to Date 4. Planned Operational Test and Evaluation 5. Operational Test and Evaluation Plans and Reports Section E. Test and Evaluation Resource Summary 1. Summary 2. Resource Summary Updates Appendices: 1. Bibliography 2. Acronyms 3. Points of Contact L-4 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L

Guidance on Required Elements Cover/Signature Page The TEMP cover page should, at a minimum, be signed and dated by: the Program Manager, the Component Head or CAE, DHS Director, T&E and Standards (dependent upon program level), and the DHS ADA (dependent upon program level). See TEMP template in this document. Executive Summary (1 to 2 pages in length) Provide an Executive Summary of the TEMP. The Executive Summary should be a brief discussion of the plan, highlighting the salient points of each chapter in the plan, including goals and objectives and expected outcomes. Briefly discuss the roles and responsibilities of key participants and discuss reports expected to be prepared and how these reports will support program decisions.. Revision Summary (if applicable) The Revision Summary should provide a high-level description of major changes followed by a Table of Changes that describes specific changes, including references to the changed section/paragraph. Section A. Introduction 1. Background Briefly summarize the capability the system provides. Briefly describe the system design. Include key features and subsystems; describe unique characteristics of the system or unique support concepts which may result in special test and analysis requirements. Do not repeat detailed background information from other program documents. The focus of this section should be on T&E issues. 2. Operational Performance Requirements List in matrix format (see Table L-1) the minimum acceptable operational performance requirements. Operational performance requirements should be traceable to the ORD and the Key Performance Parameters (KPPs) listed in the ORD. Table L-1 contains examples of operational performance requirements with their associated threshold and objective parameters. Thresholds, against which each of the effectiveness and suitability parameters will be measured, are normally quantitative. Thresholds should represent the level of system performance acceptable to the user to successfully provide the desired capability. 3. Critical Technical Parameters List in a matrix format (see Table L-2) the critical technical parameters of the system that have been evaluated or will be evaluated during the remaining phases of DT&E. For each technical parameter, list the appropriate technical threshold. Highlight critical technical issues that must be demonstrated before entering the next acquisition phase or before entering OT&E. DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L L-5

Table L-1: Example of Operational Performance Requirements Operational Effectiveness Requirement Parameter Threshold Objective Speed** Minimum Top Speed 25 Knots 30 Knots Threat Detection Continuous Speed (Sea State 2) 20 Knots 25 Knots Probability of detection of unknown threat 85% at 10 miles 90% at 10 miles Operational Suitability Requirement Parameter Threshold Objective Reliability Mean Time Between Maintenance 1000 Hours 1200 Hours Actions Mean Time Between Failures 2000 Hours 2200 Hours Mean Time Between Critical Failures 5000 Hours 5200 Hours Maintainability Mean Time To Repair 2.5 Hours 2.0 Hours Operational Availability** ** Key Performance Parameter Percentage Of Time Available To Support Operational Tasking 80% 85% Table L-2: Sample Critical Technical Parameters Matrix Critical Technical Parameter Stability Stability Minimum Top Speed Test Event Model Test Static Rollover Model Test Technical Threshold Self-right through 360 o Self-right through 360 o < Project Title > Test Location Government Test Site 25 Knots Government Test Facility Test Schedule Development Test (DT) Decision Supported Preliminary Design Completion Contractor DT Preliminary Acceptance DT Preliminary Design Completion Additional examples of Critical Technical Parameters for various types of systems are included in Table L-3. L-6 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L

Table L-3: Examples of Critical Technical Parameters Boats Length Beam Draft Speed Range Human Factors Safety/Environmental Health Navigation Information Technology Enterprise Architecture Compliance Speed of Calculation Throughput Capability Reliability Software Maintainability Security Controls Human Factors Speed Maneuvering Range Maximum Gross Weight Cargo Capacity Personnel Capacity Human Factors Airworthiness Aircraft Radars Range Detection Limits Jamming Protection Reliability Error Rate/Signal Processing Human Factors Section B. Program Summary 1. Integrated Master Schedule Graphically display the integrated time sequencing of the critical T&E phases and events within the integrated master schedule. Figure L-1 is provided for illustrative purposes only. The PM may use any graphics technique that clearly shows the key T&E events and their sequential relationship with the acquisition life cycle phases. T&E events should be traceable to applicable test requirements (KPPs and Critical Operational Issues [COIs]) within the appropriate T&E documentation (e.g., ORD, test plans, test reports, traceability matrix, etc.). Include event dates related to the testing program, such as Acquisition Decision Events (ADEs), test article availability, appropriate phases of DT&E, and OT&E. Include all T&E planning documents (TEMP, DT&E Test Plan, OT&E Test Plan) and DT&E and OT&E Reports. DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L L-7

Figure L- 1: Sample Integrated Master Schedule (Fiscal Years) 2. Management Identify all organizations that will be participating in the T&E program. Discuss in detail the roles and responsibilities of each of the identified stakeholders. Organizations which must be included in the T&E program include the PM, the Component, the Independent Evaluator or Operational Test Authority (OTA), or any organization conducting actual testing, including contractors and operational units. L-8 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L

Section C. Developmental Test & Evaluation Outline 1. Developmental Test & Evaluation Overview Discuss the overall goals and objectives of the DT&E program. Explain how the planned (or accomplished) DT&E will verify the status of the engineering design and development progress, verify that design risks have been minimized, and substantiate the achievement of technical performance. Discuss specific DT&E exit criteria and system acceptance tests, qualification tests, and test readiness reviews if applicable. This section should also address: (a) Technology which has not demonstrated its ability to contribute to system performance and ultimately provide the capability the program is designed to provide, and (b) Degree to which system hardware and software design has stabilized so as to reduce manufacturing and production decision uncertainties. 2. Developmental Test and Evaluation to Date Describe all DT&E which has been conducted to date. Include all DT&E conducted by both contractors and the government. Briefly note the results of the testing and reference all reports completed or under preparation. 3. Planned Developmental Test and Evaluation Discuss all remaining DT&E that is planned, beginning with the date of the current TEMP revision and extending through completion of production. Place emphasis on the testing events that will occur during the upcoming acquisition phase and the corresponding test resources and assets needed (e.g., test facilities and ranges, government furnished equipment, and specialized test equipment). For each segment of testing (e.g., modeling, laboratory tests, and in-plant tests), the following topics should be discussed: (a) Configuration Description - Summarize the functional capability of the system configuration (model, mock-up, and prototype) and how it differs, if any, from the planned production model. (b) DT&E Objectives - State the test objectives for the phase in terms of the critical technical parameters (CTPs) to be confirmed. Identify any specific technical parameters which an Acquisition Decision Memorandum (ADM) or legislative action has directed to be demonstrated during a particular phase of testing. (c) DT&E Events, Scope of Testing, and Basic Scenarios - Summarize the test events, test scenarios, and the test design concept. Quantify the testing in terms of the number of test events planned, and discuss the information which will be expanded upon in the DT&E Plan. Discuss the environment in which testing will be conducted and how realistic that environment is. Describe and identify any models or simulations that will be used, justification for their use, and relevant validation or certification requirements. (d) Limitations - Discuss any test limitations that may significantly affect the evaluator s ability to draw conclusions and make recommendations concerning the critical technical parameters. 4. Developmental Test and Evaluation Plans and Reports Describe all required DT&E plans and reports. Include information on the scope of each plan or report, who prepares it, who reviews it, who approves it, and when it is to be submitted. DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L L-9

Section D. Operational Test & Evaluation Outline 1. Operational Test and Evaluation Overview The OT&E outline will typically be written by the independent operational evaluator or OTA with inputs from the program and the user/operators. Discuss the overall goals and objectives of the OT&E program, including any combined DT&E/OT&E or integrated testing. Discuss the entry criteria that must be met prior to the commencement of OT&E. Discuss how OT&E is performed independently from the vendor and structured to ensure that an operationally effective and operationally suitable system is delivered to the Component and user/operators. Provide information to show how OT&E will evaluate the system in an environment as operationally realistic as possible (i.e., using typical operators, expected ranges of natural environmental conditions, and expected operational scenarios). Discuss the different phase components of the planned OT&E, which may include Early Operational Assessment (EOA), Operational Assessment (OA), Initial Operational Test and Evaluation (IOT&E), and Follow-On Operational Test and Evaluation (FOT&E). 2. Critical Operational Issues List the COIs that have been identified by the Sponsor, Component or end user. COIs are the operational effectiveness and operational suitability issues (not characteristics, parameters, or thresholds) that must be examined in OT&E to evaluate/assess the system s capability to provide the desired capability. A COI is typically phrased as a question that must be answered in order to properly evaluate the operational effectiveness (e.g., Will the system detect Home-Made Explosives at a level that will prevent a terrorist attack at airport terminal environments?) and operational suitability (e.g., Will the system be maintainable within the planned funding levels, personnel structure, and expertise level at airport terminal facilities?). The list of COIs should be thorough enough to ensure that, if every COI is resolved favorably, the system will be operationally effective and operationally suitable when employed in its intended environment by typical users. The list of COIs will normally consist of five to ten issues and should reflect only those that are truly critical in nature. If a COI cannot be favorably resolved during testing, the situation should be reviewed by the program office and user/operator as to the actual operational impact, and recommendations for resolution should be presented to the ADA. 3. Operational Test and Evaluation to Date Briefly describe all OT&E (including EOA and OA) that has been completed; if none has been conducted, so state. The descriptions should include the following: (a) A description of the system actually tested and how its configuration relates to the system that will be fielded. (b) A summary of the actual testing that occurred, including events, scenarios, resources used, test limitations, evaluations conducted, results achieved, and a reference to any test report detailing the results of such testing. Emphasis should be upon those COIs that were resolved, partially resolved, or unresolved at the completion of that portion of testing. 4. Planned Operational Test and Evaluation (a) Configuration Description - Identify the system to be tested, and describe any differences between the tested system and the system that will be fielded. Include, where applicable, the extent of integration with other systems with which it must be interoperable or compatible. Characterize the system (e.g., first article, production representative, or production configuration). (b) Operational Test and Evaluation Objectives - State the test objectives including the COIs to be addressed during remaining OT&E and the Acquisition Decision Events (ADEs) supported. OT&E which supports the Production and Deployment Decision (ADE 3) should have test objectives that examine all areas of operational effectiveness and suitability. (c) Operational Test and Evaluation Events, Scope of Testing, and Scenarios - Summarize the scenarios and identify the events to be conducted. Indicate the type of resources to be used, the simulation(s) to be employed (if applicable), the type of representative personnel who will operate and maintain the system, the status of logistic support, the operational and maintenance documentation that will be used, and the environment under which the system is to be employed and supported during testing. This L-10 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L

section should also identify sources of information (e.g., developmental testing, modeling, and simulations) that may be used by the operational testers to supplement this phase of OT&E. Whenever models and simulations are to be used, explain the rationale for their applicability and credible use. (d) Limitations - Discuss the test limitations including the scenario realism, resource availability, limited operational environments, limited support environment, maturity of tested system, safety, etc., that may impact the resolution of affected COIs. Indicate the COI(s) affected in parentheses after each limitation. 5. Operational Test and Evaluation Plans and Reports Describe all required OT&E plans and reports, which include plans and reports for any scheduled EOA and OA. Include information on the scope of each plan or report, such as who reviews it, who approves it, and when it is to be submitted. OT&E Plans are prepared by the user/operators, and approved for use by the OTA. The OT&E Plan is executed by the user/operator and overseen by the OTA. The OTA is then responsible for test evaluation and OT&E report generation. DOT&E is responsible for briefing OT&E results and conclusions at the appropriate ADEs to the ADA. Section E. Test and Evaluation Resource Summary 1. Summary Provide a summary (preferably in a table or matrix format) of all key T&E resources, both Government and contractor, which will be used during the course of the acquisition program. Estimate, by Fiscal Year, the funding required to pay direct costs of planned testing (see Table L-4) and identify any major shortfalls. Table L-4: Summary of T&E Funding Requirements FY08 FY09 FY10 FY11 FY12 FY13 FY14 FY15 DT&E 50 75 125 175 OT&E 5 10 375 375 55 35 15 Total 55 85 125 550 375 55 35 15 Specifically, the TEMP shall identify the following test resources: (a) Test Articles - Identify the actual number of and timing requirements for all test articles, including key support equipment and technical information required for testing in each phase of DT&E and OT&E. If key subsystems (components, assemblies, subassemblies, or software modules) are to be tested individually, before being tested in the final system configuration, identify each subsystem in the TEMP and the quantity required. Specify if/when prototypes, development pre-production, or production models will be used. (b) Test Sites/Instrumentation/Airspace/Spectrum - Identify the specific facilities/test ranges to be used for each type of testing. Compare the requirements for facilities/test ranges dictated by the scope and content of planned testing with existing and programmed facility/test range capability, and highlight any major shortfalls. Identify instrumentation that must be acquired specifically to conduct the planned test program and any airspace or spectrum requirements needed to support testing. (c) Test Support Equipment - Identify test support equipment that must be acquired specifically to conduct the test program. Identify unique or special calibration requirements associated with any such equipment. DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L L-11

(d) Threat Systems/Simulators - For those systems that require the use of threat systems/simulators, identify the type, model number, and availability requirements for all required threat systems/simulators. Compare the requirements for threat systems/simulators with available and projected assets and their capabilities. Identify any major shortfalls. (e) Test Targets and Expendables - Identify the type, model number, and availability requirements for all test targets and expendables (e.g., targets, flares, chaff, smoke generators, acoustic countermeasures, etc.) that will be required for each phase of testing. Identify any major shortfalls. (f) Operational Program Test Support - For each T&E phase, identify the type and timing of operational test support required (e.g., aircraft flying hours, vessel underway days, operating personnel hours and other critical operating program support required). (g) Simulations, Models, and Test Beds - For each T&E phase, identify the system simulations required, including computer-driven simulation models and hardware-in-the-loop test beds (a system representation consisting partially of actual hardware and/or software, and partially of computer models or prototype hardware and/or software). The rationale for their credible use or application must be explained in an approved TEMP before their use. (h) T&E Administrative Support - For each test phase, identify all administrative and facilities support required. Identify the organization responsible for providing such support and the source and type of funding required. Such items as office space and equipment, pier or hangar space, and maintenance services should be discussed. (i) Manpower and Training - Identify manpower and training requirements and limitations that affect test execution. (J) Investment Requirements - Discuss investment requirements for any significant instrumentation capabilities and resources needed to support testing. 2. Resource Summary Updates The initial TEMP should project the key resources necessary to accomplish DT&E and OT&E. As system acquisition progresses, test resource requirements shall be reassessed and subsequent TEMP updates shall reflect any changed system concepts or requirements. Appendices: 1. Bibliography Cite all documents referred to in the TEMP. Also cite all reports documenting developmental and operational testing and evaluation of the system. 2. Acronyms List and define all acronyms used in the TEMP. 3. Points of Contact Provide a list of Points of Contact for all participating organizations (Project Manager, Component, Sponsor, Support Organization Representatives, testers, evaluators, etc.) APPENDIX A Acronyms ADA ADE APB Acquisition Decision Authority Acquisition Decision Event Acquisition Program Baseline L-12 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L

CAE COI CTP DT&E EOA FOT&E IPT IV&V KPP M&S MNS OA ORD OTA OT&E PM S&T T&E TEMP TMOT Component Acquisition Executive Critical Operational Issue Critical Technical Parameter Developmental Test and Evaluation Early Operational Assessment Follow-On Test and Evaluation Integrated Process Team Independent Verification & Validation Key Performance Parameter Modeling and Simulation Mission Needs Statement Operational Assessment Operational Requirement Document Operational Test Authority Operational Test & Evaluation Program Manager Science and Technology Test and Evaluation Test and Evaluation Master Plan Test Management Oversight Team APPENDIX B T&E Terms and Definitions Acquisition Decision Authority: With respect to a major DHS acquisition program, means the individual within the Department of Homeland Security designated with overall responsibility (including acquisition decision-making) for the program. Conformity Assessment: An evaluation that determines whether the requirements for a specific system or equipment are fulfilled. It may include: sampling and testing; inspection; supplier's declaration of conformity; certification; and quality and environmental management system assessment and registration. Contractor Test: Testing performed by the contractor or developing agency during the development of a product. This includes component testing, integration testing and the final system level test. Critical Operational Issue (COI): COIs are the operational effectiveness and operational suitability issues (not characteristics, parameters, or thresholds) that must be examined in OT&E to evaluate/assess the system s capability to provide the desired capability. Critical Technical Parameter (CTP): Measurable critical system characteristics that, when achieved, allow the attainment of desired operational performance capabilities. They are technical measures derived from desired user capabilities. Failure to achieve a critical technical parameter should be considered a reliable indicator that the system is behind in the planned development schedule or will likely not achieve an operational requirement. CTPs directly support Critical Operational Issues. Demonstration: A limited exhibition of the operation, use, maturity, operational potential or other characteristic of a device, process, product, technology or system. Demonstrations are generally conducted to inform interested parties in an explanatory, but not conclusive manner regarding some aspect of interest regarding the thing being demonstrated. Demonstrations typically do not provide the levels of statistical confidence, repeatability, closely controlled conditions and other elements that characterize more formal and rigorous testing. Developmental Test: Any testing used to assist in the development and maturation of products, product elements, or manufacturing or support processes. Also any engineering-type test used to verify that design DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L L-13

risks are minimized, substantiate achievement of contract technical performance, and certify readiness for Operational Testing. Early Operational Assessment (EOA): An operational assessment conducted prior to or in support of prototype testing. EOA assess the operational impact of candidate technical approaches and to assist in selecting preferred alternative system concepts. Experiment: A limited trial or tentative procedure conducted to test a principle, supposition or hypothesis, for the purpose of understanding the behavior of a system or discovering something unknown. Experiments are typically directed toward increasing knowledge and understanding in the field of system research, development and acquisition supporting the establishment of long term operational capabilities. Experiments, by themselves, generally do not provide sufficient basis for conclusions on the operational effectiveness or suitability of an acquired system. Follow-On Operational Test and Evaluation (FOT&E): Test and Evaluation that may be necessary after system deployment to refine estimates made during operational test and evaluation, to evaluate production changes, and to re-evaluate the system to ensure that it continues to meet operational needs. Independent Verification and Validation (IV&V): The verification and validation of a software product by an organization that is both technically and managerially separate from the organization responsible for developing the product. Information Assurance: Activities that protect and defend information and information systems by ensuring their: - Availability: Timely, reliable access to services. - Integrity: Protection from unauthorized change. - Authentication: Verification of originator. - Confidentiality: Protection from unauthorized disclosure. - Non-repudiation: Undeniable proof of participation. Integrated Test and Evaluation Program: The planning, execution and reporting on the totality of test and evaluation events conducted on a system or equipment throughout the system technology development and acquisition. The purpose of ITEP is to ensure that the system and/or component are thoroughly tested, that redundant testing is minimized, and associated costs and time are conserved. Integration Testing: The process of testing the segments of a system as they are developed and integrated with other segments to ensure individual interfaces meet the design criteria. This usually follows unit testing which tests the segment to ensure it meets the design criteria of that segment. Key Performance Parameter (KPP): Those attributes or characteristics of a system/program/project that are considered critical or essential parts of an effective system/program/project capability. Maintainability: The ability of a system to be retained in, or restored to a specified condition when maintenance is performed by personnel having the specified skill levels, using prescribed procedures and resources, at each prescribed level of maintenance and repair. Maintainability is a characteristic of design and installation, expressed as the probability that an item will be retained in or restored to a specified condition within a given period of time, when the maintenance is performed in accordance with prescribed procedures and resources. Measures of Effectiveness: Measures of Effectiveness (MOEs) are operational outcome measures that identify the most critical performance requirements needed to meet system-level capability objectives. Operational effectiveness is the overall degree of a system s ability to provide desired capability considering the total operational environment. For example, weapon system effectiveness would consider environmental factors such as operator organization, doctrine, and tactics; survivability; vulnerability; and threat characteristics. Measure of Performance: Measures of Performance (MOPs).characterize physical or functional attributes relating to the execution of the system s function. They quantify a technical or performance requirement directly derived from MOEs and MOSs. MOPs should relate to these measures such that a change in MOP can be related to a change in MOE or MOS. MOPs are used to derive, develop, support, and document the L-14 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L

performance requirements that will be the basis for design activities and process development. They also are used to help identify the critical technical parameters. Operational Assessment (OA): An evaluation of operational effectiveness and operational suitability made by an independent operational test activity, with user support as required, on other than production systems. OAs focus on developmental efforts, programmatic voids, risk areas adequacy of requirements and the ability of the program to support adequate Operational Testing (OT). OAs may be conducted at any time in a program lifecycle using technology demonstrators, prototypes, mock-ups, engineering development models or simulators, but are typically done early in the concept or development phases. OAs will not substitute for Operational Testing and Evaluation necessary to support full rate production and deployment decisions. Operational Effectiveness: Measure of the overall ability of a system to provide desired capability when used by representative personnel in the environment planned or expected for operational employment of the system considering organization, doctrine, tactics, supportability, survivability, vulnerability, and threat. Operational Suitability: The degree to which a system can be placed and sustained satisfactorily in field use with consideration being given to availability, compatibility, transportability, interoperability, reliability, wartime usage rates, maintainability, safety, human factors, habitability, manpower, logistics supportability, natural environmental effects and impacts, documentation, and training requirements. Operational Test: The field test performed under realistic conditions, overseen and evaluated by an activity independent from the agency developer and user organizations, of any system or key component of a system for the purposes of determining the effectiveness and suitability of that system or component when used by typical users in the expected operating environment. Regression Testing: Testing of computer software and/or systems to assure correct performance after changes were made to code that previously performed in a known manner. Regression testing seeks to uncover regression faults that occur when software functionality that previously worked as desired, stops working or no longer works in the same way that was previously planned. Regression faults typically occur as an unintended consequence of program changes. Reliability: The ability of a system to provide desired capability without failure, degradation, or demand on the support system; including the ability to perform required functions in routine and non-routine and/or unexpected circumstances. Test: A program or procedure designed to obtain, verify or provide data for the evaluation of any of the following: 1) progress in accomplishing developmental objectives; 2) the performance, operational capability and suitability of systems, subsystems, components and equipment items; and 3) the vulnerability and/or lethality of systems, subsystems, components and equipment items. Validation: The process of evaluating a system or component (including software) during or at the end of the development process to determine whether it satisfies the specified user s needs. Validation answers the question; Is it the right product for the established need? Verification: The process of confirming that a system or system element is designed and/or built as intended; in other words, that the system or element meets design-to or build-to specifications. Verification answers the question: Is the product as we expected it to be? DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L L-15

Test and Evaluation Master Plan (TEMP) for (PROGRAM TITLE) Submitted by: Program Manager Date Endorsed by: User/Operator Date Approved by: Operational Test Authority 2 Date Approved by: Director, Operational Test and Evaluation (Section D) Date Approved by: Acquisition Decision Authority Date 2 The level of the OTA signature is dependent on the level of the ADA L-16 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L

APPENDIX C Executive Summary Revision Summary (if applicable) Table of Contents Section A. Introduction 1. Background 2. Operational Performance Requirements 3. Critical Technical Parameters Section B. Program Summary 1. Integrated Master Schedule 2. Management Section C. Developmental Test and Evaluation Outline 1. Developmental Test and Evaluation Overview 2. Developmental Test and Evaluation to Date 3. Future Developmental Test and Evaluation 4. Developmental Test and Evaluation Plans and Reports Section D. Operational Test and Evaluation Outline 1. Operational Test and Evaluation Overview 2. Critical Operational Issues 3. Operational Test and Evaluation to Date 4. Planned Operational Test and Evaluation 5. Operational Test and Evaluation Plans and Reports Section E. Appendices Test and Evaluation Resource Summary 1. Summary 2. Resource Summary Updates 1. Bibliography 2. Acronyms 3. Points of Contact DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L L-17

APPENDIX D DHS TEMP Review Checklist GENERAL 1. Does the TEMP accurately describe the comprehensive test and evaluation program, including all known or anticipated T&E activities and events over the program life cycle? 2. Is the TEMP compliant with the format and content of the DHS guidance? 3. Are all changes to the TEMP described in the Revisions Summary (if applicable)? 4. For programs with multiple projects, does the TEMP address the required T&E throughout the program s life cycle, including activities for each project/discrete useful segment? 5. Do the operational requirements, Key Performance Parameters (KPPs) and Critical Technical Parameters (CTPs) identified in the TEMP trace to the Operational Requirements Document (ORD)? 6. Does the TEMP show that DT&E and OT&E begin early enough to assess performance, identify risks, and estimate operational potential prior to scheduled production and deployment decisions? 7. Are the required test reports scheduled to be available to support key milestone decisions? 8. Is OT&E overseen and evaluated by an organization independent of the developer and program office? 9. Has adequate time for data analysis and report development been factored into the schedule? 10. Have adequate numbers of test assets been identified for OT and DT? 11. Will production representative units be available and dedicated to OT? 12. Have an adequate number of test hours been scheduled to resolve suitability COIs? L-18 DHS Acquisition Instruction/Guidebook #102-01-001: Appendix L