International SEMATECH Semiconductor Industry Standards Conformance Guidelines: Assessment Criteria and Processes, Rev. 2

Size: px
Start display at page:

Download "International SEMATECH Semiconductor Industry Standards Conformance Guidelines: Assessment Criteria and Processes, Rev. 2"

Transcription

1 Semiconductor Industry Standards Conformance Guidelines: Assessment Criteria and Processes, Rev. 2

2 and the logo are registered service marks of International SEMATECH, Inc., a wholly-owned subsidiary of SEMATECH, Inc. Product names and company names used in this publication are for identification purposes only and may be trademarks or service marks of their respective companies. 2003, Inc.

3 Semiconductor Industry Standards Conformance Guidelines: Assessment Criteria and Processes, Rev. 2 July 31, 2003 Abstract: This document from the MFGM017 project provides direction in applying consistent processes for assuring competent, independent assessment of conformance of a product to an industry standard. Intended audiences are IC makers who prefer that suppliers provide third-party assessment of conformance or certification, suppliers who are requested to provide third-party assessment or certification, and test service providers who perform assessments. The guideline includes processes for assessments that may be applied to conformance assessment of an implementation with respect to standards relevant to the semiconductor industry. The purpose of this document is to provide generic guidelines and processes for the development of assessment criteria. Additionally, guidance is provided in the appendix for 300 mm equipment communication and automation standards to provide direction for industry-accepted certification assessment criteria and methods that reflect current industry usage. This document complements Technology Transfers # A-ENG, A-ENG, and A-ENG. NOTICE: makes no warranties or representations as to the suitability or application of this guideline. The application of this guideline, in part or in whole, is the sole responsibility of the user of this guideline. Users of this guideline are cautioned to consider any manufacturer s instructions, regulatory requirements, contractual agreements, and other relevant materials in the application of the guidance provided herein. This document may undergo further development and is subject to change at the discretion of. This document is intended as a reference only. Guidance provided herein is not intended to represent or imply industry standard practices or specifications. Keywords: Procedures, Standards, Test Methods, Standards Conformance Authors: Jackie Ferrell, Lorn Christal Approvals: Lorn Christal, Author Jackie Ferrell, Program Manager, Guidelines & Standards Randy Goodall, Assoc. Director, Fab Productivity Laurie Modrey, Technical Information Transfer Team Leader

4

5 iii Table of Contents 1 INTRODUCTION Background Purpose Benefit Scope Limitations Assessment Process GENERATING A KNOWN GOOD TEST (KGT) Essential Element: Identifying Standard Requirements Essential Element: Standards Interpretation and Application Essential Element: Mapping Requirements to Test Cases Essential Element: Generating the Test Plan Essential Element: Test Verification and Validation (V&V) Essential Element: Document Control ASSESSMENT DOCUMENTATION Essential Element: Assessment Plan Essential Element: Test Design Specification Essential Element: Test Case Specification Essential Element: Test Procedure Specification Essential Element: Test Log Essential Element: Test Incident Report Essential Element: Assessment Report PERFORMING AN ASSESSMENT Essential Element: Preparation for Testing Essential Element: Test Sessions REPORTING RESULTS Essential Element: Test Case Evaluation Essential Element: Summarizing Results ASSESSMENT FREQUENCY...18 APPENDIX A ESSENTIAL ELEMENTS AND REQUIREMENTS...21 APPENDIX B EQUIPMENT COMMUNICATION AND CONTROL STANDARDS ASSESSMENT CRITERIA AND PROCESSES...24 B.1 Test Configuration and Documentation...24 B.2 Test Execution...25 B.3 Test Algorithms...26 B.4 Essential Elements and Requirements...27 B.5 Standards Requirements Example...28 APPENDIX C TERMINOLOGY...29 APPENDIX D REFERENCES...30 D.1 ISMT Documents...30 D.2 Industry Standards...30 D.3 Additional Information...30

6 iv List of Figures Figure 1 Flowchart of the Test Development and Deployment Process...2 Figure 2 Standards Interpretation and Application...4 Figure 3 Model of a Requirement...6 Figure 4 Process of Building a Test...7 Figure 5 Test Equipment Verification and Validation Process Flow Chart...8 Figure 6 Test Process and Associated Documentation...10 Figure 7 Results Evaluation Flow Chart...17 List of Tables Table 1 Table 2 Table 3 Table 4 Conformance Assessment Metrics...18 Guidelines on the Amount of Testing to be Done on a Certified Product That is Revised...19 Essential Elements and Requirements...21 Equipment Communication and Control Table of Essential Elements and Requirements...27

7 v Acknowledgments Key Contributors: Norman Beasley, Consultant; Lorn Christal, ISMT; Trevor Claybrough, Intercentury Software; George Collins, Rudolph Technologies; Blaine Crandell, TI; Guy Davis, Brooks; Cris DeWitt, Agile TCP; Jackie Ferrell, ISMT; Steve Fulton, ISMT; Tom Hogel, PEER Group; Richard Oeschner, Fraunhofer; Jessee Ring, Software Quality First; Glenn Stefanski, IBM; and David Walsh, Brooks Participants: Timothy Aanerud, Objective Solutions; Jonathan Chang, TSMC; Peter Cross, Intel; Stephan Gramlich, AMD; Les Marshall, AMD; Debora Ortiz, TRW; Keith Peden, Brooks; Dave Reis, Applied Materials; Max Tu, TSMC Revision History and Schedule Date Revision Description Comments Jun 18, 02 Rev Working Draft Jun 26, 02 Rev Working Draft July 08, 02 Rev Working Draft July 19, 02 Rev 0.01 Preliminary Draft Available on ISMT website for industry review and feedback Oct 11, 02 Rev 0.1 Industry Review Draft Available on ISMT website for industry review and feedback Dec 02, 02 Rev 1.0 Publish Technology Transfer # A-ENG July 03, 03 Rev 2.0 Publish

8

9 1 1 INTRODUCTION 1.1 Background This document provides direction in applying consistent processes for assuring competent, independent assessment of conformance of a product to an industry standard. Intended audiences are IC makers who prefer suppliers to provide third-party assessment of conformance or certification, suppliers who are requested to provide third-party assessment or certification, and test service providers (TSPs) who perform assessments. The guideline includes processes for assessments that may be applied to conformance assessment of an implementation with respect to standards relevant to the semiconductor industry. This document provides generic guidelines and processes for the development of assessment criteria and serves as a reference to suppliers and customers in applying best practices to conformance assessments. This document contains Essential Elements and REQUIREMENTS for fulfilling the essential elements. (see the appendix for a table of all essential elements and requirements). RECOMMENDED PRACTICES are also included in the guideline. Some criteria for 300 mm communication software assessment are also in the appendix. An example of applying the mapping of standards requirements to test requirements is in the Example of Standards Requirements Mapped to Test Requirements (SEMI E37.1). 1.2 Purpose The purpose of this document is to provide generic guidelines and processes for the development of test criteria and methods for third-party certification and supplier self-assessment of standards conformance. In this revision, these generic processes have been applied to the 300 mm equipment communication and automation standards to develop industry-accepted certification test criteria and methods that reflect current industry usage. 1.3 Benefit The expected benefits of this guideline are A well defined, industry approved, assessment methodology that takes into account current industry standards usage Basic principles in test criteria identification that can be applied to any standard 1.4 Scope This version 2.0 of the guideline covers generic processes. Specifically, generic processes are those that can be applied to any requirement s documentation (e.g., SEMI standard or guideline) to generate standardized test criteria for that specific requirement set. Generic processes in this guideline include the following: A process for deriving test requirements from industry standards and end user requirements (i.e., industry usage) Processes and criteria for generating a known good test (KGT), test preparation, execution, and reporting results A process that can be applied to testing requirements for new and emerging standards

10 2 Additionally, revision 2.0 of the application-specific guidelines includes the following: A process for identifying a minimum set of unit and scenario functionality requirements for SEMI standards that form a baseline for certification testing that includes SEMI E30, E37, E39, E40, E84, E87, E90, E94, and E109 1 using E37.1 as an example. 1.5 Limitations 1.6 Assessment Process Successful implementation of this standardized assessment methodology depends upon industry support. Every effort has been made to include interested parties in the development of these processes and guidelines. Guideline developers include representatives from device manufacturers, equipment suppliers, industry software suppliers, and assessors. Figure 1 illustrates the assessment development process described in this report. Start Types of Standards Interpretation and Application of Standards Processing of a Requirement Process of Building a Test Test Verification and Validation Test Plan Generation and Deployment Conformance Levels Results Evaluation End Figure 1 Flowchart of the Test Development and Deployment Process 1 This guideline does not require the use of a specific software testing application. Any appropriate testing application may be used to perform the testing requirements and procedures defined in this guideline.

11 3 2 GENERATING A KNOWN GOOD TEST (KGT) Summary: The following subsections define the process by which requirements are identified, test cases are generated, and standardized test plans are created and controlled. Once defined, the assessor uses the test plans during the assessment process. 2.1 Essential Element: Identifying Standard Requirements Summary: This section defines the types of requirements that can be found in a standards or requirements document. Certification applies only to requirements defined in the target standard. An example of a table that extracts standard requirements for testing is in the ISMT Example of Standards Requirements Mapped to Test Requirements (SEMI E37.1). REQUIREMENT: The results of identifying each requirement in a standard are documented. The document should identify the standard and its revision level. The documentation should include the following information for each requirement identified in the target standard: a) Standard section number (to the lowest level possible), paragraph number, and line or item number b) A unique identifier for each requirement c) Requirement text d) Requirement type (fundamental, additional, or optional) e) Category of function or capability for each requirement RECOMMENDED PRACTICE: The first step in creating a KGT is to identify the standard requirements. Standards documents present information in various formats: paragraphs, tables, and bulleted lists. Individual requirements can be found in any of these formats. Generally, text found in a standard can be described as one of the following types of text: Descriptive text Found throughout a standard and generally used to put information into context. Requirement text Found throughout a standard and generally identified by looking for the word shall in the sentence. Requirements are also found in tables or lists. Often a single complex sentence in a standard contains several requirements. Clarifying text (i.e., Note) Found throughout a standard and generally identified by the word note at the beginning of the sentence. Clarifying text is included in a standard to help the user understand a requirement s concept. It is not an official part of the requirement. To identify individual requirements in a target standard, the standard text must be parsed and categorized as defined above. Current standards requirements contain the following types of requirements: Fundamental These must be implemented to conform to the standard. They constitute the foundation of the standard. Additional These provide functionality or specifications required for some types of factory automation or functionality applicable to specific types of equipment.

12 4 Optional These provide functionality or specifications that must be implemented by the supplier but may or may not be activated by the user. Note that requirements defined as additional or optional in a standard may be required by a given purchaser of the equipment. Thus, all requirements must be tested if applicable to the equipment under test. Optional requirements should be configurable at the discretion of the end user. 2.2 Essential Element: Standards Interpretation and Application Summary: This section addresses the other factors that impact the creation of a KGT. When developing a KGT, it is important that additional factors be considered for the final test plan to be usable by the industry. Factors such as common industry usage, interoperability, and duration of assessment all impact the final usefulness of any KGT. This section identifies these factors and provides guidance on how to evaluate and apply them. Standards E15.1 Validated Tests and Content What's Important S8 E40 E62 Std Requirement Operational Scenarios (interaction between standards) Test Development E15.1 test cases S8 test cases E40 test cases E62 test cases E87 test cases Assessment Reporting Results Reports E87 Industry Usage E84 test cases Operational Scenarios E84 Guidelines Performance Tests New Std n Do the tests cover current standards? Figure 2 Standards Interpretation and Application REQUIREMENTS: 1. Evaluate each requirement for testability 2. Evaluate each requirement for industry usage

13 5 RECOMMENDED PRACTICE: The next step in creating a KGT is to consider any external influences that can impact the usefulness of the final KGT. Requirements must be evaluated on an individual basis and as an operational whole. Each requirement identified in the standard must be evaluated for testability. Not all requirements in today s standards are testable. An assessor must be able to set up a controlled testing environment that will focus the testing process on the individual requirement. The assessor must be able to clearly assign a pass or fail outcome based on whether a tool can produce a predefined result in the controlled testing environment. During the development of the KGT, the analyst must also evaluate the requirements of a standard from an operational perspective. The following factors should be considered when evaluating a standard s full set of requirements: Interoperability and Operational Scenarios Most requirements defined in a standard are designed to interact with other requirements defined in the same or other standards. These interactions must be identified and their use within the user community understood. Any KGT for a standard should include test scenarios that are based on common usage (or expected usage) within the user community. Common scenarios based on interactions between two or more standards should be included either in that standard s KGT or in a separate operational KGT. Scenarios provide information on interactions and expose sequencing errors. Untestable Requirements Requirements that are deemed untestable cannot be included in the KGT but should be noted in the documentation. If possible, the requirement should be verified by other means such as inspection, analysis, or observation, and the results should be documented. 2.3 Essential Element: Mapping Requirements to Test Cases Summary: This section describes the process by which test cases are derived from requirements. Test cases are the building blocks of the larger test plan. It is important to properly identify each requirement and develop one or more test cases to confirm proper implementation. This section defines how to develop one or more test cases to confirm a requirement s functionality. REQUIREMENT: 1. Derive test cases from requirements RECOMMENDED PRACTICE: Conditions, Responses, and Functions This section identifies the component structure of a requirement s test case. These components are identified for each requirement and used during the creation of a test case.

14 6 Conditions Event Functions Response Figure 3 Model of a Requirement Process of Test Case Creation This section defines the test case creation process. Using information defined in previous sections, a process is presented whereby the user applies that information during the generation of one or more test cases with conditions and responses. The development of a test should consider the requirements text presented in a fundamental, additional, or optional standard requirement. Functions, conditions, and responses should be related to requirements text (see section 2.3.1). An example of a test requirement could be the light must turn GREEN when the SWITCH is moved from the OFF position to the ON position. In this example, the function is the movement of the SWITCH, the conditions are ON and OFF, and the response is GREEN. Test Building Principles Focus on simple (atomic level) functionality Attempt to reduce the number of overall tests Progress from simple to more complex Design tests to be comprehensive Self-document (software specific may not be applicable in most hardware test situations unless recording devices are used) Refrain from varying more than one key attribute at a time Tests must be reproducible and repeatable The test result must be traceable to the test that was performed

15 7 Figure 4 Process of Building a Test

16 8 2.4 Essential Element: Generating the Test Plan REQUIREMENT: 1. Generate an industry-usable KGT by combining previously generated test cases into a repeatable, validated testing structure. 2. Document a test plan. 2.5 Essential Element: Test Verification and Validation (V&V) Figure 5 Test Equipment Verification and Validation Process Flow Chart REQUIREMENT: 1. The test process and its associated equipment must be verified and validated. RECOMMENDED PRACTICE: An accrediting body or standards organization should identify a KGT. In other cases, a TSP may need to develop a test or associated test equipment. If so, the TSP should follow best practices in verifying and validating the test or test equipment. Industry standards are available for verification and validation; for example, IEEE Standard for Software Verification and Validation. The TSP should be prepared to provide evidence of verification and validation of a candidate test and associated test equipment. Figure 5 is a high level flow chart of verification and validation that may be required.

17 2.6 Essential Element: Document Control Summary: This section describes the configuration management requirements that should be in place during the KGT creation process. For additional guidance on configuration management best practices, see Technology Transfer # A-ENG. REQUIREMENTS: 1. The following test documents must be controlled: a) Requirements identification document b) Test case specifications c) Test designs d) Test plans RECOMMENDED PRACTICE: All controlled documents should be given a version number that is incremented with each revision of the document. The initial release of a document should be version 1.0. Draft versions for review before initial release should be numbered 0.X. Each controlled document should have its version number indicated on every page of the document. The title page of a document should contain a revision history showing all versions of the document: A document library housing soft copies of all documents should be in use and accessible by those who have a need. The document library may contain both controlled and uncontrolled documents. The latest version of documents should be in the library. Old versions do not have to be in the library. 9 Version Date Summary of Changes Author Approved By

18 10 3 ASSESSMENT DOCUMENTATION Summary: Assessment documentation is the documentation required to prepare for the assessment and to report its results. Figure 6 is an overview of the testing process and the associated documentation. Figure 6 Test Process and Associated Documentation 3.1 Essential Element: Assessment Plan REQUIREMENTS: 1. The assessment plan must contain the following required components: a) Assessment scope b) Assessment approach c) Resources needed to perform the assessment d) Assessment schedule

19 e) Items/features to be tested and associated product documentation f) Testing tasks to be performed g) Test deliverables 2. The plan must communicate the above components of the testing program to those who will be directly involved. RECOMMENDED PRACTICE: An assessment planning template is available to help develop the assessment plan ( Assessment plans should include the following: Document identifier Introduction Items to be tested Features to be tested Features not to be tested Assessment approach Pass/fail criteria for each test Criteria for suspending and resuming tests Testing tasks Test deliverables Test environmental and equipment needs Assessment staffing needs Assessment schedule Responsibilities of individuals participating in the assessment Risks and contingencies Required approvals 3.2 Essential Element: Test Design Specification REQUIREMENTS: 1. Each test design specification must include the following: a) Document identifier b) Features to be tested c) Approach refinements d) Test (case) identification e) Pass/fail criteria for features being tested RECOMMENDED PRACTICE: The test design specifications should identify the test cases or groups of test cases that will be conducted. Depending on the size and complexity of the software under test and the amount of testing to be performed, there may be either a single test design or several test designs. The test design should provide organization and structure for the test activities by grouping the test cases 11

20 12 logically. The test design should identify the test cases to be run, but will neither define specific inputs and outputs for each test case nor the step-by-step procedures to be followed in conducting the test case. The test design should specify the types of test cases to be run, their logical grouping, their quantity, and the intent of each specific test case. 3.3 Essential Element: Test Case Specification REQUIREMENTS: 1. Test case specifications must include the following: a) Document identifier b) Items to be tested c) Input specifications d) Output specifications e) Environmental needs f) Special procedural requirements g) Intercase dependencies RECOMMENDED PRACTICE: The test case specification should define the input conditions for a specific test case and its expected outcomes (or outputs). The test case specification should be organized so that the test case specifications will be logically grouped with their associated test design specifications. 3.4 Essential Element: Test Procedure Specification REQUIREMENTS: 1. Test procedure specifications must include the following: a) Document identification b) Purpose c) Step-by-step procedures d) Special requirements RECOMMENDED PRACTICE: Test procedure specifications should define step-by-step actions that will be carried out in conducting each test case. There should be only one test procedure per test case. Test procedures should be logically grouped with their associated test design specifications. 3.5 Essential Element: Test Log REQUIREMENTS: 1. A test log must include the following: a) Document identifier b) Activity and event entries c) Description of significant events that occurred during testing 2. The test log must be the only official record of the details of the testing activities and significant events that occurred during testing.

21 RECOMMENDED PRACTICE: All significant actions taken by the testers while running the test cases should be recorded in the test log. The test log should provide a chronological record of relevant details about the actual execution of a set of test cases. Log files should be provided in format that can be read by a standard text editor. 3.6 Essential Element: Test Incident Report REQUIREMENTS: 1. The test incident log must document all events that occur during the assessment activities and that require investigation. 2. A test incident report must include the following: a) Document identifier b) Summary c) Incident description d) Impact RECOMMENDED PRACTICE: The assessment incident report will draw attention to any issues or abnormal events that occurred during the testing activities. The incident report is intended to trigger action responding to or resolving the incident. 3.7 Essential Element: Assessment Report REQUIREMENTS: 1. The assessment report must document the assessment activities, evaluate the results, and communicate these to the customer. 2. The assessment report must include the following: a) Document identifier b) Summary of testing c) Summary of results d) Comprehensive analysis of results e) Variances f) Evaluation and summary of activities g) Approvals h) Exception handling RECOMMENDED PRACTICE: The assessment report is the culmination of the process that was started with the test plan and is the only document that officially represents the outcomes of the assessment. The summary report should contain references to the test incident report, test log, and other related assessment results. 13

22 14 4 PERFORMING AN ASSESSMENT 4.1 Essential Element: Preparation for Testing REQUIREMENTS: 1. Obtain the appropriate documentation from the supplier for the tool being tested to allow for the proper configuration of the test tool. 2. Ensure all documentation delivered to the TSP for tool certification represents an as delivered and final configuration of the tool. 3. Understand and review information about the equipment manufacturer, model number, and current software versions as well as current software and hardware manuals, communications interface manuals, supplier log files, and input from the equipment supplier (see Assessment Planning Template). 4. Review the equipment supplier s implementation of the standards and agree on the appropriate test methodologies. 4.2 Essential Element: Test Sessions REQUIREMENTS: 1. The entire test must be performed on one tool and one software or hardware version or one subsystem. 2. For communications and automation tests, create the appropriate test sessions. 3. Note any calibration or tool configuration settings that are set specifically for tool testing so that they can be documented and the tool returned to its normal state once testing is completed. 4. For communications and automation tests, back up any equipment files before beginning the test session. 5. For communications and automation tests, enter the data into the test tool using the equipment supplier s documentation as a reference. 6. For communications and automation tests, enter the data into the test tool using the equipment supplier s documentation as a reference. 7. Perform unit testing of each of the target standards, followed by operational scenario (specific to communications and automation tests) testing to evaluate how the standards work together. 8. Ensure the client provides the appropriate technical personnel to support and run the test article tool during the certification test procedure. 9. Ensure the test article tool remains under the control of the tool owner/operator at all times. At no time must TSP personnel operate or control the tool under test. RECOMMENDED PRACTICE: The unit testing will evaluate every capability described in the standard using individual test cases. This establishes the basis for executing a reduced set of operational scenarios as a representative sample of the functionality being tested. The TSP should understand and review information about the equipment to determine an appropriate test methodology. To improve the effectiveness of the assessment, the supplier needs to prepare. The assessment planning template covers the documentation equipment and personnel that need to be available

23 15 for the test. The supplier should prepare for a pre-assessment dry run in which the supplier uses its own resources to simulate the assessment, discovering problems before the TSP assessment. When done before the TSP visit, the supplier will have the opportunity to fix discrepancies before the actual test. Performing a dry run is strongly recommended. The supplier should provide accurate and complete test article documentation. If not, it is reasonable to expect that the time allocated for the testing will be increased. Generic manuals or documentation should not be accepted for certification. If appropriate and adequate tool documentation is not available for use by the TSP, the test article may still be tested, but it should be noted as a failure item to be corrected before final certification can be granted. On the first day of the test activities, the TSP should conduct a kick-off meeting to introduce the test participants and to review the planned test activities and resource requirements before the testing begins. This facilitates good communication between test team members and increases the likelihood of a successful test session. During this meeting, the TSP should ensure that support will be provided by the equipment supplier for the test. In addition, the team should discuss possible roadblocks and any contingency plans for issues that may arise. The test activity evaluates the current level and extent of conformance to the standards. The TSP is NOT responsible for investigating or correcting any deficiencies in the tool software or hardware discovered during the test. If issues arise that the TSP cannot resolve during testing, the test should be suspended until the equipment supplier is contacted and the problem is addressed. During the test, the TSP should reserve the right, with the supplier s participation, to append, change, or modify the test plan to accommodate testing. Changes to the tool software or hardware or testing portions of a test session on a different tool once the testing has started are not permitted. Ensure that none of the test data is lost or corrupted as a result of suspending a test. The following steps should be taken for any non-emergency stoppage: Export or save a copy of the test session Exit the test tool Disconnect the network connection, if applicable Remove the test equipment from the test area At the conclusion of the testing, a debriefing between the TSP and the supplier participants is recommended. The discussion should focus on issues encountered during the assessment and a brief summary of any deficiencies in the assessment process. A more detailed view of the assessment will be provided to the equipment supplier once the test report is completed. Following the certification test procedure, it is highly recommended that the test article tool be requalified as being ready for production by the responsible tool owner/operator. Requalification of the test article tool for production is not the TSP s responsibility. If the TSP finds a significant number of test failures or concludes that it is not possible to complete a valid test, retesting may be needed. It is important for the supplier to completely understand the reasons for test failure so that these failures can be remedied to ensure a successful retest. In almost all cases requiring retesting, a full retest must be done to ensure the final test results are valid.

24 16 The TSP should have an action plan to handle invalid test setups, incomplete test cases, incomplete tests cases as a result of previous test failures, TSP error, and other events that may prevent completion of a test. Annotations in the test report should reflect the nature of the testing issues and the corrective actions. 5 REPORTING RESULTS 5.1 Essential Element: Test Case Evaluation REQUIREMENTS: 1. Following assessment, confirm that all required test conditions were met during the performance of the test and assure that the test was properly executed. 2. Analyze the test results of each unit test. 3. Analyze the test results of the operational scenarios. 4. Determine the root standard function or standard interaction responsible for the nonconformance. 5. Document the objective results and analysis of non-compliant functional areas. 6. Document all exception circumstances in the Test Incident Report. 7. Evaluate the results output of the test case. 8. Document all test case pass/fail results in the test report. 9. Document the failed response criteria for test cases marked Fail in the test report. RECOMMENDED PRACTICE: Understand what functions are not properly supported and how they may interact with other standards. Investigate the test results of the operational scenarios to determine the root standard function or standard interaction responsible for the non-conformance. If any of the required test conditions were not met, the test case result should be Exception. In some cases, an exception test case may be retested. If all pass criteria have been met, the test case result should be Pass. If any of the pass criteria are not met, the test case result should be Fail. Document the failed response criteria for test cases marked Fail in the test report. Following the assessment, evaluate the results and log files of each test case to confirm that all required test conditions were met during the test. This review assures that the test was properly executed. Archive all test data for future reference.

25 17 Figure 7 Results Evaluation Flow Chart

26 Essential Element: Summarizing Results REQUIREMENT: 1. Generate results indicators for each full standards-based assessment. RECOMMENDED PRACTICE: In the case of a full certification, the equipment should conform 100% with the standard requirements that apply to the equipment under test. The TSP should notate in the test report any standard requirements that were not tested or deemed not applicable to the equipment under test. In a conformance assessment, results indicators can provide a guide to the test coverage, pass/fails for tests completed, and overall results. These indicators do not weight the value of an individual test. It is beyond the scope of this guideline to assess the relative value an individual end user will place on any given requirement. In addition to the test metrics, the test reports should be reviewed for specific failures. Table 1 provides example metrics. Table 1 Conformance Assessment Metrics Metric Calculation Description Coverage % # Cases Tested/# Standardized Requirements The extent to which the assessor is able to test the requirements (defined in the Standardized Test) Test Result % # Cases Passed/# Cases Tested Goodness of performance of the cases tested Overall Conformance % Coverage x Test Result The extent to which the equipment meets the consensus requirements (defined in the Standardized Test) 6 ASSESSMENT FREQUENCY RECOMMENDED PRACTICE: Once a product becomes certified, there is no need to retest as long as the product does not change. When the product changes (due to upgrades or enhancements), retest is appropriate. The new features or changes need to be tested, but some of the unchanged features may also need to be retested because of possible unwanted side effects (i.e., regression testing). This is typically handled by evaluating the degree of change and then deciding on an appropriate level of regression testing. Whenever any changes to a certified product are made, the supplier should submit a report to the TSP that tested and certified the product. This report identifies each change that was made to the product. Based upon the number and significance of the changes, the TSP will do an analysis and make a determination as to how much testing of the revised product needs to be done. This could range from a complete retest to a partial retest to no retest.

27 19 Table 2 Guidelines on the Amount of Testing to be Done on a Certified Product That is Revised Number of Changes Impact of Changes Minimal Mid-level Severe Large Partial (med) Complete Complete Medium Partial (low) Partial (high) Complete Small None Partial (medium) Partial (high)

28

29 21 APPENDIX A ESSENTIAL ELEMENTS AND REQUIREMENTS Table 3 Essential Elements and Requirements Section # Essential Element 2.1 Identifying Standard s Requirements Standards Interpretation and Application 2.3 Mapping Requirements to Test Cases 2.4 Generating the Test Plan 2.5 Test Verification and Validation (V&V) 2.6 Document Control Requirements 1. Each requirement defined in the target standard is identified. 2. Each requirement has a unique identifier 3. The section(s), paragraph, and lines that contain the requirement text are identified. 4. Each requirement is identified as one of the following: a) Fundamental b) Additional c) Optional 5. The category of functionality or capability for each requirement is identified 1. Evaluate each requirement for testability 2. Evaluate each requirement for industry usage 1. Derive test cases from requirements 1. Generate an industry-usable KGT by combining previously generated test cases into a repeatable, validated testing structure 2. Document a test plan 1. The test process and its associated equipment must be verified and validated 1. The following test documents must be controlled: a) Requirements identification document b) Test case specifications c) Test designs d) Test plans 3.1 Assessment Plan 1. The assessment plan must contain the following required components: a) Assessment scope b) Assessment approach c) Resources needed to perform the assessment d) Assessment schedule e) Items/features to be tested and associated product documentation f) Testing tasks to be performed g) Test deliverables 2. The plan must communicate the above components of the testing program to those who will be directly involved

30 22 Section # Essential Element 3.2 Test Design Specification 3.3 Test Case Specification 3.4 Test Procedure Specification Requirements 1. Each test design specification must include the following: a) Document identifier b) Features to be tested c) Approach refinements d) Test (case) identification e) Pass/fail criteria for features being tested 1. Test case specifications must include the following: a) Document identifier b) Items to be tested c) Input specifications d) Output specifications e) Environmental needs f) Special procedural requirements g) Intercase dependencies 1. Test procedure specifications must include the following: a) Document identification b) Purpose c) Step-by-step procedures d) Special requirements 3.5 Test Log 1. A test log must include the following: a) Document identifier b) Activity and event entries c) Description of significant events that occurred during testing 2. The test log must be the only official record of the details of the testing activities and significant events that occurred during testing 3.6 Test Incident Report 3.7 Assessment Report 1. The test incident log must document all events that occur during the assessment activities and which require investigation 2. A test incident report must include the following: a) Document identifier b) Summary c) Incident description d) Impact 1. The assessment report must document the assessment activities, evaluate the results, and communicate these to the customer 2. The assessment report must include the following: a) Document identifier b) Summary of testing c) Summary of results d) Comprehensive assessment e) Variances f) Evaluation\summary of activities g) Approvals h) Exception handling

31 23 Section # Essential Element 4.1 Preparation for Testing Requirements 1. Obtain the appropriate documentation from the supplier for the tool being tested to allow for the proper configuration of the test tool 2. All documentation delivered to the TSP for tool certification must represent an as delivered and final configuration of the tool 3. Understand and review information about the equipment manufacturer, model number, and current software versions as well as current software and hardware manuals, communications interface manuals, supplier log files, and input from the equipment supplier (see Assessment Planning Template) 4. Review the equipment supplier s implementation of the standards and agree on the appropriate test methodologies 4.2 Test Sessions 1. The entire test must be performed on one tool and one software or hardware version or one subsystem. 2. For communications and automation tests, create the appropriate test sessions. 3. Note any calibration or tool configuration settings that are set specifically for tool testing so that they can be documented and the tool returned to its normal state once testing is completed. 4. For communications and automation tests, back up any equipment files before beginning the test session. 5. For communications and automation tests, enter the data into the test tool using the equipment supplier s documentation as a reference. 6. For communications and automation tests, enter the data into the test tool using the equipment supplier s documentation as a reference. 7. Perform unit testing of each of the target standards, followed by operational scenario (specific to communications and automation tests) testing to evaluate how the standards work together. 8. Ensure the client provides the appropriate technical personnel to support and run the test article tool during the certification test procedure. 9. Ensure the test article tool remains under the control of the tool owner/operator at all times. At no time must TSP personnel operate or control the tool under test. 5.1 Test Case Evaluation 5.2 Summarizing Results 1. Following assessment, confirm that all required test conditions were met during the performance of the test and assure that the test was properly executed 2. Analyze the test results of each unit test 3. Analyze the test results of the operational scenarios 4. Determine the root standard function or standard interaction responsible for the non-conformance 5. Document the objective results and analysis of non-compliant functional areas 6. Document all exception circumstances in the Test Incident Report 7. Evaluate the results output of the test case 8. Document all test case pass/fail results in the test report 9. Document the failed response criteria for test cases marked "Fail" in the test report 1. Generate results indicators for each full standards-based assessment

32 24 APPENDIX B EQUIPMENT COMMUNICATION AND CONTROL STANDARDS ASSESSMENT CRITERIA AND PROCESSES The following guidance applies to assessments of conformance with 300 mm communications and control software standards that use SECS-II protocols. This represents a subset of the criteria and processes that will be developed in future revisions. B.1 Test Configuration and Documentation REQUIREMENT: 1. When testing automation software, (e.g., E30, E87, etc.), the TSP must configure the test tool with the following information: a) Collection Events b) Status Variables c) Equipment Constants d) Data Variables RECOMMENDED PRACTICE: Unit testing evaluates every capability described in the standard using individual test cases. This establishes the basis for executing a reduced set of operational scenarios as a representative sample of the functionality being tested. The implementations of optional functions or services defined in the standards should take into account the preferences and usage by the end user. The TSP should understand and review information about the equipment to determine an appropriate test methodology. To improve the effectiveness of the assessment, the supplier needs to prepare. The Assessment Planning Template covers the documentation equipment and personnel that need to be available for the test. The template also helps the supplier prepare for a pre-assessment dry run in which the supplier uses its own resources to simulate the assessment, discovering problems before the TSP assessment. When done before the TSP visit, the supplier has the opportunity to fix discrepancies before the actual test. Performing a dry run is strongly recommended. The supplier should provide accurate and complete test article documentation. If not, it is reasonable to expect that the duration of the testing will be increased. Generic manuals or documentation should not be accepted for certification. If appropriate and adequate tool documentation is not available for use by the TSP, the test article may still be tested, but it should be noted as a failure item to be corrected before final certification can be granted. The test article tool s software (GEM) manual should be submitted to the TSP at least 2 weeks before the scheduled test date to allow for adequate time to review and prepare for the test procedure as well as to make the best use of available tool time by preconfiguring the testing software. Generic manuals or documentation will not be accepted for certification. If appropriate and adequate tool documentation is not available for use by the TSP, the test article may still be tested, but it will be noted as a failure item to be corrected before final certification can be granted. The test article s documentation must list all valid Alarms and briefly describe their individual meaning. The alarm data must include all valid ALIDs. Each unique ALID

33 (alarm identifier) requires ALTX (alarm text), ALCD (alarm status), and two unique CEIDs, one for set and one for cleared. The test article s documentation must list all valid Data Items and briefly describe their individual meaning. This data must include all valid VIDs, SVIDs, and ECIDs. For each Data Item, the format, default value, min and max value, the unit, and a description are required. The test article s manual must also document when data values (DVVALs) are valid. The test article s documentation must list all valid Collection Events. This data must include all valid CEIDs and briefly describe their individual meaning. Before a certification test is begun, the TSP should confirm that the current tool software, including setup parameters and equipment constants, has been backed up to ensure that the tool is capable of being returned to production in a known good configuration. Changes to the tool software or hardware or testing portions of a test session on a different tool once the testing has started are not permitted. B.2 Test Execution A portion of SEMI E30 testing requires the successful testing of both DELETE PROCESS RECIPE and DELETE ALL PROCESS RECIPES functionality. If this testing is to be performed on a production tool, extra diligence in the preservation of process recipes is highly recommended. If it is neither possible nor practical to execute these specific tests on the test article, which is also a production tool, the TSP must be notified not to test for this functionality before the start of testing. Conditional certification will note this situation as a failure and that this was at the request of the test article submitter because the test article was in a production environment. Ensure that none of the test data is lost or corrupted as a result of suspending a test. The following steps should be taken for any non-emergency stoppage: Export or save a copy of the test session Exit the test tool Disconnect the network connection Remove the test equipment from the test area The test article tool must remain under the control of the tool owner/operator at all times. At no time must TSP personnel operate or control the tool under test. The following list is an example of what might be needed to define a supplier s process run. The steps will vary depending on the type of test to be performed. For example, testing for automation level and functional behavior of the tool would include the following: Initialization Material placed/removed events Material move-in, move-out scenarios OSS services (e.g., create job) Control and process management commands (e.g., CJ start, CJ stop/abort, PR start, PR stop) Other remote commands 25

34 26 B.3 Test Algorithms REQUIREMENT: 1. If the test case has an associated automated test script, a brief description of the test case and the algorithm used in the test script should be documented. An example is provided below. RECOMMENDED PRACTICE: The following is an example of a test algorithm. Individual test specifications are typically incorporated into a test case and test plan. An algorithm in pseudo code is, by itself, insufficient to describe a test case. Descriptions and categories and even sub-categories as described below may be needed. Test specifications with very tight algorithms may run differently, depending on the data used for the test. Test setup will also affect the outcome. The test specification needs to clearly map to the standard requirements being tested. Description This is a brief description of each test case. Category Each test case is assigned to a functional category or capability. This category is used to associate test cases within various reports. Data This data is used by the automated test script for setup or, in some cases, expected results. This data is unique to the equipment being tested. Algorithm If the test case has an associated automated test script, a brief description of the script s algorithm is provided here. There are many models to choose from, but to avoid interpretation issues it is best to itemize all expected and unexpected results as well as a default condition. Steps These are instructions that provide the details on how the test case should be executed. It is important that these steps be followed for each test case since they provide setup and analysis instructions. Requirements These are the requirements (if assigned) that this test case verifies or, in conjunction with other test cases, helps to verify. The listing includes summary titles of each requirement. The ID field is assigned as a unique identifier for the requirement. This ID can then be used to determine the actual text from the requirements specification. Next to some of the ID values is an asterisk (*). This asterisk indicates that the requirement is considered a primary requirement for this test case. In other words, one of the main intentions of this test case is to verify that requirement in particular. Requirements without an asterisk are still verified by this test case, but typically as a byproduct of verifying the primary requirement(s).

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B 1. Work Plan & IV&V Methodology 1.1 Compass Solutions IV&V Approach The Compass Solutions Independent Verification and Validation approach is based on the Enterprise Performance Life Cycle (EPLC) framework

More information

QUALITY MANAGEMENT SYSTEM POLICIES AND PROCEDURES

QUALITY MANAGEMENT SYSTEM POLICIES AND PROCEDURES Your Company Name QUALITY MANAGEMENT SYSTEM POLICIES AND PROCEDURES Origination Date: XXXX Document Identifier: Date: Document Revision: QMS-00 QMS Policies and Procedures Latest Revision Date Abstract:

More information

Summary of TL 9000 R4.0 Requirements Beyond ISO 9001:2000

Summary of TL 9000 R4.0 Requirements Beyond ISO 9001:2000 This summary identifies the additional TL 9000 Release 4.0 requirements beyond those stated in ISO 9001:2000. See the TL 9000 R4.0 Handbook for the actual TL 9000 R4.0 requirements. ISO 9001:2000 section

More information

QuEST Forum. TL 9000 Quality Management System. Requirements Handbook

QuEST Forum. TL 9000 Quality Management System. Requirements Handbook QuEST Forum TL 9000 Quality Management System Requirements Handbook Point Release 6.1 The ICT Quality Management System Performance Excellence through Global ICT Quality Copyright Copyright 2017 Quality

More information

EDA Assessment. Steve Fulton Charisse Nabors

EDA Assessment. Steve Fulton Charisse Nabors EDA Assessment Steve Fulton Charisse Nabors Advanced Materials Research Center, AMRC, International SEMATECH Manufacturing Initiative, and ISMI are servicemarks of SEMATECH, Inc. SEMATECH, the SEMATECH

More information

National Aeronautics and Space Administration Washington, DC 20546

National Aeronautics and Space Administration Washington, DC 20546 Technical Standards Division Publication NASA-STD-2100-91 NASA Software Documentation Standard Software Engineering Program NASA-STD-2100-91 -91 Approved: July 29, 1991 National Aeronautics and Space Administration

More information

Building quality into the software from the. Keeping and. the software. software life cycle

Building quality into the software from the. Keeping and. the software. software life cycle SENG 521 Software Reliability & Software Quality Chapter 14: SRE Deployment Department t of Electrical l & Computer Engineering, i University it of Calgary B.H. Far (far@ucalgary.ca) http://www.enel.ucalgary.ca/people/far/lectures/seng521

More information

Intermediate Certificate in Software Testing Syllabus. Version 1.4

Intermediate Certificate in Software Testing Syllabus. Version 1.4 Intermediate Certificate in Software Testing Syllabus February 2010 Background This document is the syllabus for the intermediate paper which leads to the practitioner level of qualification, as administered

More information

AS9003A QUALITY MANUAL

AS9003A QUALITY MANUAL Your Logo AS9003A QUALITY MANUAL Origination Date: (month/year) Document Identifier: Date: Document Status: Document Link: AS9003A Quality Manual Latest Revision Date Draft, Redline, Released, Obsolete

More information

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices

Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices Report of the Reliability Improvement Working Group (RIWG) Volume II - Appendices Appendix 1 Formulate Programs with a RAM Growth Program II-1 1.1 Reliability Improvement Policy II-3 1.2 Sample Reliability

More information

Number: DI-IPSC-81427B Approval Date:

Number: DI-IPSC-81427B Approval Date: DATA ITEM DESCRIPTION Title: Software Development Plan (SDP) Number: DI-IPSC-81427B Approval Date: 20170313 AMSC Number: N9775 Limitation: N/A DTIC Applicable: No GIDEP Applicable: No Preparing Activity:

More information

Software Engineering II - Exercise

Software Engineering II - Exercise Software Engineering II - Exercise April 29 th 2009 Software Project Management Plan Bernd Bruegge Helmut Naughton Applied Software Engineering Technische Universitaet Muenchen http://wwwbrugge.in.tum.de

More information

Legacy Revenue Recognition

Legacy Revenue Recognition September 27, 2017 2017.2 Copyright 2005, 2017, Oracle and/or its affiliates. All rights reserved. This software and related documentation are provided under a license agreement containing restrictions

More information

INF 3121 Software Testing - Lecture 05. Test Management

INF 3121 Software Testing - Lecture 05. Test Management INF 3121 Software Testing - Lecture 05 Test Management 1. Test organization (20 min) (25 min) (15 min) (10 min) (10 min) (10 min) INF3121 / 23.02.2016 / Raluca Florea 1 1. Test organization (20 min) LO:

More information

Advantages and Disadvantages of. Independent Tests. Advantages. Disadvantages

Advantages and Disadvantages of. Independent Tests. Advantages. Disadvantages 8.0 Test Management Outline 8.1 Test organisation 8.2 Test planning and estimation 8.3 Test program monitoring and control 8.4 Configuration management 8.5 Risk and testing 8.6 Summary Independent Testing

More information

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3) PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3) 3.1 IV&V Methodology and Work Plan 3.1.1 NTT DATA IV&V Framework We believe that successful IV&V is more than just verification that the processes

More information

SAN FRANCISCO PUBLIC UTILITIES COMMISSION INFRASTRUCTURE CONSTRUCTION MANAGEMENT PROCEDURES

SAN FRANCISCO PUBLIC UTILITIES COMMISSION INFRASTRUCTURE CONSTRUCTION MANAGEMENT PROCEDURES SAN FRANCISCO PUBLIC UTILITIES COMMISSION INFRASTRUCTURE CONSTRUCTION MANAGEMENT PROCEDURES SECTION: SFPUC INFRASTRUCTURE CONSTRUCTION MANAGEMENT PROCEDURE NO: 018 TITLE: SYSTEM TESTING AND START-UP APPROVED:

More information

Work Plan and IV&V Methodology

Work Plan and IV&V Methodology Work Plan and IV&V Methodology Technology initiatives and programs should engage with an IV&V process at the project planning phase in order to receive an unbiased, impartial view into the project planning,

More information

System Management Forum. Conformance Program Issue Tracking Process

System Management Forum. Conformance Program Issue Tracking Process 1 2 3 4 Document Number: Date: 2016-04-15 Version: 1.0.0 5 6 System Management Forum Conformance Program Issue Tracking Process 7 8 9 10 11 Supersedes: None Document Class: Informative Document Status:

More information

ASA-100 QUALITY MANUAL

ASA-100 QUALITY MANUAL Your Company Name ASA-100 QUALITY MANUAL Origination Date: XXXX Document Identifier: Date: Document Revision: QMS-00 Quality Manual Latest Revision Date Abstract: This quality manual describes (your Company's)

More information

REQUIREMENT DRIVEN TESTING. Test Strategy for. Project name. Prepared by <author name> [Pick the date]

REQUIREMENT DRIVEN TESTING. Test Strategy for. Project name. Prepared by <author name> [Pick the date] REQUIREMENT DRIVEN TESTING Test Strategy for Project name Prepared by [Pick the date] [Type the abstract of the document here. The abstract is typically a short summary of the contents of

More information

Results of the IEC Functional Safety Assessment

Results of the IEC Functional Safety Assessment Results of the IEC 61508 Functional Safety Assessment Project: 3051S Electronic Remote Sensors (ERS ) System Customer: Emerson Automation Solutions (Rosemount, Inc.) Shakopee, MN USA Contract No.: Q16/12-041

More information

Sample Company. Risk Assessment and Mitigation Plan

Sample Company. Risk Assessment and Mitigation Plan Sample Company Revision History DATE VERSION DESCRIPTION AUTHOR 08-07-04 0.01 Initial Version F. Jack Anderson 09-07-04 0.02 Revised with changes from team reviews Michael Bennett 09-08-04 2.00 Initial

More information

Digital Industries Apprenticeship: Occupational Brief. Software Tester. March 2016

Digital Industries Apprenticeship: Occupational Brief. Software Tester. March 2016 Digital Industries Apprenticeship: Occupational Brief Software Tester March 2016 1 Digital Industries Apprenticeships: Occupational Brief Level 4 Software Tester Apprenticeship Minimum Standards and Grading

More information

Oracle Fusion Applications Workforce Development Guide. 11g Release 5 (11.1.5) Part Number E

Oracle Fusion Applications Workforce Development Guide. 11g Release 5 (11.1.5) Part Number E Oracle Fusion Applications Workforce Development Guide 11g Release 5 (11.1.5) Part Number E22777-05 June 2012 Oracle Fusion Applications Workforce Development Guide Part Number E22777-05 Copyright 2011-2012,

More information

The Enhanced Sales Center SuiteApp

The Enhanced Sales Center SuiteApp September 27, 2017 2017.2 Copyright 2005, 2017, Oracle and/or its affiliates. All rights reserved. This software and related documentation are provided under a license agreement containing restrictions

More information

CORPORATE QUALITY MANUAL

CORPORATE QUALITY MANUAL Corporate Quality Manual Preface The following Corporate Quality Manual is written within the framework of the ISO 9001:2008 Quality System by the employees of CyberOptics. CyberOptics recognizes the importance

More information

Program Lifecycle Methodology Version 1.7

Program Lifecycle Methodology Version 1.7 Version 1.7 March 30, 2011 REVISION HISTORY VERSION NO. DATE DESCRIPTION AUTHOR 1.0 Initial Draft Hkelley 1.2 10/22/08 Updated with feedback Hkelley 1.3 1/7/2009 Copy edited Kevans 1.4 4/22/2010 Updated

More information

Certificate IV in Project Management Student Assessment Guide

Certificate IV in Project Management Student Assessment Guide Overview The assessment for the Certificate IV in Project Management comprises Team Assignment Development of a detailed Project Management Plan for chosen case study - 30% for blended classes Individual

More information

Chapter 2 GALP Implementation Assistance

Chapter 2 GALP Implementation Assistance Chapter 2 GALP The GALP Implementation is based on established data management principles. 1. PRINCIPLES Control is the essential objective behind most data management principles. Effective management

More information

Software Testing Life Cycle

Software Testing Life Cycle Software Testing Life Cycle STLC (Software Testing Life Cycle) is an integral component of SDLC (Software Development Life Cycle). Testing has become a distinct phenomenon during and after the development

More information

Osprey Technologies, LLC. Quality Manual ISO9001:2008 Rev -

Osprey Technologies, LLC. Quality Manual ISO9001:2008 Rev - February 8, 2015 1 Osprey Technologies, LLC Quality Manual ISO9001:2008 Rev - February 8, 2015 Released by Dave Crockett President 6100 S. Maple Avenue, Suite 117 Tempe, AZ 85283 www.osprey-tech.com February

More information

TOOL ENGINEERING OLD GROVE RD. SAN DIEGO, CA

TOOL ENGINEERING OLD GROVE RD. SAN DIEGO, CA Page 1 of 42 VERTECHS ENTERPRISES, INC. Dba LUCHNER TOOL ENGINEERING 10051 OLD GROVE RD. SAN DIEGO, CA 92131 Ph No. 1-858-578-3900. Fax No. 1-858-578-2910 Reviewed and Approved By: Geosef (Joey) Straza

More information

Robotic Process Automation

Robotic Process Automation Robotic Process Automation Contents 1 2 3 4 5 6 7 Our understanding of you needs 5 Brief introduction to Robotic Process Automation 7 Project activities and deliverables 10 Project timeline 13 EY Project

More information

ISTQB Advanced Technical Test Analyst Certificate in Software Testing

ISTQB Advanced Technical Test Analyst Certificate in Software Testing ISTQB Advanced Technical Test Analyst Certificate in Software Testing Sample Paper A - 1 Hour Examination This is not a complete exam paper Record your surname/last/family name and initials on the Answer

More information

7. Model based software architecture

7. Model based software architecture UNIT - III Model based software architectures: A Management perspective and technical perspective. Work Flows of the process: Software process workflows, Iteration workflows. Check Points of The process

More information

Compliance driven Integrated circuit development based on ISO26262

Compliance driven Integrated circuit development based on ISO26262 Compliance driven Integrated circuit development based on ISO26262 Haridas Vilakathara Manikantan panchapakesan NXP Semiconductors, Bangalore Accellera Systems Initiative 1 Outline Functional safety basic

More information

GAIA. GAIA Software Product Assurance Requirements for Subcontractors. Name and Function Date Signature 15/09/05 15/09/05 15/09/05 15/09/05 15/09/05

GAIA. GAIA Software Product Assurance Requirements for Subcontractors. Name and Function Date Signature 15/09/05 15/09/05 15/09/05 15/09/05 15/09/05 Title Page : i Software Product Assurance Requirements for Subcontractors Name and Function Date Signature Prepared by D.MUNCH Prime Contractor SPA Manager 15/09/05 Verified by D.PERKINS E-SVM PA Manager

More information

Defense Business Systems - Examples

Defense Business Systems - Examples Defense Business Systems - Examples Example 1 2.3. Deficiency Reporting The EBS Workbench tool is used to document deficiencies (defects) detected during testing and tracks all steps in the defect resolution.

More information

ISO 9001 QUALITY MANUAL

ISO 9001 QUALITY MANUAL ISO 9001 QUALITY MANUAL Origination Date: 10/01/14 Document Identifier: AIF quality control manual Date: 10/01/14 Project: Customer review Document Status: Released Document Link: www.aeroindfast.com Abstract:

More information

APS Cleaning Quality Management System Scope of Certification The provision of commercial and industrial cleaning services throughout Queensland.

APS Cleaning Quality Management System Scope of Certification The provision of commercial and industrial cleaning services throughout Queensland. Quality Management System Scope of Certification The provision of commercial and industrial cleaning services throughout Queensland. Table of Contents Contents 1. Introduction... 3 1.1. Process Approach...

More information

14620 Henry Road Houston, Texas PH: FX: WEB: QUALITY MANUAL

14620 Henry Road Houston, Texas PH: FX: WEB:  QUALITY MANUAL 14620 Henry Road Houston, Texas 77060 PH: 281-447-3980 FX: 281-447-3988 WEB: www.texasinternational.com QUALITY MANUAL ISO 9001:2008 API Spec Q1, 9th Edition API Spec 8C 5 Th Edition MANUAL NUMBER: Electronic

More information

Uniform Environment, Safety, and Health (ESH) Specification for Equipment Procurement (v.1.3)

Uniform Environment, Safety, and Health (ESH) Specification for Equipment Procurement (v.1.3) Uniform Environment, Safety, and Health (ESH) Specification for Equipment Procurement (v.1.3) Technology Transfer #01064135B-ENG and the logo are registered service marks of International SEMATECH, Inc.,

More information

CODE: FD10 Rev: A Date: 9/24/2008 CONFIGURATION MANAGEMENT

CODE: FD10 Rev: A Date: 9/24/2008 CONFIGURATION MANAGEMENT 1.0 Definition of terms: CONFIGURATION MANAGEMENT 1.01 The term shall, will and may are used with specific intent thought-out these documents and will observe the following rules: 1.1 Requirements defined

More information

REQUIREMENTS FOR SAFETY RELATED SOFTWARE IN DEFENCE EQUIPMENT PART 1: REQUIREMENTS

REQUIREMENTS FOR SAFETY RELATED SOFTWARE IN DEFENCE EQUIPMENT PART 1: REQUIREMENTS Ministry of Defence Defence Standard 00-55(PART 1)/Issue 2 1 August 1997 REQUIREMENTS FOR SAFETY RELATED SOFTWARE IN DEFENCE EQUIPMENT PART 1: REQUIREMENTS This Part 1 of Def Stan 00-55 supersedes INTERIM

More information

Curtis Doss Deshraj Singh Rick Scott. Spansion APM Group. Building the stronger manufacturing systems software

Curtis Doss Deshraj Singh Rick Scott. Spansion APM Group. Building the stronger manufacturing systems software Curtis Doss Deshraj Singh Rick Scott Spansion APM Group Building the stronger manufacturing systems software Dec 4 th 2007 ISMI e-manufacturing Workshop Tokyo, Japan 2 Overview Spansion history and global

More information

Oracle Fusion Financial Reporting Compliance Cloud. What s New in Release 10

Oracle Fusion Financial Reporting Compliance Cloud. What s New in Release 10 Oracle Fusion Financial Reporting Compliance Cloud What s New in Release 10 July 2015 TABLE OF CONTENTS OVERVIEW... 3 GIve Us Feedback... 3 RELEASE FEATURE SUMMARY... 3 ORACLE FUSION FINANCIAL REPORTING

More information

VNF Lifecycle Management

VNF Lifecycle Management Case Study Lifecycle Customer Profile Customer: Cloud Service Provider (CSP) Industry: Telecommunications Employees: 22,500 (2016) Customers: 3+ Million The Challenge A CSP finds that rolling out new services,

More information

VC SOFTWARE PROJECT MANAGEMENT PLAN

VC SOFTWARE PROJECT MANAGEMENT PLAN VC SOFTWARE PROJECT MANAGEMENT PLAN Supporting Process Plan This part will contain plans for the supporting processes that span the duration of the software project. Team #4 Members: Yazeed Al-Swailem

More information

Demand Management User Guide. Release

Demand Management User Guide. Release Demand Management User Guide Release 14.2.00 This Documentation, which includes embedded help systems and electronically distributed materials (hereinafter referred to as the Documentation ), is for your

More information

Quality Manual. Specification No.: Q Revision 07 Page 1 of 14

Quality Manual. Specification No.: Q Revision 07 Page 1 of 14 Page 1 of 14 Quality Manual This Quality Manual provides the overall quality strategy and objectives of Pyramid Semiconductor s quality system. It is based on the requirements of ISO 9000. This manual

More information

Test Management Test Planning - Test Plan is a document that is the point of reference based on which testing is carried out within the QA team.

Test Management Test Planning - Test Plan is a document that is the point of reference based on which testing is carried out within the QA team. Test Management Test Planning - Test Plan is a document that is the point of reference based on which testing is carried out within the QA team. - It is also a document we share with the Business Analysts,

More information

UPGRADE CONSIDERATIONS Appian Platform

UPGRADE CONSIDERATIONS Appian Platform UPGRADE CONSIDERATIONS Appian Platform ArchiTECH Solutions LLC 7700 Leesburg Pike #204 www.architechsolutions.com 703-972-9155 atsdelivery@architechsolutions.com TABLE OF CONTENTS Introduction... 3 Upgrade

More information

P. 1. Identify the Differences between ISO9001:2000 與 ISO9001:2008 ISO9001:2008 ISO9001:2000 版本的異同. 5 January 2009 ISO 9000 SERIES

P. 1. Identify the Differences between ISO9001:2000 與 ISO9001:2008 ISO9001:2008 ISO9001:2000 版本的異同. 5 January 2009 ISO 9000 SERIES Identify the Differences between ISO9001:2000 and ISO 9001:2008 審視 ISO9001:2000 與 ISO9001:2008 版本的異同 ISO 9000 SERIES ISO 19011 ISO9000 5 January 2009 ISO9001 ISO9004 2 ISO 9000 SERIES ISO 9001 ISO 9000

More information

IIBA Global Business Analysis Core Standard. A Companion to A Guide to the Business Analysis Body of Knowledge (BABOK Guide) Version 3

IIBA Global Business Analysis Core Standard. A Companion to A Guide to the Business Analysis Body of Knowledge (BABOK Guide) Version 3 IIBA Global Business Analysis Core Standard A Companion to A Guide to the Business Analysis Body of Knowledge (BABOK Guide) Version 3 International Institute of Business Analysis, Toronto, Ontario, Canada.

More information

ROUND LAKE AREA SCHOOLS DISTRICT 116: LIMITED COMMISSIONING GUIDELINES INTRODUCTION

ROUND LAKE AREA SCHOOLS DISTRICT 116: LIMITED COMMISSIONING GUIDELINES INTRODUCTION INTRODUCTION Commissioning (Cx) is a quality assurance process that works to ensure the design intent of a building is fully realized. It requires a systematic approach to review, verify, and document

More information

See What's Coming in Oracle Talent Management Cloud

See What's Coming in Oracle Talent Management Cloud See What's Coming in Oracle Talent Management Cloud Release 9 Release Content Document 1 TABLE OF CONTENTS REVISION HISTORY... 3 HCM COMMON FEATURES... 4 HCM Extracts... 4 Deliver Extracts Using HCM Connect...

More information

Air Monitoring Directive Chapter 5: Quality System

Air Monitoring Directive Chapter 5: Quality System Air Monitoring Directive Chapter 5: Quality System Version Dec 16, 2016 Amends the original Air Monitoring Directive published June, 1989 Title: Air Monitoring Directive Chapter 5: Quality System Number:

More information

Building an Enterprise QA Centre of Excellence Best Practices Discussion IBM Corporation

Building an Enterprise QA Centre of Excellence Best Practices Discussion IBM Corporation Building an Enterprise QA Centre of Excellence Best Practices Discussion 2015 IBM Corporation Objectives Pleased to take this opportunity to present the Enterprise QACoE The objectives of the session are

More information

ISTQB Sample Question Paper Dump #11

ISTQB Sample Question Paper Dump #11 ISTQB Sample Question Paper Dump #11 1. Which of the following is true a. Testing is the same as quality assurance b. Testing is a part of quality assurance c. Testing is not a part of quality assurance

More information

Oracle Fusion Applications Project Management, Project Costs Guide. 11g Release 1 (11.1.4) Part Number E

Oracle Fusion Applications Project Management, Project Costs Guide. 11g Release 1 (11.1.4) Part Number E Oracle Fusion Applications Project Management, Project Costs Guide 11g Release 1 (11.1.4) Part Number E22600-04 March 2012 Oracle Fusion Applications Project Management, Project Costs Guide Part Number

More information

Rational ClearQuest 8.0 Release Report

Rational ClearQuest 8.0 Release Report Rational ClearQuest 8.0 Release Report Dated: 01 November, 2011 Updated: 18 July, 2012 IBM Corporation 2012 Trademarks IBM, the IBM logo, and ibm.com are trademarks of International Business Machines Corp.,

More information

IBM Tivoli Monitoring

IBM Tivoli Monitoring Monitor and manage critical resources and metrics across disparate platforms from a single console IBM Tivoli Monitoring Highlights Proactively monitor critical components Help reduce total IT operational

More information

POLICY MANUAL FOR ISO 9001:2008. Document: PM-9001:2008 Date: April 7, Uncontrolled Copy

POLICY MANUAL FOR ISO 9001:2008. Document: PM-9001:2008 Date: April 7, Uncontrolled Copy POLICY MANUAL FOR ISO 9001:2008 Document: PM-9001:2008 Date: April 7, 2015 REVIEWED BY: Tim Powers DATE: 4-7-2015 APPROVED BY: C._Bickford Uncontrolled Copy DATE: 4-7-2015 1.0 GENERAL ISS: 1 REV: E Page:

More information

State of Washington. WIC Cascades Project MIS Transfer and Implementation Scope of Work. Department of Health

State of Washington. WIC Cascades Project MIS Transfer and Implementation Scope of Work. Department of Health State of Washington Department of Health Prevention & Community Health Division Office of Nutrition Services Women, Infants, and Children (WIC) Program MIS Transfer and Implementation Scope of Work July

More information

Document ID: Revision: Date. Approved:

Document ID: Revision: Date. Approved: Document ID: Q&EMSM Standard: ISO 9001 / ISO 14001 Title: Quality and Environmental Management System Manual Approved By: Revision: ED170616 Date Approved: 6/26/17 Quality and Environmental Management

More information

Agile Product Lifecycle Management

Agile Product Lifecycle Management Agile Product Lifecycle Management Agile Plug-in for Enterprise Manager User Guide Release 9.3.3 E39304-02 December 2013 Agile Plug-in for Enterprise Manager User Guide, Release 9.3.3 E39304-02 Copyright

More information

DESIGN & CONSTRUCTION PHASE COMMISSIONING PLAN TEMPLATE

DESIGN & CONSTRUCTION PHASE COMMISSIONING PLAN TEMPLATE DESIGN & CONSTRUCTION PHASE COMMISSIONING PLAN TEMPLATE Based upon B3 Minnesota Sustainable Building Guidelines VERSION 2.1 Notes to the reader have been added to this document within numerous text boxes

More information

Errata 1 st Printing. Errata 2 nd Printing

Errata 1 st Printing. Errata 2 nd Printing Errata 1 st Printing NOTE: The following errata only pertain to the first printing of the PMBOK Guide Fifth Edition. In order to verify the print run of your book (or PDF), refer to the bottom of the copyright

More information

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide processlabs CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide CMMI-SVC V1.3 Process Areas Alphabetically by Process Area Acronym processlabs CAM - Capacity and Availability Management...

More information

CQR-1. CONTRACTOR QUALITY REQUIREMENTS for CONSTRUCTION SERVICES Revision Date: 6/8/2015

CQR-1. CONTRACTOR QUALITY REQUIREMENTS for CONSTRUCTION SERVICES Revision Date: 6/8/2015 CQR-1 CONTRACTOR QUALITY REQUIREMENTS for CONSTRUCTION SERVICES Revision Date: 6/8/2015 SCOPE This document establishes the minimum quality program requirements for a contractor providing equipment, material,

More information

GENERAL PRINCIPLES OF SOFTWARE VALIDATION

GENERAL PRINCIPLES OF SOFTWARE VALIDATION GUIDANCE FOR INDUSTRY GENERAL PRINCIPLES OF SOFTWARE VALIDATION DRAFT GUIDANCE Version 1.1 This guidance is being distributed for comment purposes only. Draft released for comment on: June 9, 1997 Comments

More information

AS9003A QUALITY MANUAL

AS9003A QUALITY MANUAL AS9003A QUALITY MANUAL Origination Date: (month/year) Document Identifier: Date: Document Status: Document Link: AS9003A Latest Revision Date Draft, Redline, Released, Obsolete Location on Server (if used)

More information

Low-Level Design Validation and Testing

Low-Level Design Validation and Testing Low-Level Design Validation and Testing Service Description Document November 2009 Contents 1. Introduction... 2 2. Eligibility and Prerequisites... 2 3. Service Features and Deliverables... 2 4. Customer

More information

Best Practices for Implementing Contact Center Experiences

Best Practices for Implementing Contact Center Experiences Best Practices for Implementing Contact Center Experiences Oracle Service Cloud Agent Desktop O R A C L E B E S T P R A C T I C E P A P E R A U G U S T 2 0 1 6 Table of Contents Introduction 2 Understanding

More information

CHAPTER 2: IMPLEMENTATION PHASES AND OFFERINGS

CHAPTER 2: IMPLEMENTATION PHASES AND OFFERINGS CHAPTER 2: IMPLEMENTATION PHASES AND OFFERINGS Objectives Introduction The objectives are: Describe the purpose of the phase planning activity, preconditions, and deliverables in the implementation methodology.

More information

Manufacturing Routing and Work Centers

Manufacturing Routing and Work Centers and Work Centers November 8, 2017 2017.2 Copyright 2005, 2017, Oracle and/or its affiliates. All rights reserved. This software and related documentation are provided under a license agreement containing

More information

Definitions contained in the above mentioned Specifications and Industry Standards are applicable herein.

Definitions contained in the above mentioned Specifications and Industry Standards are applicable herein. 1. SCOPE Quality Specification TEC-1019 12 Jul 11 Rev C All Paragraphs Revised Global Quality Management System Supplement for the Aerospace Industry Model, AS 9100 (C) 1.1. Content This specification

More information

SERVICE PROCEDURE NOVEMBER 2011

SERVICE PROCEDURE NOVEMBER 2011 DERBYSHIRE FIRE & RESCUE SERVICE SERVICE PROCEDURE INCIDENT COMMAND TRAINING AND ASSESSMENT NOVEMBER 2011 VERSION 2.0 CONTENTS INTRODUCTION Introduction Procedure Training Courses Assessments Appeals Maintenance

More information

EXHIBIT A Scope of Services Program Services Scheduling, GIS and Web Support

EXHIBIT A Scope of Services Program Services Scheduling, GIS and Web Support EXHIBIT A Scope of Services Program Services Scheduling, GIS and Web Support 1. OBJECTIVE The Florida Department of Transportation (FDOT) requires a Vendor to provide for services to support the District

More information

Testing. CxOne Standard

Testing. CxOne Standard Testing CxOne Standard CxStand_Testing.doc November 3, 2002 Advancing the Art and Science of Commercial Software Engineering Contents 1 INTRODUCTION... 1 1.1 OVERVIEW... 1 1.2 GOALS... 1 1.3 BACKGROUND...

More information

SKYCOAT, LLC QUALITY CONTROL MANUAL

SKYCOAT, LLC QUALITY CONTROL MANUAL QUALITY MANUAL SKYCOAT, LLC QUALITY CONTROL MANUAL Rev. 4 Rev. 2 1 TABLE OF CONTENTS Introduction 1 Organization 1.1 Scope 1.2 Scope Description 1.2.1 Scope Exclusions 1.2.2 Normative References 2.0 Terms

More information

KPMG s Major Projects Advisory Project Leadership Series: Stakeholder Management and Communication

KPMG s Major Projects Advisory Project Leadership Series: Stakeholder Management and Communication KPMG Global Energy Institute KPMG International KPMG s Major Projects Advisory Project Leadership Series: Stakeholder Management and Communication Stakeholder management and communication is critical to

More information

Quality Assurance Plan D9.1.1

Quality Assurance Plan D9.1.1 Quality Assurance Plan D9.1.1 Deliverable Number: D9.1.1 Contractual Date of Delivery: month 3 Actual Date of Delivery: 27/07/2001 Title of Deliverable: Quality Assurance Plan Work-Package contributing

More information

ATTACHMENT D SCOPE OF SERVICES

ATTACHMENT D SCOPE OF SERVICES ATTACHMENT D SCOPE OF SERVICES OBJECTIVE Owner s Capital Improvement Program (major capital, minor construction, repair, and rehabilitation projects) includes numerous construction and renovation projects.

More information

A02 Assessment Rating Guide Revision 2.9 August 21, 2016

A02 Assessment Rating Guide Revision 2.9 August 21, 2016 Revision 2.9 August 21, 2016 Laboratory Name: Assessment Date: (Lead) Assessor: Signature: ! TABLE OF CONTENTS 1.0! INTRODUCTION... 1 2.0! ASSESSOR NOTES ON THE USE OF CALA A02 ASSESSMENT RATING GUIDE...

More information

FEC QUALITY MANUAL. Federal Equipment Company River Rd. Cincinnati, Ohio

FEC QUALITY MANUAL. Federal Equipment Company River Rd. Cincinnati, Ohio QMS-20 2016 FEC QUALITY MANUAL Federal Equipment Company 5298 River Rd Cincinnati, Ohio 45233 www.federalequipment.com www.fecheliports.com www.usdrillhead.com www.orionseals.com www.tkf.com REVISION:

More information

Quality Manual. This manual complies with the requirements of the ISO 9001:2015 International Standard.

Quality Manual. This manual complies with the requirements of the ISO 9001:2015 International Standard. Quality Manual This manual complies with the requirements of the ISO 9001:2015 International Standard. Northeast Power Systems, Inc. 66 Carey Road Queensbury, New York 12804 Quality Manual Rev 0 Printed

More information

Quality Manual ISO 9001:2000

Quality Manual ISO 9001:2000 Quality Manual ISO 9001:2000 Page 2 of 23 TABLE OF CONTENTS COVER PAGE...1 TABLE OF CONTENTS...2 SIGNATURES...3 QUALITY POLICY...4 INTRODUCTION...5 CORPORATE PROFILE...9 4.O QUALITY MANAGEMENT SYSTEM...10

More information

Oracle Systems Optimization Support

Oracle Systems Optimization Support Oracle Systems Optimization Support Oracle Systems Optimization Support offerings provide customers with welldefined packaged services. Let Oracle Advanced Customer Support help you make the most of your

More information

Effective Test Automation of SAP Implementations

Effective Test Automation of SAP Implementations Effective Test Automation of SAP Implementations Vipin Kumar Managing Director & Software Engineering Evangelist Astra Infotech Pvt Ltd vk@astrainfotech.com QM15 2009 IBM Corporation Agenda Introduction

More information

MANUAL QUALITY CONTROL & QUALITY ASSURANCE

MANUAL QUALITY CONTROL & QUALITY ASSURANCE MANUAL QUALITY CONTROL & QUALITY ASSURANCE METROTEC ENGINEERING LLC P.O. BOX: 26045, DUBAI U.A.E TEL : 043889771 FAX:043889772 E Mail: metrotecengg@yahoo.com info@metrotec.ae Web: www.metrotec.ae 2 TABLE

More information

Machined Integrations, LLC

Machined Integrations, LLC QUALITY MANUAL Machined Integrations, LLC ISO9001: 2008 Electronically Controlled by Quality Representative, Rev2, January 2014 Page 2 of 25 TABLE OF CONTENTS SECTION ELEMENT PAGE No A Revision and Approval

More information

SAP Road Map for Governance, Risk, and Compliance Solutions

SAP Road Map for Governance, Risk, and Compliance Solutions SAP Road Map for Governance, Risk, and Compliance Solutions Q4 2016 Customer Disclaimer The information in this presentation is confidential and proprietary to SAP and may not be disclosed without the

More information

9100 revision Changes presentation clause-by-clause. IAQG 9100 Team November 2016

9100 revision Changes presentation clause-by-clause. IAQG 9100 Team November 2016 Changes presentation clause-by-clause IAQG 9100 Team November 2016 INTRODUCTION In September 2016, a revision of the 9100 standard has been published by the IAQG (International Aerospace Quality Group)

More information

BAITY SCREW MACHINE PRODUCTS QUALITY MANUAL

BAITY SCREW MACHINE PRODUCTS QUALITY MANUAL BAITY SCREW MACHINE PRODUCTS QUALITY MANUAL Page 1 of 33 TABLE OF CONTENTS SECTION TITLE PAGE 0 Company Introduction... 4 0 Organizational Chart.. 5 1 Scope.. 6 2 Related Documents... 6 3 Terminology...

More information

SAP SuccessFactors Foundation

SAP SuccessFactors Foundation SAP SuccessFactors Foundation Technical and Functional Specifications CUSTOMER TABLE OF CONTENTS KEY FEATURES AND FUNCTIONALITIES... 3 INTELLIGENT SERVICES... 4 Activities... 4 Administration... 4 INTEGRATION

More information

SQR-009 QUALITY ASSURANCE ACCEPTANCE SAMPLING REQUIREMENTS FOR SUPPLIERS

SQR-009 QUALITY ASSURANCE ACCEPTANCE SAMPLING REQUIREMENTS FOR SUPPLIERS SQR-009 QUALITY ASSURANCE ACCEPTANCE SAMPLING REQUIREMENTS FOR SUPPLIERS Revision: A VOUGHT SUPPLIER QUALITY REQUIREMENTS SQR-009 Table of Contents Section Page Revision Record 3 Introduction 4 General

More information

ENOVIA Product Quality Central

ENOVIA Product Quality Central ENOVIA Product Quality Central ENOVIA Product Quality Central manages complaints and non-conformance reports (NCRs) so companies can avoid compliance risk, reduce waste and increase the ability to leverage

More information

Oracle Fusion Applications Workforce Development Guide. 11g Release 1 (11.1.4) Part Number E

Oracle Fusion Applications Workforce Development Guide. 11g Release 1 (11.1.4) Part Number E Oracle Fusion Applications Workforce Development Guide 11g Release 1 (11.1.4) Part Number E22777-04 March 2012 Oracle Fusion Applications Workforce Development Guide Part Number E22777-04 Copyright 2011-2012,

More information