Sample. NI TestStand TM I: Introduction Course Manual

Size: px
Start display at page:

Download "Sample. NI TestStand TM I: Introduction Course Manual"

Transcription

1 NI TestStand TM I: Introduction Course Manual Course Software Version 4.0 July 2007 Edition Part Number J-01 NI TestStand I: Introduction Course Manual Copyright National Instruments Corporation. All rights reserved. Under the copyright laws, this publication may not be reproduced or transmitted in any form, electronic or mechanical, including photocopying, recording, storing in an information retrieval system, or translating, in whole or in part, without the prior written consent of National Instruments Corporation. National Instruments respects the intellectual property of others, and we ask our users to do the same. NI software is protected by copyright and other intellectual property laws. Where NI software may be used to reproduce software or other materials belonging to others, you may use NI software only to reproduce materials that you may reproduce in accordance with the terms of any applicable license or other legal restriction. Trademarks National Instruments, NI, ni.com, NI TestStand, and LabVIEW are trademarks of National Instruments Corporation. Refer to the Terms of Use section on ni.com/legal for more information about National Instruments trademarks. Other product and company names mentioned herein are trademarks or trade names of their respective companies. Members of the National Instruments Alliance Partner Program are business entities independent from National Instruments and have no agency, partnership, or joint-venture relationship with National Instruments. Patents For patents covering National Instruments products, refer to the appropriate location: Help»Patents in your software, the patents.txt file on your CD, or ni.com/legal/patents.

2 Worldwide Technical Support and Product Information ni.com National Instruments Corporate Headquarters North Mopac Expressway Austin, Texas USA Tel: Worldwide Offices Australia , Austria , Belgium 32 (0) , Brazil , Canada , China , Czech Republic , Denmark , Finland 358 (0) , France , Germany , India , Israel , Italy , Japan , Korea , Lebanon 961 (0) , Malaysia , Mexico , Netherlands 31 (0) , New Zealand , Norway 47 (0) , Poland , Portugal , Russia , Singapore , Slovenia , South Africa , Spain , Sweden 46 (0) , Switzerland , Taiwan , Thailand , Turkey , United Kingdom 44 (0) For further support information, refer to the Additional Information and Resources appendix. To comment on National Instruments documentation, refer to the National Instruments Web site at ni.com/info and enter the info code feedback.

3 Contents Student Guide A. NI Certification...vii B. Course Description...vii C. What You Need to Get Started...viii D. Installing the Course Software...viii E. Course Goals...ix F. Course Conventions...x Lesson 1 Introduction to Testing A. Types of Tests B. Automated Testing C. Test System Terminology D. Components of an Automated Test System E. Testing Roles F. Test System Development Process G. Covering and Tracking Requirements Summary Quiz Lesson 2 Introduction to TestStand A. Role of Test Management Software B. TestStand Terminology C. TestStand Components D. Using the TestStand Sequence Editor E. TestStand Execution Architecture Summary Quiz Lesson 3 Analyzing Sequences A. Debugging and Diagnostics B. Informational Tools Summary Quiz National Instruments Corporation iii NI TestStand I: Introduction Course Manual

4 Contents Lesson 4 Creating Sequences A. Overview B. Creating Steps C. Code Modules D. Subsequences E. Sequence Properties F. Sequence File Properties Summary Quiz Lesson 5 Managing Data A. TestStand Data Layout B. Expressions C. Custom Data Types D. Variables E. Importing and Exporting Properties Summary Quiz Lesson 6 Overriding Callbacks A. Callbacks B. Process Model Callbacks C. Engine Callbacks Summary Quiz Lesson 7 Configuring TestStand A. Station Options B. Adapter Configuration C. Edit Search Directories D. Report Options E. Database Options Summary Quiz Lesson 8 Executing Tests in Parallel A. Multi-UUT Testing B. Multithreading C. Executing Sequences in New Threads D. Multithreaded Process Models E. Multithreading Caveats F. Synchronization Step Types Summary Quiz NI TestStand I: Introduction Course Manual iv ni.com

5 Contents Lesson 9 Best Practices for Test Development A. Creating Modular Test Systems B. Using Appropriate Data Scope C. Using Setup and Cleanup Groups D. Handling Non Product-Specific Operations E. Documenting Test Systems Summary Quiz Lesson 10 User Management A. TestStand User Management B. User Groups C. Privileges D. Synchronizing with Other User Management Systems Summary Quiz Lesson 11 Deploying a Test System A. Introduction to Deployment B. Deployment Considerations C. Deployment Process Summary Quiz Appendix A Additional Information and Resources Course Evaluation National Instruments Corporation v NI TestStand I: Introduction Course Manual

6 TOPICS A. Types of Tests B. Automated Testing C. Test System Terminology D. Components of an Automated Test System E. Testing Roles F. Test System Development Process G. Covering and Tracking Requirements This course teaches you to create test routines to perform automated tests on products or devices. In order to create successful test routines, you must first be familiar with testing terms and concepts. This lesson introduces fundamental concepts of tests and test systems, and the associated industry terminology. National Instruments Corporation 1-1 NI TestStand I: Introduction Course Manual

7 A. Types of Tests A. Types of Tests Validation Tests Does the item being tested satisfy its intended usage? Did you build the right product? High level and exploratory Example: Usability Test Verification Tests Does the item being tested satisfy its requirements or design? Did you build the product right? Detailed and conclusive Example: Manufacturing Test Testing encompasses many tasks, industries, and applications. There are two main types of tests validation tests and verification tests. Validation Tests Validation tests examine or evaluate a product, device, or theory, to describe how well the subject of the test satisfies its intended purpose. Validation tests answer the question, Did you build the right product? Validation tests evaluate a product at a high level and are subjective. The results of validation tests are exploratory, that is, the results describe how the subject of the test could better satisfy its purpose, rather than conclusively determine whether the purpose was satisfied. An example of a validation test is a usability test, in which you observe people using the product in order to discover areas of difficulty or ways to improve the comprehension or efficiency of the product. Usability testing is measurable, but the results suggest ways to improve the product, rather than a conclusive statement about the usability of the product. Verification Tests Verification tests subject a product to various conditions or operations which result in the acceptance or rejection of the product. Verification tests answer the question, Did you build the product right? Successful verification tests are very detailed and require a measurable list of the capabilities and attributes of the product, known as requirements. Verification tests check the device for each requirement and if the device meets all requirements, the device conclusively fulfills its requirements. An example of a verification test is a manufacturing test, which subjects a product to a series of tests to ensure that product has no flaws or malfunctions. The result of a manufacturing test is a conclusive statement that the device has passed or failed the test. The subjective nature of validation tests makes them difficult to automate. This course focuses on automating test systems; therefore, this course focuses on verification tests. NI TestStand I: Introduction Course Manual 1-2 ni.com

8 Types of Tests Verification Design Verification Prototype Test Requirement acceptance test Design selection test Development Test Types of Tests Verification Unit test Hardware in the loop (HIL) test Integration test Quality Assurance System Test Performance test Functional test Environment/configuration test Failure/stress test Manufacturing Test Production test Statistical process control You can perform verification at multiple points in the development of a product. At each point there may be multiple ways to verify the product. Verification has two main purposes design verification and quality assurance which correspond to two types of testing. During product development, design verification ensures that the chosen design is capable of meeting the requirements of the system. After development, quality assurance verification ensures that the product meets quality standards. The distinction between design verification and quality assurance is gradual, not black and white. For example, many development tests can detect potential faults or performance problems which may affect the quality of the product. Failed system tests may indicate that changes to the design of the product are necessary. Design Verification Design verification tests include prototype tests and development tests. Prototype Tests When developing a product, the first testable item is often a prototype, or a working model of system components that you develop quickly to prove concepts. A common verification test for a prototype evaluates whether it satisfies a specific requirement. This evaluation can help determine whether the requirement is realistic for the final product. Testing prototypes can save considerable development effort by identifying requirements that are not technically feasible or that conflict with other requirements. You can also use prototype testing to compare alternate designs to help you select a design that satisfies the overall requirements and minimizes development time, component cost, or other variables. National Instruments Corporation 1-3 NI TestStand I: Introduction Course Manual

9 Development Tests Development verification tests ensure that each deliverable component and phase of development meets the requirements and does not introduce faults into the overall system. Unit testing individually validates each component of a system by testing the component with inputs that produce a known output and verifying that the component behaves as expected. When testing hardware or embedded components, perform unit testing by using hardware-in-the-loop simulation to simulate the operating environment and other components of the overall system. After you develop and individually unit test each component, you systematically combine components to form higher level components, eventually forming the overall system. As you combine components, perform integration testing to confirm that the component communicates and behaves correctly in the larger system. Integration testing frequently involves regression testing. In regression testing, you test a subset of previously passed tests that are not directly related to the most recently integrated component. Regression tests ensure that the new component is not interacting with other components in unforeseen ways. Quality Assurance Quality assurance tests include system tests and manufacturing tests. System Tests After the system is complete, you must test the system to ensure that it meets all the requirements and does not contain flaws or defects that may reduce its effectiveness or require repair. Performance testing ensures that the system can meet all performance related requirements. Performance requirements vary greatly between products, but tend to be variable quantities rather than attributes that distinctly pass or fail. Although you may want to optimize system performance as much as possible, good requirements specifications always specify a minimum acceptable level of performance so you can determine whether the system performs adequately. Functional testing verifies that each functional requirement is met and operates correctly. When you properly perform development testing, functional testing at the system level verifies that all requirements or components are met or present in the finished product. Environmental or configuration testing ensures that the system operates correctly in the various environments in which it is intended for use. For hardware products, environmental testing consists of testing under different temperature, moisture, electrical, or other environmental conditions. For software products, configuration testing consists of testing under different operating systems, hardware environments, and other potential variations in software. Failure or stress testing is similar to environmental/configuration testing, but instead of testing under conditions in which the product should work, failure or stress tests focus on finding the limits at which the product ceases to work. Failure testing identifies the extreme environmental conditions for hardware products and attempts to ensure that the product fails safely outside of the allowable operating environment. Stress testing pushes software beyond its normal operating conditions by running for extended periods of time or with unusually large amounts of data. It attempts to locate problems when the software is stressed beyond normal limits and ensure that the software fails safely and provides sufficient information to the user. NI TestStand I: Introduction Course Manual 1-4 ni.com

10 Manufacturing Tests After system verification, you often replicate the system in a manufacturing or mass production environment. You perform further verification tests on the resulting products to ensure that the manufacturing process has not deviated from the original design of the product in such a way as to cause the product to have flaws or to malfunction. A production test performs a series of tests to verify that each item produced is functional and satisfies its requirements. Often, verification of each item produced is not practical, so you test a subset of the items and Statistical Process Control continually verifies the manufacturing process as a whole. National Instruments Corporation 1-5 NI TestStand I: Introduction Course Manual

11 B. Automated Testing Involves developing a system that performs some or all of the tasks associated with a test with little to no human interaction Requires an investment to develop Reduces the cost of performing a test repeatedly Increases the consistency of testing Automating tests that are performed many times, such as manufacturing tests, have the highest return on investment B. Automated Testing The primary focus of this course is to automate a testing process. An automated test system can reduce or eliminate the human effort needed to test a product by performing some or all of the tasks without human interaction. An automated test system reduces the cost of performing a test repeatedly by increasing the speed of the test and/or reducing the amount of time a human operator needs to spend performing the test. A well-designed automated testing system also increases the consistency of tests by reducing the possibility for human variance or error. To varying degrees, you can automate the tests described in the Types of Tests section. However, developing an automated test system requires a significant investment of time or money and therefore it is most effective for tests that you perform many times. Automating a prototype test system rarely makes sense unless you need to test a large number of prototypes, because the effort required to create the automated system is likely greater than the effort to perform a small number of tests manually. In contrast, automating a manufacturing test system which might test dozens or hundreds of devices a day provides a much better return on investment. NI TestStand I: Introduction Course Manual 1-6 ni.com

12 C. Test System Terminology C. Test System Terminology The following terms describe automated test systems: Unit Under Test (UUT) Test Operator Test Routine Unit Under Test (UUT) An automated test system repeatedly tests many instances of the same item. The specific item that the automated test system is currently testing is the Unit Under Test (UUT), also known as the Device Under Test (DUT). You often track UUTs by a unique identifier such as a serial number. Test Operator While some tests are completely automated and require no human interaction, most test systems initiate tests at the request of a test operator, who performs any manual tasks associated with the test, such as connecting the unit under test. Test Routine A series of actions which describes the process of testing a UUT. In TestStand, test routines are called sequences. National Instruments Corporation 1-7 NI TestStand I: Introduction Course Manual

13 D. Components of an Automated Test System Automated Test System Test Stations Test Equipment Test Software Operator Interface Testing Routines Testing Framework D. Components of an Automated Test System Data Storage Analysis Applications An automated test system contains the following hardware and software components to analyze and interface with the UUT: Test Stations Computers that run the software for the automated test. Typically, a PC or PXI chassis with a user interface to interact with the test operator. Test stations are connected to one or more pieces of test equipment. Test Equipment Hardware devices that communicate with, measure, or manipulate the UUT. DAQ devices and cameras are common examples of test equipment. An automated test system may consist of one or more test stations that test multiple UUTs at the same time. Test stations may also be multi-purpose, capable of testing different types of UUTs. Software for an automated test system contains the following components: Operator Interface User interface to prompt the test operator and report the results of the test. A single operator interface may be capable of testing different types of UUTs. Testing Routines UUT-specific sequences of instructions. Testing Framework Code that is common for all types of UUTs. The testing framework executes test routines, updates the operator interface, performs other testing related tasks such as logging data to databases, generating reports, and maintaining the configuration of the test system. Data Storage Permanent data repository such as a database. The test system may include additional applications for maintaining or querying the database. Analysis Applications Provide access to data from the database and analyze trends or locate problems in the test system or specific UUTs. NI TestStand I: Introduction Course Manual 1-8 ni.com

14 E. Testing Roles E. Testing Roles Operator Executes tests Technician Diagnoses problems with UUTs Provides technical feedback on flaws in the testing system Test Developer Develops a test routine to verify the UUT Framework Developer Develops the test framework used by the other roles There are four roles associated with testing a product or device: Operator Executes tests, responds to instructions in the test routine, and performs any portions of the test that require human interaction. Operators generally need minimal technical understanding of the test system or UUT. Technician Analyzes the devices or products that fail tests, and diagnoses the issues that are causing them to fail. Technicians need extensive knowledge of the UUT, and must also be able to analyze the test routine to isolate the reasons for failure. Technicians often serve an additional role by using debugging tools to identify flaws in the test system, which they report to test or framework developers. Test Developer Writes test routines for individual devices. The developer creates the code to test individual aspects or components of a device and the logic to combine and sequence these individual elements into a cohesive test routine. Test developers must have the technical ability to create test routines and write test modules in one or more development languages. Test developers need enough understanding of the UUT to write code to test the device requirements and must have an understanding of the test framework in order to write test routines that utilize the framework. National Instruments Corporation 1-9 NI TestStand I: Introduction Course Manual

15 Framework Developer Creates the test framework under which all test routines are executed. Framework developers must have extensive knowledge of the technology behind the test framework and sufficient proficiency in one or more development languages to write operator interfaces and tools. Framework developers often need knowledge of other technologies or applications, such as databases or analysis tools. Framework developers need only minimal knowledge of the UUT because test developers implement device-specific code. However, framework developers must have a broad enough understanding of the UUTs to identify appropriate features for the test framework and design the most effective operator interfaces. The NI TestStand I: Introduction course focuses on providing technicians and test developers with the tools they need to perform their roles. The NI TestStand II: Customization course builds on the foundation taught in NI TestStand I: Introduction to help framework developers create custom frameworks. NI TestStand I: Introduction Course Manual 1-10 ni.com

16 F. Test System Development Process 1. Analyze UUT requirements 2. Develop test requirements 3. Select or create a test framework 4. Design a test routine 5. Create code for each step of the routine 6. Test the test system 7. Deploy the test system F. Test System Development Process Use the following process to develop an automated test system. Your organization may have additional steps or may perform steps in a different order. 1. Analyze UUT requirements Determine what to test. A test system should verify that the UUT meets its requirements with the minimum amount of time and effort. Analyze the requirements of the UUT to prevent developing tests that are not essential to verifying the UUT. Well-written requirements should always be testable, but not all requirements of the UUT may be testable by an automated test system. 2. Develop test requirements Define a list of all of the aspects of the UUT that you must test to verify the UUT, using information from the analysis of the UUT requirements. 3. Select or create a test framework Select or create an appropriate test framework. The same test framework can apply to multiple types of UUT, so this step is required only when starting a new automated test system. For the selected or created framework, verify that it meets the specific requirements of the test routine you are creating. 4. Design a test routine Design a technique to perform the necessary tests for each aspect of the UUT that you must test. Create an ordered list of all actions and tests which must be performed to verify the UUT. National Instruments Corporation 1-11 NI TestStand I: Introduction Course Manual

17 5. Create code for each step of the routine Write code to perform each test or action in the routine. Use a development environment such as LabVIEW or LabWindows /CVI. 6. Test the test system Verify that the test system correctly meets its requirements. Verify that invalid UUTs do not return positive test results and that valid UUTs never return negative test results. 7. Deploy the test system Create one or more test stations with the appropriate test equipment and deploy the test software to each station. NI TestStand I: Introduction Course Manual 1-12 ni.com

18 G. Covering and Tracking Requirements A development project should provide proof that a given UUT meets all of its requirements Requirements tracking Organize development efforts Provide proof that requirements are implemented and tested Manually or with professional tools such as the NI Requirements Gateway G. Covering and Tracking Requirements For any development effort, you must define the requirements of the project and track and maintain the list of requirements. Covering and tracking requirements provides proof that the finished product meets all of its requirements when development and testing are complete. For small projects with only a few requirements, you may be able to cover and track requirements without a structured process. For large projects, which may include hundreds or thousands of requirements, you must use a formal process to track each requirement. A formal requirements tracking process ensures that you implement and test each requirement, and helps organize and monitor development and testing efforts. You can use requirements tracking to determine which requirements you have not implemented or tested. Additionally, requirements tracking helps you determine if components of the implementation or test routines do not directly support the project requirements. This helps to limit or control the implementation of undefined requirements or tests, also known as feature creep. You can track requirements manually, using a spreadsheet or other tool to check each requirement. Alternately, you can build requirements tracking into the implementation and test systems and monitor progress with a software tool. The National Instruments Requirements Gateway application tracks requirements. Refer to ni.com/software and browse to the NI Requirements Gateway for more information about this software tool. National Instruments Corporation 1-13 NI TestStand I: Introduction Course Manual

19 Covering and Tracking Requirements (continued) Each requirement is linked to A specific piece of code or hardware that implements the requirement A specific section of a test routine that tests the requirement Hardware/Code Component Component Requirements Document Requirement Requirement Covering and Tracking Requirements (continued) Test Routine Test Step Test Step Requirements tracking starts with the requirements document. Each requirement in the requirements document should be specific enough that you can identify a specific component or set of components that cover the requirement. Write each requirement in such a way that you can generate a test to conclusively show that the requirement is implemented and working. For the purposes of requirements tracking, you should also give each requirement in the document a unique name or other identifier to make it easy to reference. As you implement components of the product, identify which requirement or requirements each component covers. For software products, you can track requirements by including a comment in each function or VI description. Include the unique identifiers of any covered requirements in the comment. For hardware products, you can include requirement identifiers in the hardware design documents or schematics. As you develop a test routine for the product, identify which requirements each section of the test covers. Include the requirement identifiers in the documentation for the test routine. NI TestStand I: Introduction Course Manual 1-14 ni.com

20 GOAL Exercise 1-1: Reviewing a Test Requirements Document Review the requirements document for the course project, a CD test system. National Instruments Corporation 1-15 NI TestStand I: Introduction Course Manual

21 Summary Quiz 1. Usability testing is an example of a. Validation b. Verification 2. Which roles require extensive knowledge of the UUT? (multiple answers) a. Operator b. Technician c. Test Developer d. Framework Developer NI TestStand I: Introduction Course Manual 1-16 ni.com

22 Summary Quiz 3. For requirements tracking, each requirement should be linked to which of the following? (multiple answers) a. A section of code or a hardware component b. A specific piece of test equipment c. A specific section of a test routine d. A component of the test framework National Instruments Corporation 1-17 NI TestStand I: Introduction Course Manual

23 Summary Quiz 4. Which of the following components are unique to a specific UUT? a. Test station b. Operator interface c. Test framework d. Test routine e. Test equipment NI TestStand I: Introduction Course Manual 1-18 ni.com

24 Quiz Answers 1. Usability testing is an example of a. Validation b. Verification 2. Which roles require extensive knowledge of the UUT? a. Operator b. Technician c. Test Developer d. Framework Developer 3. For requirements tracking, each requirement should be linked to which of the following? a. A section of code or a hardware component b. A specific piece of test equipment c. A specific section of a test routine d. A component of the test framework 4. Which of the following are unique to a specific UUT? a. Test station b. Operator interface c. Test framework d. Test routine e. Test equipment National Instruments Corporation 1-19 NI TestStand I: Introduction Course Manual