Systematic Testing with Quality-Oriented Test Strategies

Size: px
Start display at page:

Download "Systematic Testing with Quality-Oriented Test Strategies"

Transcription

1 Insert picture and click Align Title Graphic. Systematic Testing with Quality-Oriented Test Strategies Dr. Simon Burton, Manager Vector Consulting Services GmbH Vector Consulting Services GmbH. All rights reserved. Any distribution or copying is subject to prior written approval by Vector. V

2 Agenda Approaches for achieving quality What makes a good test strategy? 5 steps to a better test strategy Test strategies and the safety case Summary 2/18

3 Approaches for achieving quality Avoiding failures: Systematic development processes, development tools, modeling and coding guidelines, configuration management, training,... Detecting failures: Focus of this presentation Reviews, static and dynamic code analysis, formal verification, test-driven development, unit tests, integration tests, system validation tests,... The right combination and targeted use of individual methods is essential for an effective and efficient approach to achieving quality. 3/18

4 What make a good test strategy? (1/3) A good test strategy is based on an understanding of the effectiveness of individual methods at detecting particular types of error. Requirements Analysis Requirements Review: Are all requirements unambiguously defined? Acceptance test: Does the system behave in it s environment as expected? Acceptance Test System Design Design Review: Are all requirements considered in the design? Integration Test Unit Design Unit Test Unit Test: Are the algorithms implemented correctly? Code Analysis: Is the software free from run-time exceptions? Implementation 4/18

5 What makes a good test strategy? (2/3) A good test strategy sets priorities and ensures an optimal use of resources based on an understanding of the efficiency of the methods. Activity Maximum Error Detection Rate Typical Error Detection Rate Effort for Detection and correction [Hours/Failure] Requirements Reviews 10-15% 5-10% ~2 Design Reviews 10% 5-10% 3-5 Static Code-Analysis 20% 10-20% 3-5 Code Reviews 40% 20-30% 3-5 Unit Test 30% 10-30% 3-5 Integration Test 20% 5-20% Acceptance Test 5% 1-5% >30 5/18 Source: Software Measurement: Establish - Extract - Evaluate Execute, Christof Ebert und Reiner Dumke, Springer

6 What makes a good test strategy? (3/3) A good test strategy uses methods that complement each other. Reviews Focus code reviews where static analysis has shown a high level of complexity and the potential for run-time exceptions. Focus code reviews on software paths not covered by the tests. Code- Analyse Avoid investing large amounts of effort for formal verification until automated tests have found no errors. Unit- Tests 6/18

7 5 Steps to a better test strategy 1 Prioritize quality attributes 2 Identify analysis, review and test phases and allocate quality goals. 3 Define measurable criteria for the successful completion of the reviews, analyses and tests. 4 Select methods and tools for meeting the criteria. 5 Implement, measure and track continuously improve 7/18

8 1. Prioritize quality attributes Quality attribute Functionality Example prioritization per development phase Prototype 70% Pre-Series sample 30% Production release 30% Reliability/Safety - 30% 60% Usability 20% 10% - Efficiency - 20% 10% Maintainability 5% 5% - Portability/Reusability 5% 5% - Prioritize quality goals per development phase to catch critical errors as soon as possible. 8/18

9 2. Identify analysis, review and test phases Activity Requirements Reviews Design Reviews Code Analysis Code Reviews Unit Test Integrations Test Acceptance Test Example Goals (Pre-series sample) Functionality (30%), Reliability/Safety (30%), Usability (10%), Efficiency (20%), Maintainability (5%), Portability/Reusability (5%) Requirements complete, unambiguous, agreed. Non-functional requirements are considered. Code does not contain run-time exceptions Functionality, maintainability and portability of code is confirmed. Functionality and efficiency of the code is confirmed. Reliability and performance requirements confirmed. Usability and critical scenarios from the point of the view of the customer are confirmed. Example: Increase the number of errors found by focusing the goals of unit testing on functionality and code efficiency and ensuring the specifications for achieving this are of sufficient quality. 9/18

10 3. Define measurable criteria Activity Requirements Reviews Design Reviews Code Analysis Code Reviews Unit Test Integration Test Acceptance Test Example for criteria Requirements 100% agreed, all checklist criteria confirmed All design rules are adhered to All MISRA-C rules are adhered to All checklist criteria confirmed, no unexplained deviations from the coding guidelines 100% coverage of the functional requirements, WCET * determined and within limits All operating profiles, configurations and interfaces covered All driving maneuvers and critical scenarios covered The test planning should be based on measurable criteria that have a direct relationship to the quality goals and ensure a balanced cost/benefit trade-off. * WCET: Worse case execution time 10/18

11 4. Select methods and tools Activity Requirements Reviews Design Reviews Code Analysis Code Reviews Unit Test Integrations Test Acceptance Test Examples for Methods and Tools Reviews on the basis of checklists Inspections on the basis of checklists QA-C (MISRA-C-Rules), Polyspace Inspections on the basis of checklists Equivalence classes, boundary value analysis, CUnit incl. code coverage, MATLAB/Simulink simulation, SiL Tests Automated HiL Tests with rest-bus simulation, interface analysis, performance measurements Test drives on the basis of standard driving maneuvers Example: Improvement of the effectiveness and efficiency of reviews through the periodic update of checklists based of errors typically found later in the process. 11/18

12 5. Measurement and continuous improvement Example Measurement Are all criteria met? Plan und actual effort/cost per activity In which phase are errors introduced, in which phase are they detected? Which software modules contain which proportion of errors? Consequence for improvements Is the test strategy being implemented with the necessary rigor? Prioritization of the activities based on cost effectiveness (effort/error) Selection of methods to detect errors as close as possible to their source. Prioritization of expensive manual verification on those software modules with the highest density of critical errors. Consolidate experiences and continuously optimize the test strategy on the basis of adjusted goals and improved methods. 12/18

13 ISO and Test strategies ISO prescribes a number of methods to be applied to hardware, software and system tests, dependent on ASIL. It does not give direct guidance on which combination of methods is most suitable for ensuring the safety of the product under development. A comprehensive analysis, review and test strategy is therefore required to ensure that the methods deliver a convincing contribution to the safety case. The assertions made by the safety case should be used to derive additional quality goals and criteria for the test strategy 13/18

14 Example Contribution to the safety case Assumptions made during the functional safety concept are confirmed through tests, e.g. controllability of the vehicle under specific conditions. 14/18

15 Example Contribution of test to the safety case Assertions made in the safety case are backed up by verification evidence. E.g. particular classes of hardware failure can be reliably detected under a wide range of conditions. 15/18

16 Example Contribution of test to the safety case Adherence to best practice, process and standards is confirmed by verification measures. E.g. Inspection of the System FMEDA by experienced engineers, based on a predefined checklist. 16/18

17 Summary Measures for improving the effectiveness and cost efficiency of analysis, reviews and tests: Understand what is done today: How many errors are found, when in the process. Understand the effect of the methods: which methods are best suited for detecting which type of error and in which phase of development. Identify the critical aspects of the safety case to be supported by the analyses, reviews and tests. Develop of a holistic strategy under consideration of all quality criteria. Rigorous application, measurement and tracking of the strategy. Continuous improvement of the strategy regarding quality and costs. 17/18

18 Vielen Dank für Ihre Aufmerksamkeit. Weitere Informationen über Vector Consulting Services finden Sie hier: Vector Consulting Services GmbH Ingersheimer Str Stuttgart Phone: Fax: Your Partner in Achieving Engineering Excellence. 18/18