T Software Testing and Quality Assurance Test Planning

Size: px
Start display at page:

Download "T Software Testing and Quality Assurance Test Planning"

Transcription

1 T Software Testing and Quality Assurance Test Planning Juha Itkonen

2 Outline Test planning, purpose and usage of a test plan Topics of test planning Exercise References: IEEE Std , IEEE Standard for Software Test Documentation, IEEE, C. Kaner, J. Bach, and B. Pettichord, Lessons Learned in Software Testing: A Context- Driven Approach, John Wiley & Sons,

3 Great testing challenges Agency: All testers act on behalf of some client. How do testers faithfully represent them? Evaluation: Not all failures are obvious. When a bug occurs, how do testers recognize it? Coverage: There's too much to test. How do testers decide what to test and what to ignore? Logistics: Testing must pace and respond to the project. How do testers apply their efforts at the right time and place to be useful? Reporting: Testing is about information. How do testers report their work in such a way as the information will be received and used? Stopping: Even though there's too much to test, not all testing is worthwhile to perform. How do testers know when enough's enough? Test planning decisions Bach

4 Test Planning different levels Test Policy Test Strategy Master High Level Test Test Plan Plan Company or product level Project Number of needed levels depends on context (product, project, company, domain, development process, regulations, ) Master test plan Acceptance test plan System test plan Unit and integration test plan Detailed Detailed Test Test Plan Test Plan Test Plan Plan Test levels (one for each within a project, e.g. unit, system,...) Evaluation plan 4

5 Test Documentation by IEEE Std The topic of this lecture 5

6 Purpose of test planning To prescribe the scope, approach, resources, and schedule of testing activities To identify the items being tested, the features to be tested, the testing tasks to be performed To assign the responsible for each task To identify the risks associated with the plan To make conscious decisions of all these things Communicating, not just recording Test planning is part of project planning Focuses on similar issues Is applied on similar level of detail Project planning vs. Technical design <-> Test planning vs. Test (case) design 6

7 Some functions of a test plan Support quality assessment that enables wise and timely product decisions. Support the initiation and organization of the test project, including preparations, staffing, delegation of responsibilities, facility acquisition, task planning and scheduling. Support daily management and evaluation of the test project and test strategy. Describe and justify the test strategy in relation to product requirements and product risks. Promote awareness of the benefits and limitations of the test strategy. Support effective coordination, collaboration, and other relations among members of the test team and between the test team and the rest of the project. Identify and manage any risks or issues that may impact the project. Specify the deliverables of the test project and the delivery process. Record historical information in support of process audits, process improvement, and future test projects. 7

8 Topics of test planning what needs to be planned 8

9 Checklist for test planning 1. Overall test objectives (why) 2. What will and won t be tested (what) 3. Test approach (how) Test phases Test strategy, methods, techniques, Test case organization and tracking Defect reporting Metrics and statistics 4. Resource requirements (who) Tester assignments and responsibilities Test environments 5. Test tasks and schedule (when) 6. Risks and issues (what if) 9

10 1. Overall test objectives Why are we testing? The quality goals of the project/product? What are the most important quality attributes? What is to be achieved by the testing? What do we know after the testing? What are the main risks that testing focuses? 10

11 Clarify your mission Examples of testing goals Find important problems fast Assure customer that the quality is good enough Achieve certain quality attributes Perform a comprehensive quality assessment Conform to a specific (quality) standard Maximize testing effectiveness Minimize testing time or cost Rigorously conform to certain processes Satisfy particular stakeholders 11

12 2. What will and won t be tested (scope) Identify components and features of the software under test High-enough abstraction level Prioritize Both functional and non-functional aspects Consider time, resources and risks Everything can t be tested and everything that is tested can t be tested thoroughly Identify separately components and features that are not tested Provide reasoning Makes the prioritization decisions explicit and visible Steers towards the most important things 12

13 Analyze the product to know what to test Get to know the product and the underlying technology Learn how the product will be used Find out how the product is built Your testing will become better as the project proceeds, because you will be more of a product expert Ask yourself Do designers think you understand the product? Can you visualize the product and predict behavior? Are you able to produce the test data? Can you configure and operate the product? Do you understand how the product will be used? Are you aware of gaps or inconsistencies in the design? Have you found implicit specifications as well as explicit? 13

14 Analyze the product What and how What How Users Use cases Structure Functions Data Platforms Review product and project documentation Interview designers and users Compare with similar products Perform exploratory testing 14

15 Splitting features into sub-features Feature to be tested 1 Important subfeature 1 Important subfeature 2 Subfeature 3 Subfeature 4 Find the most important features to test Not necessarily the same prioritization as for implementation Requirement importance and risks 15

16 Test environments Identification of test environments Hardware, software, os, network,... Prioritization and focusing test suites on each Number of combinations can be huge Regression testing in different environments Scheduling implications Test lab Different hardware and software platforms Cleaning the machines Setting up the test data Moving from platform to another People vs. hardware needs 16

17 3. Test approach How testing is performed How quality goals are achieved and what is the role of testing Test phases Test phases are part of development phases Planning is also a phase Define entry and exit criteria Test levels Types of testing, methods Black-box vs. White-box Testing techniques Test completion criteria and pass/fail criteria By feature/module if necessary Reporting and defect management procedures Communicating and collaborating with development Tools and automation Outsourcing 17

18 Analyze product risk How might this product fail in a way that matters? At first you ll have a general idea, at best Your test strategy and your testing will become better as the project proceeds because you ll learn more about the failure dynamics of the product. Ask yourself Will you be able to detect all significant kinds of problems, should they occur during testing? Do you know where to focus testing effort for maximum effectiveness? Can the designers do anything to make important problems easier to detect, or less likely to occur? 18

19 Design the test strategy What can you do to test rapidly and effectively based on the best information you have about the product? Make the best decisions you can, up front but let your strategy improve throughout the project Ask yourself Is everything in the test strategy necessary? Can you actually carry out this strategy? Is the test strategy too generic; could it just as easily apply to any product? Is there any category of important problem that you know you are not testing for? Has the strategy made use of available resources? 19

20 Test cases How test cases are created and documented Storing, using, maintaining Testware architecture What is needed in addition to the test cases? Stubs, drivers, test data, expected result sets and outputs, test automation scripts, Plan how test cases are organized Prioritization, regression testing Test suites Requirements traceability 20

21 Organizing test cases Prioritizing tests Create test suites Positive / negative The most severe failures The most likely faults End-user prioritizing the requirements Most faults in the past Most complex or critical Test-to-Pass (Positive testing) Test-to-Fail (Negative testing) Smoke test suite Regression test suite Functional suites Different platforms Priorities, different passes 21

22 How to apply prioritization Run first priority tests first but think prioritization more like distribution of efforts than execution order Especially if the prioritization is based on requirement priorities assigned by functionality or module Use most of your time for high priority tests Use some of your time/resources for low priority tests Avoid omitting low priority tests altogether High priority tests might take more time than planned Risk of missing critical problems in low priority functionality Prioritization can be wrong! H L H t L H L H = high priority tests L = low priority tests 22

23 Deliverables What documents are produced Why? When? How test plans, designs, and cases are documented? How test results are logged and reported? How defects are managed? 23

24 Defect reporting Define process to manage defects How defects are managed? By whom? Bug lifecycle Defect states Priorities Severities Who decides? How? When? Classification Defect-tracking system reporting data query and retrieval (statistics) supports the defect resolution process What data and metrics are collected? And for what purpose? 24

25 Test completion criteria Always risks involved Risk of leaving major defects in the system Risk of spending too much time and money Risk of losing market position and sales if extend testing Weakest criteria Deadline When all the time and resources are used Setting the test completion criteria (stop-test criteria) is part of test planning When testing (or testing phase) is completed? Based, e.g., on coverage of testing, defect rates, defect severities, schedule, All the planned tests are executed (and passed) All specified coverage criteria have been met The goals for number of detected defects have been reached Defect detection rates have fallen below a specified level 25

26 Release Criteria Testing should be visible in release criteria Release criteria is not the same as test completion criteria Release criteria is business decision and it is not test team s or testers responsibility to make release decisions Testing provides useful information on the quality of the product Testing should be visible in release criteria Test results Defect metrics and reports 26

27 4. Resource requirements People How many, what experience, what expertise? Full-time, part-time, contracts, students? Responsibilities Equipment Computers, test hardware, printers, tools. Office and lab space Where will they be located? How big will they be? How will they be arranged? Tools and documents Word processors, databases, custom tools. What will be purchased, what needs to be written? Outsource companies Will they be used? What criteria will be used for choosing them? How much will they cost? Miscellaneous supplies Disks, phones, reference books, training material. Whatever else might be needed over the course of the project. Define responsibilities Identify limited/critical resources Location and availability 27

28 Plan also high-level responsibilities Not just allocating individual tasks for people Example: Mark Sarah Dennis Louise Dave Paula Jack Formatting functionality Layout functionality Editing commands Configuration and compatibility GUI: Usability, appearance, accessibility Test data and regression testing Performance and load tests 28

29 5. Testing tasks and schedule Work Breakdown Structure (WBS) Testing tasks can be divided by e.g. Areas of the software Testable features Phases, levels, and types of testing Assigning task responsibilities Mapping testing to overall project schedule Both duration and effort Build schedule Number of test cycles Environments Regression tests Releases External links, i.e. Beta testing Consider using relative dates 29

30 The idea of relative dates Task Start Duration Test planning When initial requirement specification available 2 weeks Test case design Test plan complete 12 weeks Test round #1 Code complete build 6 weeks Test round #2 Beta build 6 weeks Test round #3 Release build 4 weeks 30

31 Estimating testing is not different Estimate can be Estimate an approximate calculation or judgment, based on professional understanding of an expert Quote / quotation fixed price by a contractor Target A desired time or cost (for political or commercial reasons) Estimating any job involves the following Identify tasks Who should perform the task What resources, what skills How much effort for each task When should the task start and finish Predictable dependencies Task precedence (design test before running it) Technical precedence (add & display feature before edit feature) 31

32 Estimating testing is different Testing is not an independent activity Delivery schedules for testable items can change Test environments are critical Test iterations (rounds) Testing should find faults How long does it take to find a fault? How many faults will be found? Faults need to be fixed How long does it take to fix a fault? After fixed, how much testing needs to be repeated? How many times does this happen? - Two? Twelve? It takes time to report defects The more fault reports you write, the less testing you are able to do 32

33 Ways to estimate test effort Guessing Based on the size of the project / product COCOMO and heuristics Number of requirements or modules Consensus of experts (Delphi) Previous test effort In-project experience through incremental planning Detailed work breakdown of testing tasks Authorized deadlines Historical data helps estimating regardless of the method 33

34 6. Risks and issues Risky areas of the test project and plan not the product Critical resources Dependencies For each risk Contingency plans what if the risk realizes Consequences Mitigation how to make the risk smaller Probability and consequences Prevention how to prevent the risk altogether 34

35 7. Share the plan You are not alone The test process must serve the project So, involve the project in your test planning process At least chat with the key members of the team to get their perspective and implicit acceptance to pursue your plan Is the project team paying attention to the test plan? Does the project team, especially first-line management, understand the role of the test team? Does the project team feel that the test team has the best interests of the project at heart? 35

36 Share the plan Ways to share Actively solicit opinions about test plan Help the developers understand how their actions impacts testing Get designers and developers to review and approve testing materials Get people to review the plan in pieces Improve review ability by minimizing unnecessary text in test plan documents Record and track agreements 36

37 Test plan heuristics is this a good test plan Important problems fast Test plans aren t generic Focus on risk Content or nothing Maximize diversity Don t program people Avoid over-scripting The test schedule depends Test to the intent Avoid the bottleneck We are not alone Rapid feedback Promote testability Testers aren t the only testers Heuristic is a rule of thumb, an educated guess, a rule for action. Review documentation Kaner, Bach, Pettichord. Lessons Learned in Software Testing

38 Test plan quality criteria Usefulness: Will the test plan effectively serve its intended functions? Clarity: Is the test plan self-consistent and sufficiently unambiguous? Accuracy: Is the test plan document accurate with respect to any statements of fact? Adaptability: Will it tolerate reasonable change and unpredictability in the project? Efficiency: Does it make efficient use of available resources? Usability: Is the test plan document concise, maintainable, and helpfully organized? Compliance: Does the test plan meet externally imposed requirements? Foundation: Is the test plan the product of an effective test planning process? Feasibility: Is the test plan within the capability of the organization that must use it? Source: Kaner, Bach, Pettichord. Lessons Learned in Software Testing

39 Exercise: Contents of a test plan Compare the two sample test plans Burnstein s Sample test plan College Course Scheduler IEEE Std 829 Sample test plan Corporate Payroll System 39

40 Exercise: Contents of a test plan Focus to the following sections Introduction Features to be tested Features not to be tested Approach Item pass/fail criteria Tasks and schedule What are the good parts? why? What are the weaknesses and problems? How would you improve the plan? Which one of the plans is better? documents better the planning decisions gives guidance and supports the actual execution and management of the system testing project? 40