Risk-Based Testing: Analysis and Strategy Presented at Quality Assurance Institute QUEST Conference Chicago, Ill., 2009 Clyneice Chaney, CMQ/OE, PMP April 21, 2009
Workshop Outline Part I Risk Management and Testing Risk-Based Testing Process Identification Analysis Part II Risk-Based Testing Process Response Planning Test Scoping and Coverage Test Process Managing and Reporting From the Risk Perspective 2
Pressured Testers? When and why should we stop testing? How much testing is enough? When is the product good enough for release? How good is our testing anyway? 3
Risk Management and Testing Risk Identification Test Risk Checklist Catalogs Test Scoping Risk Analysis Risk Mitigation Risk Resolution and Monitor Risk Matrix Risk Plan Test Plan/ Strategy Testing, Reviews, Metrics, Reports Test Process Identification 4 Figure: Risk Based E-Business Testing: Gerrard & Thompson
Risk-Based Test Process Answers: When to Stop? Typical All planned tests completed All incidents raised resolved All defects fixed and retested All regression test run without failure Risk-based What are the risks of stopping NOW? What are the benefits of stopping NOW? Have sufficient benefits been delivered? What evidence to I have to support the release decision? Are product risks and failure points resolved?
Categorizing Software and Risk Process Risks Testing process Development process Project processes Project Risks Late tester involvement Mandated dates Product Risks Platform Component Integration Infrastructure Usability 6
Risk-Based Test Process Risk Identification Risk Analysis Risk Response Test Scoping Test Process Definition Consult business and technical staff Prepare a draft registrar of risks Discuss risks Assign probability and consequences Calculate exposure Formulate test objectives and test techniques Document dependencies, requirements, times scales Assign test effectiveness score Nominate responsibilities Agree on scope of risks to be addressed Agree on responsibility and budget Agree on quality criteria Draft test process Complete test-stage definitions 7 Figure: Risk Based E-Business Testing: Gerrard & Thompson
Risk Identification Risk Identification Consult business and technical staff Prepare a draft list of risks 8
Risk Identification Modes Risk workshops Individualized risk identification 9
Using Checklists/Catalogues Project-based checklists Quality-centric checklists Product-based checklists 10
Probability of Failure: Generic Product Risk Checklist Potential risk (High, Medium, Low, Not Applicable [N/A]) Module A Module B Module C Module D 1. Heavily used module High -5 N/A -0 Medium-3 High- 5 2. Very complex module High -5 High -5 Low -1 3. Modules fixed or updated often Low -1 Low -1 Low -1 Low -1 4. High availability functions N/A -0 N/A -0 N/A -0 N/A -0 5. Functions requiring consistent performance levels Medium-3 N/A -0 Low -1 Medium-3 6. Functions using new tools & languages N/A -0 N/A -0 N/A -0 N/A -0 7. Functions with many interfaces N/A -0 N/A -0 Low -1 Medium-3 8.Developed by inexperienced developers N/A -0 N/A -0 N/A -0 N/A -0 9. Developed with inadequate user involvement High -5 High -5 High -5 High -5 10. Developed by large development teams High -5 High -5 High -5 High -5 11. Completely new functions High -5 N/A -0 N/A -0 High -5 12. Developed under extreme time pressure Medium-3 Low -1 Low -1 High -5 13. Functions with large # defects in previous versions Medium-3 Low -1 Medium-3 High -5 14 Most important function to stakeholders N/A -0 N/A -0 Low -1 High -5 Risk Score 35 17 33 44
Project Risk Analysis (Test) Risk Factor N A Probability Impact (HML) Mitigation Low Medium High What is development relationship to testing? Supportive Uninvolved Adversarial Is the test schedule timeconstrained? Based on estimated Mandated, but flexible with date Mandated, little or no flexibility with date NA = not applicable H = high M = medium L = low
Quality Risks Checklist Quality Expectations Scale: Relative importance Module A Module B Module C Module D 1. Capability: Can it perform required features High -5 N/A -0 Medium-3 High- 5 2. Reliability: Will it work and resist failures in required situations High -5 High -5 Low -1 3. Usability: How easy is it for real users? Low -1 Low -1 Low -1 Low -1 4. Performance: How speedy and responsive is it? N/A -0 N/A -0 N/A -0 N/A -0 5. Installability: How easily can it be installed? Medi-3 N/A -0 Low -1 Medium-3 6. Compatibility: How well does it work with external components? N/A -0 N/A -0 N/A -0 N/A -0 7. Supportability: How economical will it be to provide support? N/A -0 N/A -0 Low -1 Medium-3 8. Testability: How effectively can it be tested? High -5 N/A -0 Medium-3 High- 5 9. Maintainability: How economical is it to support? High -5 High -5 Low -1 Medium-3 10. Portability: How economical to port or reuse? Low -1 Low -1 Low -1 Low -1 11. Localizability: How economical to utilize in another language? N/A -0 N/A -0 N/A -0 N/A -0 N/A = not applicable
Product-Based Risk Approaches Inside-out study product and ask questions Outside-in begin with set of potential risks as in a checklist 14
Inside-out: Questions What if the function fails? Can function ever be invoked at the wrong time? What error checking is where? What is the biggest load a process can handle? Can any components be tampered with or influenced by other processes? Study the product and ask repeatedly: What could go wrong here? Vulnerabilities: What weakness or possible failures are there in this component? Threats: What inputs or situations could there be that might exploit a vulnerability and trigger a failure in this component? Victims: Who or what could be impacted by potential failures and how badly? 15
Outside-in What components have these kinds of risk? Begin with set of potential risks Begin with risk checklists Quality criteria Generic risks Risk catalogs Easier than inside-out 16
Documenting Product Risks ID Risk Data Conversion 01 Converting Social Security number to non-privacy field may cause records to be lost Integration 04 Records created in scanning must be successfully accepted by workflow engine
Risk Identification and Outcome Product Risks Process Risks Project Risks Module A: very complex High usability requirement Security for Module A critical Defect tracking process absent Configuration management issue Turnover in software developers Must be completed by conference 18
Risk Analysis Risk Analysis Discuss risks Assign probability and consequences Calculate exposure 19
Group the lists by similarity Risk Analysis Delegate the risk Accept the risk as valid Investigate the risk further Analysis For each item on the list: Determine its testability 20
Impact Risk Analysis Risk Consequences Consequence Description Score Critical Business objective can t be 5 accomplished High Business objective will be undermined 4 Moderate Business objective will be affected 3 Low Business objective will be affected slightly 2 Negligible There will be no noticeable effect 1 21
Probability Risk Analysis Assessing Probability Probability Description Score 81-100 Almost certainly, highly likely 5 61-80 Probable, likely, we believe 4 41-60 We doubt, improbable, better than 3 even 21-40 Unlikely, probably not 2 1-20 Highly unlikely, chances are slight 1 22
Risk Analysis Exposure = Probability X Consequence 23
Documenting Risks ID Risk Probability Consequence Exposure Data Conversion 01 Converting Social Security number to non-privacy field may cause records to be lost Integration 04 Records created in scanning must be successfully accepted by workflow engine 4 4 16 3 5 15
Part 1 Summary Describe the relationship of software and risk and the rise of risk-based testing. Describe why standard project risk management is inadequate for testers. Define the three highlevel categories of risks. Describe risk analysis in the risk-based testing process. Describe the three components of risk analysis. Describe a rating scheme for calculating risk exposure.
Risk Response Risk Response Identify and document strategies for process and project risks Formulate test objectives and test techniques for product risks Document dependencies, requirements, times scales Assign test effectiveness score Identify role responsibility 26
Project and Process Risk Response Planning Preemptive Measures (preventative) Reactive (risk reduction) 27
Product Risk Response Risk Response Identify and document strategies for process and project risks Formulate test objectives and test techniques for product risks Document dependencies, requirements, times scales Assign test effectiveness score Identify role responsibility 28
Test Strategies Finding the answers to: How Find Critical Prob Test What Areas? Why? Techniques & Tools Types Test? When? 29
Developing a Risk based Test Strategy 30
Quality Expectations Business Benefit Uncertainty Find Problem Most Rework Find Problems Early Risk-based Testing Goal Find Important Problems 31
Getting to a Risk-based Test Strategy Step 1 Review product risks Review quality requirements Step 2 Identify what to test and how much Step 3 Determine types of test to use and when Step 4 Determine test techniques used 32
Deciding on Test Techniques Considerations Quality characteristics Area of application Extent of formality Use of resources Required knowledge skill 33
Good Test Strategies Diversified Based on Maturity Characteristics Riskfocused Productspecific Practical 34
Test Strategy ID Prob Co n Converting Social Security number to non-privacy field may cause records to be lost Integration Records created in scanning must be successfully accepted by workflow engine Ex Objective Test Types/ Phase 4 4 16 Verify traceability of Social Security number to all customer records Integration: Functional Unit Test: Functional Role Test Dev 35
Failure Mode: Test Process Worksheet Column # Column Heading Column description Test Process Worksheet Stage 1 Risk (failure mode) Brief description of risk Create in identification 2 Benefits threatened Business benefit threatened Create in identification 3 Probability Likelihood of system failure Create in analysis 4 Consequence Impact of failure Create in analysis 5 Exposure Calculate column 3X4 Create in analysis 6 Test effectiveness Confidence in ability address risk Create in risk response 7 Test priority num Calculate column 3, 4, 6 Create in risk response 8 Test objective Objective used to address risk Create in risk response 9 Test technique Test technique method used Create in risk response 10 Dependencies Tester assumptions Create in risk response 11 Effort Effort required to test Create in risk response 12 Timescale Elapsed time required to test Create in risk response 13 Test stage A, B, etc Group responsible for test activity Create in risk response
Test Scoping Test Scoping Agree scope of risks to be addressed Agree on responsibility and budget Agree on quality criteria 37
Determining Scope Evaluate the risk and associated test objectives Evaluate generic test objectives Utilize quality requirements Determine testing scope with stakeholder and team 38
Risk-Based Test Objectives ID Risk Test Objective Technique 01 Links to other on-site objects don t work 04 Links to server-based functionality don t work Verify links to on-site objects load correctly Verify following: Correct component referenced Data passed to component correctly Link checking Transaction verification 06 Middleware, connectively or custom-built, components fail when used extensively Demonstrate that integrated components do not fail with repeated user or use over extended period time Transaction link testing 39
Generic Test Objectives ID Test Objective Typical Test Activity 01 Demonstrate component meets requirements Component testing 02 Demonstrate component ready for reuse in subsystem Component testing 03 Demonstrate integrated components work together Integration testing 04 Demonstrate system meets functional requirements Functional testing 05 Demonstrate meeting nonfunctional requirements Nonfunctional testing 06 Demonstrate meeting regulatory requirements System/acceptance testing 07 Demonstrate meeting contractual requirements Contract acceptance 08 Validate system meets business or user requirements User acceptance testing 09 Demonstrate system, processes, and people meet business requirements User acceptance testing 40 Reprinted from Risk Based ebusiness Testing
Test Process Test Process Definition Draft test process Complete test activities definitions 41
Test Process Determination: Can the Strategy Be Implemented? Evaluate test strategy in light of resources and tools Identify training needs to support strategy Are all tools/products in place to support? Are any new tools/products needed? Does the strategy make use of available resources? Is everything in the strategy necessary? 42
Finalizing Test Strategy in Test Plan Entrance/exit criteria for phases Test data needed Environment needed 43
Test Activity Template Test Phase/Activity Description: Object Under Test: {Enter testing activity such as unit, system, acceptance}. Test Objective: Number of Test Planned: Entry Criteria: Exit Criteria: Environment: Risk Analysis: Risk Strategy: Execution: Management: Sign Off 44 Reprinted from Gerrard & Thompson Risk Based ebusiness Test Strategy
Master Test Plan I. Introduction A. Purpose and scope of document B. Purpose and scope of the project II. Risk analysis A. Summarize risk process utilized B. Summarize project constraints and or contingencies C. Summarize project assumptions D. Summarize project, process, and product risks III. Quality standard A. Summarize agreed upon quality criteria B. Summarize identified and agreed upon critical success factors 45
Risk Resolution and Monitoring Process Risks Testing process Development process Project processes Project Risks Late tester involvement Mandated dates Product Risks Platform Component Integration Infrastructure Usability 46
System RISK DASHBOARD Friday April 6, 2007 Dashboard High Risk Table Risk The system Modernization may require the collection of additional data points from licensees The system Modernization may result in changes to the way licensees connect and provide data to the XXX system. Status Impact Probability Ranking Priority 5 5 25 High 5 5 25 High Current schedule is based on timely identification, selection, and acquisition of Software and Hardware The system will not have full failover and redundancy (Release 2) at the time of deployment since the failover requires the deployment of PI collectors at each data collection site. If failure occurs, full failover may not take place until sites have implemented the proper interface technologies. Currently PI user interface does not satisfy all 508 requirements (a failure to meet NRC requirements) for prototype and may not be ready for accelerated release schedule. As written, some requirements may not be fully testable Coordination of production hardware installation within NRC may be difficult CTF may not be available or have space when the system is ready for security testing, and the schedule does not have much slack built into it for delays caused by the CTF. 5 4 20 High 4 5 20 High 4 4 16 High 5 3 15 High 5 3 15 High 5 3 15 High The complexity of the system and the requirements for a full-c&a may prevent the receipt of an ATO prior to the current operational date of early September 2007 Relocation of Region 4 Office site (August/September time frame mentioned) may cause problems with the backup configuration. (Release 1) Screen requirements have not been fully developed or agreed to. The volume of support material, and the amount of time that will be (and have been) required to complete the support material has placed a strain on the budget. Software selected is not currently 508 compliant. If software remains out of compliance when it is required to go into production then it will violate government regulations. If prototype equipment does not arrive in time for the development of the prototype dayto-day schedule slippage will occur 47 5 3 15 High 5 3 15 High 5 4 20 High 4 5 20 High 5 4 20 High 4 5 20 High Open vs. Closed Risks Key On Hold or waiting clarification Currently being mitigated Volatile or currently un-mitigated
Managing Test Project Risks Develop a workable schedule with frequent milestones to use in tracking the testing project Aggressively fight slippage Predict and track the likely causes of delay Aggressively manage defect aging Have an early warning process to identify test bottlenecks and resolve them quickly Measure how time is being used Be proactive, not passive 48
Risk- and Benefit-based Reporting Base reports on the number of risks mitigated or addressed Use metrics to track progress Base reports on number of benefits/critical success factors attained 49
Risk-to-Benefit Reporting Risks Benefit 1 Benefit 2 Goals 1 Goals 2 1. Interfaces: Open X X X 2. Security: Closed X X 3. Usability: Open X 4. Performance: X Open 5. Load: Closed X X X
Summary Risk-Based Test Process Risk Identification Risk Analysis Risk Response Test Scoping Test Process Definition Consult business and technical staff Prepare a draft registrar of risks Discuss risks Assign probability and consequences Calculate exposure Formulate test objectives and test techniques Document dependencies, requirements, times scales Assign test effectiveness score Nominate responsibilities Agree on scope of risks to be addressed Agree on responsibility and budget Agree on quality criteria Draft test process Complete test-stage definitions 51 Figure: Risk Based E-Business Testing: Gerrard & Thompson
References Risk Based e-business Test Strategy. Paul Gerrad, Neil Thompson. Artech House. Boston2002. Manage and Strengthen Testing: Speeding the Software Delivery Process, Part 1. Ross Collard Conduct Early and Streamlined Testing: Speeding the Software Delivery Process, Part 2. Ross Collard Manage the Risks and the Process: Speeding the Software Delivery Process, Part 3. Ross Collard Troubleshooting Risk based Testing. James Bach. 2003. Satisfice, Inc. 52