True stories about testing based on experiences University of Antwerp Patrice Willemot Pre sales test consultant Petra Haldermans - Test Consultant / Test Manager 27/04/2011
CTG - Company overview Corporate Headquarters Buffalo, NY Founded in 1966 2
CTG - Company overview European Headquarters Diegem, Belgium Founded in 1976 Bertrange, Luxembourg Reading, United Kingdom 3
CTG - Company overview Over 3.550 employees worldwide 350 employees in Belgium Quality certification ISO 9001, Tick IT Beste Werkgever (2003, 2005, 2007, 2009 and 2011) Investor in People 4
CTG Belgium 5 5
Testing unit @ CTG Market Leader in Belgium Tought Leadership Strong & Active Partnerships Early Adopter in Software Testing since 1999 150 Test Professionals Hosts Belgium s largest Software Test Seminar 3 times winner of Eurostar Best Paper Award International speakers HP Platinum Partner 3 times HP Partner of the Year in 5 years Microsoft Gold Partner Intellectual Property We Commit Focus on Innovation & Training 10+ years of experience consolidated in IP Tools STBoX FASTBoX PTBoX Prove savings Search for Improvements Report on the value Invest in market best practices Share risks & rewards Always finish what we start CTGLabs STAcademy Centers of Excellence 6
Testing Services @ CTG Plan Design Transition Execute Drive IT CTG as Strategic Partner Test Maturity Assessment Optimizing the software and service lifecycle Test Practices Improvement Testing within reach Performance Testing Test Governance Test Automation Implementation Excel within the Lifecycle Training & Coaching Do IT CTG as Operation Partner 7
Knowledge Center STBoX Essentials STBoX Prime STBoX Agile FASTBoX PTBoX Workflow Templates Workflow Templates Workflow Templates Best Practices Utilities Best Practices Templates Instructions Examples Instructions Examples Instructions Examples Functional Libraries Instructions Instructions Examples Technologies ST Academy Training Center Centers of Excellence CTG Labs Annual Convention Knowledge Base Wiki Knowledge sharing White Papers 8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
True stories about testing based on experiences 3 Dilemmas V-model Test Plan and risks Test Design and Execution 1 How to Implementation of a test process 23
Customer Case Company A: introduction Assignment: UAT Testing + Test Management Test Governance Difficulties: Different parties: Requirements definition: Company A Development & Testing (except UAT): Company B UAT Testing: Company A & CTG Meetings via Skype (companies in Belgium and the Netherlands) 24
Customer Case Company B: introduction Assignment Test Management System Testing, Acceptance Testing Test Automation Difficulties Distributed test teams (Trier, Paris, Gent, Bordeaux, Vienna) Different Test approaches Different development approaches (Scrum, V-model, Waterfall model) Different level of test maturity (very low, low, medium, high) 25
Dilemma 1: V-model No business users available for Acceptance Testing A company not familiar with the V-model No distinction between for example Component Testing and Acceptance Testing 26
Dilemma 1: The theory User Requirements Acceptance Testing System Requirements System Integration Testing System Testing Global Design Component Integration Testing Detailed Design Component Testing Code/Build 27
Dilemma 1: The theory User Requirements Acceptance Testing System Requirements Global Design Component Integration Testing System Integration Testing System 1 1 Testing 2 Detailed Design Component Testing Code/Build 28
Dilemma 1: The solution Goal: Increase business knowledge of the tester Change the mindset of the end user How: Combination of end users and testers Organize guided user acceptance test sessions Split test tasks between testers and end users Test Design on business level by IT Testers Review test cases by business Test Execution by business Promote early moment of involvement of testing Conclusion: Think about possible risks and workarounds during test planning! 29
Dilemma 1: The solution User Requirements Acceptance Testing System Requirements System Integration Testing System Testing Global Design Component Integration Testing Detailed Design Component Testing Code/Build 30
Customer Case: Company A Testing done in NL: Unit Testing Unit Integration Testing System Testing Testing done in BE: User Acceptance Testing, by both CTG and Company A (business users) Test Design by CTG Review by Company A (business users) Execution by CTG and Company A 31
Customer Case: Company A User Requirements CTG Acceptance Testing Company A System Requirements System Integration Testing System Testing Global Design Component Integration Testing Detailed Design Component Testing Company B (Netherlands) Code/Build 32
Customer Case: Company B Phase 1: CTG responsible for System Integration and Acceptance Testing Test Design and Test Execution Phase 2: CTG responsible for System Integration Testing Test Design and Test Execution Company B responsible for Acceptance Testing Test Execution of test cases created by CTG 33
Customer Case: Company B Phase 3: CTG responsible for testing during a Scrum Sprint Test Design and Test Execution Company B responsible for Acceptance Testing Test Execution of test cases created by CTG Phase 4: CTG responsible for testing during a Scrum Sprint Test Design and Test Execution CTG responsible for testing the integration of different Scrum Sprints (Scrum of Scrum) Company B responsible for Acceptance Testing Test Execution of test cases created by CTG 34
Customer Case: Company B Phase 1 Phase 2 35
Customer Case: Company B Phase 3 Acceptance Testing 36
Customer Case: Company B Phase 4 Acceptance Testing 37
Dilemma 2: Test Plans and Risks The client doesn t want to have a complete Test Plan A Test Plan: what do I need to add? How to deal with risks? We never thought about them We have a Test Plan, but nobody reads it 38
Dilemma 2: The theory Important things to consider (Risks & Requirements Based Testing): product risks & requirements Features To Test project risks test levels & quality attributes test types priorities constraints: available resources people & expertise infrastructure tools & techniques time & budget 39
Dilemma 2: The theory 40
Dilemma 2: The theory IEEE 829 - Template content 16 points test plan information introduction test Items features to be tested features not to be tested approach item pass / fail criteria suspension and resumption criteria test deliverables testing tasks environmental needs responsibilities staffing and training needs schedule risks and contingencies approvals 41
Dilemma 2: The solution Goal: Create awareness & pay attention to the importance of a Test Plan How: Organize a workshop about the content of the Test Plan (needs) Pay attention to the consequences in case no Test Plan is agreed upon Check consistency with Project Plan (e.g. escalations procedures) Rely on the Test Policy Think about the type of project (Agile / classic development) Conclusion: Convince the client that a Test Plan has a lot of advantages, even when a company is used to work in the same way for each project Reach consensus with a list of known risks 42
Customer Case: Company A A full version Test Plan with extra attention on: Suspension & Resumption criteria Test Project Risks Objectives per Test Level (and thus per company) Entry & Exit criteria between UAT Testing and the other Test Levels Description of the UAT process, with pre-defined milestones / quality gates Strengths and must haves: Penalties for the Risks Formal agreement of the document 43
Customer Case: Company B A full version of a test plan has been created based on the Company B template Test Items Environmental needs Scope Staffing and training needs Test Approach Risks and contingencies Organization Agreements Test Deliverables Glossary Test plan has never been reviewed or read by stakeholders Test plan has been ignored during the test project New template of the test plan has been created Process approach Roles and Responsibilities Environmental needs 44
Dilemma 3: Test Design & Execution We only want to do Exploratory Testing My team has a lot of knowledge about Test Design Techniques I am used to work with business users, but my new test team is not able to execute the existing test cases what should be my level of detail? 45
Dilemma 3: The theory Input Process Expected Result - High-level test case describes the input data and the predicted result on an abstract level. It can contains one or more low level test cases. - Example: A man age > 35 with >2 children - Low-level test case, the abstract values that were assigned to the input an predicted output of the high level test case, are replaced by concrete values. - Example: Dirk Honda, 70 years, 4 children 46
Dilemma 3: The theory Test Design Techniques: Line coverage Statement coverage Decision coverage (= Branch coverage) Condition coverage Decision / Condition coverage Multiple Condition coverage Condition Determination coverage Real-life Test Error Guessing Random Test Idiot Proofing Equivalence Partitioning Boundary Value Analysis Algorithm Test Decision Table Test Syntactic Test Semantic Test Elementary Comparison Test (ECT) Data Cycle Test / Entity Life Cycle Test Process Cycle Test Program Interface Test 47
Dilemma 3: The theory Exploratory testing is simultaneous learning, test design, and test execution An interactive test process Using the information gained while testing to design new and better tests A formal process defined tasks, objectives and deliverables Test Sessions and charts Testers have the skills to listen, read, think and report rigorously and effectively 48
Dilemma 3: The solution Goal: Define the right fit for the company How: Learn from experiences; what went OK, what went wrong in the past? Define advantages & disadvantages of high / low level test cases Think about which and whether you want to oblige Test Design Techniques Find out the knowledge of your test team Conclusion: Define the Test Strategy based on the needs and people within your test team 49
Customer Case: Company A Unit Testing: no written test cases Unit Integration Testing: no written test cases System Testing: written test cases in Excel; titles imported in Microsoft Team Foundation Server 2010 User Acceptance Testing: high level written test cases in TFS Written (Use Case Testing) by CTG and reviewed by the business users of Company A A list of needed test data created and finally defined by Company A Extra: error guessing done (not structured planned) Pitfall: all testing based on the same Use Case Specification Document, with defined objectives per test level 50
Customer Case: Company B At the start System-integration test cases based on requirements on the back of a napkin High level test cases with high level information (no click on button, ) Acceptance test cases based on requirements on the back of a napkin High level test cases with high level information (no click on button, ) Evolved to Acceptance test cases based on requirements on the back of a napkin High level test cases with high level information used by the end users Scrum test cases based user stories Only a title, no steps Uncontrolled exploratory testing Test Automation based on user stories 51
True stories about testing based on experiences 3 Dilemmas V-model Test Plan and risks Test Design and Execution 1 How to Implementation of a test process 52
Implementation of a test process: A best practices approach 2 main phases Test Maturity Assessment Test Practices Improvement Test Maturity Assessment Test Practices Improvement Test Maturity Assessment Test Practices Improvement Test Maturity Assessment Test Practices Improvement 53
Test Maturity Assessment Interview Metrics Documentation Determine SWOT Identification Assessment AS IS TO BE People Process Technology Workshop Based on STBoX TM Increase S and O Eleminate W and T Improvements Business cases Project cards Roadmap Improvements roadmap 54
From improvement goals to roadmap Identification TO BE Improvement goal 1 Improvement goal 2 Improvement goal 3 Improvements Improvement action 1 Improvement action 2 Improvement action 3 Improvement action 4 Improvement action 5 Improvement action 6 Improvement action 7 Improvement action 8 Improvements Improvement project 1 Improvement project 2 Quick Win 55
Test Maturity Assessment SMALL MEDIUM LARGE +/- 6 ac2vi2es Informal 1 formal deliverable Quickscan +/- 12 ac2vi2es Formal +/- 10 formal deliverables Regular +/- 22 ac2vi2es Formal +/- 14 formal deliverables Profound Size of the assessment depends on # departments # languages # interviews # test levels # test types Size of organiza2on Scope People Process Technology 56
Test Practices Improvement Outcome of the assessment Quality Roadmap Business cases Project Cards Reworked roadmap Re prioritization Life long learning Implementation Prioritised roadmap People Process Technology Continuous improvement Retrospective Test Maturity (Re-)Assessment Self Assessment 57
[patrice.willemot@ctg.com] [petra.haldermans@ctg.com] QUESTIONS AND ANSWERS http://www.ctg.com/ http://jobs.ctg.eu/ 58